Adobe introduces Project Morpheus |
Adobe has released a prototyping tool called Project Morpheus that describes the capabilities and problems of integrating deepfake technology into their products.
Project Morpheus is essentially a video version of the neural filter the company introduced in Photoshop last year.
For example, these filters use machine learning to customize a person's appearance and tweak things like age, hair color, and facial expressions to turn a surprised look into an angry look.
Project Morpheus makes all these adjustments to the video content and adds new filters such as: b- The ability to change facial hair and glasses.
It should be noted that the results are not without flaws. Compared to the wider realm of Profound Forging, its scope was also very limited.
You can make small, pre-defined adjustments to how the person appears in front of the camera. But you can't switch things like faces.
And the quality is improving quickly. While the feature is currently only a prototype, there are no guarantees that it will appear in Adobe software, but the company is clearly considering it.
Based on neural filters, Project Morpheus can automate the video editing process frame by frame for consistent results.
During a job show, an employee quickly edited a video of himself and Sensei AI did most of the work.
The tool can be abused and the company says it is keeping this in mind. "The Project Morpheus tool is a way for us to preview exploratory and future technologies from research labs and engineering teams," she said.
The Adobe Morpheus project uses artificial intelligence
A company spokesperson said these proof-of-concept ideas aren't always intended to be incorporated into products. We know we need to strike a balance between innovation and accountability to ensure our technology is used for the benefit of our customers and communities. Our development of AI follows the principles of accountability, accountability and transparency.
Additionally, the company has also noted its work on content originality plans. The project was first announced in 2019. For this project, Adobe, in collaboration with the New York Times and Twitter platforms, developed a metadata tag system to limit the number of photos and videos being edited online.
Adobe has also started beta testing of the Content Definition System. The system allows professionals including photojournalists and artists to add attribution data to their photos, indicating the date the file was edited.