Adobe Inc. announced a suite of next-generation AI-powered creative tools that has the potential of revolutionizing the way filmmakers, designers, and content creators edit visual media at Adobe MAX 2025 conference in Los Angeles.
The headline was: Magic Eraser: Get rid of objects in videos with just a single frame edit or turn on lights or warp shadows in an image.
On—or more precisely, ahead of its creativity today, Adobe announced the unveiling of a couple of brilliant new features enabled by the brand’s Firefly generative-AI models, initiatives that the creative software company is actively working to position its Creative Cloud platform so that it can stay ahead of the intelligent editing curve and not only for pros.
Generative AI risks Stepping into the Director’s Chair
Adobe’s new AI demos (code-named Project Frame Forward and Project Light Touch internally) garnered the most attention during the company’s “Sneaks” event, in which engineers often reveal experimental tools that hopefully will make their way to main features.
In Project Frame Forward, engineers at Adobe gave us a glimpse (see, “edit a single frame of a video and then have AI automatically apply the change to every frame in the video sequence”) that is almost impossible.
If you can picture removing a microphone stand, or an uninvolved onlooker, or an entire background element – and seeing the system effortlessly color composite each subsequent frame… The AI anticipates the motion, texture and lighting continuity, allowing the editors to have a clean and natural-looking result in seconds.
Project Light Touch, on the other hand, brings control of lighting to whole new level. By adjusting a slider on-screen or clicking on a lamp, users are able to “flip on” light sources that never existed, change the direction and color of illumination or change the whole scene from day to night – all using simple consequential AI inference.
The computer erase artist in most of the scenes from Blade Runner 2049, previously called a production scientist, told the MAX audience that Blade Fellow is about lighting experimentation after the fact. A lamp falling off is not a reason to go back and reshoot a scene.
A Preview into the Future of the Editorial Practice
These innovations are an extension of Adobe’s vision for generative AI that started with Generative Fill in Photoshop, and Generative Remove in Lightroom, that has been running for years. But most of those features were still limited to static images, and it appears that the 2025 Sneaks are just one indicator that video is the future.
Adobe is pushing itself with the years of research in the fields of scene understanding, motion prediction, and the diffusion model. That’s where the magic and synergy happen by creating a creative pillar where generative AI is not helpful enough, but rather a partner.
“We view this as a more significant advance area from pixel for editing to scene editing,” said Ely Greenfield, Adobe’s CTO of Digital Media. Our tools are becoming conversational, predictive and more context-aware.
This evolution results in editors being able to work more like a director: describing intent rather than performing each of the more mechanical tasks. It’s an element of a larger shift towards creative AI, in which human creativity will continue to reign over the wilds, but the universe for the brute force will be automated.
From Tedious to Effortless: What’s in it for the Creator
These tools have the potential to change the way that creative pros work on a daily basis:
- Faster editing: Having to remove a distraction from a five-minute video is no longer reduced to hours of moving about in rotoscopes, frame by frame on a paint plate.
- Non-destructive lighting: Apply illumination unfairly after the shoot so that you can recover unsuitable filming without having to time again.
- Budget efficiency: Budget creators and small individual studios benefit from post-production capabilities previously reserved for big studios.
- Creative experimentation: Movie directors can instantly try changing the mood from bright daylight to candle-lit warm mood without re-lighting the set.
For Marketers, Filmmakers and Digital First brands these tools can potentially shrink production cycles as well as cinematic quality which in a world where the velocity of content equals visibility, is a critical advantage.
Increasing Concerns in Authenticity
While there is no debating the potential that comes with machine learning, there is also a resurgence in the discussion of authenticity in media thanks to Adobe’s new suite of AI capabilities. If editors can easily delete, insert and re-light any scene they wish, how does the viewer know what to trust?
Adobe says its tools are designed keeping transparency in mind. The company is also growing its Content Authenticity Initiative (CAI) an industry-wide effort that integrates provenance metadata into images and videos that have been edited using Firefly. That metadata enables publishers and audiences to be able to claim back owed information regarding when and how AI tools were utilized.
Experts say that even responsible innovation creates challenges. “Manufacturing synthetic edits and generation of synthetic content will become more sophisticated so it becomes more difficult to properly put human beliefs forward as artistic statements while drawing the line between human intent and editing,” says Dr. Rachel Falk, head of the Cyber Security Cooperative Research Centre. “These creators and platforms must contain ethics guardrails.”
Technology Behind the Magic
Both Project Frame Forward and Project Light Touch, run on Adobe’s family of models called Firefly, which is trained on licensed high-quality data sets in order to ensure brand safety and copyright compliance.
Frame Forward utilizes the temporal diffusion algorithm to make it smarter about what a visual edit looks like in reality, growing with time. light touch creates a Neural Lighting Model that knows the depth on the object’s surface, reflectivity of the surface material and understands color physics.
Adobe did say these projects represent prototypes, though, so they are not yet part of the public version of the Premiere Pro and After Effects software. However, insiders are suggesting commercial rollouts can appear in 2026.
A New Era for Creative Cloud
For Adobe, this year’s MAX conference wasn’t just a show of cool demos – it was a statement of direction. As more companies, such as Runway, Pika Labs, and Stability AI, battle to win the AI-video space, Adobe hopes to stand out through trusted, enterprise-grade AI tools for its professional ecosystem.
The integration of generative AI directly into Creative Cloud also ensures that the massive base of professional users at Adobe — everything from filmmakers to photographers to graphic designers — are able to adopt new AI workflows without leaving the locations that they are already using.
The point that Adobe makes is straightforward: AI shouldn’t be used to replace creativity; it should be used to augment it.
The Broader Impact to the Creative Industry
Adobe has made an AI announcement which affects more than its post-production studios.
- Education: Design schools and film programs will have to teach these skills of AI-assisted editing as a base skill.
- Enterprise content: Brands are going to be able to produce and adapt campaign visuals in an instant and personalize content at scale.
- Accessibility: High end results can now be achieved by non-experts which will empower a new generation of creators.
- Ethics & governance: Authentic labeling and watermarking will become critical as tough parts in upholding the trust of the viewers.
Analysts believe that by 2027, more than 60% of professional video edits will use generative AI in some way or the other.
Looking Ahead
From Frame Forward’s intelligent object removal to Light Touch’s photorealistic relighting (available for preview in Adobe Flash), Adobe sneak previews more than ever before just how close tutorials are to the fully realized dream of editing that’s almost easy like breathing.
Despite being unavailable for public use, the launch of these tools heralds a new era of artistic creativity, one in which AI and artistry blend more effortlessly than ever before. As we all know, the next few years will change the meaning of the word “editing” as we know it in the fields of film, design, and digital media.
Adobe said in a post-keynote release “Creativity is not something that can be replaced by different technologies.” “With them, the creators are no longer bound — they can concentrate on the content rather than the software.”








