Adobe is betting big on its Sensei AI platform, and so it’s probably no surprise that the company also continues to build more AI-powered features into its flagship Photoshop applications. At its MAX conference, Adobe today announced a handful of new AI features for Photoshop, with Sky Replacement being the most obvious example. Other new AI-driven features include new so-called “Neural Filters” that are essentially the next-generation of Photoshop filters and new and improved tools for selecting parts of images, in addition to other tools to improve on existing features or simplify the photo-editing workflow.
Photoshop isn’t the first tool to offer a Sky Replacement feature. Luminar, for example, has offered that for more than a year already, but it looks like Adobe took its time to get this one right. The idea itself is pretty straightforward: Photoshop can now automatically recognize the sky in your images and then replace it with a sky of your choosing. Because the colors of the sky also influence the overall scene, that would obviously result in a rather strange image, so Adobe’s AI also adjusts the colors of the rest of the image accordingly.
Image Credits: Adobe
How well all of this works probably depends a bit on the images, too. We haven’t been able to give it a try ourselves, and Adobe’s demos obviously worked flawlessly.
Photoshop will ship with 25 sky replacements, but you can also bring in your own.
Neural Filters are the other highlight of this release. They provide you with new artistic and restorative filters for improving portraits, for example, or quickly replacing the background color of an image. The portrait feature will likely get the most immediate use, given that it allows you to change where people are looking, change the angle of the light source and “change hair thickness, the intensity of a smile, or add surprise, anger, or make someone older or younger.” Some of these are a bit more gimmicky than others, and Adobe says they work best for making subtle changes, but either way — making those changes would typically be a lot of manual labor, and now it’s just a click or two.
Among the other fun new filters are a style transfer tool and a filter that helps you colorize black and white images. The more useful new filters include the ability to remove JPEG artifacts.
As Adobe noted, it collaborated with Nvidia on these Neural Filters, and, while they will work on all devices running Photoshop 22.0, there’s a real performance benefit to using them on machines with built-in graphics acceleration. No surprise there, given how computationally intensive a lot of these are.
While improved object selection may not be quite as flashy as Sky Replacement and the new filters, “intelligent refine edge,” as Adobe calls it, may just save a few photo editors’ sanity. If you’ve ever tried to use Photoshop’s current tools to select a person or animal with complex hair — especially against a complex backdrop — you know how much manual intervention the current crop of tools still need. Now, with the new “Refine Hair” and “Object Aware Refine Mode,” a lot of that manual work should become unnecessary.
Other new Photoshop features include a new tool for creating patterns, a new Discover panel with improved search, help and contextual actions, faster plugins and more.
Also new is a plugin marketplace for all Creative Cloud apps that makes it easier for developers to sell their plugins.