Firefly, Adobe’s family of generative AI tools, is out of beta testing and ready for commercial use. That means all you creative types now have the green light to use it to create images in Photoshop, try out crazy text effects on the Firefly website, recolor images in Illustrator, and create posters and videos made with Adobe Express to brighten up. I did some of that myself.
And now we know how much Adobe’s artificial intelligence technology costs to run. Adobe credits Firefly use in varying amounts depending on the Creative Cloud subscription you pay for, but will increase subscription prices in November.
If you have the full Creative Cloud subscription, which gives you access to all Adobe software for $55 per month, you can create up to 1,000 creations per month. If you have a single app subscription, for example to use Photoshop or Premiere Pro for $21 per month, that’s 500 creations per month. Subscriptions to Adobe Express, a universal mobile app that costs $10 per month, get you 250 uses of Firefly.
‘We don’t want anyone to save [credits] or creating from a place of scarcity or feeling rationed,” says Deepa Subramaniam, vice president of marketing for Adobe’s Creative Cloud subscription.
But watch out. Adobe will increase its subscription prices by about 9% to 10% in November, citing the addition of Firefly and other AI features, along with new tools and apps. For example, the annual subscription for all apps will increase from $55 to $60 per month, and a single app subscription will increase from $21 to $23 per month.
In my experience with Firefly so far, it has delivered some really cool effects, but I have also seen its limitations. It’s a cloud-based service, so there’s reason to expect Adobe will deliver on its promises of improvements while retraining Firefly for better results.
UBS analyst Karl Keirstead estimated in a report Thursday that Adobe will generate $400 million to $500 million in new revenue from the price increase in the company’s next fiscal year. However, he expected that Adobe would charge for a standalone Firefly subscription, and that it would not be included in overall Creative Cloud pricing. His company “We wonder if this speaks to Adobe’s confidence in a more direct Firefly approach to monetization,” he said in the report.
Generative AI’s impressive capabilities to mimic human output entered the public consciousness in 2022 with the arrival of OpenAI’s ChatGPT, a text-based chatbot. Generative AI tools trained on large amounts of data make a lot of mistakes, but Adobe’s customers could be more forgiving because many of them are exploring ideas. Generative AI is better with fantasy than with literal truth.
Customers with paid plans can continue using Firefly once they use up their monthly subscription amount, but it will be slower, Subramaniam said. Those on the free tier get a taste of the technology with 25 applications per month. Those who expect to exceed their limits can pay $5 per month for 100 additional Firefly usage credits starting in November.
I used Photoshop’s generative AI technology Firefly to add this red crab to a photo I took of an American avocet sweeping a mudflat with its beak. Firefly is smart enough to get most of the crab’s reflection right, but if you look closely, imperfections are obvious.
Adobe pays stock art contributors for AI training images
Also notable about Adobe’s approach: It’s paying Adobe Stock contributors whose footage was used in Firefly’s training. Adobe will pay out a “meaningful” bonus annually, she said. The payout is primarily based on how many times a contributor’s images have been licensed by customers, as well as the total number they have accepted into stock photo licensing.
“This is an opportunity to provide a new revenue stream to our contributors,” said Subramaniam.
Previously, Firefly was only available in beta versions of the software and Adobe banned its use in commercial projects. To work around copyright issues that can prevent commercial customers from using AI, Adobe trained its Firefly AI on images from its own corpus of Adobe Stock images and public domain images.
Firefly is also coming to Adobe’s Premiere Pro video editing tool later this year.
I tried Adobe Firefly AI
During my testing, Firefly was often able to combine images with existing scenes, by inserting elements with the generative fill tool or widening an image with generative expansion. It can sometimes match the lighting and perspective of a scene, which is a difficult feat, and even create plausible reflections. It is particularly adept at reproducing busy environments such as foliage.
But it also often causes deformations or strange problems – for example an elephant with a second trunk where its tail should be. Often you’ll have to reject a bunch of Firefly duds and try different clues to get useful results, and so far at least it doesn’t seem likely that MidJourney fans will abandon that rival AI image generation tool.
I first used Firefly in Photoshop to expand the original image, left, with new trees and greenery, which it did excellently. The fish I added next is less plausible, but Firefly blended it well with the background.
You can often get better results by splitting the generation into multiple steps. For example, in the skydiving hippo image above, I first asked Photoshop to generate a hippo against a blue sky, then enlarged the image to give it more sky, and then added the parachute.
Images labeled as AI-generated
Many people are alarmed by ‘deepfake’ AI copies of real people and are impressed by realistic AI images such as the pope standing out in a puffy jacket. To help combat the problems, Adobe is using a technology called content references, which it helped develop to improve transparency.
Images created with Adobe’s tools are labeled as AI-generated using content references, Subramaniam said.
“That’s really how we’re going to bring some trust and some transparency into the process to demystify all of this,” she said.
Editor’s note: CNET uses an AI engine to help create some stories. See this post for more information.