Adobe's Firefly comes to iOS and Android | TechCrunch
Adobe has been on a quest to attract users to its platform for their AI needs. The company in April launched a redesigned Firefly web app that lets users use Adobe’s own Firefly image- and video-generation models as well as third-party models.
Now, it is releasing a Firefly app on both iOS and Android that lets people use all of its models as well as models from OpenAI (GPT image generation), Google (Imagen 3 and Veo 2), and Flux (Flux 1.1 Pro).
Like the web app, the new smartphone apps let you use prompts to generate images or videos or convert images into videos. You can also edit certain parts of images using generative fill or expand an image with generative expand. Adobe Creative Cloud subscribers can start a project on the Firefly mobile app and store it in the cloud to access it through the web or desktop app.
The company is now also supporting more third-party models, including Flux.1 Kontext by Black Forest Labs, Ideogram 3.0 by Ideogram, and Gen-4 Image by Runway.
The company is also updating Adobe Canvas, its collaborative whiteboarding tool, with the ability to generate videos. Canvas lets users generate videos with Adobe’s own video models as well as those made by its competitors.
Adobe said users have so far created more than 24 billion media assets with its Firefly models, and that its AI features have been a big factor in increasing the number of first-time subscribers 30% quarter-over-quarter.