Editing photos using text commands – Apple AI work continues

February 12, 2024

Apple’s AI work may be something of a slow-burn when it comes to Siri, but the company isn’t sitting back doing nothing. The company has just released an open-source AI model for editing photos using simple text commands – something which has been described as a breakthrough achievement.

Named MGIE – for Multimodal large language model Guided Image Editing – the model allows the user to use natural language to tell the editor what it is you want to achieve …

For example, a user can simply say “make the sky more blue” and MGIE will interpret that as “identify and select the sky, then increase the saturation in this area by 20%.”

VentureBeat reports that MGIE is able to carry out an impressive range of edits.

MGIE can perform common Photoshop-style edits, such as cropping, resizing, rotating, flipping, and adding filters. The model can also apply more advanced edits, such as changing the background, adding or removing objects, and blending images.

MGIE can optimize the overall quality of a photo, such as brightness, contrast, sharpness, and color balance. The model can also apply artistic effects like sketching, painting and cartooning.

MGIE can edit specific regions or objects in an image, such as faces, eyes, hair, clothes, and accessories. The model can also modify the attributes of these regions or objects, such as shape, size, color, texture and style.

If the model doesn’t deliver the result you expected, you can refine your request, or undo the effect and give a different instruction.

Right now, it’s just an open-source model on Github, but there’s an online demo you can use to upload your own images and play with it. A brief play with this shows that it’s definitely an early beta, but I can certainly see the potential

Here’s my original photo:

My instruction was “make the sky slightly more red” which MGIE interpreted as “Make the sky in the picture a shade of red rather than a shimmering blue. Make the cityscape a shaded shade instead of a stark white sky.” Here’s the result (which is cropped, for unknown reasons):

While it’s not a usable edit as-is (and the demo only supports very low-res output anyway), what is notable for me is the way it accurately adjusted the reflection on the inside of the metal frame to match the sky. It shows early promise for sure.

We don’t know yet when or if Apple might add this capability to iPhone once it delivers more polished results, but it certainly seems a very logical step for a company which has always aimed to make AI photography features as automatic/easy to use as possible.

FTC: We use income earning auto affiliate links. More.