adobe

Editing a Photo in Lightroom + Photoshop ... on an iPad

Not too long ago, I never would have considered editing my photos on an iPad. It always felt like something I should save for my desktop. But things have changed. Both Lightroom and Photoshop on the iPad have improved massively, and these days I often use them when traveling. More and more, this mobile workflow is becoming a real option for photographers.

In this walkthrough, I’ll show you how I edited an image completely on the iPad, starting in Lightroom, jumping over to Photoshop when needed, and then finishing off with a print.

Starting in Lightroom on the iPad

The photo I worked on was taken with my iPhone. The first job was the obvious one: straightening the image. In Lightroom, I headed to the Geometry panel and switched on the Upright option, which immediately fixed the horizon.

Next, I dealt with a distraction in the bottom left corner. Using the Remove Tool with Generative AI switched on, I brushed over the wall that had crept into the frame. Lightroom offered three variations, and the second one was perfect.

With those fixes made, I converted the photo to black and white using one of my own synced presets. A quick tweak of the Amount slider gave me just the right level of contrast.

Masking and Sky Adjustments

The sky needed attention, so I created a Select Sky mask. As usual, the AI selection bled slightly into the hills, so I used a Subtract mask to tidy things up. It wasn’t perfect, but it was good enough to move forward.

From there, I added some Dehaze and Clarity to bring detail back into the clouds. A bit of sharpening pushed the image further, but that also revealed halos around a distant lamppost. At that point, I knew it was time to send the photo into Photoshop.

Fixing Halos in Photoshop on the iPad

Jumping into Photoshop on the iPad takes a little getting used to, but once you know where things are, it feels very familiar.

To remove the halos, I used the Clone Stamp Tool on a blank layer set to Darken blend mode. This technique is brilliant because it only darkens areas brighter than the sample point. With a bit of careful cloning, the halos disappeared quickly.

I then added a subtle “glow” effect often used on landscapes. By duplicating the layer, applying a Gaussian Blur, and changing the blend mode to Soft Light at low opacity, the image gained a soft, atmospheric look.

Back to Lightroom and Printing

With the edits complete, I sent the image back to Lightroom. From there it synced seamlessly across to my desktop, but the important point is that all of the editing was done entirely on the iPad.

Before printing, I checked the histogram and made some final tweaks. Then it was straight to print on a textured matte fine art paper. Once the ink settled, the result looked fantastic — no halos in sight.

Final Thoughts

I’m not suggesting you should abandon your desktop for editing. Far from it. But the iPad has become a powerful option when you’re traveling, sitting in a café, or simply want to work away from your desk.

This workflow shows what’s possible: you can straighten, retouch, convert to black and white, make sky adjustments, refine details in Photoshop, and even prepare a final print — all from the iPad. And of course, everything syncs back to your desktop for finishing touches if needed.

Exciting times indeed.

AI Just Changed How We ENHANCE EYES in PHOTOSHOP 💥

Two Ways to Add Detail to Dark Eyes in Photoshop

If you’ve ever edited a portrait where the eyes are so dark there’s no detail to recover, you’ll know how tricky it can be. Brightening them often makes things look worse, leaving the subject with flat, lifeless eyes.

In the video above, I walk you through two powerful techniques that solve this problem:

  • A reliable method using Photoshop’s traditional tools

  • A newer approach that uses AI to generate realistic iris detail

Here’s a quick overview of what you’ll see in the tutorial.

The Traditional Photoshop Method

This approach has been in my toolkit for years. It doesn’t try to recover what isn’t there. Instead it creates the impression of natural iris texture.

By adding grain, applying a subtle radial blur, and carefully masking the effect, you can fake detail that looks convincing. A touch of colour adjustment finishes the look, leaving you with eyes that feel alive instead of flat.

It’s a manual process but it gives you full control, and the result is surprisingly realistic.

The AI-Powered Method

Photoshop’s Generative Fill takes things in a different direction. With a simple selection around the iris and a prompt like “brown iris identification pattern”, Photoshop can generate natural-looking iris textures, the kind of fine patterns you’d expect to see in a close-up eye photo.

Once the AI has created the base texture, you can enhance it further using Camera Raw:

  • brighten the iris

  • increase contrast, clarity, and texture

  • even add a little extra saturation

Add a subtle catchlight and the transformation is incredible. The eyes go from lifeless to full of depth and realism in seconds.

Why These Techniques Matter

Eyes are the focal point of most portraits. If they’re dark and featureless, the whole image suffers.

These two techniques, one traditional and one modern, give you reliable options to fix the problem. Whether you want the hands-on control of Photoshop’s tools or the speed and realism of AI, you’ll be able to bring that essential spark back into the eyes.

🚨 Adobe’s New Cloud Selection Technology: Hype or Reality?

One of the areas Adobe has been relentlessly improving in both Photoshop and Lightroom is selections. Over the years, the tools have become smarter, faster, and more automated. Today, we can make incredibly intricate selections with just a single click. At least, that is what Adobe says.

But if you are anything like me, you will know that demo images shown on stage or in marketing videos always look perfect. They are the kind of photos you would expect to work well in a demo: clean backgrounds, well defined edges, controlled lighting.

That is not real life.
So the question is: what happens when we use these tools on our own photos?

Recently, I tested Adobe’s new Cloud Detailed Results option for Select Subject using nothing more than some quick shots I had taken on my iPhone. The results were genuinely impressive.

Device vs. Cloud: What Is the Difference?

When you click Select Subject in Photoshop, you now have a choice:

  • Device – the selection is processed locally on your computer.

  • Cloud Detailed Results – the file is sent to Adobe’s servers, where the AI analyzes the image and sends back a more refined selection.

The device option is fast but often rough around the edges. The cloud option takes a little longer, but the results are noticeably more accurate.

Putting It to the Test

To really see the difference, I used a handful of everyday photos. Nothing staged, just casual iPhone shots. Here are a few examples:

Motorbike Portrait

With the device option, edges around wheels, helmets, and clothing looked rough and patchy. Switching to the cloud option instantly cleaned things up. Spokes, frames, and even tiny gaps were handled beautifully.

Tree with Branches

This was the kind of subject that used to take several different techniques combined. The cloud option managed to capture the branches and trunk in one go. Yes, there were a few areas that could be tidied up with a brush, but the heavy lifting was done.

Bicycle Spokes

Ordinarily, this is a nightmare for selections. Yet the cloud option picked out individual spokes, valves, and gaps between them. Minimal cleanup needed.

Setting Your Default

If you want Photoshop to always use the cloud option, head to Preferences > Image Processing. Under Select Subject and Remove Background, choose Cloud Detailed Results. From then on, every time you use those tools, Photoshop will default to the cloud method unless you manually switch.

Final Thoughts

I will admit I was skeptical. On demo images these things always look good, but I did not expect my casual iPhone shots to stand up so well. The results from Cloud Detailed Results were consistently sharper, cleaner, and more accurate than anything the device option gave me.

And just to clear up a common question: this does not use your generative AI credits. It is simply sending your image to Adobe’s servers for analysis and returning a selection.

Selections have always been one of the most tedious parts of editing. This new technology does not just save time, it also frees up creative energy. Instead of fighting with edges and masks, you can focus on the fun part: being creative.

Exciting times ahead, and if this is what Adobe is offering now, I cannot wait to see how much better it gets.

The Remove Prompt in Photoshop

The NEW Remove Button in Photoshop that I mentioned about in an earlier post where I shared a video, has been added into Photoshop to prevent what are referred to as "Hallucinations", which is when instead of Removing something, Photoshop would add in a random object.

This works incredibly well BUT doesn't give 3 Variations to choose from, so (and this is new) to use the EXACT SAME technology, make a selection and then type "Remove" in the Contextual Task Bar.

This WILL remove whatever you have selected but now gives you 3 variations to choose from.

Note: Even though this is removing, as it's giving you 3 variations this does mean that credits are deducted.

HOW I Edit THIS Portrait in 2025 – Full Lightroom Workflow (No Photoshop!)

In this video I show how I now retouch a stylised portrait in Lightroom, that up until recently was only possible using Photoshop by making BIG use of Lightroom Masks …

*Newsletter Subscribers can download the same file I use in this tutorial to follow along step by step.

Project Indigo: Adobe Release NEW Pro Camera App for Mobile

A Research-Based Mobile Camera App

OVERVIEW

Project Indigo is a free experimental camera application developed by Adobe Labs, available for iOS devices starting from the iPhone 12 Pro and all non-Pro models beginning with the iPhone 14.

The app is intended as a platform for exploring advanced photography workflows on mobile devices. It combines traditional manual camera controls with computational imaging methods to improve photo quality and give users greater flexibility over how images are captured and processed.

Image Processing and Output

Unlike typical smartphone camera apps that rely heavily on automatic adjustments and stylised enhancements, Indigo emphasises subtle, globally tuned image processing. It also supports capturing both JPEG and RAW (DNG) formats, with computational benefits applied to both. This allows users to maintain editing flexibility while still benefiting from improved noise handling and dynamic range.

One of the core features of Indigo is its multi-frame image capture system. Instead of taking a single photo, Indigo records a short burst of up to 32 frames per shot. These frames are underexposed and then computationally aligned and merged to reduce noise and preserve shadow detail. This technique is applied even when shooting in RAW formats, uncommon for mobile photography apps, and the result of which enables cleaner, more editable files with fewer artifacts.

Manual Controls

The app includes full manual controls, giving users the ability to adjust:

  • Shutter speed

  • ISO

  • Focus (including manual focus override)

  • White balance (with both temperature and tint sliders)

  • Number of frames to capture per burst

These settings allow photographers to fine-tune their exposure and image quality, and to optimize for different conditions such as low light or motion.

Specialised Shooting Modes

Indigo also introduces dedicated capture modes for specific photographic techniques. The Long Exposure mode allows users to create effects like light trails, or light painting using a mobile phone, without needing an actual long shutter time. This though I hasten to add is unlike Long Exposure in such apps as EvenLonger or ReeXpose where we simply choose a time period for the long exposure from 0.5 seconds up to Bulb Mode (ReeXpose) or 1 second to 24 hours (EvenLonger), and the app then creates the long exppsure.

Another feature, Super-Resolution zoom, enables extended zoom up to 10x on the 15 and 16 Pro Max model. It is model dependant how much you get however you can use it to pinch and zoom too and the results are actually impressive. This technology is definitely going to affect sales of add-on zoom lenses.

Super-Resolution zoom is active for 10x+ zoom levels, all the way up to 25x. What it does is it increases the effective resolution by 2x horizontally and vertically (4x in pixels). As a result of this, at 10x, instead of a 3MP image you get 12MP, and at 20x instead of 0.75MP you get a 3MP image

Lightroom Integration

For users working within the Adobe ecosystem, Indigo integrates directly with Lightroom Mobile. Images can be exported to Lightroom with a single tap. In the case of DNG files, Indigo embeds tone and color metadata that Lightroom can interpret and apply as default settings—though users retain full control to adjust or reset these as needed.

You can also set the Project Indigo camera to be used within Lighroom Mobile.

To do this make sure you have updated the Lightroom Mobile App. Once updated go the App settings from within the Lightroom Mobile App and then go to the EARLY ACCESS section and turn on the Use Indigo for Camera (Tech Preview)

Experimental Features

The app also includes access to experimental tools, that currently include AI DeNoise and Remove Reflections.

Personally I’d prefer to see these in the Lightroom Mobile App instead of inside the Camera App itself.

Planned Features and Roadmap

Looking ahead, Adobe plans to expand Indigo with several features:

  • An Android version of the app

  • Additional tone presets and photographic "looks"

  • Support for portrait, panorama, and video capture

  • Exposure and focus bracketing modes for use cases like astrophotography

INITIAL FEEDBACK

By far my favourite feature so far is the Super Resolution Zoom, but there are definitely things that I would want adding such as Burst Mode and a Long Exposure function that work very similar to EvenLonger and ReeXpose, so fingers crossed these appear sooner than later.

OVERHEATING

However, on release day I’ve already been receving feedback from folks saying that they are getting an overheating warning appear on their iPhone after only a short period of use.

This I discussed with the development team at Adobe during the time before this was released hen I was testing the app. I too noticed my phone getting much warmer than normal.

It was explained to me that this is in part due to the the fact that the App uses a different process when handling files in that it makes use of other hardware within the phone. Also because the heat dispersal of the iPhone 15 Pro Max which I currently have is known for not being good.

I have heard from folks using the 16 Pro Max and reporting the same warning appearing even though it has improved heat dispersal, so fingers crossed this is something Adobe are able to iron out in the near future.

Check out the official Project Indigo Blog Post for more information …