You NEVER Know ❤️

One story I often tell is of a time when I was presenting at a Photography Conference in The Netherlands called “Professional Imaging” a few years back, when at the end of the presentation, a man and his wife came to the side of the stage to speak to me, and mentioned about his Father Dying and how devastated he was, that the last photograph he took of him, was out of focus.

Well, today, completely out of the blue, I had an email come through from him, mentioning about our conversation all those (11) years ago.

I've attached it here for you, for no other reason than to say ... you NEVER know who you are affecting and you never know who is looking in.

Each of us goes about our thing day to day. We take pictures, we create and we share, but believe when I say that someone is watching, someone is looking in, and is being inspired and encouraged by what you do.

You've not met them yet, and maybe you never will but they are 100% out there, and for that reason alone ... we have to keep on, keeping on!

Best to you,
Glyn

⛔️ Stop Policing Creativity

I don’t normally write a post such as this, but I’ve seen a fair bit of ‘this’ lately so just felt the need to put pen to paper, so to speak.

I’m tired of seeing people tell others what they should or shouldn’t be doing with their photography and editing.

We see it all the time in comments and forums; people acting like there is a "correct" way to be creative.

It's tedious. It’s exhausting.

The escape is the point

Photography and editing are personal.

For loads of us, picking up a camera is a break from all the rules, deadlines and stress that come with modern life.

When someone sits down to create, that might be the only hour in their day where they actually feel in control of something. If they want to use a tool that makes things easier or more enjoyable, that's up to them.

The minute we start slapping "rules" on creativity, we turn what should be a release valve into just another chore; we make people second-guess themselves before they share their work, or even worse, they stop creating altogether because they're worried about being judged by the purists.

Use the tools you want

This goes for the tools we choose too.

If someone wants to use a particular bit of software or decides to use AI, so what? That's their choice.

If what someone else is doing has absolutely no impact on you, your life, or your own creativity, then why let it concern you?

As long as they're not trying to deceive people or claim credit for something they didn't actually do, let them get on with it, and even if someone does try to be deceptive, they'll get found out eventually. We'd probably do better spending our time keeping our own house in order before we start telling everyone else how to run theirs.

The elitism of the "right" way

Then you've got the phrases that always come up, like "getting it right in camera" or "we should all go back to basics."

Every time I see or hear this, it comes across as elitist. It feels like they're saying "I'm better than you."

Do the people who say this honestly think everyone else is deliberately trying to get things "wrong" in camera?

We all try to do our best at the point of capture, but for many people, that's just the start of the process.

And as for going back to basics, who are we to say that?

Just because one person finds joy in the traditional way of doing things doesn't mean everyone else has to. Why should someone else do what you reckon they should do?

Leave them be

Life's tough enough as it is. We're all different, and thank goodness for that; the world would be a boring place if we all worked the same way.

If someone's getting enjoyment out of what they're doing, leave them be. The world doesn't need more critics, it needs more people finding a way to enjoy themselves.

If their process made their day a bit better, they didn't break a rule, they won.

🙅🏼‍♂️ How to NEVER forget your Photoshop edits again ✅

I have lost count of the times I have finished an edit, loved the result, and then completely forgotten how I actually got there.

In this video, I am showing you a simple trick using the Photoshop History Log and AI to create a perfect, step-by-step record of every single move you make.

No more guessing which filter you used or what that specific slider value was; it’s like having a digital assistant write your editing recipes for you while you work.

What I cover:

✅ How to turn on the hidden History Log in Photoshop.
✅ Exporting your editing steps as a simple text file.
✅ Using a clever AI prompt to turn that messy log into a clear workflow.
✅ Why this is a game-changer for your consistency and learning.

Evoto's AI Headshots: When Your Favourite Tool Turns Against You

Evoto's AI headshot generator has become a cautionary tale about how quickly an AI company can burn through the trust of the very professionals who helped build its reputation.

When your retouching app becomes a rival

At Imaging USA 2026 in Nashville, portrait and headshot photographers discovered that Evoto had been quietly running a separate "Online AI Headshot Generator" site. The service let anyone upload a selfie and receive polished, corporate-style portraits, with marketing that openly pitched it as a cheaper, easier alternative to booking a photographer.

This wasn't a hidden experiment tucked away behind a login. The headshot generator had a public URL, example images, an FAQ and a clear path from upload to final "professional" headshot. For photographers who had built Evoto into their workflow, it felt like discovering that a trusted retouching assistant had quietly set up shop down the road and started undercutting them.

Why Evoto's role made this sting

Evoto built its identity as an AI-powered retouching and workflow tool aimed squarely at professional photographers, especially those shooting portraits, headshots and weddings. The pitch was straightforward: let the software handle the tedious stuff like skin smoothing, flyaway hairs, glasses glare, background cleanup and batch retouching so photographers can focus on directing and shooting.

That positioning worked. Photographers paid for it, used it on paid client work, recommended it in workshops and videos, and sometimes became ambassadors or power users. The unspoken deal was that Evoto would stay in the background, supporting human photographers rather than trying to replace them. A consumer-facing headshot generator cut straight across that understanding.

What the headshot generator offered

The AI headshot tool followed a familiar pattern: upload a casual selfie, choose a style and receive cleaned-up headshots with flattering lighting, neat clothing and tidy backgrounds, ready for LinkedIn or company profiles. The examples looked very similar to the kind of "studio-style" work many Evoto customers already produce for corporate clients.

*Simulation Only ; NOT the Evoto Interface

The wording is what really set people off. The marketing leaned heavily into cost savings, avoiding studio bookings, quick turnaround and "professional-looking" results without needing a photographer. Coming from a faceless tech startup, that would already be provocative. Coming from a tool that photographers had trusted with their files and workflows, it felt like a direct invitation for clients to pick AI over them.

For many creatives, this is the line that matters: AI that helps you deliver better work is one thing. AI that presents itself as your replacement is something else entirely.

Why photographers are so angry

Photographers' reactions centre on three main issues.

First is a deep sense of betrayal. People had paid into the Evoto ecosystem, uploaded thousands of client images and publicly championed the product. Learning that the same company had built a consumer brand aimed at undercutting them felt like discovering that their support had funded a tool designed to compete with them.

Second are concerns about training data. Photographers have pointed out that the look of the AI headshots seems very close to the kind of work Evoto users upload. Evoto now says its models are trained only on commercially licensed or purchased imagery, not on customer photos, but those reassurances arrived after the story broke and against a backdrop of widespread anxiety about AI scraping. Without long-standing, transparent policies on data use, many remain sceptical.

Third is the tone of the marketing. Promises of saving money, avoiding bookings and still getting "pro-quality" results read like a direct invitation for clients to choose a cheap AI pipeline instead of hiring a photographer. Photo Stealers captured the mood with a blunt "WTF: Evoto AI Headshot Generator" and reported photographers literally flipping off the Evoto stand at Imaging USA. The Phoblographer went further, calling the service an attempt to replace photographers with "AI slop" and questioning the claim that this was simply an innocent test.

The apology that didn't land

In response, Evoto posted a statement saying the headshot generator had "missed the mark", "crossed a line" and was being discontinued. The company framed it as a test of full image generation that strayed beyond the support role it wants to play, and promised that user images are not used to train its models, describing its protections as "ironclad" and its training data as licensed only.

On the surface, this sounds like the right approach: apology, cancelled feature, clearer explanation of data use. In practice, many photographers point out that a fully branded, public site with examples and a working workflow doesn't look like a small internal trial. Shutting down comments on the apology thread after a wave of criticism made it feel more like damage control than a genuine conversation with paying users.

Commentary from outlets such as The Phoblographer argues that the real problem is the direction Evoto appears to be heading. If a company plans to sell "good enough" AI portraits directly to end clients while also charging photographers for retouching tools, trust will be almost impossible to rebuild.

What photographers can learn from this

The Evoto story lands at a time when photographers are already rethinking their place in an AI-saturated world, from smartphone "moon shots" to generative backdrops and AI profile photos. Beyond the immediate anger, it points to a few practical lessons.

Treat AI tools as business partners, not just clever software. Pay attention to how they talk to end clients and where their roadmap is heading.

Ask clear questions about training data and future plans. You need to know if your uploads can ever be used for model training and whether the company intends to build services that compete with you.

Be careful about attaching your reputation to a brand. Discounts and referral codes matter less than whether the company's long-term vision keeps human photographers at the centre.

For AI companies in imaging, the message is equally direct. You cannot present yourself as a photographer-first platform while quietly testing products that encourage clients to bypass those same photographers. In a climate where trust is already thin, real transparency, clear boundaries and honest dialogue are the only way to stay on the right side of the people whose pictures, workflows and support built your business in the first place.

Why AI Enhancement Isn't Cheating in Wildlife Photography

Wildlife photography is something I'd love to do more of, but at the moment, time doesn't allow it. However, when I do get the chance to head out with a long lens to give it a go, I gain deep respect for what it takes to capture the shot.

That's why the debate around AI editing tools fascinates me.

Critics argue that tools like Topaz Gigapixel or AI sharpening "ruin" wildlife photography. If your lens wasn't long enough or your sensor didn't capture fine details, using AI to reconstruct them is cheating.

I disagree completely.

The soul of wildlife photography is being there. If you hiked to a remote location, endured harsh weather, and invested hours of patience to witness a specific behaviour, that has real value. That's the foundation of your photograph.

So why should using AI to overcome your gear's physical limitations invalidate your fieldwork?

AI enlargement or texture refinement doesn't fabricate what the animal did. When a predator chases prey, AI doesn't invent the event. It helps your image reflect what you actually witnessed. It bridges the gap between your equipment's constraints and the magnitude of the moment.

We obsess over the technical "purity" of raw files, but we should focus on the effort required to be standing in that field. Cameras are tools, and every tool has limits. If software rescues a once-in-a-lifetime encounter from being a blurry mess, that's a win.

The truth of wildlife photography isn't in the pixels. It's in the person willing to get cold, wet, and tired to document the natural world.

What's your take?

Does AI enhancement cross a line, or does the real work happen in the field?

I'd genuinely love to hear your perspective.

Come on Adobe 🙏🏻 We NEED THIS FEATURE ⚠️

I've put together this short video because I need to ask a favour from anyone who uses Photoshop Camera Raw or Lightroom. There's a fundamental feature that's been missing for years, and it seriously impacts how we edit our images and the results we achieve.

The Missing Piece in AI Masking

The issue centres on masking, specifically the AI-generated masks available in the masking panel. Being able to select a sky or subject with one click is genuinely incredible, but there's a massive gap in functionality. We have no way to soften, blur, or feather those AI masks after they've been created.

Instead, we're left with incredibly sharp, defined outlines that sometimes look like poorly executed cutouts. This makes blending our adjustments naturally into the rest of the image much harder than it needs to be.

Years HAVE PASSED

Adobe introduced the masking panel back in October 2021. It changed the way we work and represented a huge step forward. Yet here we are, years later, still without a simple slider to soften mask edges.

If you want to blend an adjustment now, you're often stuck trying to subtract with a large soft brush, using the intersect command with a gradient, or employing other crude workarounds to hide the transition. It feels like excessive work for what should be a standard function.

The Competition Gets It Right

What makes this even more frustrating is seeing other software solve this problem elegantly. The new Boris FX Optics 2026 release includes AI masking controls where a single slider softens and blurs the mask outline, and it works incredibly well. Luminar has been offering this functionality for quite a while too.

These tools understand that a mask is only as good as its edges. When the competition provides ways to feather and refine AI selections, the absence of this feature in Adobe's ecosystem feels glaringly obvious.

Adobe's Strengths and Opportunities

Don't get me wrong. I appreciate that Adobe constantly pushes boundaries. We've witnessed tremendous growth over recent years, with developments from third-party AI platforms like Google's Gemini, emerging models, and innovations from Black Forest Labs with Flux and Topaz Labs. It's an exciting time to be a creator.

But I wish Adobe would take a moment to polish what we already have. Adding flashy new features is great, but refining the core workflows we use every single day would be a massive leap forward for all of us.

How You Can Help

Rather than simply complaining about this issue, I've created a feature request post in the Adobe forums. It's been merged with an existing thread on the same topic, which actually helps consolidate our voices into one place.

Here's what I need you to do: click the link below to visit the post and give it an upvote by clicking or tapping the counter number in the upper left. If we can get enough visibility on this, Adobe might finally recognise how much the community wants and needs this feature.

( LINK )

I believe refining existing tools is just as important as inventing new ones. Thank you for taking the time to vote. It really does make a difference when we speak up together.

The Smartphone AI Photography Controversy: What's Really Going On?

The smartphone photography world is having a bit of an identity crisis right now, and it's forcing us all to ask an uncomfortable question: when does making a photo look better cross the line into just making stuff up?

Samsung's Moon Photo Fiasco

The whole thing kicked off properly in March 2023 when a Reddit user called ibreakphotos ran a brilliant experiment. They took a high-resolution moon photo, blurred it until you couldn't see any detail at all, stuck it on a monitor, and photographed it from across the room using a Samsung Galaxy phone's Space Zoom. What happened next was pretty shocking: Samsung's camera added crisp crater details that simply weren't there in the blurry image.

This wasn't your typical computational photography where the phone combines multiple frames to pull out hidden detail. Samsung was using a neural network trained on hundreds of moon images to recognise the moon and basically paste texture where none existed. The company more or less admitted this in their technical explanation, saying they apply "a deep learning-based AI detail enhancement engine" to "maximise the details of the moon" because their 100x zoom images "have a lot of noise" and aren't good enough on their own.

The controversy came back round in August 2025 when Samsung's One UI 8 beta revealed they were working to reduce confusion "between the act of taking a picture of the real moon and an image of the moon". In other words, they admitted their AI creates moon photos rather than capturing them.

Other Companies Are At It Too

Samsung isn't the only one playing this game. Huawei faced similar accusations with its P30 Pro back in 2019, using AI to enhance moon photography beyond what the camera actually saw. The pattern is pretty clear: smartphone manufacturers are using AI to make up for physical limitations that no amount of clever software can genuinely overcome.

Google's Approach to Reality

Google went in a slightly different direction with the Pixel 8 and 8 Pro, introducing "Best Take", a feature that swaps facial expressions between different photos in a sequence. If someone blinks or frowns in a group shot, the phone finds a better expression from other photos and drops it into your chosen image.

They also launched "Magic Editor", which lets you erase, move, and resize elements in photos (people, buildings, whatever) with AI filling in the gaps using algorithms trained on millions of images. These tools work on any photo in your Google Photos library, not just ones you've just taken.

Tech commentators called these features "icky", "creepy", and potentially threatening to "people's (already fragile) trust of online content". Google's Isaac Reynolds defended the approach by saying that "people don't want to capture reality, they want to capture beautiful images" and calling the results "representations of a moment" rather than documentary records.

Photographers Are Fighting Back

The controversy has created what some observers call a "perfection paradox". As AI became capable of churning out flawless imagery at industrial scale in 2025, perfection itself lost its appeal. Social feeds filled up with technically immaculate visuals, but the images actually getting attention were the ones showing signs of real human touch.

Professional photographers responded by deliberately embracing film grain, motion blur, quirky colours, accidental flare, and cameras with intentional limitations. The message is clear: authenticity and imperfection have become the things that set you apart in an AI-saturated landscape.

One photographer noted that when clients were offered choices between AI-crafted footage and work shot by humans with clear creative perspectives, they "still gravitated to the latter". Despite AI's technical achievements, there's still a "gap between technological capability and cultural readiness".

The Trust Problem

The fundamental issue is that smartphone manufacturers market these AI enhancements as camera capabilities without clearly telling users when AI is manufacturing details rather than capturing them. Samsung's moon photos showcase this perfectly. Users think they've captured incredible detail through superior hardware and processing, when actually the phone has just overlaid trained data.

Professor Rafal Mantiuk from the University of Cambridge explained that smartphone AI isn't designed to make photographs look like real life: "People don't want to capture reality. They want to capture beautiful images". However, the physical limitations of smartphones mean they rely on machine learning to "fill in" information that doesn't exist in the photo, whether that's for zoom, low-light situations, or adding elements that were never there.

What's Happening Next

There's growing pressure on the industry for what's being called "the year of AI transparency" in 2026. People are demanding that manufacturers like Samsung, Apple, and Google disclose when and how AI is manipulating photos.

Google has started responding with detection tools, rolling out AI detection capabilities through Gemini that can spot artificially generated photos using hidden SynthID watermarks and C2PA metadata. These watermarks stay detectable by machines whilst remaining invisible to human eyes, surviving compression, cropping, and colour adjustments. The system analyses images on-device without sending data to external servers.

Samsung, meanwhile, continues embracing AI integration. They recently published an infographic declaring that future cameras "will only get smarter with AI" and describing their camera as "part of the intuitive interface that turns what users see into understanding and action". This language notably sidesteps the authenticity questions that plagued their moon photography feature.

The Cultural Pushback

Perhaps most tellingly, younger users are increasingly seeking cameras that produce "real" and "raw" photos rather than AI-enhanced imagery, driving a resurgence of early-2000s compact digital cameras. This represents a rebellion against smartphone AI manipulation and a genuine desire for photographic authenticity.

The controversy forces a broader reckoning about what photography means in the AI era. As one analysis noted, 2025's deeper story wasn't simply that AI improved, it was "the confrontation it forced: what counts as real, what counts as ours, and what creativity looks like when machines can mimic almost anything".

The Bottom Line

The core issue is straightforward: smartphone manufacturers are using AI to create photographic details that cameras never actually captured, then marketing these capabilities as camera performance rather than AI fabrication.

Companies haven't clearly disclosed when AI is manufacturing versus enhancing, which is eroding trust in smartphone photography. Real photographers are differentiating themselves by embracing authenticity and imperfection as AI floods the market with technically perfect but soulless imagery.

And 2026 is shaping up as a pivotal year for AI transparency demands and authenticity verification tools.

This controversy represents more than just technical debates. It's fundamentally about trust, authenticity, and what we expect from our photographic tools in an increasingly AI-mediated world.

Picture This - A Musical Gift 🎸

Last Friday I was left completely speechless!

I logged in to a live video chat to join members of The Photography Creative Circle for our weekly coffee hour, and immediately there seemed more members present than usual … way more.

Shortly after logging in I found out why, as member and dear friend Jean-François Léger began reading out something he’d prepared …

Glyn, In the spirit of the holiday season, we have a surprise for you today.

About six months ago, you shared a vision with us by creating this Photographic Creative Circle. At first, we all joined to learn from you, to master our cameras and refine our post-processing skills. But very quickly, something much deeper began to take shape.

It has become a place where we share our lives, celebrate our successes, and support one another through difficult times. Photography, in the end, became the beautiful pretext for us to become true friends.

You laid the foundation for this community, now this community wanted to create something for you that gives full meaning to the word 'community.'

Glyn, this is our way of saying a big thank you for the commitment, the generosity and the tremendous work you’ve done for all of us.

So Picture this!

And this is what I was presented with …

Written, recorded and edited by Jean-François and with contributions by other members of the community, including 2 in particular that have had traumatic loss in their families in recent weeks … this blew me away!

Such an incredible gift that I will treasure forever … and be playing over and over again ❤️

What Are Those Mystery * and # Symbols in Photoshop??? 🤔

If you spend any amount of time in Adobe Photoshop, you become very familiar with the document tab at the top of your workspace. It tells you the filename and the current zoom level.

But sometimes, little cryptic symbols appear next to that information. Have you ever looked up and wondered, "Why is there a random hashtag next to my image name?" or "What does that little star mean?"

Nothing is broken. These symbols are just Photoshop's way of giving you a quick status update on your file and its colour management, without you needing to dig through menus.

What These Symbols Tell You

The symbols represent:

  • The save state of your document

  • Whether it has a colour profile attached

  • Whether the document's profile differs from your working space

Here is a quick guide to decoding those little tab hieroglyphics.

1. The Asterisk After the Filename ("Save Me!" Star)

What it looks like: … (RGB/8) *

What it means: An asterisk hanging right off the end of your actual filename means you have unsaved changes.

When it appears: Photoshop is hypersensitive here. The star will appear if you:

  • Move a layer one pixel

  • Brush a single dot onto a mask

  • Simply toggle a layer's visibility

  • Do pretty much anything

It's a gentle reminder that the version on screen is different from the version saved on your hard drive. If the computer crashed right now, you would lose that work.

The fix: Press Cmd+S (Mac) or Ctrl+S (Windows). The moment you successfully save the file, that little star will disappear because Photoshop now considers the document "clean" again.

2. The Asterisk ("Profile Difference" Star)

What it looks like: … (RGB/8*)

What it means: This is a different symbol in a different spot. If the star is tucked inside the parentheses next to the bit depth (the 8 or 16), it's no longer talking about unsaved work but about colour management.

In current Photoshop versions, an asterisk here generally means the file's colour profile situation does not match your working RGB setup. For example, you're working in sRGB as your default, but the image you opened is tagged with Adobe RGB (1998). In other words, the document is "speaking" a slightly different colour language than your default workspace.

Should you worry?

  • Usually, no. As long as you keep the embedded profile and your Colour Settings are sensible, Photoshop can still display the colours accurately even if the document profile and working space are different.

  • It's worth paying attention, though, if you're planning to combine several images into one document. You'll want a consistent profile for predictable colour when you paste, convert or export.

3. The Hash Symbol # ("Untagged" Image)

What it looks like: … (RGB/8#)

What it means: If you see the hash/pound/hashtag symbol inside the parentheses, it means the image is Untagged RGB. There's no embedded colour profile at all, so Photoshop has no explicit instructions telling it how those RGB numbers are supposed to be interpreted.

Why this happens: This is very common with:

  • Screenshots

  • Many web images

  • Older files where metadata was stripped out

When Photoshop opens an untagged image, it has to assume a profile based on your Colour Settings (typically your RGB working space, often sRGB by default), which may or may not match how the file was originally created.

Should you worry?

  • If colour accuracy is critical (printing, branding, matching other assets), yes, you should pay attention to that #. Different assumptions about the profile can easily lead to differences in appearance between systems.

  • You can fix this by going to Edit > Assign Profile and choosing the correct profile. For many web-style images, assigning sRGB is a sensible starting point, but be aware that assigning the wrong profile will change how the image looks, so use it when you have a good idea of the original intent.

Summary Cheat Sheet

(RGB/8) *

  • This document has unsaved changes

  • Save the file and the star will disappear

(RGB/8*)

  • There's a colour-profile difference or related colour-management status

  • Typically means the document's profile is not the same as your current working RGB space

(RGB/8#)

  • The image is Untagged RGB, with no embedded colour profile

  • Photoshop has to assume a profile based on your settings

Catching the New Years Day Sunrise 2026 ☀️

Got up early and popped down to the local beach to photograph the sunrise, and Mother Nature did not disappoint 😍

Happy New Year 🎉

Fuji X-T5
Fuji 18mm f/1.4 @ f/11
2.5 sec, f/11 , ISO 125

NiSi 3 Stop JetMag Pro ND Filter

Benro Rhino Carbon Fibre Tripod

Images below captured on my iPhone 17 Pro Max using the Leica Camera App and the Greg WLM B&W setting…