photography

⛔️ Stop Policing Creativity

I don’t normally write a post such as this, but I’ve seen a fair bit of ‘this’ lately so just felt the need to put pen to paper, so to speak.

I’m tired of seeing people tell others what they should or shouldn’t be doing with their photography and editing.

We see it all the time in comments and forums; people acting like there is a "correct" way to be creative.

It's tedious. It’s exhausting.

The escape is the point

Photography and editing are personal.

For loads of us, picking up a camera is a break from all the rules, deadlines and stress that come with modern life.

When someone sits down to create, that might be the only hour in their day where they actually feel in control of something. If they want to use a tool that makes things easier or more enjoyable, that's up to them.

The minute we start slapping "rules" on creativity, we turn what should be a release valve into just another chore; we make people second-guess themselves before they share their work, or even worse, they stop creating altogether because they're worried about being judged by the purists.

Use the tools you want

This goes for the tools we choose too.

If someone wants to use a particular bit of software or decides to use AI, so what? That's their choice.

If what someone else is doing has absolutely no impact on you, your life, or your own creativity, then why let it concern you?

As long as they're not trying to deceive people or claim credit for something they didn't actually do, let them get on with it, and even if someone does try to be deceptive, they'll get found out eventually. We'd probably do better spending our time keeping our own house in order before we start telling everyone else how to run theirs.

The elitism of the "right" way

Then you've got the phrases that always come up, like "getting it right in camera" or "we should all go back to basics."

Every time I see or hear this, it comes across as elitist. It feels like they're saying "I'm better than you."

Do the people who say this honestly think everyone else is deliberately trying to get things "wrong" in camera?

We all try to do our best at the point of capture, but for many people, that's just the start of the process.

And as for going back to basics, who are we to say that?

Just because one person finds joy in the traditional way of doing things doesn't mean everyone else has to. Why should someone else do what you reckon they should do?

Leave them be

Life's tough enough as it is. We're all different, and thank goodness for that; the world would be a boring place if we all worked the same way.

If someone's getting enjoyment out of what they're doing, leave them be. The world doesn't need more critics, it needs more people finding a way to enjoy themselves.

If their process made their day a bit better, they didn't break a rule, they won.

Evoto's AI Headshots: When Your Favourite Tool Turns Against You

Evoto's AI headshot generator has become a cautionary tale about how quickly an AI company can burn through the trust of the very professionals who helped build its reputation.

When your retouching app becomes a rival

At Imaging USA 2026 in Nashville, portrait and headshot photographers discovered that Evoto had been quietly running a separate "Online AI Headshot Generator" site. The service let anyone upload a selfie and receive polished, corporate-style portraits, with marketing that openly pitched it as a cheaper, easier alternative to booking a photographer.

This wasn't a hidden experiment tucked away behind a login. The headshot generator had a public URL, example images, an FAQ and a clear path from upload to final "professional" headshot. For photographers who had built Evoto into their workflow, it felt like discovering that a trusted retouching assistant had quietly set up shop down the road and started undercutting them.

Why Evoto's role made this sting

Evoto built its identity as an AI-powered retouching and workflow tool aimed squarely at professional photographers, especially those shooting portraits, headshots and weddings. The pitch was straightforward: let the software handle the tedious stuff like skin smoothing, flyaway hairs, glasses glare, background cleanup and batch retouching so photographers can focus on directing and shooting.

That positioning worked. Photographers paid for it, used it on paid client work, recommended it in workshops and videos, and sometimes became ambassadors or power users. The unspoken deal was that Evoto would stay in the background, supporting human photographers rather than trying to replace them. A consumer-facing headshot generator cut straight across that understanding.

What the headshot generator offered

The AI headshot tool followed a familiar pattern: upload a casual selfie, choose a style and receive cleaned-up headshots with flattering lighting, neat clothing and tidy backgrounds, ready for LinkedIn or company profiles. The examples looked very similar to the kind of "studio-style" work many Evoto customers already produce for corporate clients.

*Simulation Only ; NOT the Evoto Interface

The wording is what really set people off. The marketing leaned heavily into cost savings, avoiding studio bookings, quick turnaround and "professional-looking" results without needing a photographer. Coming from a faceless tech startup, that would already be provocative. Coming from a tool that photographers had trusted with their files and workflows, it felt like a direct invitation for clients to pick AI over them.

For many creatives, this is the line that matters: AI that helps you deliver better work is one thing. AI that presents itself as your replacement is something else entirely.

Why photographers are so angry

Photographers' reactions centre on three main issues.

First is a deep sense of betrayal. People had paid into the Evoto ecosystem, uploaded thousands of client images and publicly championed the product. Learning that the same company had built a consumer brand aimed at undercutting them felt like discovering that their support had funded a tool designed to compete with them.

Second are concerns about training data. Photographers have pointed out that the look of the AI headshots seems very close to the kind of work Evoto users upload. Evoto now says its models are trained only on commercially licensed or purchased imagery, not on customer photos, but those reassurances arrived after the story broke and against a backdrop of widespread anxiety about AI scraping. Without long-standing, transparent policies on data use, many remain sceptical.

Third is the tone of the marketing. Promises of saving money, avoiding bookings and still getting "pro-quality" results read like a direct invitation for clients to choose a cheap AI pipeline instead of hiring a photographer. Photo Stealers captured the mood with a blunt "WTF: Evoto AI Headshot Generator" and reported photographers literally flipping off the Evoto stand at Imaging USA. The Phoblographer went further, calling the service an attempt to replace photographers with "AI slop" and questioning the claim that this was simply an innocent test.

The apology that didn't land

In response, Evoto posted a statement saying the headshot generator had "missed the mark", "crossed a line" and was being discontinued. The company framed it as a test of full image generation that strayed beyond the support role it wants to play, and promised that user images are not used to train its models, describing its protections as "ironclad" and its training data as licensed only.

On the surface, this sounds like the right approach: apology, cancelled feature, clearer explanation of data use. In practice, many photographers point out that a fully branded, public site with examples and a working workflow doesn't look like a small internal trial. Shutting down comments on the apology thread after a wave of criticism made it feel more like damage control than a genuine conversation with paying users.

Commentary from outlets such as The Phoblographer argues that the real problem is the direction Evoto appears to be heading. If a company plans to sell "good enough" AI portraits directly to end clients while also charging photographers for retouching tools, trust will be almost impossible to rebuild.

What photographers can learn from this

The Evoto story lands at a time when photographers are already rethinking their place in an AI-saturated world, from smartphone "moon shots" to generative backdrops and AI profile photos. Beyond the immediate anger, it points to a few practical lessons.

Treat AI tools as business partners, not just clever software. Pay attention to how they talk to end clients and where their roadmap is heading.

Ask clear questions about training data and future plans. You need to know if your uploads can ever be used for model training and whether the company intends to build services that compete with you.

Be careful about attaching your reputation to a brand. Discounts and referral codes matter less than whether the company's long-term vision keeps human photographers at the centre.

For AI companies in imaging, the message is equally direct. You cannot present yourself as a photographer-first platform while quietly testing products that encourage clients to bypass those same photographers. In a climate where trust is already thin, real transparency, clear boundaries and honest dialogue are the only way to stay on the right side of the people whose pictures, workflows and support built your business in the first place.

Why AI Enhancement Isn't Cheating in Wildlife Photography

Wildlife photography is something I'd love to do more of, but at the moment, time doesn't allow it. However, when I do get the chance to head out with a long lens to give it a go, I gain deep respect for what it takes to capture the shot.

That's why the debate around AI editing tools fascinates me.

Critics argue that tools like Topaz Gigapixel or AI sharpening "ruin" wildlife photography. If your lens wasn't long enough or your sensor didn't capture fine details, using AI to reconstruct them is cheating.

I disagree completely.

The soul of wildlife photography is being there. If you hiked to a remote location, endured harsh weather, and invested hours of patience to witness a specific behaviour, that has real value. That's the foundation of your photograph.

So why should using AI to overcome your gear's physical limitations invalidate your fieldwork?

AI enlargement or texture refinement doesn't fabricate what the animal did. When a predator chases prey, AI doesn't invent the event. It helps your image reflect what you actually witnessed. It bridges the gap between your equipment's constraints and the magnitude of the moment.

We obsess over the technical "purity" of raw files, but we should focus on the effort required to be standing in that field. Cameras are tools, and every tool has limits. If software rescues a once-in-a-lifetime encounter from being a blurry mess, that's a win.

The truth of wildlife photography isn't in the pixels. It's in the person willing to get cold, wet, and tired to document the natural world.

What's your take?

Does AI enhancement cross a line, or does the real work happen in the field?

I'd genuinely love to hear your perspective.

The Smartphone AI Photography Controversy: What's Really Going On?

The smartphone photography world is having a bit of an identity crisis right now, and it's forcing us all to ask an uncomfortable question: when does making a photo look better cross the line into just making stuff up?

Samsung's Moon Photo Fiasco

The whole thing kicked off properly in March 2023 when a Reddit user called ibreakphotos ran a brilliant experiment. They took a high-resolution moon photo, blurred it until you couldn't see any detail at all, stuck it on a monitor, and photographed it from across the room using a Samsung Galaxy phone's Space Zoom. What happened next was pretty shocking: Samsung's camera added crisp crater details that simply weren't there in the blurry image.

This wasn't your typical computational photography where the phone combines multiple frames to pull out hidden detail. Samsung was using a neural network trained on hundreds of moon images to recognise the moon and basically paste texture where none existed. The company more or less admitted this in their technical explanation, saying they apply "a deep learning-based AI detail enhancement engine" to "maximise the details of the moon" because their 100x zoom images "have a lot of noise" and aren't good enough on their own.

The controversy came back round in August 2025 when Samsung's One UI 8 beta revealed they were working to reduce confusion "between the act of taking a picture of the real moon and an image of the moon". In other words, they admitted their AI creates moon photos rather than capturing them.

Other Companies Are At It Too

Samsung isn't the only one playing this game. Huawei faced similar accusations with its P30 Pro back in 2019, using AI to enhance moon photography beyond what the camera actually saw. The pattern is pretty clear: smartphone manufacturers are using AI to make up for physical limitations that no amount of clever software can genuinely overcome.

Google's Approach to Reality

Google went in a slightly different direction with the Pixel 8 and 8 Pro, introducing "Best Take", a feature that swaps facial expressions between different photos in a sequence. If someone blinks or frowns in a group shot, the phone finds a better expression from other photos and drops it into your chosen image.

They also launched "Magic Editor", which lets you erase, move, and resize elements in photos (people, buildings, whatever) with AI filling in the gaps using algorithms trained on millions of images. These tools work on any photo in your Google Photos library, not just ones you've just taken.

Tech commentators called these features "icky", "creepy", and potentially threatening to "people's (already fragile) trust of online content". Google's Isaac Reynolds defended the approach by saying that "people don't want to capture reality, they want to capture beautiful images" and calling the results "representations of a moment" rather than documentary records.

Photographers Are Fighting Back

The controversy has created what some observers call a "perfection paradox". As AI became capable of churning out flawless imagery at industrial scale in 2025, perfection itself lost its appeal. Social feeds filled up with technically immaculate visuals, but the images actually getting attention were the ones showing signs of real human touch.

Professional photographers responded by deliberately embracing film grain, motion blur, quirky colours, accidental flare, and cameras with intentional limitations. The message is clear: authenticity and imperfection have become the things that set you apart in an AI-saturated landscape.

One photographer noted that when clients were offered choices between AI-crafted footage and work shot by humans with clear creative perspectives, they "still gravitated to the latter". Despite AI's technical achievements, there's still a "gap between technological capability and cultural readiness".

The Trust Problem

The fundamental issue is that smartphone manufacturers market these AI enhancements as camera capabilities without clearly telling users when AI is manufacturing details rather than capturing them. Samsung's moon photos showcase this perfectly. Users think they've captured incredible detail through superior hardware and processing, when actually the phone has just overlaid trained data.

Professor Rafal Mantiuk from the University of Cambridge explained that smartphone AI isn't designed to make photographs look like real life: "People don't want to capture reality. They want to capture beautiful images". However, the physical limitations of smartphones mean they rely on machine learning to "fill in" information that doesn't exist in the photo, whether that's for zoom, low-light situations, or adding elements that were never there.

What's Happening Next

There's growing pressure on the industry for what's being called "the year of AI transparency" in 2026. People are demanding that manufacturers like Samsung, Apple, and Google disclose when and how AI is manipulating photos.

Google has started responding with detection tools, rolling out AI detection capabilities through Gemini that can spot artificially generated photos using hidden SynthID watermarks and C2PA metadata. These watermarks stay detectable by machines whilst remaining invisible to human eyes, surviving compression, cropping, and colour adjustments. The system analyses images on-device without sending data to external servers.

Samsung, meanwhile, continues embracing AI integration. They recently published an infographic declaring that future cameras "will only get smarter with AI" and describing their camera as "part of the intuitive interface that turns what users see into understanding and action". This language notably sidesteps the authenticity questions that plagued their moon photography feature.

The Cultural Pushback

Perhaps most tellingly, younger users are increasingly seeking cameras that produce "real" and "raw" photos rather than AI-enhanced imagery, driving a resurgence of early-2000s compact digital cameras. This represents a rebellion against smartphone AI manipulation and a genuine desire for photographic authenticity.

The controversy forces a broader reckoning about what photography means in the AI era. As one analysis noted, 2025's deeper story wasn't simply that AI improved, it was "the confrontation it forced: what counts as real, what counts as ours, and what creativity looks like when machines can mimic almost anything".

The Bottom Line

The core issue is straightforward: smartphone manufacturers are using AI to create photographic details that cameras never actually captured, then marketing these capabilities as camera performance rather than AI fabrication.

Companies haven't clearly disclosed when AI is manufacturing versus enhancing, which is eroding trust in smartphone photography. Real photographers are differentiating themselves by embracing authenticity and imperfection as AI floods the market with technically perfect but soulless imagery.

And 2026 is shaping up as a pivotal year for AI transparency demands and authenticity verification tools.

This controversy represents more than just technical debates. It's fundamentally about trust, authenticity, and what we expect from our photographic tools in an increasingly AI-mediated world.

Picture This - A Musical Gift 🎸

Last Friday I was left completely speechless!

I logged in to a live video chat to join members of The Photography Creative Circle for our weekly coffee hour, and immediately there seemed more members present than usual … way more.

Shortly after logging in I found out why, as member and dear friend Jean-François Léger began reading out something he’d prepared …

Glyn, In the spirit of the holiday season, we have a surprise for you today.

About six months ago, you shared a vision with us by creating this Photographic Creative Circle. At first, we all joined to learn from you, to master our cameras and refine our post-processing skills. But very quickly, something much deeper began to take shape.

It has become a place where we share our lives, celebrate our successes, and support one another through difficult times. Photography, in the end, became the beautiful pretext for us to become true friends.

You laid the foundation for this community, now this community wanted to create something for you that gives full meaning to the word 'community.'

Glyn, this is our way of saying a big thank you for the commitment, the generosity and the tremendous work you’ve done for all of us.

So Picture this!

And this is what I was presented with …

Written, recorded and edited by Jean-François and with contributions by other members of the community, including 2 in particular that have had traumatic loss in their families in recent weeks … this blew me away!

Such an incredible gift that I will treasure forever … and be playing over and over again ❤️

Catching the New Years Day Sunrise 2026 ☀️

Got up early and popped down to the local beach to photograph the sunrise, and Mother Nature did not disappoint 😍

Happy New Year 🎉

Fuji X-T5
Fuji 18mm f/1.4 @ f/11
2.5 sec, f/11 , ISO 125

NiSi 3 Stop JetMag Pro ND Filter

Benro Rhino Carbon Fibre Tripod

Images below captured on my iPhone 17 Pro Max using the Leica Camera App and the Greg WLM B&W setting…

Why "Digital Infinity" is Killing Your Creativity (and How to Fix It)

We often see videos on YouTube claiming that one "magic trick" will change your life, but they usually fall a little bit flat. However, I recently ran an experiment in our creative community that I don't just believe will transform your photography, I know it will.

We live in an age of "digital infinity." Our phones can hold thousands of images, and it costs us absolutely nothing to press the shutter button. But this unlimited choice has a hidden downside: it can make us lazy.

To combat this, I set a challenge for our photographers that was brutally simple, and the results were completely unexpected.

The 10-Exposure Challenge

The rules were designed to strip away the safety nets we've become so reliant on:

  1. Only 10 exposures. That's it.

  2. No fixing it in post. What you shoot is what you get.

  3. No do-overs. If you click it, it counts, even if it's an accidental selfie.

The "Maddening" First Step

For many, the first reaction wasn't creative bliss; it was pure frustration. We had a studio photographer, Sarah, who is used to total control over lighting and props. Suddenly, out in the real world with only 10 frames, that control vanished. She described the experience as "maddening."

Another photographer, Francois, usually shoots a hundred frames just to get one perfect food shot. Having to tell the entire story of a meal in just 10 frames was a massive mental shift.

The Turning Point: Slowing Way Down

Once the frustration settled, something powerful happened. The photographers started to see this limitation as a lens that focused their attention.

They were forced to stop, look, and truly see what was in front of them. One member, Brian, took the challenge on his usual 90-minute walk. It ended up taking him three hours to take just 10 photos. That is the pace of deliberate creation.

What We Learnt

This challenge acted like a time machine, throwing us back to the discipline of the film era where every shot cost money. Here are the big takeaways:

  • Visualise first: We rediscovered the importance of walking around and using our eyes to find the angle before ever lifting the camera.

  • Embrace imperfection: Francois realised that his industry's obsession with "perfection" wasn't authentic. By embracing little imperfections, his photos felt more real and more appetising.

  • Constraint is liberating: Without the pressure of endless choices and editing, the simple act of taking a picture became joyful again.

The Final Verdict

Would they do it again? It was a resounding yes across the board. One member was so inspired he actually went back to shooting on real film.

The value wasn't really in the final 10 images; it was about rediscovering a mindful, deliberate way of working.

So, I have a question for you. In a world of unlimited options, what's one constraint you could impose on yourself to unlock a new level of creativity?

Give this challenge a go. I guarantee you'll see a difference and feel like an artist again.

Photographing Cars with my iPhone 15 Pro Max

Now primariy I’m a Portrait Photographer however this past few weeks, for the fun of it, I’ve been experimenting with my iPhone to see how it would fare when taking pictures of my car … both still and moving.

So a few weekends back I met up with friends in South Wales and headed to Crickhowell; an area I’ve photographed in before when working on this Harley image …

kit used

For these car photographs I was using my iPhone 15 Pro Max along with …

  • Polar Pro iPhone Case

  • 67mm Filter Adaptor + 67mm Circular Polarising Filter

  • ReeFlex Pro Camera App

CHECK OUT THESE ITEMS ON MY GEAR PAGE

The circular polariser is the car photographers secret weapon because of how effective it is in reducing / removing reflections in the car windows and on the bodywork …

I used the ReeFlex Pro Camera App purely so that I could use shutter speed and ISO to lock in the exposure that I wanted.

Here’s the ‘out of camera’ result …

Here’s the final retouched image which was done using Lightroom Desktop.

I also added a long exposure effect to the clouds using a technique in Photoshop which works really well however now I want to do the shoot again and capture it all ‘in camera’ using a long exposure app like ReeXpose or EvenLonger 😃

I also tried some panning shots and did get some results, however the ratio of keepers to rejects was very low.

I was trying to do this using the ReeFlex App and a Neutral Density Filter on the front of the iPhone so that I could lock in an exposed photographed at 1/30 second. However, the ReeFlex app at the time I did this did NOT have burst mode. This meant I was having to try to capture a single frame as I panned with the car passing at no more than 30 mph.

Here’s an edited result of what I managed to capture …

Later having shared this image online I spoke with a friend of mine who is a Professional Car Photographer. It was great to hear that he was impressed with the results expecially considering they were from an iPhone but he also gave me some settings that he and other pro car photographers use to capture a moving car.

My friend told me that moving car shots often see the car travelling at no more than 20 mph and that they also use a shutter speed of around 1/15 sec or 1/20 sec … but ALWAYS shoot in burst mode.

So armed with these settings I set about finding a pro camera app that would allow me to manually dial in the shutter speed but that also had Burst Mode; and the app I turned to was MOMENT.

I’ve yet to use the app in a planned car shoot however when in Germany a couple of weeks back I was waiting for a friend to pick me up and whilst waiting near a fuel station forecourt I attached the Variable Neutral Density Filter to my iPhone using the Polar Pro Case and Adaptor, and set the shutter speed using the MOMENT app.

I then simply took a burst of panning shots of passing cars as they negotiated a roundabout and drove away, which from my position would have meant them travelling around 20 mph.

Here’s some of the results …

So, yeah I’ve been very impressed with how the iPhone has dealt with this and the results I’ve managed to get.

Lots more experimenting and fun to be had!

The Photographer’s Guide to Lifelong Learning and Creative Growth | Marco Ter Beek

Watch the recording of this LIVE Broadcast as I chat with Professional Photographer Marco Ter Beek and discuss the importance of continually being a student in order to grow, find inspiration, being creative and ultimately creating work to be proud of.

links mentioned in the video: