Adobe doesn't always make noise about the right things. The January 2026 Photoshop update landed with a feature list that reads like incremental improvement — new adjustment layers, better AI output, a text warping beta. But if you actually work in Photoshop daily, two of these changes address long-standing frustrations that have pushed photographers toward workarounds, plugins, and competing software for years.
The headline features are Clarity, Dehaze, and Grain as proper non-destructive, maskable adjustment layers — and a significant quality jump in Firefly-powered generative tools. Let's talk about why both matter more than the release notes suggest.
Non-destructive adjustment layers
What changed
Clarity, Dehaze, and Grain are now available as standalone adjustment layers in Photoshop. That means you can add them to your layer stack, mask them selectively, blend them with opacity and blend modes, and adjust or remove them at any point without touching the underlying image data.
If you're thinking "wait, couldn't I already do that?" — you could, sort of. The previous method involved using Camera Raw Filter as a smart filter, which worked but was clunky. You couldn't easily mask individual adjustments independently, the preview workflow was slower, and stacking multiple Camera Raw Filter instances for different zones of an image was tedious. Most photographers either committed to destructive adjustments or built complex workaround pipelines involving multiple smart objects.
Why it matters
Clarity and Dehaze are two of the most commonly used adjustments in photography post-production. Clarity adds midtone contrast that reveals texture and structure — essential for landscape, architecture, and product photography. Dehaze cuts through atmospheric haze and is critical for outdoor and aerial work. Having both as proper maskable layers means you can apply Clarity to a building's facade while leaving the sky untouched, or Dehaze a distant mountain range without affecting the foreground — all without destructive edits.
Grain as an adjustment layer is equally significant for a different reason. Film grain has become one of the dominant visual trends in photography, and applying it non-destructively means you can dial it in at the very end of your editing process, adjust it after client feedback, or vary it across print sizes without re-editing the image. That's a real workflow improvement for anyone delivering both digital and print files from the same master.
Firefly generative improvements
The quality jump
Photoshop's Generative Fill, Generative Expand, and Remove tool all received a quality upgrade in this release. The key improvements: output resolution now reaches 2K, detail is noticeably sharper, artifacts are reduced, and the tools do a better job of matching lighting and perspective in the surrounding image.
The Remove tool, in particular, has been a pain point. Previous versions left visible seams, inconsistent lighting at boundaries, and occasionally introduced objects that weren't there. Adobe acknowledges these issues in the release notes and claims the updated model addresses them directly. Early testing suggests the improvement is real, though not perfect — complex scenes with strong directional light still occasionally produce artifacts.
Reference-based generation
Generative Fill now does a better job of preserving the original subject when generating backgrounds around it. In previous versions, generating a background around a selected object could expand the object's boundaries or introduce unwanted edge effects. The updated model maintains sharper separation between the original element and generated content.
For product photographers and compositors, this is a practical improvement. Generating clean background extensions around a product shot — something that's become a common e-commerce workflow — is now more reliable without manual cleanup.
What's in beta
Dynamic Text
A beta feature called Dynamic Text allows you to select a text layer and transform it into circular, arched, or bowed shapes. This is less relevant to pure photography but matters for anyone who designs marketing materials, social media graphics, or album layouts inside Photoshop. It's a beta, so expect rough edges — but the functionality has been requested for years.
What this update means for photographers
Taken individually, non-destructive Clarity and better AI fill don't sound revolutionary. Taken together, they represent Adobe responding to a pattern of feedback that's been consistent for several years: photographers want Photoshop to be less destructive, more flexible, and more reliable when it does things automatically.
The adjustment layer additions also hint at a direction. If Clarity, Dehaze, and Grain are now proper layers, it's reasonable to expect other Camera Raw adjustments — Texture, Vignette, maybe even color grading — to follow the same path in future updates. That would gradually reduce the need to round-trip through Camera Raw Filter or Lightroom for adjustments that logically belong in Photoshop's layer-based workflow.
For photographers who left Photoshop for alternatives like Capture One, ON1, or Affinity Photo partly because of non-destructive workflow limitations, this update doesn't single-handedly change the calculus. But it narrows the gap in one specific area that mattered enough for people to leave.
The Firefly improvements are harder to evaluate without extensive real-world use. Adobe's generative AI has improved steadily with each update, and the gap between generated content and real imagery continues to shrink. Whether that's encouraging or concerning depends on your perspective — but from a pure tool standpoint, having more reliable content-aware removal and background extension is unambiguously useful for working photographers.
It's a solid update. Not the one Adobe marketed the loudest. But the one photographers will actually feel in their daily work.