Last week, Adobe rolled out a few updates for Photoshop, including a new addition to its array of Neural Filters: Depth Blur. Right now, this tool produces some pretty bad results, but given time it could affect how much we crave expensive lenses with huge apertures.
Adobe introduced Neural Filters last year, the most headline-grabbing of which was the ability to add fairly realistic aging or slightly disturbing smiles. Tucked away were a few filters that were built on Photoshop’s ability to determine depth within a scene, drawing on the knowledge of Adobe Sensei. Sensei is Adobe’s newly developed machine learning that takes your image, uploads it to the cloud, and then attempts to make intelligent deductions based on its library of millions of images.
The original release of Neural Filters included the option to add haze and being a big fan of photographing fog in a forest, I had a brief play, before quickly realizing that the results were shoddy and didn’t give it any further thought. Haze is now included as one of the sliders in Depth Blur and sits alongside some fairly major changes. Photoshop gives you the option to quickly (sort of — my average turnaround was about a minute) knock the background out of focus and give the impression that you have created a photograph that has a much shallower depth of field. If you’re a fan of bokeh, don’t get too excited just yet: the results are not great, but there’s a reason for that.
Can Photoshop Catch Up With Smartphones?
Phones have been using AI to replicate a shallow depth for a few years, and the results you see on a tiny screen are passable for social media but tend to fare poorly when you zoom in. Edges can be smudgy, and complex areas such as hair can be hit and miss. Fortunately, the overwhelming majority of users don’t notice and are simply happy that a portrait suddenly looks a little less like it was shot on a phone and is a bit more cinematic. It’s fun.
Photoshop, however, is a serious tool for manipulating high-resolution images, so you’d expect that when Adobe decides to launch a similar feature, it would have it nailed. On the contrary: this is a beta version, and it does a pretty poor job. Just like smartphones, edges can be confused, and hair is a problem. Right now, high-resolution results are a long way from being acceptable, which feels a bit odd given that Photoshop’s selection tools are incredibly sophisticated. Why? Because despite both being in Photoshop, these are two very different and separate technologies, and AI has a lot of catching up to do.
The Magic of the Depth Map — or Not
Generating a depth map from a two-dimensional image is no easy task, even when you have Adobe’s computing power. Images with distinct layers (e.g., tight crop of person standing in foreground, mountain and sky in background) are relatively straightforward, but trying to make a machine understand how a surface gradually extends into the distance is challenging, and if that surface has a complex texture, then the results will often be jarring, as this image demonstrates.
The two images below demonstrate the issue. On the left is an image shot at f/1.8; on the right is the same scene at f/5.6 (Stefan almost managed not to move, bless him) and with the Depth Blur filter applied.
The image on the left was shot at f/1.8. The image on the right was shot at f/5.6 and then run through Photoshop's Depth Blur Neural Filter.
You can see a few more examples in this excellent video from Unmesh Dinda of PiXimperfect, who has provided an insight into the tool’s various shortcomings. Creating examples where Depth Blur does a poor job is not difficult.
If It's so Bad, Why Has Adobe Released It?
So, why is Adobe publicly beta-testing a feature that performs so poorly? There's a clue in that question. My assumption is that Adobe needs to give its machine learning the opportunity to do exactly that — some machine learning. Each time you use one of these filters, Photoshop asks you if you’re happy with the results, and that all gets fed back into the system. In time, it will improve, and as discussed below, that’s when things will get interesting.
Where It Works
In theory, converting an image with a fairly shallow depth of field into something even more shallow will be comparatively simple, as many of the edges that it struggles with will be sharply defined against the background and therefore easy for it to identify, while areas that are in gentle transition won’t feel so jarring when more blurring is applied.
This is my experience following some early testing. If the depth is not too complex and there is already some out-of-focus drop-off around some of the edges (such as hair), you might be able to put this filter to good use.
On the left: 35mm at f/2.8. On the right, the same image with the Depth Blur filter applied at 100%. Ignoring the fringing over her right shoulder, does it feel a bit more like f/1.4?
A second example:
85mm at f/1.8 on the right and with the Depth Blur filter applied on the left.
Here's the layer that the Depth Blur filter generated:
Here's a 100% crop, with and without the filter:
No doubt there are parts of the image where Photoshop has struggled (again, shoulders seem to be problematic!), but given how bad some of the other images have looked (again, see the PiXimperfect video), this is impressive.
The world wants bokeh. Phones are trying their best to create it, lens manufacturers promise creamy backgrounds and smooth balls, and photographers frequently save their pennies to bump themselves from a more affordable f/1.8 version to the ludicrously expensive f/1.4 version, if not f/1.2. In five years, Depth Blur might soon be at the stage where a chunk of prospective buyers will settle with the cheaper option or simply choose to shoot with a greater depth of field to ensure accuracy of focus, safe in the knowledge that Photoshop can work its magic later on in the matter of a few clicks.
A situation where wide aperture lenses are no longer so coveted might be hard to imagine given that Depth Blur can’t figure out where someone’s hand stops and a mountain begins, but technology moves fast. Right now, it's easy to get bad results from this early version, but used on the right image, it's not too disastrous, and there are signs that those huge, super-fast lenses might not be as coveted in the not-too-distant future.
Have you played with Photoshop's latest feature? Let us know your experiences in the comments below, along with whether you think Depth Blur might eventually change the way that we shoot.