Why DxO Mark's lens “tests” are utter nonsense
While I give them considerable credit for their thorough camera sensor testing and ranking, those of us who look to DxO Mark to help us decide which lenses are the “best” for our particular cameras have been lead astray for too long, already.
One particular testing item that has always called my attention, but had never bothered to research – until now – was the infamous DxO Mark “Best at” Score, which, without exception, suggests that lenses ought to be shot wide-open for best performance.
Shooting lenses wide open might be interesting in certain situations, but in general does not make much sense, because they are typically at their optical best at around two stops down from their maximum aperture.
It also called my attention that lenses we all know to be average keep picking up DxO score on newer, higher resolution and/or full-frame cameras, while some legendary ones keep getting below average scores, independent of the camera they are “tested” on.
As it turns out, and I quote:
“DxO Mark Score is measured for low-light conditions: 150 lux and 1/60s exposure time. Such conditions correspond to a correctly lit living room (with no daylight).
It is a difficult, but rather typical photographic use case.”
These “testing” conditions are absolutely useless when it comes to determining a lens’ optical performance or IQ.
There is no such thing as a dim-lit living room photographic category and can the DxO folks also please explain what they mean by correctly lit?
Correctly lit for what? For 1/60 s., for wide-open? With floods, incandescent, candles or moonlight?
Moreover, claiming that such a scene is a typical photographic use case seems like a bit of a stretch; I don’t see a lot of Flickr or Instagram users posting pic after pic of dim-lit living rooms, if any.
Putting slow lenses at disadvantage from the get-go can only be explained from a commercial point-of-view or – more forgivingly – ignorance, while suggesting that lenses are “best” wide-open only leads the less educated into believing that lenses are designed to be shot that way, which is not only silly, but damaging.
DxO’s rather weird choice of a dim-lit scene for lens testing explains why slower lenses are always at disadvantage, even if they may be excellent optically.
These so-called lens-tests are really about ISO 100 vs. ISO 800, not about the appropriate f-stop for optimal optical performance.
The extra two or three ISO stops that a slower lens requires to achieve proper exposure at the test scene will generate noise that erases some (if not a lot) of fine detail, thus pushing down its theoretical optical performance score, largely depending on the camera used.
This also explains why the scores are so much better on newer and FX cameras, because low light performance has advanced in leaps and bounds ever since the introduction of the D700.
I have always been curious about why a lens we all know to be average could have a superior DxO score on larger sensors, rather than losing score because of the increased optical demands that larger and higher resolution sensors are supposed to impose on lenses.
So now we know. While DxO Lens Ratings might give us some indication on what camera-lens combination produces the best low light performance, it is absolutely useless to determine the true IQ of any tested lens…
If you really want to know if a lens is worth its salt better check this: Photozone.de
Want to help?
I'm a freelancer!If this article is useful to you, you might want to consider a small contribution. Thank you!