6.23.2017

Why I have stopped believing in test and review sites for cameras.

Tight crop from Panasonic fz2500
The Original uncropped frame

I'm not a big fan of gobbledy-gook, jargon and half-understood scientific word constructs meant to justify a visceral opinion in the service of marketing (and don't get me started on the satanic nature of acronyms). By this I mean that having a rationale for why something should be better or worse is not the same as a camera or lens actually being better or worse. So much of testing is still very subjective and, when it comes to issues such as focus, current cameras have far too much complexity for most users, which seems to exponentially (see what I did just there?) increase the things that can go wrong; or be mis-set.

Two recent things affected my ability to believe without question the results of one of the most famous camera review sites on the web. The first was their declaration that the Leica lens on the front of the Panasonic fz 2500 was mediocre. I was able to prove (at least to myself) that much, if not all, of the softness some people were experiencing with that lens had to do with the automatic focusing modes and their interface with the touchscreen, and the tenuous software that binds them together. If the camera is set up correctly for your individual use targets it is capable of lens quality performance rivaling its closest rivals.

Some tinkering with focus modes should have given the wayward reviewers more insight, at least into the quality of the lens itself, so they could re-focus their attention to the vagaries (not faults?) of the focusing system itself. The bottom line is that the Panasonic bridge camera is capable of making wonderfully sharp images, in the right hands. 

But the final, jarring, sledgehammer blow to the credibility of this corporate band of reviewers has been the ongoing exuberant praise, and alternate active rehabilitation, of the Sony a9 camera. A camera which sets the record for the most lines of text written in the service of naked marketing ever seen in the hyperbolic history of camera reviewing. 

The coup de grace to the credibility of the site in question was their re-re-testing of the a9's sharpness via a series of tests, the methods of which diverged from the parameters of tests done with hundreds of other cameras, for no other reason than to increase the sharpness score for that particular camera. Of course, a new testing procedure means that none of the previously tested cameras can be objectively compared, on that site, with the a9 because they were not given the endless chances to finally excel which have been lavished on the Sony product. Nor were test procedures previously modified to compensate for the shortcomings of other products. If you want objectivity and  also want to believe in the scientific method you can't have it both ways.

Just jotting down "fibonacci sequence" doesn't validate method. (They never mentioned Fibannaci Sequence but I'm making a point about trying to intimidate readers by trotting out phrases or arcane procedures that just don't match the situation...). 

I sympathize with the review site. It's a tough way to make a living in the post camera buying era. Click throughs become absolutely critical. But I find there's no substitute to living with a camera for a sustained period of time in order to understand it on a more holistic, even visceral, level. Most of the current cameras can only be assessed as part of a system. I prefer "hands-on" shooting to chart tests. This is not "String Theory" and the reviewers are not all Phd. researchers at Cal Tech. 

Just to be clear: Objective testing should mean all cameras get tested the same way

Now, if the reviewers want some non-Sony a9 work that would actually be continuously helpful to real photographers, who want to know if they should buy a certain piece of gear, they should consider re-reviewing cameras that have already been reviewed each time a big firmware fix is unveiled. There is much consensus that some cameras have been made amazingly better by new firmware and yet the old reviews stand as fact. The world iterates. Reviews should too. Right up until the camera in question is retired from the market.

(no ad for the Sony a9 here...).

added 6-27: An interesting article by Erwin Puts about testing and manufacture tolerances: http://www.imx.nl/photo/optics/optics/page62.html

19 comments:

  1. From my reading of the A9 sharpness article, you may have misunderstood what happened. DPR said that they used the same method (macro rail for fine manual adjustment) for the A9 as for all cameras, both the first time, when they manually mis-focused, and the second go where they focused correctly. The DPR tester explains: "...we re-test/shoot many cameras, most of the time behind the scenes before we actually publish. Also, we manually focus every camera, because the precision of even CDAF on many cameras is not enough every time to nail peak sharpness exactly on the plane of our scene."

    The re-test applied only to the flat test chart scene, for the simple reason that this chart needs to be as absolutely accurately focused as possible if you want to use it to assess or even numerically calculate the limits of the sensor sharpness/resolution/aliasing. For the A9 they actually published a manually-misfocused chart, which is entirely their own mistake and nothing to do with the camera's focusing performance. "...we take full responsibility for a non-optimally-focused set of shots."

    You, however, have given the impression that the *camera* did something wrong, and they 'gave it another chance'. That really is the opposite of what happened.

    ReplyDelete
  2. As a Fuji xt2 owner who has seen many of the shortcomings, listed in the dpreview site, corrected by firmware, I agree with you wholeheartedly Kirk.

    Keep up the great work,

    ReplyDelete
  3. Also of interest is what can happen when you put the wrong camera into the right hands. Give it extensive use under varied conditions, live with it, and sometimes beautiful images come pouring out. I refer to the Samsung cameras which you found to be far short of the mark after a long-form test involving at least a couple of iterations. But have you counted the number of Samsung photos which appear in your new website's portrait gallery? Even the lead-off shot dominating the page when the site opens came out of one of the Samsungs. There were other blog entries at the time you were using the cameras which were just drop-dead gorgeous -- architectural detail photos which don't find any place in the website's categories. I guess it comes down to talent and skill, neither of which can be purchased off the shelf from B&H or Amazon.

    Dropping what reads like a tribute to Samsung (which abandoned the market not too long after you abandoned their cameras), take a look at the fairly recent portrait of Lou Lofton. It was made with a Nikon 3XXX entry level DSLR, which was categorized as a throwaway camera -- something to be left on the floor of the car in case of emergency. True, given the subject matter and studio lighting control you could have shot that with a Brownie Hawkeye and produced the same result. The point holds up, though. Given the huge number of products in the marketplace and the desperate need to grab revenue in a declining market, gear reviews on photo interest websites should be discounted as 80% bunk.

    ReplyDelete
  4. So, tnargs, Let me get this straight. They test and re-test other cameras many times before they publish the reviews. And I have never seen them come back after they've published a review and revise their focus findings.... until now. So, after supposed extensive testing of the camera (a9) they were confident enough in their testing to put up their initial findings in a finished, globally published, review but then, after the extensive review (and many other articles) was published something compelled them to go back to only the a9 and re-test for test chart sharpness and then to issue a long-winded explanation of how their test methodology failed. So, with their only job, really, being to scientifically test these camera, and constantly telling their readers about the hundreds and hundreds of test points they live through, they "take full responsibility for a non-optimally-focused set of shots" but we are to believe that every camera we compared against their results for the a9 had its test done perfectly by the time of review.

    I have given the impression, I think, that they are sloppy in their testing and evaluations. And that if they re-test some cameras why not others? After all, their readers are comparing various different brands and models with side by side test example. If I found my test method to be sloppy or to have the potential for unintended variations I think I would be duty bound to my readers to correct the problem systemically and to go back and re-test all currently featured products to ensure that they too were tested fairly. Right? or am I missing something?

    And here is my central question: The new article mentioned that the image magnification of the a9 was not sufficient to ensure focusing on the correct chart details. It yielded only 9x instead of 16x. If the discrimination of the screen magnification remained at 9X would they ever be able to get repeatable results above that level? So, they can't get the discrimination of focusing they need via the lens focusing ring and are now using a macro rail. So, how have they tested the dozens and dozens of cameras that lack live view? Or, do they just nudge the macro rail into tiny discreet steps and then arbitrarily decide they've accomplished the right amount of discrimination to assess the optimal focus point?

    Then, I guess the next question is, given this test situation, can any consumer expect repeatable results in actual use or will they all be condemned to running around with a rail system on their cameras to eck out the last percentage of accuracy?

    And, if they focus by moving the camera back and forth they forfeit finding out if the lens flange is set correctly for back focus, etc. Right? Which could have been the origin of their initial issue.

    I'm not sure if the camera was at fault or the test but the change to their test parameters has hardly given us the ability to understand if the focusing or focusing software is at fault.

    To simplify, we know the sensor is sharp. That's just physics. We have a good understanding that the lens is quite sharp (other independent tests, including DXO) ergo the testing is faulty. But how was the testing fixed? Two tests done the same way yielding two different results. Baffling.

    And please remember, I am a big fan of Sony. I own lots of their cameras. I am almost certain there are no problems with the a9's focusing. To end, I hope what I really did was to call into question the methodology of camera testers. And our supposed reliance on their findings.

    ReplyDelete
  5. Matthew M. Cameras are like forks. Used even moderately well they serve your purpose; and the food may be delicious. I've yet to find a miracle fork that actually makes my enchiladas taste better.

    ReplyDelete
  6. Years ago I think a quick survey of the camera review sites was the first thing I did each morning. These days I might take a look 2-3 times a month. I pretty much read all of your blog posts as it is one of the few that has a sentient being with critical thinking behind the keyboard. Years ago the reviews that Phil did on DPR were really, truly helpful. Now they're ok, but the affiliation with Amazon seems to have watered the site down. Wow do I miss Michael Reichman's occasional stuff on Luminous! I could never afford the stuff he played with, but loved seeing his reviews and thoughts.

    ReplyDelete
  7. Hi Kirk, thank for replying to my first comment. I think I am a lot less worked up about what happened than you are, and a lot more willing to take things at face value. That is, DPR posted a chart that was spotted as soft by a lot of people (some cheering, some booing, but put that aside for the moment), one of whom posted scientific proof that it definitely was mis focused, so they realised it was indeed mis focused and re-tested until their usual standards were met.

    As for sloppy testing, you might be right, but my simple trusting mind knows that scientists can have a testing technique that is known to be reliable, and they use the same technique every time, but once in a while they make what is called a human error, not due to sloppiness, but due to humanness. If DPR have used the same technique on 200 cameras and made a human error on one, which an external reviewer proved and they then re-tested and corrected, one needs more evidence that that to say their technique is sloppy and all 200 are utterly inconsistent and rubbish. But I encourage anyone with such a claim to back it up. It would be very interesting.

    P.S. sensors are not "sharp, that's just physics". Sensors are naturally unsharp, and to varying amounts, due to varying combinations of mosaic arrangements, colour filter array technologies, and anti-aliasing and infra-red filtering technologies, and demosaicing techniques and technologies. The only sensor I would describe as "sharp, that's just physics" is my Sigma camera's Foveon sensor.

    ReplyDelete
  8. Yep, dpreview has totally slide in the whole review department. Now its a substitute for petapixel, with head lines. The old thorough, going through each menu, testing card speed, and actually comparing cameras, review site is gone.
    Now you get a rehash of what is in the manual and a test chart shot. This test chart shot is actually useless to compare as diferent lenses are used. Only can really use it to get an idea of the noise levels at higher iso.
    I am surprised they did buy 5 copies of the tamron adaptall 90mm f2.8 macro, and a set of non ebay pro adapters for all the camera brands. You would have one lens, with back ups, that can be tested and correctly manual focused on a test chart for all cameras. Only the phase one and pentax 645 cameras can not use that lens. But the Gfx and Hasselblad mirrorless can. They then could compare camera settings to see response, and get a true sense of the camera sharpness across brands. And this could have started when they switched the review chart just a couple of years ago.
    But sadly no and there tests are partly useless.

    ReplyDelete
  9. Hi Kirk, thanks for the vote of confidence in "Phd. researchers at Cal Tech." We're working hard to make sure the next camera we put into the world meets expectations.

    ReplyDelete
  10. A few years ago the review site did one of a Nikon high-end P&S that had a (zooming) optical eye level viewfinder in addition to a flipping screen.

    The review "deducted points" because the optical viewfinder was somewhat smallish and tunnel-like.

    When listing the competition, the site featured competitive models that had no eye level viewing at all, only back-of-camera screen.

    The point is, instead of "adding points" to the Nikon rating, because it had a feature the competition did not, it "deducted points" because the feature didn't meet their standards. Nor did reviews of the other models note the lack of a desirable feature.

    My takeaway was that the reviews are arbitrary and inconsistent, with pluses and minuses selectively chosen and randomly applied.

    ReplyDelete
  11. The whole system has become rotten: "reviewers" repeatedly taken on junkets by the manufacturers (see how many repeat invites you get for a less-than glowing review), cross-posting and cross-podcasting within the same groups, "elite" Exploders of Light etc repeatedly posting how great the new baubles are, social media postings bought for cash payments, etc. The emperor has no clothes.

    Rick

    ReplyDelete
  12. This comment has been removed by the author.

    ReplyDelete
  13. tnargs, If the Sony is the only one out of 200 cameras that had a faulty test that would be a lot more difficult to believe than if every ten or so tests had human errors. With the precedent of 199 accurate test pre-dating the one faulty test I'd say we are statistically far in the rough. Now we're looking for why the glitch occured.

    ReplyDelete
  14. It's not the case that 199 cameras are perfect and the 200th was wrong. All processes have error inherent to them (not as in 'mistakes', but imprecision that cannot be wholly removed). We've designed our chart and our processes to minimize these sources of error but it's impossible to eliminate them entirely.

    In this instance, the camera's magnified live view does not give sufficient detail to focus the camera to the precision we require (side-by-side pixel-level comparison is demanding, way beyond most 'real world' requirements) - there was no visual distinction on screen between results that varied in sharpness when you compared the captured shots. This adds significantly to the error of the test. After publishing the initial shots (but being aware that slightly sharper shots were possible, but not practically attainable), we found that we could reduce this imprecision by shooting tethered, and did so.

    We've done this several times over the years (re-shooting with different lenses, re-shooting to further fine-tune focus, etc), but we always try to achieve the best results we can, before publication. For instance, we requested a second copy of the FZ2500 and re-shot our tests to be certain that we weren't about to criticise the camera based on a single, sub-standard copy.

    No test is perfect (though Rishi has ambitious plans aimed at further reducing the error in this particular test), but we do our best, irrespective of camera or brand.

    ReplyDelete
  15. To clarify, Richard Butler is one of the writer/editors at DPReview.com.

    Nice to hear about their take on the issue.

    ReplyDelete
  16. I'm glad you've brought up this issue. Most review sites, but especially DPR, lost credibility in my eyes several yeears ago. DPR was bought by Amazon, and as a portal for sales, DPR can't bite the hand that feeds it. And neither can professional camera reviewers, who depend on being first to the web with reviews in order to generate buzz, click-throughs and ultimately referral fees for income.

    ReplyDelete
  17. It was interesting to read the DPR review of the fz2500/2000, and they conclude it has a `mediocre lens` My fz2000 lens is super sharp, Others report that they have a sharp lens. Imaging resource made it their SUPERZOOM OF THE YEAR. An interesting fact is that dpreviews studio test shots use very low shutter speeds for the FZ2500 (1/50), too low in my opinion to exclude other factors like shutter shock, floor vibrations etc. also, The Sony RX10III was tested at higher speeds. Make of that what you will.

    Allan.

    ReplyDelete
  18. To be fair Allan, the guys at DP Review do get it right about 50% of the time. Just in time for new firmware to obsolete their reviews....

    ReplyDelete

We Moderate Comments, Yours might not appear right after you hit return. Be patient; I'm usually pretty quick on getting comments up there. Try not to hit return again and again.... If you disagree with something I've written please do so civilly. Be nice or see your comments fly into the void. Anonymous posters are not given special privileges or dispensation. If technology alone requires you to be anonymous your comments will likely pass through moderation if you "sign" them. A new note: Don't tell me how to write or how to blog! I can't make you comment but I don't want to wade through spam!