Tony Northrup Blows Up the Big Sensor/Small Sensor Debate – and With it, Implications for Their Lenses

by Hugh Brownstone47 Comments

I really like this guy.

He knows math; he knows images; and he knows gear.

With this one video, I believe he is going to fundamentally change people’s buying decision process when it comes to crop sensor vs. full frame sensor cameras and their lenses.

It’s going to change the industry.


And the reason why is because he educates in a very compelling way how much of the industry – and most consumers – have got the whole discussion around crop sensor dynamics – and their ramifications for lenses (ESPECIALLY lenses)…


Dead wrong.

Watch for yourself. It’s a long vid at 37+ minutes, but well worth it.

I can tell you that his math and images match my own expectations and experience as I downsized from a 5D Mk II to the Rebel SL1. But I definitely learned a few things too, especially around equivalent ISO.

Oh, one other thing: I think he draws one very wrong conclusion about the irrelevance of ISO (Tony, to say that all we need to do is have the same number of photons hitting a smaller sensor may be true, but is irrelevant when you can’t change it – e.g., at a sporting event, concert, or on the street – something which you yourself allude to).

But this is not intended as criticism. It is, rather, a kudos and hopefully a contribution to a productive dialogue (something we have far too little of these days): he put his opinion out there, his rationale for it — and challenged us to think differently.

Outstanding piece!

Are Camera Manufacturers Misleading Us by Not Calculating Sensor Size Into Specs?

Tony Northrup, an award-winning author and well-known reviewer of camera gear, recently put out a video that takes an interesting, in-depth look at how mirrorless camera companies might be fudging the specifications of mirrorless cameras to make them seem better than they are.

The video (which is basically an expansion on this shorter, very controversial demonstration) is an extremely long watch, coming in at almost 40 minutes, but it goes into great detail regarding how Northrup believes camera manufacturers are cheating and misleading us. He claims the stats like ISO, focal length and aperture are untrue as presented due to variations in hardware — specifically the sensor size.

Read full details and Mr. Northrup’s response to the video comments on Petapixel’s article “Are Camera Manufacturers Misleading Us by Not Calculating Sensor Size Into Specs?”

Note: it is our policy to give credit as well as deserved traffic to our news sources – so we don’t repost the entire article – sorry, I know you want the juicy bits, but I feel it is only fair that their site get the traffic and besides, you just might make a new friend and find an advertiser that has something you’ve never seen before

(cover photo credit: snap from Petapixel)


  1. William Sommerwerck

    Some years ago, Bob Carver announced his discovery that video signals’ dynamic range  was compressed by a factor of more than 2:1. He announced that he would soon be introducing a monitor that corrected this ghastly flaw.
    Needless to say, it never appeared.
    Mr Northrup’s views are controversial, because virtually everything he says is either confusing or simply wrong. I’ve been involved in photography for over 45 years, and thoroughly understand film speed, f/stops, exposure, etc, etc, etc.
    Digital sensors do not change the fundamental principles of photographic imaging. His attempt to view things from a different (and to his sense) more-useful angle has resulted in “explanations” that are sure to thoroughly confuse photographers who’ve had little experience with silver-based 35mm photography.
    “The bigger sensor has a cleaner image because it is able to gather more light.”
    Wrong. Wrong, wrong, wrong, wrong, wrong. Wrong.
    f/2.8 is f/2.8, regardless of focal length or field of view. Any f/2.8 lens delivers the same illumination per unit area as any other f/2.8 — or it wouldn’t be possible to switch lenses and get the same exposure. And hand-held exposure meters would be useless.
    I won’t spend any more time on this, except to say that Mr Northrup should be explaining things in terms of basic principles — not accusing manufacturers of dishonesty.
    Please give me a call, Mr Northrup. I will patiently try to untangle the Gordian know of your explanations.

  2. TonyNorthrup

    Hey, thanks, Mitch! I’ve been following your blog since whenever it is you started, so I was excited to see your kind words. If you’re ever in CT or Providence, I owe you a beer :).
    Re: the irrelevance of ISO, all I meant is that it’s irrelevant to compare ISO 100 on different sized sensors, since they’re gathering different amounts of light and thus the final images will have different total noise levels. If we were to design a system for measuring sensor sensitivity that would allow direct comparisons between different sensor sizes, I’d base it on total light gathered, rather than light intensity. Of course, I’d do the same for the f/stop replacement.
    That won’t happen, but if you remember to multiply the ISO by the crop factor squared, it’s useful for comparison between sensor sizes. So, I know my MFT GH4 at it’s base ISO of 200 will have about the same total noise as my 5D Mark III at ISO 800. I already have a sense for how much cleaning up I’m going to have to do at ISO 800, so making that conversion is helpful… and it makes me wish the GH4 offered ISO 25. 
    BTW, I see you’re also moving towards the GH4s (we have two of them). Mixing MFT and full-frame for my work is what spawned this research, experimentation, and conclusions. Here’s our latest lens review, starring the 5D Mark III, but filmed mostly with GH4s: And if you want to webcam in as a guest on one of our shows, we’d love to have you–just email me at

  3. TonyNorthrup

    William Sommerwerck Heya, William.

    It’s OK to disagree, but I provided third-party sources, recorded my experiments on video, and showed mathematical proofs for my conclusions. If you want to say I’m wrong seven times, you’ll need to provide some real evidence, because physics doesn’t care how long we’ve been involved in photography. Why not just repeat my experiments and see if you get different results?
    Here’s a quote with more context: “Given similar sensor technology, the sensor that gathers the most total light will produce the cleanest image, regardless of sensor size.”
    To quote you: “Any f/2.8 lens delivers the same illumination per unit area as any other f/2.8.” Yes, this is a principle that I explain, demonstrate, and prove with examples in the video. That’s important prerequisite knowledge that you must understand before digging into the rest of the video. We totally agree on this, and if you think I said anything different, than you didn’t understand my videos.
    The point of the videos is that both f/stop and ISO measure light based on illumination per unit area/light intensity, but smaller sensors have fewer unit areas, and thus they gather less total light. My experiments showed that it’s total light that determines total image noise and depth of field, not light intensity.
    I then provided simple formulas based on crop factor that allow you to convert our existing light intensity-based measurements to a total light gathered-based measurement, providing meaningful comparisons across sensor sizes: Focal length x crop factor, f/stop x crop factor, and ISO x crop factor x crop factor.

  4. William Sommerwerck

    TonyNorthrup William Sommerwerck  Rather than get drawn into a drawn-out argument, I will say that if two sensors — of the same area, and built with the same technology — have different-sized pixels, the one with smaller pixels will display greater noise.
    To put it the other way around… A full-frame 16Mpx sensor will have less noise than a C-format 16Mpx sensor, because its pixels are larger — not because the sensor is larger. The additional photons needed for the better S/N are provided by a lens with a physically wider aperture (for a given speed).
    If this isn’t true, then I have somehow missed something somewhere.
    I urge you to rethink what you’re claiming. Electronic sensors do not change the rules of exposure and imaging that have been understood for at least 100 years. If you wish to discuss this, I will provide my phone number.

  5. pixelop

    This is just a ripoff of several photographic equivalence essays floating around… like this video, most the findings are non-scientific and highly subjective, with formulas being drawn up from thin air.  There’s no point in  dwelling on this too much, here’s an interesting chart debunking your ISO equivalence formula:  

    Unless anyone is interested in the field of photogrammetry, just get out there and enjoy yourself shooting great pics!

  6. petermkent

    This video is a little misleading because he doesn’t cover the whole story; but his heart seems to be in the right place, wanting manufactures to make faster native crop lenses and getting casual consumers to fight for it by arming them with math. It does seem like he’s trying very hard to be controversial by only telling the side that pisses people off enough to write angry half educated letters about false adverts, but hey I’ve been lobbying Panasonic to make 1.4 zooms for years so I enjoy the company :)

  7. petermkent

    While I agree his ISO equation was inaccurate the rest seemed OK and I do think the “Photogrammetry” naming conventions need reform, they are currently dictated by marketing departments. We need to start with Sensor Size renaming based on their diameters in mm, then Focal Lengths to Angle of View based on their image circles or mounts and using the T-Stops rather than f-stops. Marketing is afraid of numbers because it encourages buyers to accurately compare which leads realizations of a products real price to performance ratio.

  8. pixelop

    William Sommerwerck TonyNorthrup William you’re correct…lens coatings on photosites and advanced signal processing also needs to be considered when determining SNR. Try comparing an original 5D with a new GH4 and you will find very comparable SNRs, even though there’s quite a bit of difference in sensor size…

  9. TonyNorthrup

    petermkent Don’t leave us hanging, what’s the rest of the story that I left out?

  10. TonyNorthrup

    pixelop I address the Admiring Light article in detail in this video: I also follow-up on that exact chart.
    What makes you think the formulas are drawn up from thin air? I provide mathematical proofs for everything. And how is it non-scientific, exactly? I created experiments, documented them, showed the evidence, gathered third-party sources, and invited people to repeat my experiments and critique my results. That’s about as much science as one person can do; from here, I need peers who are willing to actually perform further experiments to either validate or invalidate my findings. You’re welcome to do that!

  11. TonyNorthrup

    William Sommerwerck I cover the pixel density concern here, with new findings that supported my original theory: If you still have concerns after digesting the pixel density part of the video, I’m happy to hear them.

  12. pixelop

    TonyNorthrup pixelop This is what math looks like  and even this is somewhat simplified.. please do elaborate how you went from the SNR equation to the ISO Crop Factor^2 hypothesis. Please do comment on the methodology for your experiment, sample size, validity, etc.

  13. pfhix

    Good god! It wasn’t so long ago I was shooting on 8″x10″, 4″x5″, 6cmx7cm and 35mm on stills and Academy 35 and Super 16mm on movie film not to mention very large formats on repro cameras as well as various video camera formats.  Conversion factors ?  Forget it !    Any working pro thoroughly understood what gave wide , “normal” and tele on which format and that the exact same 300mm “normal’ lens on his 8′ x 10″ camera was a long lens on his 5″x4” and a 300mm on his Super16 Arri SR3 was  a very long lens indeed. 

    On thing that was constant was exposure. F8 on a large format camera was the same as f8 on 16mm  ( as long as the camera wasn’t working in close up as in product shots as the bellows factor in large format can be quite significant ). When I was doing studio work for car manufacturers we would often be shooting various format stills and movie of the same set-up and I used the same meter across all the cameras. When digital SLRs became available I often shot alongside 16mm cameras and, as often as not used the digital camera as a second light meter to quickly check large lighting set-ups. I’d match the ISOs and it always worked fine.

    As to exposure, it’s all about how many photons hit each silver halide crystal or pixel on a digital sensor and the inherent sensitivity of that crystal or pixel. This why larger pixels are  desirable as each gets more photons giving more signal  needing less gain to give  less noise. So obviously the larger the sensor for any given pixel count the better as the pixels can be bigger.

    Format does indeed affect Bokeh and , more importantly, the ability to separate subject and background as well as the “look”. I used to put up with the inconvenience of shooting portraits and product on large format stills partially for these reasons.  

    I think Tony is confused and needs to get a bit more practical, empirical, experience. 

    For the record, I have 40 years experience and have shot on virtually every format going barring Imax. I’ve built specialist cameras and have adapted many lenses for uses such as animation, special effects, 3D etc. I’ve used film formats from 20″x24″ down to super 8  as well as many digital still and moving image cameras etc.etc.

  14. James 9

    For a website that focuses on video rather than still photography it is important to note the errors that the author commits in this video.  The author consistently states that F stop equals light transmission.  This is not true.  F stop is the ratio of the lens aperture to the focal length.  T stop is the light transmission.  The two are not the same and it is for that reason that cinema lenses measure T stop rather than F stop.  If you watch the video carefully you will see that the author refers to F stop with the definition of T stop repeatedly.  See here:

    and here:

    I also found that some of the examples presented in the video are erroneous.  For example at the 11:00 mark the ISO 400 2x example is visibly darker than the others.  Therefore, I have to question the methodology in this video.  How do we know the two exposures are equivalent?  How were they measured?  We aren’t told so I can only guess.

    In fact I have to question all of the examples methodology.  What lenses were used?  Was the same lens applied to the full frame, 1.6 crop, and 2x crop?  Because it makes a difference.  This sort of methodological problem has even lead to some to speculate that the ISO is being boosted in camera at certain f-stops because it doesn’t follow the expected value, not realizing that F stop != T stop.

    These are important questions to ask for DSLR cinematographers because we rely on F stops rather than T stops.  This is why it is so important to understand the differences between the two when metering.  And why, with DSLR, to know and understand your histogram.

    I suspect people are upset when the author says things, like at 17:00 minutes, that depth of field is determined by f-stop and sensor size.  But in reality it is f-stop and distance from the camera.  By recomposing to equivalent field of view, the full frame camera is positioned closer to the subject, which decreases depth of field.  This may be considered pedantic or splitting hairs but photographers have been discussing this for over a decade.

    As far as sensor noise goes, what generation camera you have seems to be more important than crop factor, and the role that image processor has in that seems quite important, and is worth note as well.

  15. petermkent

    Hey Tony, thanks for the reply. By “whole story” I’m mostly referring to your gripe with f-stop markings on “crop lenses”, while I agree Panasonic and the like should be releasing faster zooms for m4/3 (especially for $1000) I do not agree that “2.8 should be considered 5.6”. While that is true for depth of field the other side of the story is that it isn’t true about light transmission which is arguably far more important. If I buy a 2.8 lens and use it on a m4/3 sensor I wouldn’t be pissed if the DoF was wider compared to a 2.8 lens on “full frame” (in fact I prefer the wider range so it’s really a subjective drawback). On the other hand if I bought the same lens marketed as a 5.6 for m4/3 and tried to match it to a 5.6 on “Full Frame” to find my “Full Frame” is 2 stops darker, I would be piss and upping exposure isn’t a good option as it usually hurts dynamic range and color depth. Another reason why the Light Transmission is more important than the Depth of Field in a given f-stop value is the common practice of setting your “key light” by f-stop with a light meter, especially if lighting a scene for two cameras of varied sensor sizes; for example if I ask for a “5.6 key for camera A” and “2.8 for camera B” but really mean the same light you can see how that can be confusing. I’m sure marketing departments don’t mind misleading buyers but in this case I think it’s more them trying to keep to standard measurements that’s easily translated across. As for focal length multipliers I say we move to angles of view but since most don’t know them “ff equivalents” are the next best thing.

  16. petermkent

    My second issue with the video is that your equation for signal to noise is missing pixel size, which is actually the only part that matters in digital, you can have a huge high megapixel sensor with tiny pixels or a small low megapixel sensor with large pixels and the smaller sensor will have a better signal to noise ratio. Your equation might be true for film but it isn’t for digital, you’re close though :D

  17. TonyNorthrup

    petermkent This video covers pixel size/pixel density, which doesn’t actually impact total/visible image noise the way we typically view pictures:

  18. TonyNorthrup

    petermkent I totally agree that the light transmission is more important than the depth of field, but the two are inseparable… and what I demonstrated in the ISO segments is that light transmission is impacted. 
    f/2.8 produces the same light intensity/light per square inch, regardless of the attached sensor. But if a sensor has fewer square inches, than it’s gathering less light. A small bucket gathers less rain, and a small solar panel gathers less energy. Size matters.
    Gathering less total light doesn’t produce a darker exposure because the camera’s software brightens it… I demonstrate this in the first video of the series:

    But it’s the total number of photons gathered that determines the signal to noise ratio, and thus gathering less light produces more noise. 
    I went on to prove with examples that *total light gathered* controls both depth-of-field and total visible image noise, and keeping the light per square inch constant produces very different results when you change the sensor size.
    But yes, many people think that having the same image brightness means the sensor has gathered the same amount of light… but that’s definitely not the case.
    And I probably mentioned it in the videos, but it would be much simpler to discuss angle of view than focal length equivalents, but old habits are hard to break.

  19. TonyNorthrup

    pixelop The math for squaring ISO is pretty straightforward. Once you acknowledge that you want to keep the total light gathered constant, you just need to convert the one-dimensional crop factor measurement into a two-dimensional unit to allow converting the area of the sensor. So you square it.
    For example, MFT sensors have a 2X crop factor that you apply to the focal length. Their sensors are 1/4 the area of a full frame sensor. 2 squared is 4, and 4 * 1/4 = 1, allowing us to easily compare total light gathering ability when keeping the light intensity constant.
    In that third crop factor video, I use numbers from DXOMark and work backwards to calculate the one-dimensional crop factors based on DXOMark’s measurements of total sensor noise, and you can see that the formula was 99.2% accurate for MFT to full frame conversions.
    Re: sample size, I think I tested maybe 6 or 7 cameras from the modern generation. Using DXOMark’s numbers makes it pretty easy to compare against larger sample sizes. I’d love someone else to repeat my experiments, but so far, nobody has bothered (despite so much passionate response).

  20. TonyNorthrup

    pixelop How so? It’s not cool to throw out negative labels without some evidence.

  21. TonyNorthrup

    pixelop William Sommerwerck I do say “given similar sensor technology” throughout the videos… but all current generation sensors are amazingly similar. As I mentioned in another comment, calculating based on DXOMark’s measurements for top-rated sensors at different sizes showed that 99.2% of the sensors performance was attributable to total light gathered, and only 0.8% was attributable to differing technologies (including lens coatings on photosites and advanced signal processing).
    I do have both the original 5D and the GH4, and though I haven’t compared them, the 5D’s sensor just doesn’t hold up against modern sensors at all… but it’s old, and definitely not “similar sensor technology.”
    But whenever a new sensor technology does come out, it’ll be applied to all different sensor sizes, so that’s pretty much a wash.

  22. TonyNorthrup

    pfhix I still shoot film, and in a variety of different formats. I see you have plenty of experience, but I don’t see what you disagree with in my video.
    You do mention that, “it’s all about how many photons hit each silver halide crystal or pixel on a digital sensor and the inherent sensitivity of that crystal or pixel.”

    That’s not what I found in my third crop factor video–pixel density doesn’t impact total image noise. It does impact noise per pixel, but nobody looks at single pixels.
    To put it in film terms, it’s really not about how many photons hit each silver halide crystal. It’s about how many photons hit your entire piece of film, and what you have the enlarger set to when making your print. 
    As you know, you’ll see the least grain in a 1:1 contact print. The more you have to enlarge your negative, the more grain you’ll see. This was always obvious to those of us who shoot film and do our own printing.
    In the digital world, though, we don’t physically handle our sensors, and so we’ve become distanced from the concept of enlargement. Obviously, showing a picture from an MFT sensor full screen is more of an enlargement than showing a picture from a full frame sensor. People have figured this out, but they oversimplify it by saying, “small sensors are noisier.”
    All I did was to provide proof that giving a small sensor the same amount of light produces images with the same amount of total noise and the same depth of field. Thus, “small sensors are noisier when you keep the light intensity constant” is a true statement, but so is, “small sensors are not noisier when you keep the total light gathered constant.”
    I then provided some formulas to make it easy to calculate how to keep the total light gathered constant.

  23. pixelop

    TonyNorthrup There’s over 40min worth of video to support my claim…

  24. pixelop

    TonyNorthrup This is flawed… you’re assuming the same pixel pitch, well capacity, SNR, etc across two different sized sensors from different manufacturers, which is rarely the case.  There are numerous cases in dxomark  when comparing SNR from different sized sensors where your golden rule doesn’t apply… sorry but 6 or 7 samples doesn’t cut it.. that will give you an extremely low confidence level in your findings… here’s some help to get you started figuring out the sample size

  25. pixelop

    TonyNorthrup Take a look at this:  &  this is what an honest attempt looks like

  26. TonyNorthrup

    pixelop Good sources! They all seem to support my findings, so I’m not sure why you’re showing them to me.
    I’m totally open to criticism supported by evidence, and in fact I’ve already corrected several oversights that other viewers noticed, but please be both specific and polite.

  27. TonyNorthrup

    pixelop It’s still far more data than you have to the contrary, and there’s not much variation between modern sensors. For example, all MFT cameras have basically the same performance, as do all modern Sony/Nikon sensors. To quote one of the articles you linked to, “It turns out that the noise in good modern digital cameras is dominated by photon counting statistics, not other sources.”
    I welcome different points of view, and I invite you or anyone else to do further research and present your findings. Take some sample pictures that prove your point, and send them over to me at I’ll happily present your findings to the same audience.

  28. pixelop

    TonyNorthrup They do NOT support your findings, and I stand by my statements

  29. pixelop

    TonyNorthrup It only takes 1 sample showing the contrary to debunk your charlatanry

  30. TonyNorthrup

    pixelop “charlatanry” — see, that’s name calling and it’s not polite. If you can produce that one sample, I’m happy to look at it, but I haven’t seen anything from you.
    To expand on my samples, I tested: Canon full-frame (x2), Canon APS-C (x2), Nikon full-frame, Nikon APS-C, Sony APS-C, Panasonic MFT (x2), Olympus MFT (x2), Fujifilm APS-C. That’s literally every sensor format from the biggest manufacturers. 
    But regardless, I say, “given similar sensor technology” throughout my videos, because when I started this process I assumed the sensor technology would have a bigger impact. Through my testing and research I discovered that it doesn’t much matter. Canon’s about 20% behind the other manufacturers, and that amount is visible to the eye. Fuji’s rather difficult to assess because they use a different filter pattern and they fudge their ISOs.
    But given similar sensor technology, you can use the formulas I provided to quite accurately predict lens/sensor performance for the purposes of comparing different cameras. I hope that helps simplify your shopping and shooting.

  31. William Sommerwerck

    I didn’t intend to post again, but I feel it necessary to separate the wheat from the chaff, so that the chaff might be more-closely analyzed (prior to combustion).
    What Mr Northrup says about lens diameter and focal length with respect to DOF is correct. But any technically knowledgeable photographer knows this stuff. (I worked through it many years ago, and can explain the math behind the reason for small formats having greater DOF.) Similarly, his point about ISO
    “speeds” for digital cameras not being truly equivalent to ISO speeds for
    silver-halide film is valid.
    But neither of these has any connection with the gross misunderstanding at the center of his argument. To wit:
    “Image S/N is determined by the total amount of light falling on the sensor. Therefore, as large sensors collect more light (at the same f/stop and image illumination) it follows that they will produce cleaner (less-noisy) images.”
    Wrong. An image comprises individual pixels whose behavior is autonomous. The S/N of a given pixel depends on the amount of light striking it — and nothing else. The number of pixels or the size of the sensor has nothing to do with it.

    1. Stephen Cole

      William Sommerwerck – “An image comprises individual pixels whose behavior is autonomous. The S/N of a given pixel depends on the amount of light striking it — and nothing else. The number of pixels or the size of the sensor has nothing to do with it.”

      Wrong. Size matters. Pixel density doesn’t. An image is comprised of pixels which the retina averages and the perceived S/N ratio is the same regardless of sensor pixel density for same size similar technology sensors. Nobody looks at individual pixels.

      INDIVIDUAL PIXEL NOISE IS IRRELEVANT for high density sensors because you have more pixels to average thus eliminating noise. Think of it this way. If you compare two full frame sensors, one with 40MP, and the other with 10MP, if you take the 40MP image and downscale it to 10MP by averaging every block of 4 pixels to create one pixel, the random noise would cancel to the point where both images look the same in terms of noise. Tony is correct and you own him an apology.

      And I have one year of experience with photography – but that’s totally irrelevant. Nobody cares if you have 100 years of experience. This is physics.

      Stephen Cole

  32. pixelop

    William Sommerwerck I don’t believe there’s a better way of articulating this.. I’m in full agreement with William.

  33. KiM_Sweden

    pixelop TonyNorthrup

    pixelop, your just thinking like an old dog. Ignorant and doesnt want to listen. is trying to share his finding with alot of efford, and you just blocking your ears to whatever reason…

    Instead, if you have more evidence that he has it all wrong, discuss like an grown up and share your opposite knowledge.
    From your side, it looks more like your trying to protect your faulty learning curve, and is really embarassed…
    Me myself is not so clever that I can be in this discussion, but I dont hesitate to relearn everything from ground up, if the facts has been changed. I like being educated correctly, thats all.


  34. pixelop

    KiM_Sweden “Me myself is not so clever that I can be in this discussion” ahem :)  believe me you are not being educated through TonyNorthrup ‘s videos.. while there may be some partial truths here and there, the angle is misleading at best and most, if not all, the SNR and ISO info is unfounded and incorrect.. Kudos for the effort, but it is arrogant to preach new discoveries in digital imaging science with this level of “research”.  I’m not trolling, I believe respect is earned and not given. All the misrepresentations make this video not worthy of it. Casual observation or  simple arithmetic doesn’t make you a subject matter expert.    If you want to talk about equivalent DOF between sensor sizes, go ahead.. but do go saying a 12-35mm f/2.8 lens with an image circle for a MFT sensor is not really 2.8…   If you’re going to talk about “total image noise”, please define it first. What is the industry recognized definition for it???   etc etc etc    It’s one thing to casual observations, it’s another to pretend to teach or establish new rules in such an advanced and complicated area of study.

  35. mchenetz

    pixelop TonyNorthrup  I can definitely understand what Tony is saying. If we just take labels out of it and look at light coming into the camera then what he is saying makes sense. If you have a sensor that has less pixels and a smaller area then, unless you focus your light on that area then you will obviously not get as much light onto it. A bigger sensor has more of an area to receive light. Lenses obviously have a lot to do with this too. I don’t think what he is trying to say is out of the realm of possibility. Why is there so much anger and aggression. I am not saying the math is totally accurate. The tests seem to indicate that there is a relationship. Science is all about repeatable tests and proof. This is what Tony is trying to do. You should not fault someone for trying to prove a theory. If you do not think it is correct then try it yourself and disprove the theory. That should be the only educated response that should be listened to. I will be glad to listen to yours and other theories to.

  36. William Sommerwerck

    mchenetz pixelop TonyNorthrup 
    “If we just take labels out of it and look at light
    coming into the camera then what he is saying makes sense.”
    It makes no sense at all. The total amount of light reaching
    a sensor has nothing to do with the sensor’s noise level, because the sensor
    comprises individual pixels that respond independently to the light striking
    them. I don’t understand why Mr Northrup cannot grasp this trivially simple
    idea. When he presents experimental evidence that contradicts this, I will give
    it my full attention.
    Here’s a concrete example. Imagine a 4Mpx sensor, in a
    camera with a 50mm f/2.8 lens. Now a imagine a sensor with twice the dimensions
    — and thus four times as many pixels of the same size — in a camera with a
    100mm f/2.8 lens.
    Clearly, the second sensor receives four times a much light
    energy. But it also has four times as many pixels. Therefore, the amount of
    light each pixel receives is the same — so the S/N is the same. QED.

  37. murhaaya

    Well if the “total light that hits the film/sensor” matters that means that there is a difference between exposing one sheet of 5×4 film or one sheet of 5×4 that is cut in half. How does the size of the film matters is still uknow to me. How does it differ to expose just a well lit face with the sea of darkness around with a 50 mm lens on 36×24 and doing the same but on 16mm film where there would be only face and no darkness around. The 16 mm has to be enlarged more to get the same size print but that does not mean it is more grainy. It if’s the same material developed in the same chemistry then it’s the same, one is just smaller. Also overexposing a face in the sea of darkness won’t give tou better overall exposure. It will be just blown out face in the sea of darkness.
    If it is a general physical concept it should apply as it’s common in physics, in the simplified scenario. So let’s consider a pinhole camera with a film instead of the sensor. This way we can eliminate stuf as coatings, pixel size, microlenses, telecentric designs,… the film will be same and processed the same way. no need to speculate about similar technology. Also no in camera brightness adjustmenst can thus occur.

    Let’s consider a pinhole with diameter A (exact size does not matter). It let’s some amount of light through. The intesinty of that light depends on the distance from the pinhole as it decreases with the square of distance. Let’s take a sheet of 6×6 cm film and place it 50 mm behind the pinhole. This gives the angle that the pinhole projects in the horizontal plane ON the film about 79°. A fairly wide angle as anybody who has shot with 6×6 and 50mm would tell you. The aperture f number of the pinhole is thus 50/A

    Now I crop the 6×6 sheet of film to 36×24. What I get? I did not move the film away from the pinhole so the intensity is still the same. I did not change the pinhole itself so the light it lets through is still the same, I merely cropped my picture and the result would be same as if I would crop the print of that picture. I get another angle of view and that is around. 43° A normal lens for full frame.

    But what if I want the same 79° horizontal angle of view? The equivalent focal lenght to give me the same view of the world? That would give me a 30 mm focal lenght. So I have only one option left. Move my film stock now in the 36×24 mm closer to the pinhole. The pinhole did not change so the amount of light it gives did not change but I moved closer so the film stock so the intensity of the light in the new film plane is higher. The current aperture f number is 30/A (remember it’s the aperture number f/1.4 those kind of numbers). To give the same intesity of light in the 30 mm distance we have to make the the diameter A smaller in to diameter B so that:
    50/A = 30/B
    The physical diameter of the pinhole is different but the intensity of light in the plane of the film (which is different in both cases) stays the same. 

    Now how does DOF fit into that I have no idea. There is no DOF in pinhole (there is diffraction and optimum object distance for given pinhole more see )

  38. murhaaya

    Another way how to approach using physical simplyfication is to consider just one pixel. Then the light that is hitting the pixel and total light hitting the one pixel sensor is the same. The noise for the one pixel and for sensor is the same. This imaginary pixel can be replaced instead of film in my previous example. It has a nonzero size (although small) it has angle of view (although small) and the amount of light it receives depends on it’s distance from the pinhole and the opening of the pinhole.

    also that would imply that 150+ years of photography was having it all wrong… hard to believe.

  39. William Sommerwerck

    mchenetz William Sommerwerck pixelop TonyNorthrup 
    This is the only material that had any connection with what we were discussing. It raises an issue that occurred to me, but I didn’t want to bring up, as I was afraid the discussion to veer off in the wrong direction..
    “Theoretically, a larger sensor with smaller pixels will still have lower apparent noise (for a given print size) than a smaller sensor with larger pixels (and a resulting much lower total pixel count). This is because noise in the higher resolution camera gets enlarged less, even if it may look noisier at 100% on your computer screen.”
    This is a good point for investigation — but it doesn’t address the point of view Mr Northrup is promoting.

  40. petermkent

    Why is this still being discussed? Stop filling my inbox with this useless jibber jabber lol.
    It’s a simple conversation, the guy in video thinks signal to noise ratios are calculated on a sensor level by combining the light gathered by all the pixels but it’s actually on a pixel level by combining the light gathered by a set number of “sensels”. Calculating on a sensor level may be true if the goal was to product a single solid color but since we’re mostly making photos with an array of colors and intensity than we need to factor in the pixel size.
    “… larger pixels receive a greater flux of photons during a given exposure time (at the same f-stop), so their light signal is much stronger. For a given amount of background noise, this produces a higher signal to noise ratio…”.
    If he refuses to understand this then leave him with his ignorance, let him save a little face and he’ll come around on his own (eventually).

  41. petermkent

    Right, I think I see where you’re confused.
    “But if a sensor has fewer square inches, than it’s gathering less light. A small bucket gathers less rain, and a small solar panel gathers less energy. Size matters.”
    Your right, size matters and your equation would be correct but “The Bucket” isn’t the sensor, it’s the Pixel. Sensels are exposed on the sensor which are combined into individual color Pixels and a larger square micron “Sensel group” would yield higher signal to noise ratios for the Pixel.
    While it may be possible to hide some noise through downsampling the resolution for monitor viewing (which monitors will do automatically) it’s just one way of manipulating the bucket that is still on a Pixel level.

Leave a Comment