Jump to content

Are "standard" lenses really standard?


Recommended Posts

Not specifically Alpha lenses,....but...

For a long time I've been told that 50mm (or there abouts) is the standard focal length for a full frame (or 35mm SLR) because it gives a view very close to that of the human eye.

I've just been playing with my 50mm, lining it up on my computer screen, then taking the camera away and finding that my eye sees a much wider field of view (other eye closed).

I then tried again with other lenses and find my field of view is somewhere between 15 and 24mm (probably about 20mm - but I don't have that size lens for FF).

My optician has never told me I have unusually wide vision, so how am I misinterpreting the term "Standard Lens?

This question is purely out of interest - I use 50mm when I feel it's right for the subject I am photographing.

Edited by thebeardedgroundsman
Link to post
Share on other sites

Humans have a binocular viewing angle of about 120°, which is roughly comparable to a lens with 10mm focal length on a fullframe camera. Most of that human field of view is peripheral vision though, hardly usable for recognizing subjects. Only about the central 7° cone is usable for recognizing faces, comparable to ~250mm focal length on a fullframe camera. To say that 50mm on FF most closely resembles the human field of view is therefore indeed nonsensical. Just like people claiming the human eye can discern 300 megapixels: it's our brains incredible post-processing abilities where most of the magic happens.

As to your original question, I found an interesting read here. To quote from this source:

"... the 50-mm anecdote persists—in part because of the history of lens manufacturing, but also because it taps into the latent fears, anxieties, and imaginations that surround the use of technology for seeing. It’s comforting to believe that there is a standard view, and that photographic apparatuses can reproduce it."

Edited by Pieter
Link to post
Share on other sites

Human eye's FOV is a lot wider than any standard lens so I don't think that's the definition of a standard lens.  I have read a bit about standard lenses and more often than not, I come away more confused.  What I did find with the standard lens (50mm) mounted on my Pentax K-5, APS-C DSLR (0.92x magnification, 100% viewfinder coverage) is that when I open my left eye to look at the subject while the right eye is glued to the viewfinder, the image more or less merge together.  A Standard lens is usually defined as between 40-60mm and I can confirm a 40mm lens does about as good as a 50mm lens.  Wide angle and telephoto lenses does not work

This doesn't work with my A7II (0.71x magnification, 100% viewfinder coverage), almost 1/3 reduction in the subject magnification is too much for the brain to compensate for the difference in size between the left and right eyes' view.  Looking at DPReview of Sony APS-C e-mount cameras, A6xxx cameras have 1.05x Viewfinder magnification with 100% viewfinder coverage.  I would like to ask the members of this forum try the experiment of looking through the viewfinder while keeping the naked eye on the subject while shooting with a 'standard' lens (35 ~ 70mm focal length lenses).

Link to post
Share on other sites

I just tried with my a6500 and 16-55 mm zoom and sure enough, both eyes lined up perfectly right at about 50 mm focal length. Hardly scientific but this sounds like a plausible explanation.

Still, this is only relevant when looking through the viewfinder. After taking the shot you'll loose context and the relation with your actual human view: the captured image becomes just a very tight crop of that human view.

Link to post
Share on other sites

Yep, I've just tried it and sure enough it works. And I also get the other points re: peripheral vision (that was my original thought for an explanation) and of course the captured image will be different - size of print/screen etc etc.

I like the explanation in the link Pieter posted, indicating that the perspective is similar to the human eye.

This makes sense when wide angle lenses lengthen the perspective, making the foreground, mid ground and background look further apart, and telephoto lenses making them look closer together.

 

Edited by thebeardedgroundsman
Link to post
Share on other sites

Don't forget that all of the standardization were established in the film era when SLR's pentaprism viewfinder was the state of art.  Before this you had the view camera with the upside down, left/right reversed image then came the waist level viewfinder which did not flip the subject upside down but still reversed the image from right to left.  Sure drove me bonkers when shooting with a Kowa Six medium format camera and it's waistlevel finder - a Pentax 645 with a pentaprism finder was a huge step forward for me back then.

Anyway, standard lens and whatnot, I don't think about it much these days.  I use whatever focal length lens that I deem necessary for a particular shot.  I tend to use 50 and 85mm lenses because they are the fastest lenses that I have - mainly for isolating the subject from the background.

 

Link to post
Share on other sites

  • 2 weeks later...

Interesting topic.

Maybe, just to put some view here.

Human eyes are, of course, very different, they do have obvious ability to move, to change focus angle, using muscles. So the focus can be found without head being moved, it may be similar to the (potential) camera having no moves and the lens would be with possibility to change angle in order to find the object of focus. The focused object for eyes is always at the center, the F number is being achieved similar way as the camera lens are doing that, narrowing and minimalizing the amount of light coming to the yellow spot at the back of the eye (or maximizing that in low light). Nearsighted people may get the sharpness of view by looking to objects thru very small holes at the (for example) paper, which is similar to getting more sharpness using F numbers (example) 8 or more at lenses.

Human eyes are not using complete sight angle, just the central part of the range (even visibility is really very wide). As moving easily, they're reaching the objects of view to be at the center of the view. Both eyes are making 3D perspective, important for orientation at space, environment, as they are recalculating the view all the time.

Fujifilm developed 3D W3 cameras, with nice 3D, "stereo" view, including very interesting rear display which is providing 3D image, combining 2 separate "normal" photos (or 2 videos for 3D video). As the image at the rear screen is processed artificially, human eyes are not being prepared to watch it for longer time, more people may have headache after just few minutes. Even, having that in mind, results are amazing.

Again, the topic is very, very interesting, it's even touching "technical" aspects of human eyes view.

Link to post
Share on other sites

Glad you find the topic interesting Aldowski.

Of course the camera also takes away the"filter" that the brain uses. EG; once you've had "floaters" in your eye for a while, the brain ignores them - a dirty camera sensor WILL show up in your photos. 

I recently heard on the Radio that  human eyes always have the nose in their filed of view, but the brain is very good at filtering it out of the image!

Imagine a camera that automatically erases the bits you don't want in your image!

Link to post
Share on other sites

Yes, brain is ultimate controller, developed during thousands of years. Unfortunately, eyes are having more issues by years, as you mentioned "floaters", then, it's changing the distance between the eye front surface and back yellow spot (the distance getting smaller, eyes are becoming far-sighted), also, flexibility of focusing is changing.

About pixels, I tried to explain here in another topic, cannot talk about Sony cameras now (even having 2 of them, I do not have the issue), but for sure Fujifilm 3D W3, has the "broken" pixels fixing process in it's software inside camera itself. When I created video with it, there are visible totally white pixels at it, of course, clearly visible at the low light videos, I tried with no light at all as well. Must say, at those same places, when using the camera for photos, everything looks okay, most probably, by interpolating surrounding pixels (to "recreate" the broken one) to avoid the issue. For that reason, I returned few W3 3D cameras to shop, but, it looks that, many of those 10MP sensors (installed at W3 3D) are having the same problem. Again, at photos the issue is being resolved, at the video it is existing (it may be resolved at postprocessing, I am using Virtualdub plugin for it).

The good example of our, human "sensors" being changed are ears. There's a primary school(s), where kids are "signalling" stuff between themselves (as phone ringtones, for example) using sound with higher frequencies (those they can hear, but their teachers cannot). "Mosquito" ringtone. There are even videos which are "recognizing" your age by playing sound (frequency being changed during time, upper to lower: from 20KHz down), the moment when you may hear the sound (frequency being changed while playing) is showing (at included table) age.

Link to post
Share on other sites

It's not about the FOV in absolute sense, as in what the eye sees/registers all around, but rather about "natural eye magnification of the area we focus on" with the eye+brain.

My A7R IV has a 0.78x viewfinder magnification - when I zoom to ~61mm, both eyes (the naked one and the one looking into the viewfinder) see the same size/proportions of the subjects/scene. Not the same "frame" in size, but the things are at the same magnification. 61mm x 0.78 gives around 48mm as "natural view" (not field of view). And I remember on my old D750 it happened at the and of 24-70/2.8, at 70mm with magnification 0.7x it corresponds to ~49mm.

Link to post
Share on other sites

  • 2 months later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Posts

    • I'd opt for a small zoom, but I must admit that there seems to be a dearth of lenses in the e-mount in the 24-50mm range -- for some reason.  I have a small 24-70mm, but that's an a-mount Tamron.  Maybe you can find something by looking at lenses slightly longer.  I have a heavy, but small 24-100mm a-mount, and Tokina made a 24-200mm a-mount.  Maybe there are similar lenses in the e-mount.  Kill three birds with one stone.
    • Well this! Thank you! I have been following suggestion after suggestion for the past 3 hours with my a7CR and never thought of removing the battery. Magic!
    • I recently got an a7cii and to pair with the compact body, I thought of getting 2 of the trio compact lenses, 24mm F2.8 and 40mm F4.0. (I already have a 70-200mm) However I stumbled upon the newly released 24-50mm F2.8 G. I'm not sure which to get - I like the small factor of the prime lenses ON the body because it's discreet and helps me blend in as an average tourist / doesn't make it obvious when doing street. But if I add the dimensions of the 2 primes together, it takes up more space in the bag than the zoom lens. BUT THEN, the weight of the 2 prime lenses is 110g lesser than the zoom lens. The zoom lens has the added benefit of being more versatile.   So now I'm stumped. Each has their pros and cons and I can't decide which to get. I'd like to hear the views of you guys who are more experts at this.   Edit: I'm a bit concerned about weight because the last time I went overseas my shoulders were aching from carrying too much. Which is why I was looking for small compact primes in the first place.
  • Topics

×
×
  • Create New...