Jump to content

Why is the APS-c image from A7R3 smaller than M43 20mpxl image


Recommended Posts

I have been shooting with a M43 camera that yiels a 20mgpxl raw file.  I recently acquired a Sony A7R3 and I thought the file s size for APS-C would be about hte same size as the full frame M43 size image file.  What I find is that the  Sony APS-C images are significantly smaller than the M43 images.  for example, my G9 camera yields a image file of 5184x3888 or 20.15 mgpxl   The A7R3 in APS-c mode yields a file size of 5168x3448 or 17.8 mgpxl.  What I do not understand is how 1/2 of a ff sensor that is 40+mgpxl can yield an image that is so much smaller than the M43 size image when the m43 is said to be about 1/2 of the size of a FF sensor.  I bought the Sony Camera ssytem thinking I cald have a single system that permitted both high resolution imaging for landscapes and a APS-C crop for use with my long lens for wildlife and not lose any image quality from what I got with the M43 system.  I am looking for a good understandable explanation of why APS-C is not yielding 1/2 the pixel density of the full frame sensor. 

Link to post
Share on other sites

You're making quite a lot of errors there in your reasoning, which leads you to false conclusions. Let's set things straight:

4 hours ago, pcr1040 said:

What I do not understand is how 1/2 of a ff sensor that is 40+mgpxl can yield an image that is so much smaller than the M43 size image

You should really differentiate between two critical things here: sensor area and megapixel count.

  • FF sensor area is 35.9 mm × 24.0 mm = 862 mm²
  • APS-C sensor area is 23.6 mm × 15.7 mm = 371 mm²
  • M4/3 sensor area is 17.3 mm × 13.0 mm = 225 mm²

From this you can see APS-C is 43.0% of FF sensor area, M4/3 is 26.1% of FF sensor area.

An A7R3 has 42.18 effective MP. If you put it in APS-C mode, that should yield 42.18 × 43.0% = 18.1 MP. Not sure where that 0.3 MP's went (the files are actually 17.8 MP), possibly used for image stabilisation.

If the A7R3 had a M4/3 crop mode, it would yield about 42.18 × 26.1% = 11 MP. That's a whole lot less than that APS-C sized crop image.

4 hours ago, pcr1040 said:

and a APS-C crop for use with my long lens for wildlife and not lose any image quality from what I got with the M43 system.

You don't. You get less megapixels yes, but those megapixels are much larger than those on your M4/3 camera. In practical sense this means they are less noisy when you crank up the ISO. Likely the images from the A7R3 in crop mode will look nicer than those coming out of a G9 at anything but base ISO.

4 hours ago, pcr1040 said:

why APS-C is not yielding 1/2 the pixel density of the full frame sensor.

Again, you should differentiate two things here:

  • Pixel density, which is the number of pixels per unit area;
  • Resolution, which is the total numbers of pixels in an image.

The A7R3, be it in FF or APS-C mode, has a pixel density of 42.18 MP / 862 mm² = 49k pixels/mm². Crop mode has nothing to do with this: it's a property of the sensor.

The G9 has a pixel density of 20.16 MP / 225 mm² = 90k pixels/mm².

From this you can see that the A7R3 has about half the pixel density of the G9, which means that the pixels are almost twice as large, gather twice as much light per pixel and consequently produce much cleaner images.

To sum things up

  • APS-C is not half the area of FF (it's a little less), and neither is M4/3 (which is about ¼ of FF area)
  • Image quality is not differentiated by MP count alone (there are some compact telezoom cameras with a higher MP count than the G9, might as well get that if you believe that way)
  • Yes A7R3 in APS-C mode produces a smaller MP count than the G9, but still produces better images if you value decent performance at high ISO.
Edited by Pieter
Link to post
Share on other sites

Pieter

Did you ever read something then smack yourself in the head and say to your self "DUMMY"?  Well after reading your concise and clear response to my dumb question, I did just that.  I got hung up on pixel density and forgot about pixel size.  Thank you for your excellent response. But I still deserve the title "Dummy"

Paul

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Posts

    • I mostly see posterization artifacts, which are the result of lossy compressed RAW files (or bad jpeg conversion). Unfortunately, the A6400 doesn't offer uncompressed or lossless compressed RAW. The noise might indeed result from the smaller sensor than what you're used to. If you're not shooting at max aperture, you could try shooting at wider aperture and lower ISO. When you're not shooting at max aperture, fullframe versus APS-C shouldn't matter much in terms of ISO-performance combined with depth of field: at the same ISO and aperture value, fullframe offers better noise performance but with a narrower depth of field. This can be offset by choosing a larger aperture and lower ISO on the APS-C camera. If you want a fullframe camera the size of an A6400, try the A7C(ii).
    • ..unfortunately, the lighting was correct. The shot required deeper shadows. The K1 ff didnt have these banding issues [yes, I know the sensor is larger]. The film shots had details in the same light. The sony files, both the jpg and raw, had this banding/noise - with NO retouch or post adjustments [straight out of the camera]. the camera was purchased new a few years ago and I am trying to determine if there is something wrong, or the settings are wrong, or the camera just cant handle this kind of lighting [studio + softbox]. No shadow detail is one thing... banding/noise in the shadows is unacceptable. Does sony have a body this size that is FF ? Im wondering if that would make a difference..  dw
    • The root causes for banding are uneven lighting, incorrect exposure settings, or compression artefacts or certain kinds of artificial lighting, especially LED lights. Also the lens used plays a role, I have noticed it more with my sharpest lenses, looks like they outresolve the sensor when I have a uniform blue sky. There is more than one solution, and ultimately post-processing, but the root cause has to be identified first.
  • Topics

×
×
  • Create New...