I'm trying to get the quality value from a Bitmap which was loaded from a jpeg image. This is what I have so far:
int GetQuality(Bitmap bitmap) {
if (!bitmap.RawFormat.Equals(ImageFormat.Jpeg)) return -1;
ImageCodecInfo jpegEncoder = ImageCodecInfo.GetImageEncoders().First(c => c.FormatID == ImageFormat.Jpeg.Guid);
var jpegGuid = jpegEncoder.Clsid;
var jpegParams = bitmap.GetEncoderParameterList(jpegGuid);
EncoderParameter qualityParam = jpegParams?.Param[1]!;
BindingFlags bindFlags = BindingFlags.Instance | BindingFlags.Public | BindingFlags.NonPublic | BindingFlags.Static;
FieldInfo? field = typeof(EncoderParameter).GetField("_parameterValue", bindFlags);
object obj = field!.GetValue(qualityParam)!;
IntPtr qualityParamValue = (IntPtr) obj;
unsafe {
//byte* qpvPtr = (byte*) qualityParamValue;
//var quality = *qpvPtr; // val is type byte with a value of 0
var quality = *(byte*) qualityParamValue; // val is type byte with a value of 0
return quality;
}
}
But the quality value comes back zero for a jpeg bitmap which was saved at a quality compression of 50% (EncoderParameter(Encoder.Quality, 50L)
). Either my reflection and unsafe code is wrong, or the quality value is not saved in the jpeg bitmap. I derived this code by looking at the EncoderParameter constructor, which looks looks like this:
public EncoderParameter(Encoder encoder, byte value)
{
_parameterGuid = encoder.Guid;
_parameterValueType = EncoderParameterValueType.ValueTypeByte;
_numberOfValues = 1;
_parameterValue = Marshal.AllocHGlobal(sizeof(byte));
*(byte*)_parameterValue = value;
GC.KeepAlive(this);
}
where byte value
is the quality (from 0 - 100). Does my unsafe code correctly do the inverse of *(byte*)_parameterValue = value;
? If not, what am I doing wrong?. If so, does that mean the quality is not actually stored in the compressed bitmap?