objective c - NSImage Image Size With Multiple Layers -
i have mac (not ios) application allows user select 1 or more images nsopenpanel. have trouble how correct dimensions multiple-layered images. if image contains 1 layer or compressed, following me correct image dimensions file path.
nsimage *image0 = [[nsimage alloc] initwithcontentsoffile:path]; cgfloat w = image0.size.width; cgfloat h = image0.size.height;
but if select image has multiple layers, i'll strange numbers. example, have single-layer image dimensions 1,440 x 900 px according fireworks. if add small layer of circle , save image png , read it, 1,458 x 911 px. according this topic , this topic, suggest read largest layer. okay. i've created function follows.
- (cgsize)getimagesize :(nsstring *)filepath { nsarray * imagereps = [nsbitmapimagerep imagerepswithcontentsoffile:filepath]; nsinteger width = 0; nsinteger height = 0; (nsimagerep * imagerep in imagereps) { if ([imagerep pixelswide] > width) width = [imagerep pixelswide]; if ([imagerep pixelshigh] > height) height = [imagerep pixelshigh]; } nssize size = cgsizemake((cgfloat)width, (cgfloat)height); return size; }
using function above, wrong dimensions (1,458 x 911 px) instead of 1,440 x 900 px. actually, had same problem when developing mac applications real stupid till few years ago. how can correct dimensions when image contains multiple layers?
thank advice.
Comments
Post a Comment