What is the size of a UIFont point?

I'm struggling to figure out what the dot size in UIFont . These are not pixels, and this is not the standard definition of a point, which is due to the fact that they refer to 1/72-inch.

I designed the pixel size with -[NSString sizeWithFont:] fonts of different sizes and got the following:

 | Point Size | Pixel Size | | ---------- | ---------- | | 10.0 | 13.0 | | 20.0 | 24.0 | | 30.0 | 36.0 | | 40.0 | 47.0 | | 50.0 | 59.0 | | 72.0 | 84.0 | | 99.0 | 115.0 | | 100.0 | 116.0 | 

(I made [@"A" sizeWithFont:[UIFont systemFontOfSize:theSize]] )

And looking at a point size of 72.0 , which is not 1-inch, since it is on a device with a DPI of 163, so the 1-inch will be 163.0 pixels, right?

Can anyone explain what the dot means in terms of UIFont ? that is, my method above is wrong, and indeed, if I used something else, would I see that something about the font is 163 pixels at 72 points? Or is it pure that a point is determined from something else?

+44
ios objective-c fonts uikit uifont
Aug 02 2018-12-12T00:
source share
5 answers

The font has an internal coordinate system, treating it as a unit square, in which the coordinates of the glyph vector are indicated for any arbitrary size that contains all the glyphs in the font + - any field size that the font designer chooses.

At 72.0 points, the square of a font unit is one inch. Glyph x of the font y has an arbitrary size with respect to this square inch. In this way, a font designer can make a font that appears large or small in relation to other fonts. This is part of the font character.

So, the “A” pattern at 72 points tells you that it will be twice as high as the “A” drawn at 36 points in the same font, and absolutely nothing more about what the actual size of the bitmap image is.

those. for a given font, the only way to determine the relationship between point size and pixels is to measure it.

+10
Aug 02 2018-12-12T00:
source share

I'm not sure how -[NSString sizeWithFont:] measures height. Does it use line height or the difference between the peaks of the beziers? What text did you use?

I believe that -[UIFont lineHeight] it would be better to measure the height.

Edit: Also note that none of the measurement methods returns a size in pixels. It returns the size in points . You must multiply the result by [UIScreen mainScreen].scale .

Pay attention to the difference between typographic points used to build the font and points from iOS default logical coordinate space . Unfortunately, this difference is not very clearly explained in the documentation.

+5
Aug 02 2018-12-12T00:
source share

At first I wondered if this was due to the way [CSS pixels are defined in 96-inch increments] [1], while the layout points of the user interface are defined at 72 inches per inch. (Where, of course, an inch has nothing to do with a physical inch.) Why do web standards affect UIKit's business? Well, you may notice that when checking stack traces in debugger reports or crashes, there is some WebKit code that underlies a lot of UIKit, even if you are not using UIWebView . In fact, it is easier.

Firstly, the font size is measured from the lowest counter to the highest level in regular Latin text - for example, from the bottom of the “j” to the top of the “k”, or for convenience in measuring in one character, the height is “ƒ”. (That U + 0192 “LATIN SMALL LETTER F WITH HOOK” ​​is easily typed using the -F option on the American Mac keyboard. People used it to shorten the “folder” back when.) You will notice that by measuring this circuit, the height in pixels (1x on the display) corresponds to the specified font size - for example, with [UIFont systemFontOfSize:14] , "ƒ" will have a height of 14 pixels. (The capital measurement "A" takes into account only an arbitrary part of the space, measured in font size. This part can change with smaller font sizes: when rendering font vectors in pixels, a hint changes the results to get a clearer screen text.)

However, fonts contain all kinds of glyphs that do not fit into the space defined by this metric. There will be letters with diacritics over the ascendant in the Eastern European languages, and all kinds of punctuation marks and special characters that fit into the "layout" are much larger. (See the “Mathematical Symbols” section of the “Mac OS X Special Symbols” window for a large number of examples.)

In the CGSize returned -[NSString sizeWithFont:] , the width takes into account the specific characters in the line, but the height only reflects the number of lines. Line height is the metric specified by the font and associated with the layout, encompassing the largest characters of the font.

+3
Aug 2 2018-12-21T00:
source share

I agree that this is very confusing. I am trying to give you some basic explanations here to make things clearer.

Firstly, the DPI (dot-inch) thing comes from printing on physical papers. So the font. A device item was invented to describe the size of the physical print of text, just because the inch is too large for normal text sizes. Then people came up with a dot, that is, a length of 1/72 inch (actually developed in history) to easily describe the size of the text. So yes, if you are writing a document in Word or other word processing software for printing, you will get one inch text if you use 72pt font.

Secondly, the theoretical height of the text is usually different from the rendered strokes that you can see with your own eyes. The original idea of ​​the height of the text came from the actual glyphs used for printing. All letters are engraved on glyph blocks that have the same height, which corresponds to the height of the font point. However, depending on the different letters and font design, the actual visible part of the text may be slightly shorter than the theoretical height. Helvetica Neue is actually very standard. If you measure the upper part of the letter k at the bottom of the letter p, it will correspond to the font height.

Thirdly, the computer display fastens the DPI, as well as determining the point at the same time. The resolution of computer displays is described by their own pixels, such as 1024 x 768 or 1920 x 1080. The software does not actually care about the physical dimensions of your monitors, because everything will be very blurry if they scale the contents of the screen, for example, on paper - only physical resolution not high enough to make everything smooth and legal. The software uses a very simple and dead way: fixed DPI for any monitor you use. For Windows, it's 96DPI; for Mac, it's 72DPI. However, no matter how many pixels an inch is on your monitor, the software simply ignores it. When the operating system displays text at 72pt, it will always be 96 pixels higher on Windows and 72px higher on Mac. (Why do Microsoft Word documents always look smaller on a Mac, and you usually need to zoom up to 125%.)

Finally, on iOS, it is very similar, whether it is an iPhone, iPod touch, iPad or Apple Watch, iOS uses a fixed 72DPI for a retina-free screen, 144DPI for a @ 2x retina and 216DPI for a @ 3x retina on an iPhone 6 Plus.

Forget about the real inch. It exists only for actual printing, and not for display. For software that displays text on the screen, this is simply an artificial relation to physical pixels.

+2
May 14 '15 at 19:14
source share

The truth, as far as I could tell, is that UIFont lie. All UIKit accept liberties with fonts. If you need truth, you need to use CoreText , but in many cases it will be slower! (So ​​in the case of your pixel height table, I think it adds some kind of coefficient + bx, where x is the size of the point.

So why is this so? Speed! UIKit rounds up the material and creates scripts at intervals so that it can cache bitmaps. Or at least that was my departure!

0
Aug 03 2018-12-12T00:
source share



All Articles