You can do something like this in Firefox with the font-size-adjust CSS property. Basically, if you say:
font-size: 14px;
font-size-adjust: 0.5;
…then that ensures lower-case letters will be 0.5×14 = 7px high. That’s not the same as setting the size of upper-case letters, but it’s arguably more useful because we use lower-case letters more often. This is particularly useful for text that intersperses monospaced and proportional fonts (like technical documentation does), because monospaced fonts typically look smaller at the same point size.
Where this can go wrong is fonts that have drastically unusual x-height.
Like, Verdana was such a scourge on the web for years because it, due to its tight hinting and its unusual x-height for some reason looked better a few pixes smaller than any other font did. So people would set Verdana, set a smaller font size, and then for everone who didn’t have Verdana installed we’d see tiny ant-size letterns. It was pretty awful times.
It can also go wrong when using one of those silly languages that uses accent marks. These of course go up above the cap height on capital letters, so if you set everything up to look great with ASCII / English text, it can turn into a mess once you get some uppercase accents poking into the line above.
This is especially annoying when trying to center text vertically — to do it right, the centering has to be done by examining the optical bounds of each glyph, not just the overall font metrics.
As a typographic pedant, I’m compelled to point out that 1pt is not exactly 1/72”. I know, what did you expect from non-metric units…
I wish this article didn’t keep talking about pixels. Dunno about you, but I haven’t seen a pixel in years — I’d have to get out a magnifying glass, since every screen I use (aside from TV sets) runs at 200ppi or higher. Pixels are an implementation detail of the display.
MacOS and iOS stopped using device pixels as GUI metrics long ago. AppKit and UIKit use arbitrary units that are decoupled from hardware pixels. I think initially macOS went to 2x scale, but nowadays you can go into System Prefs and choose your scaling. I know Windows has done this for a long time too.
Um, what else… line spacing is not based on the font’s em size; it’s a metric specified in the font. And in my experience a lot of font designers choose really questionable values for it; not sure why.
My first thought when reading this: how reasonable would it be to just start using this in new apps?
How practical would it be to implement this on top of existing GUI libraries?
When writing a new GUI library, would it make sense to have font size always mean cap height, or would that cause too much confusion for developers who are used to the existing meaning of font size? What about translating designs that use the existing meaning of font size?
In the same vein: when making a new GUI app (like a word processor or vector graphics editor) where font size is an important option, would it be too much of a barrier to adoption to use cap height = font size?
“I want to see my letters 2 inches tall, I can do that by setting the font size to 144 pt. In practice, nobody does that.” As someone who does that across the stack, I call bullshit. Just code search around for FT_Set_Char_Size and see the numerous toolkits that plugs in the proper hdpi/vdpi for the targeted output display.
“A 2012 case in the Michigan Supreme Court turned largely on the meaning of point size. Michigan state law requires certain ballot measures to be “printed in capital letters in 14-point boldfaced type”.
Specifying the size in some virtual pixel and fixing it up with a scale factor will not give you a lower error rate across different output densities than following pt/em and it will do nothing to solve for the problem that once you apply shaping for the script you are rendering your cap height can be equally chewed up by the shaping rules. It’s a fractal of the logo turtle on acid all the way down.
I’m not so sure pixels are the right measure though since pixels aren’t of uniform size either… like what if you want to print the page, what then? I suppose you can give it some arbitrary multiplication.
But on the other hand pixels might be the least-bad option.
The CSS px unit is a reference to device independent pixels (dips), which though varying in size from one device to another, does at least account for high pixel density screens. Generally, it would make sense to start with px based rules and then add @media print queries to produce printer-friendly CSS rules. It would be great to be able to use absolute units on screens as well. However, paged media is the only context in which they’re rendered consistently:
Most of these units are more useful when used for print, rather than screen output. For example, we don’t typically use cm (centimeters) on screen. The only value that you will commonly use is px (pixels).
But this isn’t about browsers, it’s about text display & editing in general. I know some editors are (IMHO stupidly) based on browser engines, but most aren’t.
I assumed from the tags on the post and the footnotes in the article (all of which link to CSS related content) that adam_d_ruppe’s comment was also with regard to CSS. Adam, if you were referring to the general case, I apologize for the confusion.
I actually did not know how exactly CSS did it. I know it did some magic for printing but not the whole dips thing… so yeah TIL lol. I just recall people used to say “omg never use px sizes” and never really updated my brain.
I disagree that pixels are the correct measure: pixels are different sizes on different devices. At the end of the day, every font is registered on an eyeball, and that size is what matters.
But I can’t imagine that specifying fonts in seconds of arc is going to take off.
As a fallback, I would like fonts to be (roughly) the same size of my monitors, so I think specifying sizes in real points (i.e., in real fractions of an inch) is appropriate.
this was really neat, and a cool bite-sized history lesson too! I had no idea of the connection between
em
and metal typesetting.You can do something like this in Firefox with the font-size-adjust CSS property. Basically, if you say:
…then that ensures lower-case letters will be 0.5×14 = 7px high. That’s not the same as setting the size of upper-case letters, but it’s arguably more useful because we use lower-case letters more often. This is particularly useful for text that intersperses monospaced and proportional fonts (like technical documentation does), because monospaced fonts typically look smaller at the same point size.
Correct. We need to care about the x-height, not the cap height.
Where this can go wrong is fonts that have drastically unusual x-height.
Like, Verdana was such a scourge on the web for years because it, due to its tight hinting and its unusual x-height for some reason looked better a few pixes smaller than any other font did. So people would set Verdana, set a smaller font size, and then for everone who didn’t have Verdana installed we’d see tiny ant-size letterns. It was pretty awful times.
It can also go wrong when using one of those silly languages that uses accent marks. These of course go up above the cap height on capital letters, so if you set everything up to look great with ASCII / English text, it can turn into a mess once you get some uppercase accents poking into the line above.
This is especially annoying when trying to center text vertically — to do it right, the centering has to be done by examining the optical bounds of each glyph, not just the overall font metrics.
As a typographic pedant, I’m compelled to point out that 1pt is not exactly 1/72”. I know, what did you expect from non-metric units…
I wish this article didn’t keep talking about pixels. Dunno about you, but I haven’t seen a pixel in years — I’d have to get out a magnifying glass, since every screen I use (aside from TV sets) runs at 200ppi or higher. Pixels are an implementation detail of the display.
MacOS and iOS stopped using device pixels as GUI metrics long ago. AppKit and UIKit use arbitrary units that are decoupled from hardware pixels. I think initially macOS went to 2x scale, but nowadays you can go into System Prefs and choose your scaling. I know Windows has done this for a long time too.
Um, what else… line spacing is not based on the font’s em size; it’s a metric specified in the font. And in my experience a lot of font designers choose really questionable values for it; not sure why.
My first thought when reading this: how reasonable would it be to just start using this in new apps?
How practical would it be to implement this on top of existing GUI libraries?
When writing a new GUI library, would it make sense to have font size always mean cap height, or would that cause too much confusion for developers who are used to the existing meaning of font size? What about translating designs that use the existing meaning of font size?
In the same vein: when making a new GUI app (like a word processor or vector graphics editor) where font size is an important option, would it be too much of a barrier to adoption to use cap height = font size?
“I want to see my letters 2 inches tall, I can do that by setting the font size to 144 pt. In practice, nobody does that.” As someone who does that across the stack, I call bullshit. Just code search around for FT_Set_Char_Size and see the numerous toolkits that plugs in the proper hdpi/vdpi for the targeted output display.
Microsofts own guidelines for new applications also does that, the 72/96 thing is legacy. https://docs.microsoft.com/en-us/windows/win32/hidpi/high-dpi-desktop-application-development-on-windows
Output density is valuable input font engine parameters, both for hinting and for variant selection: https://docs.microsoft.com/en-us/typography/opentype/spec/otvaroverview
Em sizing stretches across typography and is not just bound to silly displays with sillier pixels. Nibble on this little gem for instance: https://typographyforlawyers.com/em-sizing.html
“A 2012 case in the Michigan Supreme Court turned largely on the meaning of point size. Michigan state law requires certain ballot measures to be “printed in capital letters in 14-point boldfaced type”.
Specifying the size in some virtual pixel and fixing it up with a scale factor will not give you a lower error rate across different output densities than following pt/em and it will do nothing to solve for the problem that once you apply shaping for the script you are rendering your cap height can be equally chewed up by the shaping rules. It’s a fractal of the logo turtle on acid all the way down.
I’m not so sure pixels are the right measure though since pixels aren’t of uniform size either… like what if you want to print the page, what then? I suppose you can give it some arbitrary multiplication.
But on the other hand pixels might be the least-bad option.
The CSS
px
unit is a reference to device independent pixels (dips), which though varying in size from one device to another, does at least account for high pixel density screens. Generally, it would make sense to start withpx
based rules and then add@media print
queries to produce printer-friendly CSS rules. It would be great to be able to use absolute units on screens as well. However, paged media is the only context in which they’re rendered consistently:But this isn’t about browsers, it’s about text display & editing in general. I know some editors are (IMHO stupidly) based on browser engines, but most aren’t.
I assumed from the tags on the post and the footnotes in the article (all of which link to CSS related content) that adam_d_ruppe’s comment was also with regard to CSS. Adam, if you were referring to the general case, I apologize for the confusion.
I actually did not know how exactly CSS did it. I know it did some magic for printing but not the whole dips thing… so yeah TIL lol. I just recall people used to say “omg never use px sizes” and never really updated my brain.
I disagree that pixels are the correct measure: pixels are different sizes on different devices. At the end of the day, every font is registered on an eyeball, and that size is what matters.
But I can’t imagine that specifying fonts in seconds of arc is going to take off.
As a fallback, I would like fonts to be (roughly) the same size of my monitors, so I think specifying sizes in real points (i.e., in real fractions of an inch) is appropriate.