1. 48
  1.  

  2. 6

    Modern font rendering is so complicated I’m nostalgic for bitmap fonts.

    Intuitively there’s something very wrong with font formats being designed around running the fonts in a vm.

    1. 14

      Intuitively there’s something very wrong with font formats being designed around running the fonts in a vm.

      One thing I have to remind people of from time to time is that computers are used in more languages than just English. Languages that use the Arabic alphabet (Arabic, Farsi, Urdu, etc.), for example, are extremely hard—arguably impossible—to properly do armed with only bitmaps. Letters have different sizes and shapes depending on their position in the word and what comes before and after, and justification is done not by adjusting the spaces between words, but by adjusting individual letter widths. Yeah, you can do this as a bitmap with no VM, but then you need a huge number of glyphs, and each program has to natively reimplement Arabic rendering logic. Using a VM for such a script is a huge boon, saving tons of disk space, code points, and duplicated logic. And while Arabic script is extreme, scripts like Hudum, Chinese, and Devanagari have similar issues.

      1. 5

        One thing I find fascinating but somewhat under-studied is that Latin script has been typeset in print for something like 600 years, and typewriters have existed for over 100. To some extent then, what Latin alphabet text looks like has been influenced by the needs of machines for a long time. Contrast with what pre-print formalized handwriting styles sometimes look like, and ponder trying to come up with bitmap fonts that look like that.

        Text is fundamentally shaped by the tools used to write it. Try writing with a fountain pen instead of a ball pen and after all the scuffs and smudges as you re-learn the process, your handwriting will come out different. Try writing with a Chinese/Japanese brush or a cuneiform stylus-in-mud and you’ll get different letter shapes again. Scripts throughout European history are often based off of Roman Latin script… or rather, the formal Latin script that people of the time had available, which was generally carved into stone.

        1. 5

          You might be interested in this very deep dive into fonts developed in the Papal states in the 15th century:

          https://ilovetypography.com/2016/04/18/the-first-roman-fonts/

          1. 2

            Whew that is WAY too deep for me, but it’s still absolutely fascinating. Thank you!

        2. 1

          On a related but maybe not as important a note - what bitmap fonts (if any) support emojis?

          1. 2

            GNU unifont

            1. 1

              Thanks for the link, I did not know about this!

        3. 11

          I mean, I definitely get the same feeling, but it’s not at all clear to me that there’s a better method for encoding the ideal way to rasterize a given vector image at every possible raster size, without excessively bloating the filesize.

          If you do naive interpolation, you either get jaggies or blur. Embedding explicit hinting data for every raster size is a non-starter for web fonts.

          1. 5

            Typography is a pretty complicated subject, so (no offense) your intuition doesn’t hold much water. (My creds: I was working in font digitization from 1988-90 during the “font wars”, have friends who’ve stayed in the field longer, and have had a pretty strong amateur interest ever since.)

            The virtual machine for font hinting has been around since Apple developed TrueType circa 1988-90. One of the rationales was that there were several pre-existing proprietary hinting systems (such as Adobe’s “Type 1” fonts), and they wanted those vendors to be able to migrate to TrueType, so having lots of flexibility in the hinting system was necessary. Also, a VM allows for new innovations, instead of just supporting one hinting algorithm.

            And I want to second what @gecko said — Roman-alphabet typography is dead easy compared to most other scripts. During some meetings between Apple and Sun about font technology in Java2D (circa 1998), after a lot of back and forth about ligatures, one of the Sun guys said “but really, how many people care about ligatures and contextual forms?” One of the Apple folks gave him a very sharp look and said “How about every literate person in India and Pakistan?”

          2. 4

            The day bitmap fonts stop working on computers is the day computers die.

            1. 8

              So that was about a year ago when Pango (which on Linux is used by almost everything) dropped support for them? Good to know. I’ll keep my dieded PC.

              1. 8

                As I said when I talked about this on another website, it’s complicated:

                True bitmap fonts, like those saved as .bdf files, are on their way out. Pango, for instance, recently dropped support for them.

                But “bitmap fonts” don’t have to be saved as .bdf files. The fonts mentioned above, Scientifica and Curie, can just as easily be built as .ttf or .otb files and work fine.

                So true bitmap fonts are already deprecated, but “bitmap fonts” that map to specific pixel sizes aren’t going anywhere. There’s nothing about them that ensures their doom, even as we approach ridiculously high resolutions like 8K. We can just upscale these fonts and move on with our lives.

                It’s my fault for not being more specific though. I called them “bitmap fonts” because I don’t really know what else to refer to them as. “Pixel fonts” maybe?

                1. 2

                  That’s a great explanation. I also wrote a bit about how to convert bdf/pcf to sfnt.

                2. 2

                  It was catastrophic, and caused a lot of people (myself included) pains as we had to scramble to find ways to continue to use our favored fonts, typically by switching to software that doesn’t depend on Pango.

                3. 3
                  • Bitmap fonts -> practical, consequential
                  • TTF fonts -> pretty, intentional

                  Luckily pretty stuff tends not to outlive the practical stuff.

                  I’ll also add one: the day we’re forced to use one of those terms that feels laggy to type on (Gnome & KDE’s default terms? It’s been a while, I only use them on other people’s computers these days) is the day computers die.

                4. 4

                  I really don’t like many of the pictures in that article. If you smooth fonts too much they look blurry to me, I really like having my pixels be well defined and my font renders full of high-frequency content.

                  I suspect everyone has different font preferences. I grew up on Win9X and XP, so that’s the style of font rendering I work with best.

                  Windows 7 font rendering gives me a headache to look at. I discovered this when I started to work in an office with a provided (Win7) computer. There is dissonance between everything looking slightly blurry and my brain telling me that “no it’s perfectly in focus”. I happily found you could turn font smoothing off in system properties (which unfortunately made some things ugly as sin) and since then we have Win10 which seems to do a better job for me.

                  A few years back freetype changed its defaults and suddenly my GTK fonts looked blurry. This made me sad until I discovered FREETYPE_PROPERTIES=truetype:interpreter-version=35 . Again it’s not perfect (some fonts render like sin) and not all applications honour it (I think firefox recently started noticing it again?) but it helps me find the world a little bit happier.

                  1. 5

                    For each their own. What’s nice is that the settings are there for anyone to test with, whatever their visual preferences, and that’s what I wanted to show.

                    1. 2

                      Ty for the article by the way. Please don’t take my criticism of the content of your pictures to be criticism against you :)

                      Something else of fun: freetype rendering of white fonts on black backgrounds.

                    2. 3

                      Thankfully 4K screens make this go away. Even “blurry” rendering in 4K looks sharper than the sharpest Win XP font rendering on <100dpi screens.

                      1. 5

                        I live off 2nd hand monitors. In my market (Australia) monitors are almost 2x the price they were 10 years ago :| My rightmost monitor is a 2003-era 1280x1024 LCD, my leftmost a 1920x1080 with some scratches that someone threw out.

                        If I did buy a monitor I would probably prioritise refresh rate over resolution; so 4k is not likely to be a fix for me for a very long time even if I do buy a new monitor.

                        Regardless: buying hardware to fix a software problem (font rendering) doesn’t sit well with me. If I threw 170-250AUD at all of my individual computer problems then they would get fixed but I would also be heavily out of pocket. I would much prefer font rendering gets better & more customisable in the software realm.

                        1. 6

                          Unfortunately, 4K screens are not a reasonable solution for me, nor most people I know, nor people who can’t afford a new computer or screen, nor people who use the computers at institutions that can’t afford new screens.

                          1. 5

                            It is getting better. 4K is not unreasonably expensive any more.

                            It’s no longer a premium, but just one tradeoff between resolution/color/refresh rate. If you’re working with text, as opposed to photos or gaming, then you can trade other aspects for text sharpness at the same price point.

                            And IMHO improvement for text is way better than what you can achieve with software. It’s not a workaround for bad rendering, it’s so much better old problems cease to be relevant.

                            1. 2

                              I cannot agree more, as I’ve mentioned in the beginning of the article, 4k monitors are still luxury items for 90% of the world.

                              1. 4

                                Most of the world uses cellphones, not PCs. Pretty much every cellphone I’ve seen in years has a high-resolution screen.

                                Is US$300 a lot for a display? That’s what I paid for my 28” Dell 4K monitor last year. IMHO something you stare at for hours a day and read text from is something that’s worth putting serious money into. It’s your axe. If you want to be a good guitarist, you’re going to need to spend at least $300 on a guitar.

                                If you “prioritise refresh rate over resolution”, that’s basically saying that you prioritize playing games over coding.

                                1. 2

                                  Is US$300 a lot for a display?

                                  Where I live that’s around a month and a half of the average salary now. And usually imported stuffs are inflated in price, if it costs $300 in the USA, it’s going to sell for $500 where I live. People would rather pay $30-40 for a cheap external screen, even pre-used.

                                  1. 1

                                    Agreed, that’s a lot.

                                    But in general, displays are so damn cheap now. Back in the ’80s or early ‘90s you’d pay upwards of US$2000 for a color monitor bigger than, say, 15”. (Sorry, I’m an old fogey.)