I feel like this is one (of many) critical realizations you pick up when you start to learn mathematics more seriously. Mathematicians are very serious about taking care to distinguish representation and object, often because much of their mechanism arises out of proving that a representation is (sufficiently) faithful to the genuine object and thus useful. Alternatively, because so much of generality arises out of finding new ways to dispense with burdensome representation.
An example people likely have experience with is linear algebra. Honestly, LA is about linear mappings between vector spaces, but we regularly spend our time working with matrices. Matrices are just representations of those mappings that are available in a certain—highly useful!—subset of possible linear mappings. Matrices enable great algorithms and give insight into properties of linear maps. Remembering that matrices are just representations lets us port knowledge to, say, infinite dimensional vector spaces where the representations no longer quite make sense.
Question: why is the binary representation “the true data” and not yet another representation in a (probably infinite?) set of possible representations? There are lots of neat properties afforded by representing an IP address as a 32-bit integer, for example, but those don’t seem like they elevate it beyond the category of “representations”.
It is a representation, but it is also the way the computer works with it. It’s kind of the same reason we work in decimal when we have to do math manually. It’s the way we are trained to think, so that’s the way we are used to do the math. You could train yourself to do math in hexadecimal, trinary, base64 or whatever other base system and it would just be another representation and completely the same. The only difference between us using a new mental model and a computer using the binary system is that, as far as I know, only the computers binary model maps 1-to-1 to the physical representation of the math.
Binary is literally the exact same way the computer thinks, which is why it’s useful to use as a base to understand what’s going on under the hood.
I think you’re fixating on the “binary” aspect when I meant to highlight the “numerical” aspect. Why is number-as-binary truer than text-as-binary; everything that runs on computers is implicitly “-as-binary”, so I suppose we can drop that suffix. Why is the numerical representation (and specifically the fixed-width numerical representation) the “true data”. I understand that there are useful properties, but the author seems to be driving at a more categorical distinction. And I guess perhaps a deeper mathematical/philosophical question might be “what if anything does ‘true data’ mean?”.
No, binary representation is still just representation of data. One can represent IP as radio frequency, flag messages, tertiary voltages, etc. Still encoding the same data but in different ways. There is no “true” representation, there can be canonical one though.
This is a really interesting article, but I really struggled to read it because the all-caps hurts my head (I’m not sure whether that’s a visual thing or because I’m internally shouting everything).
@hauleth, I’ve enjoyed past things you’ve written and enjoyed this, but what’s the rationale behind the font decision?
Edit: I’ve just looked at the CSS and I’ve set a style to disable the capitalisation, but I’m still curious as to why you do it in the first place?
I use it everywhere so I just got used to it. Mostly that is the reason. And as I am Safari user I just use reader mode whenever I need so that is again non-issue for me.
Also it seems that there is something really specific in your setup as I haven’t spotted anyone that would say that it is all cap.
In that case I apologise. I’ve just checked on an Android device and don’t get all caps.
I also just downloaded Chromium (I usually use Firefox), and the problem persists, so I’m presuming that it’s an issue with the font configuration on my end. That said, I haven’t done anything to my font configurations and it’s a fairly recent Ubuntu installation, so I don’t know what’s going on. Here’s what I’m seeing.
The issue seems to stem from your font-feature-settings line: removing "case" from the list puts the font back to normal for me. According to CSS-Tricks that value tells the font to use case-sensitive forms, though what that means is beyond me.
Anyway, I’ve got a user style in now so I can enjoy what you’ve written in peace :) thanks for the interesting article.
I feel like this is one (of many) critical realizations you pick up when you start to learn mathematics more seriously. Mathematicians are very serious about taking care to distinguish representation and object, often because much of their mechanism arises out of proving that a representation is (sufficiently) faithful to the genuine object and thus useful. Alternatively, because so much of generality arises out of finding new ways to dispense with burdensome representation.
An example people likely have experience with is linear algebra. Honestly, LA is about linear mappings between vector spaces, but we regularly spend our time working with matrices. Matrices are just representations of those mappings that are available in a certain—highly useful!—subset of possible linear mappings. Matrices enable great algorithms and give insight into properties of linear maps. Remembering that matrices are just representations lets us port knowledge to, say, infinite dimensional vector spaces where the representations no longer quite make sense.
Question: why is the binary representation “the true data” and not yet another representation in a (probably infinite?) set of possible representations? There are lots of neat properties afforded by representing an IP address as a 32-bit integer, for example, but those don’t seem like they elevate it beyond the category of “representations”.
It is a representation, but it is also the way the computer works with it. It’s kind of the same reason we work in decimal when we have to do math manually. It’s the way we are trained to think, so that’s the way we are used to do the math. You could train yourself to do math in hexadecimal, trinary, base64 or whatever other base system and it would just be another representation and completely the same. The only difference between us using a new mental model and a computer using the binary system is that, as far as I know, only the computers binary model maps 1-to-1 to the physical representation of the math.
Binary is literally the exact same way the computer thinks, which is why it’s useful to use as a base to understand what’s going on under the hood.
I think you’re fixating on the “binary” aspect when I meant to highlight the “numerical” aspect. Why is number-as-binary truer than text-as-binary; everything that runs on computers is implicitly “-as-binary”, so I suppose we can drop that suffix. Why is the numerical representation (and specifically the fixed-width numerical representation) the “true data”. I understand that there are useful properties, but the author seems to be driving at a more categorical distinction. And I guess perhaps a deeper mathematical/philosophical question might be “what if anything does ‘true data’ mean?”.
No, binary representation is still just representation of data. One can represent IP as radio frequency, flag messages, tertiary voltages, etc. Still encoding the same data but in different ways. There is no “true” representation, there can be canonical one though.
This is a really interesting article, but I really struggled to read it because the all-caps hurts my head (I’m not sure whether that’s a visual thing or because I’m internally shouting everything).
@hauleth, I’ve enjoyed past things you’ve written and enjoyed this, but what’s the rationale behind the font decision?
Edit: I’ve just looked at the CSS and I’ve set a style to disable the capitalisation, but I’m still curious as to why you do it in the first place?
I’m not seeing capitals on my end.
I am. I’m using Firefox on Linux, if it helps. I also have web fonts disabled.
I use it everywhere so I just got used to it. Mostly that is the reason. And as I am Safari user I just use reader mode whenever I need so that is again non-issue for me.
Also it seems that there is something really specific in your setup as I haven’t spotted anyone that would say that it is all cap.
In that case I apologise. I’ve just checked on an Android device and don’t get all caps.
I also just downloaded Chromium (I usually use Firefox), and the problem persists, so I’m presuming that it’s an issue with the font configuration on my end. That said, I haven’t done anything to my font configurations and it’s a fairly recent Ubuntu installation, so I don’t know what’s going on. Here’s what I’m seeing.
The issue seems to stem from your
font-feature-settings
line: removing"case"
from the list puts the font back to normal for me. According to CSS-Tricks that value tells the font to use case-sensitive forms, though what that means is beyond me.Anyway, I’ve got a user style in now so I can enjoy what you’ve written in peace :) thanks for the interesting article.