1. 6

    It really annoys me that the StackOverflow answer from 2017 that was linked in the blog post that just started getting traffic was just edited: https://stackoverflow.com/posts/45645766/revisions

    I feel like the edit was just a vanity edit by somebody who could, but it really annoys me because that’s not what the person answering it wrote. Does anybody else get bothered by unnecessary edits on StackOverflow answers?

    1. 2

      Please don’t choose a default type face for my reset. Thank you!

      1. 2

        This is just changing the default initial value from serif to sans-serif — I end up making more sans-serif websites than serif ones so it’s a sensible default.

        When you use this snippet you’re able to change that value to whatever font-family you like ^_^

      1. 5

        Would be nice to see a before and after…

          1. 2

            Interestingly, Preset omits all the Normalize fixes for MS Edge, which seems surprising to me.

            1. 1

              Ah, I thought the parent meant a before/after comparison of old resets versus this new rest. Not a before/after comparison of the reset applied to a page.

            2. 1

              Sure! Here’s a before/after: https://imgur.com/a/QjZl2

              And they’re online at:

            1. 2

              Everybody says it’s fast, and pages certain do appear the get rendered faster, but CSS variables and JS events run so slowly now that things I have built that used to be silky smooth animations in older FF (and every other browser) are now showing up as slideshows in FF Quantum. What a total piece of garbage!

              Here’s a video I recorded a few moments ago of the same page in the new FF and in regular Safari - for the record, it used to look like Safari’s example in pre-Quantum Firefox: http://staticresource.com/new-ff.mp4

              It’s running so slowly, I’m probably going to get bug reports for my software because of how bad the new FF sucks, and what can I tell them? Downgrading would fix performance, but is never recommended for security.

              1. 11

                Interesting. What machine is this running on? I tried it on mine and couldn’t replicate the effect. Runs smoothly.

                1. 1
                  • MacBook Air (13-inch, Mid 2011)
                  • Proc: 1.7 GHz Intel Core i5
                  • Mem: 4 GB 1333 MHz DDR3
                2. 3

                  Looks great here on macOS, I can’t reproduce your issue either. As @skade said, any information on the machine itself would be appreciated, happy to try and help diagnose.

                  1. 3

                    I get the same thing on a 3Ghz i7 Macmini7,1. Not sure how that particular demo performed in pre-quantum firefox, though.

                    1. 4

                      Regardless of performance pre-quantum, you can help us investigate if you swing by on IRC at irc.mozilla.org #developers and ask for help in profiling a specifically slow website. Folks will show you around if you’re a bit patient with uss

                      N.B.: most Devs are in Pacific Time (utc-8 O think)

                      1. 2

                        Also, couldn’t find anybody in #developers on irc.mozilla.org - are you sure that’s where you meant to send me?

                        image of IRC channel list

                        1. 4

                          The channel does not show up on that list, probably due to a channel mode (+s).

                          It’s #developers, it’s there and there’s around 600 people in right now.

                          1. 3

                            For those wondering if anything good ever came out of this thread.. innovati dropped by and we ended up with at least a couple useful bug reports & test cases, and devs are looking into it :-) So thanks a lot.

                            https://bugzilla.mozilla.org/show_bug.cgi?id=1417970

                            https://bugzilla.mozilla.org/show_bug.cgi?id=1417991

                            1. 3

                              Thanks for your help DuClare! Hopefully (if they have time, and if they find the bugs) they can fix these and be on par with previous releases of their own browser, and also join other browsers in their performance ^_^

                      2. 3

                        It ran as smooth as could be previously, none of the demos I’m talking about are particularly computationally expensive and run just fine on low end hardware, but it feels like new FF is throttled or something, it’s not just jumpy it’s a downright slideshow

                      3. 1

                        Compare this page in new FF to any other browser: https://s.codepen.io/tomhodgins/debug/egWjBb

                        Should this little amount of JS turn a modern browser into an absolute slideshow?

                        function update(e){
                          var x = e.clientX || e.touches[0].clientX
                          var y = e.clientY || e.touches[0].clientY
                        
                          document.documentElement.style.setProperty('--cursorX', x + 'px')
                          document.documentElement.style.setProperty('--cursorY', y + 'px')
                        }
                        
                        document.addEventListener('mousemove',update)
                        document.addEventListener('touchmove',update)
                        

                        It updates 2 variables on mousemove and touchmove - that’s it! There’s literally nothing else going on in this demo :/ It worked fine in FF until this new version.

                        1. 3

                          This is working very smoothly for me. Running firefox 57 on arch linux.

                          1. 1

                            Thanks for having a peek :3

                          2. 2

                            What version did you run previously? 56 or something older?

                            I’m being told there’s event throttling in nightly which should reduce excess updates and thus improve performance. This doesn’t explain why performance should have regressed in 57. I honestly can’t see much of a difference between 56, 57, and nightly.

                        2. 2

                          I can see a similarish slowdown of the animation on my version of FF (58.0b1 (64-bit) dev edition on Windows 8.1). Granted, I’m running on an older bit of hardware. (a 4-core i5-2400 running up to 3.1GHz, 8GB of memory). Chrome on the same machine runs the animation smoothly

                          Overall, on older hardware (2010-12 era CPUs), FF 57+ seems to have issues where a heavy page seems to be able to contest rendering on other pages, at least that’s my guess. However, I don’t have a stable repro at the moment.

                        1. 1

                          “Container queries” are a tiny subset of all possible “element queries”, which themselves are a subset of all possible “scoped styles”

                          1. 3

                            I believe what you’re looking for is called XPath. CSS was designed for styling documents; XPath was designed for traversing XML trees. Now, if only XPath selectors could be used from inside CSS…

                            1. 2

                              Oh I have got you covered! I use XPath in CSS to select elements and have made two plugins for doing this, depending on whether you’d rather do it from CSS or from JS:

                              [xpath="//*"] {
                                background: lime;
                              }
                              
                              <style process=auto>
                                ${xpath('//*', `
                                  background: lime;
                                `)}
                              </style>
                              

                              (to be honest, I have ~20 plugins that I use to do everything in the wishlist :D)

                              1. 2

                                Oh wow, you came to this thread prepared ?

                                CSSplus is a family of CSS reprocessing plugins that give you more power when writing CSS. It’s called “CSSplus” because it’s CSS plus JavaScript, the method most of these plugins use to add power to CSS is by having JavaScript dynamically update CSS variables with information about the browser and elements.

                                Okay, so from what I understand, you’re looking for this stuff to be built into CSS instead of a dynamic plugin. Is that right?

                                1. 2

                                  Yeah, my ‘CSS Wishlist’ are the ideas I work with all the time via CSS + JS, but what they might look like if they were specced as part of CSS.

                                  I’ve got a CSS reprocessor feature comparison chart that tracks which of 20 plugins does which responsive technique. I’m writing up all of these techniques, and I made this little wishlist as an aside taking a break from writing because I had never put all of the ‘ideas’ together in a fake CSS-like syntax in the same document before :D

                                  If any of these ideas piqued your interest - they’re all already usable today, just not via the fake CSS syntax in my wishlist!

                            1. 2

                              First off, beautiful site. Love how the gradient continues through the items!

                              Some quick comments about the wishes, which all seem pretty sensible:

                              • the :attribute(partial-name-*) request is pretty completely covered both by [attr*=value] that matches attributes containing value and other related selectors like [attr|=value] (see MDN docs) EDIT: I cede this point to innovati!
                              • :parent is covered by :has, but has not been implemented in any browsers…probably because of the potentially enormous performance hits.
                              1. 3

                                You can match attribute values as strings, but not attribute names themselves. Suppose you have <div data-example1> and <div data-example2>, how can you write a selector that will select both data attributes data-example1 and data-example2, without listing out all possibilities like [data-example1], [data-example2], etc…

                                Consider these examples:

                                But this isn’t something CSS is capable of doing presently.

                                Also, I think the difference between selecting an element’s direct parent (like parentNode in JavaScript) is a lot simpler than finding the nearest ancestor matching a selector to another element (like closest() in JavaScript), or like selecting all parents of an element that match a CSS selector (like :has() if it existed)

                                1. 2

                                  Love how the gradient continues through the items!

                                  While cool, that effect doesn’t seem to work if I block javascript from cdnjs.cloudflare.com so I’m assuming it’s not pure CSS.

                                  1. 2

                                    Indeed not - it’s using EQCSS via JS to style elements using the knowledge of their index within the total number of tags like it in the document:

                                    @element code {
                                      $this {
                                        background: hsl(eval("([].indexOf.call(document.querySelectorAll('code'), $it) * 5) + 150"), 100%, 90%);
                                      }
                                    }
                                    

                                    This means that each <code> tag gets an HSL() value for its background that uses [].indexOf.call(document.querySelectorAll('code'), $it) * 5) in JavaScript to get the current index of that <code> tag :D

                                1. 5

                                  The full-page link, for mobile: https://codepen.io/tomhodgins/full/oeKOOO/. This should be the actual link – the markup powering this article is irrelevant.

                                  1. 3
                                  1. 1

                                    The source code is here, it’s 25 lines of vanilla JS: https://github.com/tomhodgins/reprocss/blob/master/mixins/closest-selector.js

                                    1. 1

                                      Seems like Qutebrowser cant even open this page

                                      Edit: I was mostly wondering why it couldnt. Is it because of advanced css? (I would though it would of juste degraded gracefully) or does it have some weird js?

                                      1. 1

                                        Ah sorry! It could be due to the CSS Variables (CSS custom properties) like var(--something), but the JS plugins are also all written in ES6, so unless that browser has support for things like the JS arrow function notation and template literals you would also need to transpile the plugin down to ES5 before it would run.

                                        I’m not familiar with that browser to look up which rendering engine it uses and where the CSS and JS support are at :/

                                      1. 2

                                        Very interesting. For the curious, this seems to make use of naturalWidth and naturalHeight which are properties of the img element and represent the actual width and height of the image (source: https://developer.mozilla.org/en-US/docs/Web/API/HTMLImageElement). The 96/300 bit is there, I presume, because the images he chose were saved as 96dpi. I’m guessing you’d want to change this if the dpi was actually different.

                                        1. 3

                                          But if you know the dpi upfront surely you also know the image size?

                                          I don’t understand the advantage in that case.

                                          1. 1

                                            Images don’t really have a DPI, they are just a raster of pixels x wide and y tall. This demo is designed as something to show the people wanting to know how to ensure they can print 300dpi images from sources they have embedded in HTML. I had a handful of people ask a very similar question in a timespan of 1 week, so I made a demo to have ready for the next person who asks :D

                                            1. 2

                                              Most image formats have EXIF or similar metadata - much more than just a raster.

                                              It’s common IIRC to include ‘resolution’ which browsers could (no idea if they do) use when printing.

                                          2. 2

                                            dpi != ppi: dpi is dots per inch and means printed resolution of ink dots per inch of paper. It has no immediate relevance for the display of an image on a screen, which depends on pixel per inch, or ppi, of the LCD and the software doing the displaying. So, you can save the dpi information in the file for printing, but it will not be used by browsers. The hack presented here “fixes” that by instructing the browser to scale the image down.

                                            All modern browsers use a 96 ppi resolution as the basis for calculating widths: make an element have a width of 1in, it will be displayed as 96 logical CSS pixels. The 96/300 is there to display the image with a density of 300 ppi. So, if I have an image that has a natural width of 300 px, it will always be as wide as an element with width: 1in.

                                            Personally, I think one should instead use the <picture> element to let the browser choose an appropriate image to download. This way no bandwidth is wasted transferring huge images that are scaled down to a third of their width anyway.

                                            1. 2

                                              Omg, you’re totally right about dpi vs ppi. I knew I was having a brain fart. Thanks for clearing this up.

                                          1. 2

                                            Isn’t this what the new picture and/or srcset stuff is for? Present the browser with e.g. a uri for a 1x image, a 2x image and a 3x image, and let the browser load the one it needs, based on the device it’s running on?

                                            1. 1

                                              What inspired me to make this demo was a handful of people asking how to ensure 300ppi embeds in HTML for purposes of rendering a PDF for print - they wanted to ensure that all images, regardless of size, were at a high enough resolution for print at the size they appear in the layout :D

                                            1. 3

                                              I used to fiddle with shell configs, SSH terminal colors, and all of these settings to try to cobble together an experience on each machine that now I can have instantly by installing Fish.

                                              I’ve been using Fish for a few years (daily light use) and I couldn’t be happier! It’s the terminal I always wanted, but never could figure out how to configure correctly :D

                                              1. 4

                                                Do you know what font is used for code in these slides? It’s very pretty.

                                                1. 4

                                                  The font is Fira Code, a monospaced font with special ligatures for programming: https://github.com/tonsky/FiraCode

                                                  1. 3

                                                    Thanks!

                                                1. 1

                                                  You have powerful automation tools at your disposal, scripting and browser automation at you fingertips, but instead you turn yourself (even more than before) into a factory worker optimizing the ergonomics of your tools. I find this a step backwards.

                                                  1. 1

                                                    Well I’ve done both - I also have quite a bit of automation using command-line scripts, using custom-built tools, and using browser extensions sitting on top of whatever I’m looking at as well.

                                                    Customizing your input methods isn’t a step backwards, it’s just optimizing a different direction. This custom hardware, when paired with my custom software, makes for a really comfy working setup :D

                                                  1. 2

                                                    Wow! Thanks for sharing, I’ll definitely check this out :D

                                                    1. 4

                                                      I think most of the time a unitless number is great for line-height, and likely what you want. However consider the following example (and many like it):

                                                      h1 {
                                                        font-size: 34pt;
                                                        line-height: 100vh;
                                                      }
                                                      @media (min-width: 500px) {
                                                        h1 {
                                                          font-size: 72pt;
                                                        }
                                                      }
                                                      

                                                      Using a unit-based line-height value keeps the text positioned in the same place while the font-size is free to change independently. Not always, but quite a bit more often than ‘never’ this is in fact what we want to do! Add in the tantalizing idea of element-based units (if they were ever to be added to CSS) and this advice could become very limiting, even bad advice.

                                                      You should use whatever units make sense for solving what you’re trying to do, and not try to tell people what they should do when solving problems you’re not looking at. There are valid reasons you can use numbers with units for line-height, there’s no benefit to be gained by ignoring them and pretending they don’t exist.

                                                      This article could be improved by being named: “Hmmm. Hmmm. Hmmm… Is a Unit on Line-Height Really What You Need?” and then explaining the cases the author does understand where a unit doesn’t make sense.

                                                      1. 3

                                                        I was about to complain that there are scenarios where, if the user has a sufficiently high text-scaling setting, this would result in cropping the header.

                                                        Then I realized that the line-height you specified is 100vh. Carry on. :)

                                                      1. 4

                                                        This article has some weird information. First this media query is given as an example:

                                                        @media only screen and (-moz-min-device-pixel-ratio: 2),
                                                               only screen and (-o-min-device-pixel-ratio:2/1),
                                                               only screen and (-webkit-min-device-pixel-ratio:2),
                                                               only screen and (min-device-pixel-ratio:2),
                                                               only screen and (min-resolution:2dppx),
                                                               only screen and (min-resolution:192dpi) {
                                                        
                                                          .social-icons {
                                                            background-image:url("data:image/png;base64,...");
                                                            background-size: 165px 276px;
                                                          }
                                                        
                                                          .sprite.weather {
                                                            background-image: url("data:image/png;base64,...");
                                                            background-size: 260px 28px;
                                                          }
                                                        
                                                          .menu-icons {
                                                            background-image: url("data:image/png;base64,...");
                                                            background-size: 200px 276px;
                                                          }
                                                        
                                                        }
                                                        

                                                        The reason this query is so wrong isn’t the Base64, but all those non-standard media features. We have a resolution feature for media queries.

                                                        But that’s not why they consider this monstrosity of a query bad. Instead they say:

                                                        All users, whether on retina devices or not (heck, even users with browsers that don’t even support media queries), will be forced to download that extra 18K of CSS before their browser can even begin putting a page together for them.

                                                        Base64 encoded assets will always be downloaded, even if they’re never used. That’s wasteful at best, but when you consider that it’s waste that actually blocks rendering, it’s even worse.

                                                        Well of course that’s what happens if you choose to inline your media queries inside your other CSS files perhaps! What about splitting the CSS that goes inside that media query into its own file, and putting the responsive condition in the media="" attribute of the <link> element linking to that CSS? Is the problem still present then?

                                                        It seems like this really isn’t a problem if you read the media queries spec: https://drafts.csswg.org/mediaqueries/#media

                                                        1. 5

                                                          I mostly agree that plain HTML is a great way to go, and letting the browser display semantic markup however it likes (ideally well, and according to user preferences) is right.

                                                          But setting max-width to 800px? That jumps out as a very bad idea.

                                                          1. 11

                                                            It’s pretty subjective, but I just picked it as a number that’s roughly in the range of what most text-oriented sites seem to use. Medium uses 700px, for example. My linked blog above uses a slightly wider 900px. BBC News uses a bit narrower 600px (not counting the right sidebar).

                                                            1. 9

                                                              I like limiting it to 52em^H 40 em. That way the printed page looks exactly like the web page. See for example: https://www.btbytes.com/notebooks/nim.html

                                                              1. 9

                                                                The BBC News article body width has had a surprising amount of thought and effort put in. The intention is to strike a balance between readability (~80 characters max per line) and aesthetics (too much whitespace on either side can look strange).

                                                                The choice of 645px has worked well for the last several years but is too narrow on high resolution displays. We’re going to be rebuilding the article pages later this year and I hope we will take the approach of using rems instead of pixels, as mentioned by @btbytes.

                                                                1. 3

                                                                  I think it’s better to specify a max-width in ems, because that naturally accommodates people with vision issues who’ve increased their font size to cope (and some older people apparently make fairly drastic font enlargements). How many ems it should be, I don’t know. I’m currently using 45 ems and the result seems okay, but I picked it more or less out of the air.

                                                                  1. 1

                                                                    Hmm, that’s a good point. I didn’t realize it’s common for people set font size explicitly. (I know browsers support user-specified CSS, but thought of it as basically a “strictly for programmers” feature.) I personally like larger font size than most pages have by default, but I use the zoom function for that instead of specifying fonts in user-CSS, which is also what the elderly people i know do. From some testing, specifying max-width in ems or pixels behaves the same w.r.t. zoom. But there’s no real reason not to use ems afaict, so I might switch to that if there’s an advantage.

                                                                    1. 1

                                                                      This is interesting. I just did a test in (desktop) Firefox and Chrome, and they behave differently. In Firefox as I have it set (which is to not zoom images, just text), zooming does not change a max-width set in px, so the text on your page doesn’t get wider. In Chrome, zooming does apparently increase the max-width even when it’s set in px, so the text on your page gets wider, eventually going up to a full screen width. Given the difference in behavior it seems worthwhile to set the size in em.

                                                                      1. 1

                                                                        Odd. I was also basing my comment on testing desktop Firefox and Chrome (on OSX), which with the default settings for me both do zoom width on my blog post, and with seemingly identical behavior whether you specify px or em. I wonder if it’s your don’t-zoom-images setting for Firefox that turns off resizing of all pixel-specified areas' sizes, not just images per se? I don’t see an option for that in the prefs; is it one of the ones behind about:config?

                                                                2. 5

                                                                  I’ve been persuaded that some max-width is a good idea. Some number of people do browse with their browsers in full-screen mode on wide (or absurdly wide) screens, and if you don’t limit your site’s width you get really long lines of text that are hard to read. It annoys me to have huge amounts of empty space that could be filled with something useful, but so it goes.

                                                                  (I prefer to center the text in an over-wide screen rather than leave it at the left side, but that’s a taste issue. I think it looks better in the full-screen browser scenario, and it puts the text hopefully straight in front of the person, if they’ve centered themselves in front of their screen.)

                                                                  1. 2

                                                                    I like the long lines of text. I would ask site makers to please please find a way to let me have the long lines when I want them. If motherfuckingwebsite can manage to make it work then so can you.

                                                                    1. 4

                                                                      in typography a traditional line length is 2-3 alphabets long, aka like 60-70 characters. This usually falls much below 700px, so already web designers are more generous than say, magazine layout designers or newspapers - but having a line length that’s too long results in reader fatigue from too much left-right motion and not enough vertical.

                                                                      The effect of the fatigue is something testable and measurable - so no doubt news websites from profit from people clicking multiple stories will try not to strain the viewers eyes after reading their first article.

                                                                1. 13

                                                                  You can balance your additions by dropping the explicit html, head, and body tags. They’re not required, even for spec-valid HTML, as the parser adds the elements automatically.

                                                                  1. 4

                                                                    From what I found it is best not to omit charset declaration. Additionally we could get rid of few quotes. Then it could look like this (this validates with W3C validator):

                                                                    <!DOCTYPE html>
                                                                    <meta name=viewport content="width=device-width,initial-scale=1">
                                                                    <meta charset=utf-8>
                                                                    <style type=text/css>body { max-width: 800px; margin: auto; }</style>
                                                                    <title>Mark's Internet Presence</title>
                                                                    <h1>Mark's Internet Presence</h1>
                                                                    <p>
                                                                    At Mark's Internet Presence, we leverage HTML5 technologies to...
                                                                    <p>
                                                                    Next paragraph.
                                                                    

                                                                    title could be the first element after DOCTYPE, but for me it looks good just before h1 element.

                                                                    There could be <html lang=en> (without closing tag) added at front.

                                                                    I like it! For some time I plan to do a rehash of my website and was thinking about writing html directly as it does not need any processing. I was also toying with use of a very simple SSG. It generates fully static pages, but with a bit more pleasant CSS (although certainly longer).

                                                                    In a way writing fully static page directly reminds me of printing. Just as an old book or newspaper issue will look the same as it was when it was printed. Every article can have it’s own character and that is nice.

                                                                    1. 3

                                                                      title could be the first element after DOCTYPE, but for me it looks good just before h1 element.

                                                                      You always want to put the title after the charset declaration, or else you won’t be able to use special characters in your title ;)

                                                                      1. 2

                                                                        That is false. Once the charset is sniffed from the document itself, the entire page up to that point is re-parsed with the new charset.

                                                                        1. 7

                                                                          As a nitpick, the browser only has to reparse if it finds the charset early in the document, specifically:

                                                                          The element containing the character encoding declaration must be serialized completely within the first 1024 bytes of the document.

                                                                          Seems to occasionally come up in the wild, with people on StackOverflow confused about why their charset declaration isn’t working, who turned out to have had >1024 bytes of stuff before it (e.g. a big HTML comment at the top).

                                                                    2. 2

                                                                      Wow, apparently that’s been valid for a long time, but this is the first I’m hearing of it. Intriguing. Looks like the only case the explicit tags are really required is if you want certain elements (like a script) to be the first tag in the body, while they’d be put as the last tag in the head if you leave it implicit. But for everything else it’s unambiguous.

                                                                      1. 5

                                                                        It has been valid since before “valid HTML” was a thing — SGML allowed you (as a schema author) to specify arbitrary tags as optional so long as they could be inferred unambiguously.

                                                                        Some day we will catch up to the markup technology of the 1980s.