1. 20

This page describes what I was thinking about for quite some time - a simplified website building blocks. Please notice, that referenced content is displayed using a proxy that presents the original page in a different form.

Many gemini related links were posted here on lobsters. I believe gemini project is meant to enable a different use of the internet to those that are not happy with the current state of the web. I am not aware of any other attempt to approach this issue.

I was wondering what lobsters think about the idea of promoting a simplified (or older) versions of existing technologies. Instead of building everything from scratch, it is possible to promote certain design and limitations of pages. Subset of HTML, CSS, HTTP and basic Javascript. All steps to how to popularize and build a community can be done exactly like with the gemini project - provide a search engine, link users and their pages, build a browser etc.

  1. 23

    I don’t generally think that going “backward” along the timeline is the answer. I don’t think the web was strictly “better” in the 90s. It was different, for sure, but there were a ton of things that sucked. And I don’t think the web today is strictly “worse”. There are a ton of things that suck, but there are also a lot of things that are awesome.

    For example, I want images and video. I also want JavaScript and web apps. What I don’t want is a simple blog post that has to show a loading spinner while it downloads 50MB of static assets and then autoplays a video ad. But that has nothing to do with the available technologies, people also created obnoxious, unusable websites in the 90s. And taking away JavaScript and video is throwing the baby out with the bath water.

    1. 3

      This is a good description of how I feel as well, but I’m not sure which incentives would help us get there. On the one hand technical measures like ranking these sites higher in search results might help, but the obnoxious ads situation is perhaps a matter of finding a compensation model for online content which works for both sides.

      1. 5

        I don’t think we can expect businesses to incentivize this sort of stuff. there isn’t any money in this.

    2. 16

      This is all very arbitrary.

      HTML 4? Why? You’re missing out on more semantic tagging that came with HTML 5.

      CSS 2? Why? Is this solely so you can be compatible with IE5 or Netscape 6 like it mentions? Give me a break. You’re missing out on a ton of really helpful CSS features like variables, calculations, transitions, and media queries that improve usability.

      “Frugal to no embedded images and video”? Again, why?

      “Heavily restricted JavaScript [..] in forms”? What does that even mean? Who restricts it? How do you define what is “too much”?

      “No SSL or TLS encryption” – This is mind-boggling. What is the argument against https?

      You’re free to use all of these things if you want, but I lived in a time of these things (and before them) and I do NOT miss that type of web.

      1. 7

        It’s basically the web for retrocomputing enthusiasts who don’t care for Gopher and want the web, Netscape 4 and all.

        1. 6

          So glad this gets rid of that silly semantic web cruft from HTML5. Who needs it when you’ve got P, B and I tags? Don’t forget to reimplement all the weird bugs in IE 6, for the most realistic Ye Olde Web experience. Oh, and the BLINK tag is vital! 🙄

          1. 5

            These were my thoughts too. Similar to Gemini, this has a very “throw out the baby with the bathwater” feel to it. I would love to see someone do an actual principled redesign of technologies for a document-centric that doesn’t give up everything we’ve gained in terms of accessibility and general convenience over the last couple of decades.

            1. 4

              You’ll have to contend with the contingent that don’t want TLS—half won’t want it at all, and the other half want to use some bespoke new encryption system that isn’t widely implemented yet. Those that do want to use TLS, half will want to use self-signed certificates, but then attempt to bring in some halfhearted attempt at a “certificate authority” (or some other way to verify certificates). There’s a sizable group that want total privacy (DON’T LOG ANY IPs! IT’S ILLEGAL AND AGAINST THE GDPR! WE WANNA BE ANONYMOUS!) and thus, no way of tracking anything, server or client (to the point of having every error message be standardized completely). And there will be another sizable group that will rebel against any form of extensibility, because that can lead to complexity (or tracking—see previous group).

              Good luck. We’re all depending on you.

            2. 1

              Mostly because old computers, and by old I mean vintage, have a really hard time rendering the modern web or doing the modern cryptographic dances.

            3. 13

              ITT: Retrocomputing enthusiasts, protocol bikeshedders, and people who don’t mind modern technology but hate adtech walk into a bar. They can’t agree on anything.

              1. 11

                I sometimes think back to a sign I saw on the door to a room in an office that read ‘Nostalgia is not a strategy’.

                I’d love a simpler version of the web, but I don’t think trying to move backwards is going to work. A lot that has been built exists for good reasons and while it’s built on messy underpinnings, that doesn’t mean the reasons for existing have gone away.

                If we want to redesign, perhaps we need hard thinking about how the web, or something similar, could be built with lessons learned, not forgotten. Maybe we could think about what we would build if we had some future tech that doesn’t exist (yet)?

                Take HTTP, for example. Is the best way to browse and retrieve hypertext still to have a client application make formatted requests over a TCP based protocol when the user clicks links - with content served by ‘servers’ which are set up to listen for requests on a port?

                Could we write on our devices and have our writing sync to incorruptible, free, author-verifying - or optionally-anonymous, repositories of works?

                I love some computing nostalgia, but aren’t we more likely to improve the web by redesign than by pretending the last couple of decades didn’t happen?

                1. 11

                  All that and it’s not mobile friendly.

                  1. 1

                    Looks great in reader mode on my mobile!

                    1. 4

                      Having to switch to reader mode is ironic, though.

                      1. 1

                        The search box is not available in reader mode. If it’s that hard to use one input, imagine a whole form.

                    2. 9

                      Internet Explorer 5 or Netscape 6

                      This is not a good goal. These browsers are unsafe to use these days — they lack over a decade of client-side security improvements (such as support for HttpOnly and Secure cookies) and aren’t patched. You might as well tell users to install malware directly.

                      No SSL or TLS encryption

                      This is dangerous advice, and inconsistent with the intent of the rest of the page. TLS is easy now (unlike in the “simpler times”) and not using it is asking for ads to be injected into your pages — the thing you wanted to avoid, right?

                      1. 4

                        Out of all this, the “no encryption” is what baffles me the most. Do they just expect we’ll never use anything with any kind of secrets?

                        1. 2

                          This is probably to accommodate older browsers and hardware running old operating systems.

                          1. 4

                            That’s fair. I still don’t think it’s a worthwhile trade-off, especially since as other people have pointed out those older browsers are going to be chock-full of security holes, and it means any script kiddie on your wifi network can steal all your secrets.

                      2. 7

                        I know this doesn’t completely ban them, but I think a web without images and video is much worse off for it. I remember clearly how the web was in 1995 and the wealth of visual imagery we have now seemed like far-future fantasy. I’d love for there to be fewer (to no) ads and lighter page weights and more easily accessible information for more people, but I absolutely do not hanker after Times New Roman on a white background with no images, the ‘90s can keep that. And little to no JS on forms, so we have to round trip every bit of validation? No thanks. The motivation is totally valid but this just seems way over the top. Sure, let’s come up with proper standards for progressive web, and let’s use them. The web will be better off for it. But let’s not throw the baby out with the bathwater. Apart from anything else it comes across as overly zealous and naïve, hence reducing the strength of its argument.

                        1. 1

                          If all validation is done client side, then you’re opening up your server to be abused by criminals who will bypass your client side validation. And there’s plenty of standards for the web, even the “progressive” web. It’s just that everybody has a different set of features they want, and want to exclude all else. I know this because I was involved in the Gemini community for a time and everybody wants to extend Gemini, just in a different way.

                          1. 6

                            I don’t think the argument was that client side validation should replace server side validation, just that the addition of client side validation (where it makes sense) greatly improves the UX.

                            1. 3

                              100%. I dig purity too but I like it served with a thick slice of pragmatism :-)

                              1. 2

                                +1, try submitting a fiddly form that only has server-side validation over a slow/unreliable network. client-side navigation and validation can make a site bearable to use in these circumstances.

                          2. 3

                            I dig this; all the benefits of Gemini without the elitism and intentional exclusivity.

                            I would bring it back to HTML 2.0 if it were up to me, (coincidentally the last HTML version to have an RFC) but 4.0 is fine too.

                            The only part I take issue with is the refusal to implement TLS; while this enables access from 1990s devices it cuts off a lot of opportunities to do so and gives a somewhat reactionary feel to it.

                            1. 2

                              There’s a subset of people who use Gemini who hate TLS. Some want to remove it entirely; others say to replace it (with their preferred encryption-du-jour).

                            2. 3

                              People commenting here apparently don’t understand the context that forum message is being discussed. This is not a let’s move all the web backwards, it is more like let’s carve a niche corner, and move it back in time so our old Macs can render stuff. There are lots of retrocomputing enthusiasts running old Macs, Ataris, Acorns, etc out there. Many of those machines can’t handle modern cryptographic and webby workflows, having those niches adhere to older standards is better for compatibility while still allowing current day browsers on modern machines to view the same content.

                              The Web is retrocompatible with itself, this means that by following the practices outlined in that post, you can enjoy your weekend projects in your old Quadra running system 7, and still check the same content on your weekdays on your modern M1-based mac.

                              People saying that this is giving away all the advantages and security of the modern web don’t realise that this is not done in protest against those features, this is done because those machines can’t handle it. And before people say: “you should be using more modern machines!”, let me just say that there is a lot of joy and fun in these old machines and it is not up to anyone to tell what other people should spend their time with.

                              1. 1

                                This is a great summary. It feels to me that some comments did not think about the issue, but rather got triggered on a certain topic.

                                What I don’t want is a simple blog post that has to show a loading spinner while it downloads 50MB of static assets and then autoplays a video ad.

                                I think this is another great observation.

                                I also enjoy watching videos and pictures. I do believe there is also a place for canvas, websocket, WebGL etc. What I do not like is the abuse of technology at my cost. Instead of complaining and punishing fat pages or unprofessional developers, I would much more like promoting good content. Use a carrot instead of the stick.

                              2. 3

                                I understand the appeal, and certainly would like more plain HTML/CSS websites around, but this doesn’t resonate with me.

                                Part of what I like about Gemini is that it has aspects of a creative art project, quaint in it’s artificial limitations.

                                As well, Gemini is new, and shows that we can move forward with the lessons we’ve learned. Restricting yourself to specific outdated versions of HTML etc just strikes me as regressive.

                                1. 2

                                  There are multiple “new” Webs waiting to be weaved, but going all Amish on the current stack will uncover none of them.

                                  In the one quadrant of what is mostly unexplored space we have AR/VR. Facebook is going hard on that one for some reasons best left alone, but also because they already have a new web – you just get to see tiny fragments. It might not look the part if you are only thinking of Facebook as that human casino of a website, but even that is the whole of “Internet” and the “web” to a frightening large number of people.

                                  Their primary disadvantage is that they mostly need to operate within the current legal framework(s) and that slows them down by a lot. Some leeway is permitted, mostly with copyright infringement (see also: youtube’s rise to fame because of piracy - I take it that the KLF would have been proud), but censoring breastfeeding, phallic visuals or anything that might be deemed offensive to the wrong people or culture is much more difficult in glorious 3D++.

                                  VR is dull and pointless without strong freedoms of creation, interactive augmentation, alteration and sharing. Much of its potential hides in the emergent feeling of “presence” when things respond to being poked and can be viewed in their actual proportions and in its right place among other objects, curated and altered by people – not confined to a page or a flask template. BBS 2.0 is waiting there somewhere, with a striking force much more potent than Mindcraft and MMOs.

                                  Along other vectors you have the chance to address is the schizophrenic “web as a document” vs “web as an app” conflict where the spatial glue, “links”, are quintessential yet so impotent that they can’t even tell you which one of the two you’ll get. Then you won’t be able to tell which one you’ll share or what the receiver will see; where it will take them; where it came from or even what data it wants. The icing on this already mold infested cake is that any answer you had might change the next second. Even ‘snapshot’ and ‘store’ offline is still at a frustrating “print to PDF” kind of a stage.

                                  The “web as an app” angle is even more hilarious when you try to untangle the first burst of GET /index through the lens of traditional executable formats in the PE/ELF/… sense. A good thought exercise is to walk through the process of loading and presenting a web-app in terms of entry points, linking and loading, symbol resolving, resource sections, trampolines, on demand linking and so on. It takes a lot of hard working and smart talent working with entirely different, often conflicting and unstated, goals and vision to get something this polished yet so very very bad.

                                  1. 1

                                    The sentiment behind the idea is a good one, though the terminology is too imprecise to be applicable, e.g. “Relatively frugal … images and video”, “Heavily restricted to no use of JavaScript”. Frugality and heaviness are hard concepts to pin down.

                                    Also as others have said, HTML 4 and CSS 2 are arbitrary technology decisions. Those technologies are neither good nor bad; merely “old”. If oldness is the goal, these technologies will indeed achieve it. But limiting web pages to these technologies certainly does not achieve “better”, “more useful”, or “more accessible” web pages.

                                    While well-intention, the only thing I see achieved here is a framework for oldness, and oldness for oldness’ sake is not good.

                                    1. 1

                                      While I can agree with the sentiment, I don’t agree with the old versions. HTML5, latest CSS, TLS/SSL mandatory should be the base. No JS will probably be the best thing to do with just the tech above. By being no JS you can make fast, performant pages but still use modern tech, images, video, etc.

                                      THe problem is bad JS and JS in general. Browser have become Application Platforms. We need to build something that is just a viewer/basic interactions browser. For docs, basic checkout for shopping, etc.

                                      Especially with CSS and HTML getting more and more functionality that is often handled by JS .

                                      Maybe the only JS that should be allowed is AJAX requests, but even that isnt necessary and should be only considered with lots of restrictions. maybe in some htmx style setup.

                                      1. 1

                                        This is a good base to build a community on, as is Gemini, as is indieweb. We all can bikeshed but most of us can’t build communities from scratch. So instead of any technical opinions I might have I’ll just say best of luck. We have room for multiple communities like this, and it’s all a great change from the commercial web.