I’m glad eevee made this post. I wish I could file bugs against twitter’s webapp, but I suspect they have bigger problems on their hands.
Ones they’ve fixed:
I have a lot more complaints against the UI but I should probably cap this to the serious ones…
for example, checking the checksum on a credit card number (so you don’t need to submit a typo’d credit card number and have to reenter everything on the page) (also you literally afaict never need to click on the “this is a visa” radio button, the card type can be identified with the number)
also it was cool when twitter let you look at threaded tweets on a timeline/userpage without having to change URL or open up a horrid cringey lightbox, it was fast and useful and wasn’t too onerous, unlike the new lightboxes.
i hope whoever invented those lightboxes for tweets has to use twitter on a computer that doesn’t have the latest CPU and doesn’t have 24 GB of memory and isn’t connected with a gigabit link to twitter.com. it is sufficient punishment.
I recently used a Gopher site.
It was - hands down - shockingly better at information presentation than, say, 80% of sites out there. And blazingly fast. I feel sad that smart and nerdy engineers built HTML/JS/CSS in the 90s. They had a chance for a better world, and chose to let the current dumpster fire start. Even HTTP has major problems.
I’m so over the Web 2.0 and whatever jibber jabber is post 2.0 at this point. My home projects uniformly don’t use HTML as a presentation layer; I generally avoid using the “web stack” at work, because of the staggering non-design that continually causes problems.
The memory usage of twitter.com brings my 2014 Chromebook to its knees. I can barely tolerate having that tab and an ssh window open at the same time; anything beyond that is not practical. It never reclaims memory from tweets that have scrolled out of sight, so I refresh it every ten minutes or so.
I can actually do quite a bit on the Chromebook, as long as I don’t try to view Twitter.
This is why I have started using Tweetdeck.
TweetDeck is (generally) better than the web version, unless your account is locked.
I used to use a RSS/Atom reader to view tweets but Twitter removed the ability
to do so about a year ago. Afaik this is now only possible with a registered
account and API key.
I have written a small tool called “tscrape” to scrape the Twitter user page and
write the fields TAB-separated to stdout. It is written in C and has no
external dependencies except make and a C compiler. Using a wrapper script and the standard UNIX utilities it should be trivial to display tweets any way you like.
curl -s 'https://twitter.com/lobsters' | tscrape
If I was a better writer, I’d make a quip about “those who forget web 1.0 are doomed to reimplement it, badly” or something to that effect. It’s especially frustrating when you consider that many of those “new and improved” web experiences require so much memory and CPU power that they are completely unusable on a laptop that’s a few years old.
I think sites doing everything themselves in WebGL+Canvas because “the dom is too slow” may be the latest addition to the list of possible reasons for “why is my laptop on fire?”.
Still a bit soon to tell. ;)
One solution that preserves both the convenience of infinite scroll and jumping to a given page (as well as saving your place on the page) would be keeping the page number in a URL fragment and incrementing it on infinite scroll. So far I’ve seen it used only in RES.
I would like to back up what @lmm said and lost a few karma points for –
Note that I believe all sites should be usable as pure HTML for accessibility purposes. This is my exception to the above, and is not addressed in this article. It makes a convenient backstop too, as the aforementioned able-bodied JS-shunners – yes, who make up a tiny fraction of a percent of my users – can fire up /usr/bin/links and have the 1998 experience they were dreaming of.
and lost a few karma points for
It’s really disappointing to see down votes for these opinions - no matter how much some disagree.
I’ve always been impressed at the level of discourse on this site. Trying to squelch a reasonable opinion - and it is reasonable - because it’s not shared is pretty shameful IMO. I just hope it’s a one-off and doesn’t represent a trend of where lobsters is headed.
Ironically enough I do use a custom stylesheet, to turn off the max-widths a lot of sites seem to use these days. But when I break a site by doing that I see that as my problem and I don’t complain about it.
And it sounds like the author hasn’t even stopped using twitter because of it!
This is sad, as it reflects the reality of network effects.
Twitter/FB/everyone else could degrade our experiences significantly more and we’d still feel compelled to stay, because it’d be weird to not have a YouBook account. Essentially, it feels like we’re slowly ceding control of technological matters to non-technological users.
This is not what I signed up for.
I can’t upvote this enough!
I pine for relocatable accounts. I don’t even care if they are distributed, just easier to import/export data into/out of. Why can’t I say, “I don’t like Facebook.com any more, I’m going to move my profile to MySpace2.com”. Ideally everyone’s links to me in all services would then update to point to me and old posts on my new provider. I know this is a big logistics problem and obviously no one has the Facebook software except Facebook for example. But I think something like this is an interesting dream. We have reasonable standards for viewing web pages and even servers talking to each other, yet migrating accounts between various services is still a sore point. And yes I do realise that everyone running Diaspora (Especially Google, Facebook et al) is pretty much never going to happen unfortunately.
The trap is in thinking you can just throw JS at problems and be done with it.
You should be able to, but the browser is an extremely hostile execution environment. Rather than tilt at windmills saying, “well, people shouldn’t do x and y,” accept that JS has it’s own set of drawbacks, and use it when it genuinely improves the experience without sacrificing functionality for times when it fails, either due to browser configuration or other issues beyond your control.
No page is entitled to run JS. The beauty of the web is that the user-agent can make those decisions. The fact that this makes it harder to program for is not the user’s problem. Abdicating your responsibility to progressive enhancement is laziness.
Other platforms (iOS, Android) do not have these issues. I prefer programming them because they are more conceptually sound, move slower, use far less resources, and are easier to optimize for.
The user-agent can do what it likes. But the server can decide what it’s going to support. How much engineering effort is it worth for twitter to improve the experience for the tiny minority for whom JS won’t be running correctly (particularly when it seems like the non-JS experience wasn’t bad enough to drive this person away from twitter)? I don’t think it’s twitter who are tilting at windmills here.
For me, the twitter webapp currently displays any link preview as only 1-3 characters and then an ellipsis for both the title and the body. This makes all that space much less useful than just showing the URL, which is truly hard to do and I applaud twitter for working it out.
100% agree. And it’s actually even worse with GitHub, for example.