1. 5

    one thing that annoys me when upgrading Racket is that all of a sudden I’ll be getting errors on my current projects because the version mismatch between the new Racket version and whatever version was used to generate the stuff in the “compiled” folders. Also, reinstalling every single package when I upgrade is a chore. I’m sure there might be an easier way, but damn when upgrading a language I don’t expect this kind of friction.

    If there is a version mismatch between what is in compiled and the current running runtime, then just recompile. Why can’t packages be installed in a way that multiple Racket versions can access, why are they siloed per version?

    1. 4
      read-compiled-linklet: version mismatch  expected: "7.9.0.3"  found: "7.8.0.6"  in: .../afile_rkt.zo
      

      Is this the error you are seeing? The 3480 issue might resolve your problem.

      https://github.com/racket/racket-lang-org/issues/110

      https://github.com/racket/racket/issues/3253

      https://github.com/racket/racket/issues/3480

      1. 1

        will check the bugs, thanks for the links.

      2. 4

        reinstalling every single package when I upgrade is a chore

        raco pkg migrate <previous-version-here> will automatically migrate your old packages to the new version.

        Re. re-compiling modules as needed, I think someone just needs to put in the time/effort to get that working. I don’t think there’s anything inherent to prevent it from working. In the mean time, you should be able to run raco setup after an upgrade to get it to re-compile all your stale packages (assuming you install your projects as packages).

        1. 2

          thanks @bogdan, as usual your content saves my day.

      1. 4

        wow, this is very neat! I’m quite impressed by the progress the author made. Look at a recent post of it running on a Pi.

        1. 3

          I don’t use it for scripting, but I had a couple scripts done with deno. The reason for deno in that case was that it was a single binary, so it was easy to move around. These days, when I need scripts I mostly using either bash or AppleScript to be honest. AppleScript is terribly underrated.

          1. 1

            Nice post! I’ve always been tempted to write a book, but never done it, partly because I know what a time-suck it would be.

            I have to say that copy-editing has really gone down the tubes, even in commercially-published books. I usually notice grammar errors on every page of recent books, and in many cases could probably guess the author’s nationality within a few pages, especially if they’re Russian/Slavic or Indian.

            (Academic presses are the worst. I remember a multi-author anthology on P2P from Springer that cost something like $300, where it was obvious they’d done zero copy-editing and some chapters were almost unreadable.)

            1. 4

              In terms of copy-editing, you kinda get what you pay for. Indie publishers usually don’t have much money to spend on professionals, so they’ll often have no copy editing (that is my case) or maybe they’ll get some basic copy editing done. I’m fully aware that I’m part of the problem here, I just didn’t had the money to hire someone for an honest price. As someone who is not a native speaker, I’m sure that lots of my phrases sound strange and sometimes they fell like you’re reading Portuguese instead of English.

              The book I’m working on now is a fiction book, and for that one I’m saving enough to get developmental editing, copy-editing, and cover designer, the full package, right? :-)

              Regarding slopy copy-editing and typesetting in traditional publishing, I have some opinions that ring true to me but that I have no evidence for. Many of these publishers are putting multiple books out per month. The development ecosystem, specially the Web, moves too fast and publishers can’t produce books fast enough to benefit from being at the market at the right time, so in the end everything is kinda rushed. Publishing fast and getting into the market reduces their production cost and increases the chance of people buying the book before the topic at hand becomes boring or deviate too much from the book’s content.

              For example, about a decade ago I worked with a major publisher in a book that ended up being cancelled due to external reasons. They had the editor communicating with me almost weekly, revising and commenting my drafts as I saved them. It felt great and I learned a ton even if the book never saw the light of the day. On a recent book with another publisher, the editing period felt rushed and in my subjective opinion it was too short for the kind of editing my non-native speaker text needs before it is ready. I voiced this to some beta readers, but they were fine with the content, so I guess it might just have been my own insecurities.

              In the end, it is a bit of a trade-off: you can have more books or fewer books with better quality. You can’t really have quality and large output unless people start paying more for books so that the authors have funding to pay decent prices for the professionals they need. A similar problem plagues Journalism as well, the fetish of speed doesn’t give Journalists enough time to do deep investigations and research. It is all about breaking news and working the wire, publishing a gazilion small articles with no depth. Speed is a deceitful master.

              With all the tools and services available for indie publishing, it is no wonder that many writers focused on tech content will prefer to go with the self-publishing route instead of traditional publishing. I think that I’m OK with having books that don’t have perfect copy-editing and typesetting, if that gets more content out there. I’d be deeply sad if the other option for these authors would be simply to not publish. Maybe, the real culprit of these problems is that indie tech writers are not really aware that they need to hire these professionals at all. Many don’t have any training regarding publishing and are doing it by the seat of their pants. Blog posts and books about it might help spread the awareness of such needs.

              1. 2

                For what it’s worth, I would not have guessed you aren’t a native English speaker if you hadn’t brought it up. You write it better than many Americans I know :)

                1. 1

                  Thanks a lot, this warms my heart. I always think my English sounds kinda broken :-)

                2. 1

                  It can’t completely replace a human editor, but ProWritingAid can help clean up a manuscript before it reaches others and is not terribly expensive. It’s also able to open Scrivener files directly.

                  1. 1

                    I’m a happy customer of proWritingAid as well :-) I really recommend it for everyone.

                3. 1

                  I’m not saying professional copy-editors are useless, but for some books you ask yourself if the author didn’t let any single person proof read it.

                  Which kinda blows my mind, but maybe I’m the weird one for having offered to proofread several thesises for friends.

                  1. 1

                    I guess that for many self-published tech authors, the time some third-party sees the book is at the publication date. Many will be writing in a vacuum, without consulting anyone, not even their friends, regarding the book. I’ve been like that too. Sometimes you don’t show it to others with fear of imposing some obligation on them. You are afraid they’d rather not read or check your book, but are doing it because it would look bad if they didn’t. In the end it is just insecurity, people are afraid to impose on their friends, and sometimes unaware that there are highly skilled freelancers available for those tasks.

                    Another important aspect, which plays a more important role than people realise, is the need for speed. This is more observable in traditional publishers. They’re kinda terrified of some other publisher pushing a competing title just before their own title reaches the market. I don’t feel this fear is really warranted, but hey they are the hundred-years old companies who know how this works, they are probably right. I do know the fear that creeps on you as you’ve been working for a long time in some title and then competing titles start popping out on the market, and you calculate how many more will pop before you can reach it. To be honest, I don’t see other authors as competitors, we’re all in this together, but if you’re writing a book about a new fancy web framework, and some weeks before you launch, eight titles arrive in the market with the exact same topic and start making a splash, you worry if there will still be a demand for it by the time your book arrives in the market.

                    This fear tends to make publishers cut corners in an attempt to arrive in the market earlier. Good copy-editing is one of the victims of this frenzy.

                  2. 1

                    I have to say that copy-editing has really gone down the tubes

                    And so did the typesetting. Orphans and widows are the new normal in print. Why even bother to buy a printed book if its quality barely exceeds the result of printing a plaintext file?

                    1. 1

                      Unfortunately the quality of ebooks is really bad too, I’ve tried various readers and unless you just put a PDF on an eInk display it looks like you’re reading a Word document without the window chrome :(

                      1. 1

                        Well, ebooks in non-paged formats can’t be good. PDFs can, and I agree many authors/publishers neglecting it. Hell, I keep telling people to stop neglecting the basic quality of life features of PDFs that I even made a reusable guide to it. ;)

                        1. 1

                          ebooks in non-paged formats can’t be good

                          They could, if the currently-visible text were rendered into the desired rectangle with the TeX box-and-glue algorithm rather than whatever Blink/WebKit/khtml fork the industry has settled on.

                      2. 1

                        Which is weird, since InDesign, Pages, and (IIRC) Word all have widow/orphan prevention. Not sure if it’s enabled by default, but you’d think whomever designs the stylesheets for a book publisher would know to turn it on.

                        TBH I don’t mind them that much. My pet peeves, typographically, are typewriter quotation marks and awful line stretching due to a missing hyphenation. Oh, and Helvetica/Arial/Verdana as body text.

                    1. 11

                      If you like mazes, there’s an excellent resource: Mazes for Programmers - Usually this is a book I choose to gift to friends of mine who can write code.

                      1. 4

                        That book is wonderful.

                      1. 2

                        I write technical documents in sphinx. What would you are the relative benefits of using Scrivener instead?

                        1. 4

                          Not OP, but I have used Scrivener for years, though for fiction. Here’s what I like about it, which may or may not actually be things you care about:

                          1. Long documents are broken up into smaller chunks (called scrivenings), with essentially arbitrary levels of nesting. You can select multiple of these and then have the main document area show all of those chunks together.
                          2. It has a powerful “compile” feature that can take your book and produce ready-to-go ebook and print formats (or Word, if you need to get the doc into someone else’s hands)
                          3. If you’re working on the kind of thing where you’d want to change the flow of the document, there’s a corkboard view that lets you move the individual scrivenings around
                          4. Store notes that are not part of the final doc in the same Scrivener doc, just in a separate folder within it. All neatly accessible in the UI.
                          5. Each scrivening can have its own metadata, tracking its status (first draft, second draft, final draft, etc.), or with notes specific to it.
                          6. Can sync with iOS for writing/updating on the go (I sometimes prefer to use my iPad for this)
                          7. Word count targets are nice when you’re trying to get a project done

                          I’m sure I’m leaving a lot of features out. This is one of those tools that I think will suit some people’s brains and not others. Kind of like todo list apps.

                          1. 1

                            I never used sphinx so I can’t comment on it, but @dangoor comment is spot on. I think that Scrivener is beneficial for those who are writing longer works as they can keep research, notes, links, and the content all in one place. It feels like a companion or an assistant, always ready to help you write your book.

                          1. 2

                            Is writing/publishing your main source of income?

                            1. 3

                              Unfortunately no, but I’ve treated it as a hobby up until this year. You know how people keep saying: “I want to be a writer” but never take it seriously? Well, that was me. I did write six books, and participated in two fiction anthologies, and it was the most fun experience for me. It is what I actually enjoy doing. Still, I’ve kept being a software developer for the past twenty years. I’m still working with software development as my main source of income, but I’m slowly ramping my book publishing with the goal to eventually becoming just a writer. There is a long road ahead of me.

                            1. 1

                              Your Roguelike Development with JavaScript book sounds like a lot of fun! Congrats on the release.

                              This blog post, and others like it that I’ve seen, are what have driven me to work on a book about publishing for technical poeple. I’m guessing that I’m about 75% through the first draft.

                              As a self-published fiction author, I see self-published tech authors leaving a lot of opportunity on the table and making things harder on themselves because they don’t know about some of the great stuff that has been developed for indie authors. Things like Scrivener, Vellum, and Reedsy are what I’m talking about… you’ve clearly explored the indie publishing space a lot more than most. Thanks for sharing!

                              Edit to add: Oh, and I just noticed you’re using Draft2Digital’s Universal Book Links for the links in your post. You’ve definitely come across the wonderful indiepub tools out there :)

                              1. 1

                                Thanks for the kind words :-)

                                I try to keep up to date with all that wide and indie publishers are using. I’m writing fiction as well (not yet ready to publish though) and these tools have been invaluable for me.

                                Great idea on the book about publishing for technical people, I think there are a lot of technical people who can benefit from it. Keep pushing!

                                1. 1

                                  Thanks! Good luck with your fiction! It’s definitely a very different market.

                              1. 2

                                I’ll be keeping an eye on this thread. Feel welcome to ask me anything.

                                1. 1

                                  Hey, nice post. I skimmed through it and I liked what I read. Appreciate the various resources you mentioned.

                                  Here’s a couple possible typos:

                                  CON: You need to do everything yourself, or hire people to do it.

                                  This is mentioned under both Self and Traditional publishing. I think it is copy-paste typo under Traditional.

                                  Leanpub used to be free but some people were not playing fair and now the service requires payement.

                                  First 100 books/courses are free now. I use Gumroad as well, mainly because payout terms are much better than Leanpub.


                                  I have a question as well. Have you used affiliate marketing? If so, any suggestions on how to go about it?

                                  1. 2

                                    Thanks for the kind words and feedback. Yes that was a copy & paste mistake indeed, I’ve fixed it and also updates the comment on Leanpub to reflect the current pricing. I remember they going full commercial for a while. Glad they managed to keep a free plan floating around.

                                    Instead of Gumroad, I use PayHip (after being a SendOwl user for years). What I like about PayHip is that it can deal with EU VAT automatically and it has some special features for books. The payout terms are much better than Leanpub with all three of them.

                                    As for affiliate marketing, I never used it but I plan to use some affiliate links in the future. It is very hard to derive all your income from being an independent author. You kinda need multiple revenue streams, and as your platform grows, using such links might keep some extra money dripping into your account. I know that many people are suspicious of them, believing that the content producer will write anything to convince you to buy through the affiliate link, but in my own personal experience as a reader and consumer of many podcasts that use such links, I never felt like that.

                                    Other forms of affiliate marketing such as guest posts are OK with me, specially if they are exchange posts between authors. What I don’t like are paid posts which are presented as original content. Those are just sneaky ads for me.

                                    1. 1

                                      What I like about PayHip is that it can deal with EU VAT automatically

                                      Gumroad also deals with EU VAT automatically for digital products: https://help.gumroad.com/article/10-dealing-with-vat. Haven’t heard of PayHip/SendOwl before, so I’ll keep those in mind if I need alternatives, thanks.

                                      in my own personal experience as a reader and consumer of many podcasts that use such links, I never felt like that

                                      Good to know. And agree about multiple revenue streams, in my experience it is difficult to bet on author income alone as it keep fluctuating.

                                1. 2

                                  ISP sucks here in Central London, all I can have are ADSL+ 10mb connections so I opted to buy a 4G home broadband router. It is a Huawei one and it is not bad, it usually gives me a bit more than 10mb but not much more. I attach it using ethernet cables to my trusted Mikrotik Cloud Router Switch (https://mikrotik.com/product/CRS125-24G-1S-2HnD-IN) which can kinda do anything but that I use as mostly a dumb wifi router with cables going to my dock and video mixer. That router is strong enough for me to pick it across the street (it is in front of a large glass window)

                                  1. 2

                                    Surprised you can’t get VDSL in central London.

                                    1. 2

                                      It’s really patchy, it totally depends on what happens to have been installed in the cabinet nearest to where you live, which can be wildly different from street to street. Some places do have VDSL, others just … don’t.

                                      1. 1

                                        Can you get FTTP? Might not be worth the cost to you of course.

                                        1. 5

                                          Haha. Actually I have a story about that. I had it in my last apartment, and it was amazing. I moved in to a brand new block where everything seemed whizzy, but when it came to it, the ISP said, uh, 0.75mbps up and 16 down. No cabinet upgrades planned. No, no idea when or if they will be. Yes we know the next street over has VDSL but you don’t. No, we’re basically not even sorry. Uh huh. So first of all, I looked into a line-of-sight microwave link from the office I was working in at the time, a few hundred feet away, but not only is UK weather really bad, I also found out they were planning the next sister block to be built directly in the way. Yay. So then I got onto a FTTP provider who wires up whole buildings, and basically wangled with the building manager for them to install it in that whole building, and the 2 next-door buildings in the process of being built, which was the treat they were looking for, as when I initially said “40 apartments in the block” they yawned a bit and said something about not getting out of bed for less than 100. Overall it took a year and change and a lot of hassling, but eventually I had a symmetric 1Gbps fiber line that reliably gave me 900mbps each way. With a 1ms ping into LINX. Joy. And I didn’t even have to pay for the install or the first year because I was the one that wangled it in there. It was actually one of the things that kept me in that place, long after I was happy with it for other reasons.

                                          So when I finally decided to move, I even used FTTP availability as a factor in choosing my current place. I saw that another FTTP provider had literally just dug up the road outside, and they offered 900mbps for even less than the other lot. Boom. I kicked off the process before moving in or signing anything, checked with the real estate people, got them to ask the landlord if it was OK, they said it would all be fine, so I got an ISP estimate done for wiring it into the apartment through the window frame, started the ISP’s landlord approval request, everything. Only after I moved in did anyone tell me not only that (a) the building is Grade II listed, so no, you can’t drill any holes, anywhere, and that (b) the cable riser that was put in 20 years ago when everything was converted (under an apparently extremely painful permission process) is 20 feet back into the building from the main wall, so I’d have to go in the basement, into the neighbour’s apartment, up through the riser, open the wall in my apartment to get the cable out, then pull up the antique wooden flooring, run the cable underneath it back to the front wall, break open that wall to get into the other riser and have it come out the socket, etc, etc, none of which would be allowed anyway - but also that (c) the ISP had dug their hole 6 feet away from the vault under the street containing the ingress point for all the existing cables/gas/power etc, where they should have dug it. Instead, they dug it above The Other Vault - you know, the one owned by the “weird little old guy” who used to own the building and a load of other ones in the area and who, when he did the deal to sell them all and allow for the conversions, got some crazy feudal lawyer to make changes to the centuries-old land registry documents so that Weird Little Old Guy still owns The Other Vault, along with the identical secondary under-street vaults in a lot of the other nearby properties he used to own, and Uses Them, for Things No-One Knows About, and only he has keys and rights to give entry permission to them, and he famously Never Gives Entry Permission For Them, To Anyone, Ever. Apparently once in a blue moon he’s seen showing up, checking the surroundings, and Going In to the Vault and locking the door behind him, staying for a while, then venturing out and scurrying away. No-one knows his story, or anything about why he wants a series of presumably unconnected under-street vaults, or what he’s doing in there. Maybe there are connecting tunnels. Maybe he’s got a subterranean cache of stolen gold with an attendant dragon retainer guarding it for him. Bondage dungeons. Within a mile of the UK government & intelligence buildings, so it could be anything really, secret service entrances, dimension portals, anything, but … No-One Knows.

                                          So the upshot is that I have to put up with 80/20mbps VDSL. Even then the ISP I could get sold me 180/100+ on the basis that their tool told me I was close enough to the cabinet, but when they installed it, it managed to sync at 160/80 but it had a crazy error rate. The engineer tried to explain, clearly expecting me to glaze over, but then when I made the error of saying “oh you mean a CRC” about one particular bit, he got all excited and then got into way more detail that I could keep up with, but the long and the short of it was that I could either “keep the sync speed and have a 30% error rate” or “lose the errors but also lose more than half the speed”, which was obviously a no-brainer. And then they tried to charge me the same price they quoted for the higher speed, as though this were perfectly natural. Hahaha. It took about 10 days for it to settle at 80/20 but it’s been totally reliable since then. So … obviously that’s a lot better than most DSL, but after living with FTTP for 4 years, anything else seems like purgatory really. Even if this is perfectly fast enough for most stuff I do, it’s just when you have a big push or pull to do it’s just …. argh. Hey ho!

                                          1. 3

                                            Arghhh !! I am chewing on my lip out of frustration as I read this. Man, seriously, idk what to say, sorry for your loss?

                                            1. 3

                                              Thanks! Ah, you know, it could be way worse, at least we haven’t been stuck indoors with nothing to do except use internet for a year 😂

                                            2. 2

                                              I’m reading along going yeah, ok, yeah, then suddenly there it is: “Grade II listed”. You have my sympathies, if nothing else!

                                              1. 1

                                                Hehe, thanks. Yeah. I guess I should have checked, but it just didn’t occur to me. Old cities, tsk …

                                    1. 4

                                      For those who want a modern version of HyperCard, I urge you to take a look into LiveCode. It is a modern take on HyperCard which can also distribute standalone applications to macOS, Windows, Linux, Android, and iOS. It runs on macOS, Linux, and Windows, so you’re kinda covered even if you don’t have a Mac. It used to be able to import HyperCard stack but I think that feature was removed some time ago. I’ve written a post called LiveCode is a modern day HyperCard couple years ago that might interest people here.

                                      Oh, and LiveCode has a FOSS GPL version at LiveCode.org for those who want to keep their feet firm into FOSS.

                                      PS: In that article I say I work for LiveCode, well, I don’t work there anymore even though we’re still friends.

                                      1. 4

                                        Thanks for the tip. To be truthful, I haven’t tried LiveCode, and while I’m happy that people are trying to offer modern HyperCard alternatives, I haven’t come across any such program with the level of polish that HyperCard had, in terms of the user interface.

                                        Another thing about HyperCard that modern incarnations fail to recreate is that HyperCard stacks were something of a mix between documents and applications, whereas SuperCard, LiveCode and similar software seem entirely focused on the application aspect. That’s a worthwhile aspect of it, to be sure, but it doesn’t capture the totality of what make HyperCard so unique and useful.

                                        1. 1

                                          If you tell me what document features you’re missing, or that are important to you, I can try to write a post showing if they are still present in LiveCode.

                                          1. 2

                                            Yeah, sure, well, it’s not really the features that I miss. I’ll try to explain it.

                                            What’s unique about HyperCard, in my estimation, is that the interface is paper-like. It naturally encourages users to carry over things that they would do on paper to HyperCard. It bridges the gap between paper and computer in a way that no user interface has done since. Back in the day, companies put their entire catalogs in HyperCard, because paper-based material translated very naturally into HyperCard stacks.

                                            Looking at LiveCode, it seems entirely focused on building graphical applications, but I don’t think HyperCard is primarily a GUI programming environment, it’s more like interactive paper. Sort of like the web, but much more flexible.

                                            I think the following question makes the distinction quite clear: Can it be printed to paper? For most LiveCode applications, I suspect the answer would be no. But the contents of HyperCard stacks is generally a pretty good fit for paper.

                                      1. 4

                                        I remember running Frontier back in the days. I still miss it, it was so much fun to use. At the same time a database, a web server, an application development runtime and environment, and a novel outline-based UI. Doing web stuff on the mac was really different in those early days, I remember doing CGI using AppleEvents since both the CGI and the server were desktop apps. Frontier unified everything into a cohesive package. If you’re curious, check out what were their manila sites and the “edit this page” features that allowed in-browser editing with ease.

                                        This is a cool link for those who are curious about it: https://inessential.com/2014/05/24/what_happened_at_userland

                                        1. 1

                                          and I couldn’t get this version to compile on a modern M1-based mac…

                                          1. 2

                                            For anyone else having trouble compiling my Frontier build on MacOS, Ted Howard has his own build likely to work better on Macs since that’s the focus of his commits.

                                          1. 1

                                            oops, I didn’t realise that. Maybe a mod can merge it.

                                          1. 7

                                            The author seems to be very short sighted in my opinion. Even though I don’t like that cryptography library is moving to Rust, I’d rather they created a new crypto library in rust and donated the old C one to new maintainers. It is in the end their call, and Rust will provide a ton of safety features out-of-the-box.

                                            I think that it all boils down to which platforms the cryptography library committed to support. If they want to support all platforms that Python runs, then RIIR is the wrong option. If they’re going to make it clear that it supports just a subset of those, then it is quite OK. People who are in the orphaned platforms can fork the last known working version and move ahead with a new team.

                                            also SIGH but the first post linked on his sidebar is a rant where he explains why he refuses to wear a mask and calls business who force him to wear a mask, tyrants…

                                            1. 4

                                              Is your concern that they are keeping the “brand” for cryprography? There’s certainly nothing preventing new maintainers from taking over the C version…

                                              1. 1

                                                not the brand from a marketing perspective, but that build systems and other automated tools point-of-view. Those who are pulling their sources will depend on a whole new toolchain to build it, that affects package maintainers and products for a whole range of architectures.

                                                IMO, drastic changes like that should be done in new libraries so that it doesn’t break other systems. They could’ve declared that library finished, created cryptography2 or some other name and proceed to RIIR. The people using their library would have the option to stay with the unmaintained system and keep their stuff working, or go through the trouble of upgrading to the new library.

                                                1. 5

                                                  They could’ve declared that library finished, created cryptography2 or some other name and proceed to RIIR

                                                  That’s a strange inversion of responsibility. It’s also not viable since they’re doing an incremental rewrite, not a wholesale one. A wholesale rewrite might warrant a new library, but incremental rewrites are reliant on the full existing code and history to do properly.

                                                  The responsibility for maintaining a library falls on the library maintainers. The responsibility for maintaining a distro falls on the distro maintainers. If a library stops supporting an environment, then the distro has to choose: also stop supporting the environment, or fork the library which they have license to do. “Tell the library maintainers to maintain their library differently” is not one of the options. Or, at least, not an option that will work.

                                                  1. 5

                                                    I personally would be okay with just a major version number bump or equivalent. Keep the brand, say the newer version is now the canonical one. Just:

                                                    • keep the old one for a while;
                                                    • mark the change as “breaking”.
                                                2. 2

                                                  Rust will provide a ton of safety features out-of-the-box.

                                                  Those safety features are almost irrelevant in the specific case of cryptographic code. Platform support aside, it won’t hurt, but it won’t do much good either. Don’t get me wrong, I would love that Rust be supported everywhere. Until then, C is still the better choice for portable cryptographic code.

                                                  My biggest qualm about this change is that they didn’t mark it as “breaking change, let’s bump the major version number”.

                                                  1. 5

                                                    “My biggest qualm about this change is that they didn’t mark it as “breaking change, let’s bump the major version number”.”

                                                    This was misunderstood by others as well. This project does not use semantic versioning, and the users were warned beforehand :

                                                    https://cryptography.io/en/latest/api-stability.html#versioning

                                                    https://github.com/pyca/cryptography/issues/5771#issuecomment-775038889

                                                    https://github.com/pyca/cryptography/issues/5771#issuecomment-775041677

                                                    1. 2

                                                      They’ve changed their mind, and from now on their project will use a semver-compatible system. The version they released that caused the issues was on their old (weird) scheme, tho.

                                                      1. 0

                                                        Kudos to them, then. While ideally they should probably have reverted the change, switch to semver, then apply the change back, the switch to semver shows that they are listening to feedback.

                                                  2. 0

                                                    it’s significantly worse than that; see my other response here. this is simply a terrible person.

                                                    1. 6

                                                      Why don’t we stick to discussing the article, and leave the person out of it?

                                                  1. 5

                                                    oh my, I remember getting stuff from that collection from a local BBS using the fancy new ZMODEM which supported resumable transfers.

                                                    1. 3

                                                      I always thought KERMIT is underrated, and should have been much more popular.

                                                      1. 3

                                                        I only learned about the versatility of KERMIT way later in my life. ZModem was so convenient for large transfers on my time-limited session at the BBS. I kinda miss those days.

                                                        1. 3

                                                          A lot of the ideas that Kermit implemented as a file transfer protocol where ahead of their time and a lot of similar features ended up in TCP. Kermit suffered in comparison to [XYZ]Modem because its default settings were tuned for really dirty lines. If you had the luxury of clean lines, [XYZ]Modem were clear wins due to their much larger buffer sizes and relative simplicity.

                                                          1. 1

                                                            Safe defaults, but easy to make fast (Kermit FAQ has the settings). I still use it to transfer files back and forth with old PCs.

                                                            Also, sensible choice of CRC16 (CCITT variant, reflected for performance).

                                                          2. 2

                                                            I was lucky enough to attend a university with good network connections in the early 90s, and Zmodem was on its way out even then. Kermit was seen as old (but valued) tech.

                                                        1. 2

                                                          I think that is my favourite FOSDEM talk ever. I’ve run many of the OSs listed there and went through a similar journey of discovering the road not taken. I too wonder what a new OS could be like. I knew I was in for a good talk when newtons and smalltalk appeared.

                                                          1. 1

                                                            Excellent! Thank you!

                                                          1. 2

                                                            FWIW, you can shorten the last 2 lines in mode() to the following idiom: return mode_map[m] or m.

                                                            1. 1

                                                              wow, that looks great. I’m surprised by how clean and straightforward it is.

                                                              1. 1

                                                                Heh, neat. Thanks.

                                                              1. 2

                                                                wow, went to check my site and found a single IP requesting my RSS every minute. That IP alone transferred 13GB of data since December. Damn. Thanks for raising awareness about such bad players @cadey.

                                                                1. 1

                                                                  Good post, I learned a bit more about Lua.

                                                                  @pushcx, consider folding this into https://lobste.rs/s/2lpxqj/lua_python

                                                                  1. 2

                                                                    Thanks for the kind words. That post was written as a reply to that Lua vs Python post, folding might be a good idea. I don’t know how folding works here to be honest.

                                                                  1. 11

                                                                    Thank you @soapdog, the HN thread also made me feel tired. Your post helped me remember there are others like me who do understand and appreciate the choices made by the authors of the language. And more people who understand that it’s not a zero-sum game, where one language being good (for some purposes) automatically means others must be “bad”.

                                                                    1. 5

                                                                      Thanks for the kind words. Someone posted that article on HN as a “real entry” instead of a comment reply like I did. Now, the amount of people devoting a ton of energy on 0-based indexes vs 1-based indexes, makes me tired. :-)

                                                                      1. 0

                                                                        Spite writes are the best writes!