1. 11

    I’m a little curious how this strikes others who work at this scale, or who use the ~contract/acceptance patterns the post describes?

    When I was reading it, I felt a little bit like E2E was getting unfairly tied to more general problems like “slow test suites create friction”, and “intrinsically flaky tests waste time and create uncertainty”?

    I clicked over to the Pact docs to see if that helped but I only ended up even more skeptical because the introduction promptly attacked what seems like a ~strawman to me:

    Do you set your house on fire to test your smoke alarm? No, you test the contract it holds with your ears by using the testing button. Pact provides that testing button for your code, allowing you to safely confirm that your applications will work together without having to deploy the world first.

    Like, huh? I test my fire alarm to make sure it works after I replace the battery. I guess it’s literally a kind of smoke test. I might do the same if I had been gone for a really long time, or if I lightly burned some pizza and noticed it didn’t go off. If the button test made one of my least-favorite noises and then still didn’t go off a few weeks later when something spills in the oven and smokes the kitchen up, you can bet I will indeed go light something on fire to test it out. Not the house, of course.

    Further, there’s a world of difference between how I might test my smoke alarm, and how someone who makes smoke alarms should test them. I do, in fact, want them to at least closely approximate lighting something on fire to make sure it works as they intend.

    1. 3

      My takeaways from this article is that E2E testing will take increasingly longer as your application grows (no surprise there), and that some people are not aware/forgot about of the Design by Contract (Bertrand Meyer, Eiffel’s author). The approach outlined in the article seems quite similar: express constraints on the input, on the outputs, and the state, which define the operational contract for the service.

      1. 7

        Design By Contract is a different kind of contract than here. This is more a technique to break integration tests into distinct unit tests and keep them in sync with each other.

      2.  

        It’s also worth noting that if you get a smoke detector professionally tested then the person comes along with a little cone that goes over the smoke detector and spays it with a very small amount of smoke. It literally tests the end-to-end operation of the smoke detector.

      1. 2

        I plan to integrate an image converting worker I made with a gallery I’m working on, the integration uses something like SQS which I made in Janet. I find it neat that I can make useful software run under 4MB of memory.

        While the cloudflare images feature supports WebP, PNG, JPEG and GIF, mine also supports AVIF and JXL. The worker uses an alpine image I packaged with avif and jxl binaries. Unfortunately these newer formats are not well supported in brew on arm64 and are missing from many repositories.

        1. 1

          Maybe try Nix, you may have better luck with support and compact containers

        1. 2

          A friend of mine had a home automation startup called Calaos, and was using EFL back in the early 2000. I saw a video of his product and it was very impressive: buttery smooth animations, gradients, transparency. It was really ahead of the rest, and more importantly working well on an embedded device. I always thought that EFL had a lot of potential, but somehow failed to reach popularity… Maybe others felt like the author of that post?

          1. 5

            You’re completely right: EFL was jaw-droppingly amazing at the time the project got some initial traction.

            I was looking to make EFL bindings for CHICKEN many years ago, but the docs were so sparse and the library churn was so high (and no stable releases for many years that made it into distros, either!) that I abandoned the entire idea.

            There wasn’t a release for some years after they got started, until they released the “asparagus” development snapshot when they felt the project was getting ready for public use. But then, for many years, development continued while the only “release” that was available was that very development snapshot.

            Edit: heh, regarding the churn - it looks like they kept rewriting Enlightenment (the WM at least?) over and over (all the while making it slower and slower), so much that the Bodhi Linux guys felt the need to fork it for their distro to keep it stable and fast.

          1. 8

            They’re both interesting (with pulumi having a bit more features available). But I think at this point terraform is so widely used and supports pretty much everything that they probably should have some comparison pages. I can’t find a “why use this rather than terraform?” justification page for either.

            1. 4

              I can’t find a “why use this rather than terraform?” justification page for either.

              https://www.pulumi.com/docs/intro/vs/ https://www.pulumi.com/docs/intro/vs/terraform/

              What I really enjoy about Pulumi is that, unlike Terraform, it defaults to remote state management service.

              1. 3

                I use Pulumi as I prefer to write (typed) code over YAML files. Thanks to that, I can leverage abstraction, composition and specialization constructs available in the wrapper language (I use Python). It’s a game changer for me.

                1. 1

                  Have you tried tf-cdk? cdk.tf

                  1. 1

                    Not yet, I saw it on the TF website but it’s beta, isn’t it?

                    1. 1

                      Terraform wasn’t 1.0 until like this year.

                      It works fine. :D

              1. 5

                It is exciting to see neovim gaining such novel and useful features like first class Lua scriptability (you can even access libuv event loop from the Lua!), LSP client and online syntax parsing (treesitter).

                Personally, while I use neovim since 0.1.x versions I’m not eager to take the advantage of those 0.5.0 new features. I want to wait some time before dust settles. There are lots of new Lua plugins pop up for neovim (some brand new, some rewrites of vimscript based plugins into Lua) and I want to wait and see which will become more robust and mature.


                Regarding the LSP I still use ALE which, I think, has some of the best UX:

                • Support for non-LSP linters and fixes (autoformatters mostly). This means I can use the same UI to see diagnostics from both LSP and non-LSP sources. This is really important for me.

                • A huge collection of integrations with LSP servers and linters. Sometime I discover a linter/LSP for a new file type I’m editing by just invoking :ALEInfo and seeing a list of suggested integrations.

                I think all of this is possible with neovim’s new LSP + some configuration on top (like using null-ls for non-LSP linters/formatters). But… it all requires some friction for configuring it right while ALE gives you all of that out of the box.

                1. 2

                  +1 for ALE, it’s easy and seamless, and comes with an impressively wide language and tools support.

                  1. 1

                    Thanks for the recommendation. I tried neovim but gave up after half an hour of failing to configure its LSP support and after becoming annoyed that they change the undo file format but don’t change the extension so you can’t edit the same file with vim and neovim. I tried ALE and it took about 10 minutes to get it working nicely (well, spread over two days, poking it a bit while I waited for things to build). It does exactly what I want and it’s added a total of under 10 lines to my vimrc to have nice clang-format and clangd integration.

                  1. 5

                    I’ve spent too much time on my laptop this week (lockdown week #2), so instead of coding, I’ll be thinking about how build systems and CI/CD pipelines are just the same thing (one is run locally on demand, the other remotely in reaction), but yet they typically lead to two separate code assets (eg. makefiles and YAML files). I hope to have some clearer ideas of what the building blocks for achieving that convergence may be.

                    1. 3

                      I wish we had more articles like these, where technology is put under the light of different perspectives. In the case of this very well written article, social, cultural and political lenses are applied. It’s a reminder that the changes we introduce, through technology or practice, have an impact and a meaning beyond the direct goal we’re trying to achieve. Agile was primarily born out of the challenge of ever changing requirements, but carried much more than addressing that one issue.

                      1. 2

                        This is a great convergence between notebooks and literate programming. ObservableHQ notebooks are the closest thing to that, and were a game changer for me.

                        1. 4

                          I was using BeOS for my daily environmental back in the early 2000s, and have really fond memories of that time. The desktop was way more polished than what you would get on Linux and Windows, while also being super lightweight. Now everything is on the Web, but I do download Haiku’s RC ISO each time when they come out, and fantasize about having that for my daily OS again.

                          1. 12

                            I go the other way. I gave up on lisps because I love my types too much, but I wish more languages had s-expresion syntax - they’re such a joy to edit in an editor with appropriate tooling.

                            1. 6

                              Have you tried Carp?

                              https://github.com/carp-lang/Carp

                              1. 1

                                I have not, but it looks pretty cool. I don’t get much time to noodle around with random languages any more =|

                              2. 6

                                they’re such a joy to edit in an editor with appropriate tooling.

                                I’d say this differently: it’s easier to build appropriate tooling for S-expression based languages. But if you have “extend selection”, a bunch of intentions for common editing tasks, typing assists and proper cursor placement, you’ll get a better editing experience for any language. But yeah, building all that is non-trivial.

                                1. 2

                                  Hear hear! I started writing an s expression syntax for python once.

                                  It was a sad day when I admitted thatsuch a thing would have a user base of exactly one.

                                  1. 5

                                    Have you seen Hy lang?

                                    1. 2

                                      Surely you mean a use base of two!

                                      :-D

                                      1. 1

                                        There’s https://github.com/hylang/hy in that field

                                        1. 2

                                          Jinx!

                                      2. 2

                                        I think that’s a form of BDSM ;-)

                                        1. 3

                                          Benign Dictator S-expression Mandates?

                                      1. 4

                                        Why would anyone use this? I mean, I’d much rather use Perl than javascript on my servers.

                                        1. 20

                                          Because a great many people prefer JavaScript to Perl, or at least know JavaScript better than Perl.

                                          1. 24

                                            Not everyone is you. Hope this helps.

                                            1. 3

                                              A use case would be using Pulumi for provisioning and zx for configuration, deployment and automation, with common libraries between both.

                                              1. 2

                                                I gotta admit JS leaves me wanting when it comes to really basic stuff like list comprehensions and dictionary syntax, but being able to throw together an async-y tool to run stuff in parallel cleanly, for example, is a pretty great party trick.t

                                                For example, you can easily make your build scripts just follow their dependencies cleanly without having to go full Bazel to get the proper speed boosts you might want

                                              1. 4

                                                The more I read about Zig, the more I like it. I hope there will be a rich stdlib like Go, and I think I’ve seen Rust-like compiler safety features on the roadmap. Good to read first hand experiences with the language!

                                                1. 1

                                                  Rust-like compiler safety features

                                                  Most probably not as iron-clad, but safety is definitely one design goal.

                                                  1. 1

                                                    Just curious, as you’re part of the Zig team, do you have any link to the roadmap items or tickets related to the safety features? I’d like to follow!

                                                      1. 1

                                                        Wow, that was fast, thanks!

                                                1. 5

                                                  How ironic that Xamarin, originally designed for .NET applications on Linux, is now not for Linux anymore. I wonder what De Icaza would think of that.

                                                  1. 5

                                                    Considering how the Linux community treated Mono, who can blame them?

                                                    1. 1

                                                      I’m not sure who would have backed up Mono on Linux. Permanent second class, living in the shadow of an uncooperative giant. Icaza’s long term vision was always unclear to me, but history shows what it was: assimilation, annihilation ;)

                                                      1. 4

                                                        I have a different view of it, informed by rms’ fatwa against it that became the closest the Linux community got to an angry mob (i.e; screaming about Mono-based applications on distro CDs, trying to cancel a Debian developer for packaging it, etc.).

                                                        Microsoft in the 2000s seemed fairly cool towards it (i.e. adding Unix to the OS enum), but they were too Windows focused to promote such a thing. I understand why Mono had to wither away for MS’ own push for .NET on Linux (Windows users didn’t think it was viable, Linux users had prejudice), but I’m sad that a good project spent years in the weeds because of it.

                                                        1. 3

                                                          I understand why Mono had to wither away for MS’ own push for .NET on Linux (Windows users didn’t think it was viable, Linux users had prejudice), but I’m sad that a good project spent years in the weeds because of it.

                                                          I don’t think this is quite what’s happened. One of the goals of .NET 5 was to merge the Mono and .NET Core codebases. Various bits of Mono infrastructure are used in the Xamarin components.

                                                          This makes me somewhat sad because Mono was very portable but the .NET Core-derived bits are Linux/Windows/macOS and often lose *BSD/whatever support that Mono has had for ages.

                                                    2. 1

                                                      Too much money to care about that

                                                    1. 3

                                                      Isn’t the whole point of unkpg and cdnjs availability and ease of access? The caching bit seems like a cherry on top compared to having modules readily loadable from the browser with a script tag. Sadly, you can’t ED6-import from all of these sites, but that would be great. Deno does that well (https imports).

                                                      1. 18

                                                        I use delta for viewing git diff output. Since I started that, I find myself reaching for GUI diff viewers much less frequently, which I like because I’ve never met a git gui that I find generally usable.

                                                        1. 2

                                                          Vimdiff works pretty well for that use case if you’re from that church.

                                                        1. 8

                                                          It’s not true React/JSX but for small things this is also nice:

                                                          import { html, render } from 'https://unpkg.com/htm/preact/standalone.module.js'
                                                          

                                                          https://github.com/developit/htm

                                                          1. 2

                                                            Exactly what I was looking for, I was going to use a Gist with a jsx template string processor, but that seems better, thanks for sharing!

                                                          1. 2

                                                            Company: New Zealand Stock Exchange

                                                            Company site: nzx.com

                                                            Position(s): Platform Engineer, Full-Stack Developer

                                                            Location: Wellington or Auckland, NZ

                                                            Description: The NZX (New Zealand Stock Exchange) is going through a Digital Transformation™, where we’re building the foundation for a modern API-based, streaming data platform, and an equally ambitious content delivery platform to change the way we engage with our institutional and retail clients. Big changes, and big people required :)

                                                            Tech stack: Python (mypy), Deno, Go, React on AWS

                                                            Contact: PM me, the positions are not advertised yet.

                                                            1. 17

                                                              I have yet to hide any tag on Lobsters, but if there were a tag for these stupid OOP vs FP hot takes I would hide them in an instant.

                                                              1. 16

                                                                I’d generalize that to every article “Uncle Bob” writes.

                                                                1. 3

                                                                  But you would miss the discussions on Lobsters, which are often better than the OP ;)

                                                                  1. 2

                                                                    Same; I think I’d hide even an OOP tag by itself, never mind FP.

                                                                  1. 2

                                                                    .. and CI/CD pipelines are distributed build systems

                                                                    1. 2

                                                                      It would be great if this article and the one on Preserve’s data model had more information on the rationale and goals, as it’s not immediately apparent what problem Preserve is addressing and what makes it different from the rest. Looks interesting, though!

                                                                      1. 2

                                                                        Thanks! That’s a comment that keeps coming back, it’s good feedback. I’ll try to write something up on the motivation.