1. 1

    Outside work, I’m trying to reverse-engineer the IR protocol for an AC. Y’all are welcome to help!

    Hitting “power” on the remote yields a message like this:

    110000010110000000100000000000000000000000000000101110011100100000000000000000000000000000000000000000000110000111000000001000000000000000000000000100000000000010010000 
    

    I’ve recorded one message for each temperature level the remote can set, the bits that change are, oddly(?) spread out across the message. H,M and ? changes with the clock in the remote. T changes with temperature. _ indicates the bit is the same in value in all messages so far.

    ____________________________TTTT_________________HHHHH___MMMMMM__________________________________________??????TT______________________________T_______________________T
    110000010110000000100000000000000000000000000000101110011100100000000000000000000000000000000000000000000110000111000000001000000000000000000000000100000000000010010000
    

    If you collapse the message and just look at the bits that change with temperature setting changes, these are the bits for each temp level (in F, on the remote anyway):

    61 00001100
    62 00001111
    63 10001100
    64 01001100
    65 01001111
    66 11001100
    67 11001111
    68 00100100
    69 00100111
    70 10100100
    71 10100111
    72 01100100
    73 11100100
    74 11100111
    75 00011000
    76 00011011
    77 10011000
    78 10011011
    79 01011000
    80 01011011
    

    If anyone has IR experience and/or likes puzzles, let me know if you see the pattern. I may just end up encoding these in a table, though that feels like cheating :p

    1. 5

      I can help with the first four bits: It adds 0.5°F, then converts it to °C, subtracts 16°C, and the bits are in reverse order.

      In Python:

      def bits(temp_F):
          return bin(int( (temp_F + 0.5 - 32)*5/9 ))[3:].zfill(4)[::-1]
      
      1. 1

        Oof! Awesome.

        The last four are interesting.. there’s definitely a grouping. A list of temp in C, minus 16, vs bit pattern seems to say they group incrementallyish. Wonder if it’s some sort of checksum or mask. As I’ve understood it IR protocols are prone to include redundancy or similar, given the shoddy transfer medium.

        bits temp_c_minus_16
        1100 [0.39, 1.5, 2.06, 3.17]
        1111 [0.94, 2.61, 3.72]
        0100 [4.28, 5.39, 6.5, 7.06]
        0111 [4.83, 5.94, 7.61]
        1000 [8.17, 9.28, 10.39]
        1011 [8.72, 9.83, 10.94]
        

        edit: staring at this, definitely something about the fraction right, it splits cleanly around .5; all the ones that are >.5 end in 11, and the ones that are <.5 end 00

        1. 3

          The next two bits seem to be the bitwise not of the {uppermost bits, division by 4} of (°C - 16). You noticed that the last two bits follow (°C - ⌊°C⌋) > ½, but they also correspond to the 4th bit to the right of the decimal point.

          All together in Python:

          for f in range(61, 81):
              print(f, end=' ')
              c = (f + 0.5 - 32)*5/9
              c = 0xff & int(c * 2**4)
              c = list(map(int, bin(c)[2:].zfill(8)))
              c[0:4] = c[0:4][::-1]
              c[4:6] = [1 - b for b in c[2:4]]
              c[6] = c[7]
              print(''.join(map(str, c)))
          

          I’m not entirely satisfied with this explanation, but it’s the best I can think of.
          To know for certain you could always dump the code off of the remote and reverse engineer it. :P

      2. 3

        Checkout the content at analysir. They sell a product but Chris is also an expert and his blog posts may be helpful.

        https://www.analysir.com/blog/tag/decoding/

        1. 1

          Oh wow, that’s super in depth, thanks for this!

        1. 3

          Care to elaborate?

          1. 3

            The supported languages are Rust, C++, and Node. The website has an extremely tacky corporate feel with web fonts and stock photos. The toolkit heavily features animations and transparency, even beyond what has ill-advisedly been done in existing toolkits. The name even appears to be a reference to this misdesign.

        1. 2

          @gempain this looks great. Frankly I’ve been wanting to write something like this for my own personal use for years.

          It’s a bit of a bummer to see this under the BSL. However, it looks like the change date to Apache 2.0 is extremely close to release date. Can you talk a bit about what you’re trying to accomplish/what the thinking is there?

          1. 3

            Thanks for the kind words ! This is a mistake from us, the change date should be in 2023, thanks for pointing this out. The reason behind choosing a BSL is that we’d like to finance the project by running a cloud version of Meli :) Basically you can do all you want with the tool, but we find this license fair enough so that a team like us can fully dedicate its time to making this project awesome while allowing others to use it for free.

            1. 1

              Hey, really cool project! Honestly I was looking for something exactly like this anddd youve got packaged up in docker, saves me from having to do that!

              Looking forward to using it.

              Also, I personally think that the license choice is appropriate but considering how this is a new type of license do you think after some time your team could share some thoughts on how successful it was?

              I don’t want to detract too far, but I think finding a sweet spot between user freedom (open source) and sustainability is very important. I’d rather a BSL project that is updated and improved over the years than an open source prototype that cant reliably be used because the developers had to move on to another project.

              1. 3

                Thank you for this really nice comment ! I think exactly the same way as you ! I am aware that the debate over BSL is hot at the moment, and not every one will agree. We want to focus solely on developing this tool, and in this context, BSL makes sense as it gives the team a chance to monetize the project fairly. Everyone can use the tool for free, with no limitations, except the one mentioned in the license. It’s a fair model, which has been supported by the OS foundation even though they have not recognized it officially yet. We’re part of the people that believe in it and would love to see the community supporting this choice - it’s a good way to ensure healthy evolution of a platform like this. I think BSL makes sense for platforms, but for libraries, we always use MIT or GPL as I think it’s more suited. We’ll definitely blog about how it goes with this license, it’s a topic I hold at heart.

              2. 1

                Basically you can do all you want with the tool

                From the license:

                (…) make non-production use of the Licensed Work

                Am I missing something, or can one do everything except use it?

                1. 2

                  BSL: You can self host or pay for hosting but not sell hosting to others.

                  1. 1

                    It’s not that simple from the BSL alone, but I missed the concrete parameters to this use of the license:

                    Additional Use Grant: You may make use of the Licensed Work, provided that you may not use the Licensed Work for a Static Site Hosting Service.

                    A “Static Site Hosting Service” is a commercial offering that allows third parties (other than your employees and contractors) to access the functionality of the Licensed Work by creating organizations, teams or sites controlled by such third parties.

                    For an unmodified BSL, any “production use” is outside the grant, but Meli grants “use” outside of running a service as specified (it appears they allow a non-commercial service though).

                    1. 1

                      It may be a good idea to design a “composable” set of shared source licenses like Creative Commons did with their licenses for creative works. E.g. SourceAvailable-ProductionUse-NoSublicensing-NoCloud.

                2. 1

                  Thanks for clarifying. This does make sense!

              1. 6

                Is this the new version of a webring?

                I don’t get the fully anti-JS sentiment. My own personal site has a sprinkling of JS so you can change the theme. It still has a really high score on the GTMetrix scanner they say to use. It also does nothing special other than capturing your t keypress and changing the theme in a cycle. SO DANGEROUS! https://nickjurista.com if you want me to steal your identity with my scary JavaScript.

                1. 4

                  Sure today you only capture t, tomorrow you capture all scroll events, then you prevent device-native shortcuts.

                  1. 3

                    This seems to be the slippery slope fallacy.

                    “A slippery slope argument, in logic, critical thinking, political rhetoric, and caselaw, is often viewed as a logical fallacy in which a party asserts that a relatively small first step leads to a chain of related events culminating in some significant effect”

                    1. 2

                      Well, the slippery slope fallacy isn’t a fallacy if it actually is a slippery slope… Question is whether it is here.

                      1. 1

                        I’m not sure I understand haha. This fallacy is one I’ve always had trouble understanding.

                        The way I see it,it’s insufficient to say a happened so b must be the next step. So in this case b bad so we assume b is going to happen and therefore we should stop a.

                        From a logical point of view this is purely speculation? You can certainly speculate based on patterns, but I think it weakens the reasoning of the argument?

                        Let me know what your thoughts are

                    2. 2

                      Hey, stop giving away my identity theft secrets!

                    3. 1

                      Users generally have no way of knowing that capturing t and switching the theme is the only thing your site does until it’s too late. Javascript isn’t vetted by distro maintainers either.

                      Your website could instead use a CSS media query to set the user-preferred theme: @media (prefers-color-scheme: dark) {...}. That way, users automatically get their preferred theme without having to execute untrusted code. They could also use the same browser/OS dark/light toggle that works on every other website instead of learning your site’s specific implementation.

                      I’m generally a big advocate of leaving presentation up to the user agent rather than the author when it’s possible; textual websites are the main example that comes to mind. There’s a previous discussion on an article I wrote on the subject; the comments had a lot of good points supporting and opposing the idea.

                      1. 1

                        I think giving a user presentation options is fine for text, but I really don’t care how someone wants to look at my personal site. The themes I use are not light vs dark, they’re a variety of color schemes.

                        I also don’t care if someone disables JS in their browser. IMO it’s extremist behavior from a very, very small fraction of people. My site works without JS because it has nothing interactive anyway. But many sites I’ve worked on have been entirely JS-based (like live updating sports and customer dashboards). There’s anything inherently wrong with JS.

                        1. 1

                          My site works without JS

                          It’s great that your site works perfectly without JS; thanks for sticking to progressive enhancement!

                          it’s extremist behavior…There’s anything inherently wrong with JS.

                          There are many good, non-“extremist” reasons why people don’t run JS:

                          • They use Tor. Running JS on Tor is a bad idea because it opens the floodgates to fingerprinting; frequent users generally set the security slider to “max” and disable all scripting.
                          • They have a high rate of packet loss and didn’t load anything besides HTML. This is common if they’re in a train, on hotel wi-fi, using 3g, switching between networks, etc.
                          • They use a browser that you didn’t test with. Several article-extraction programs and services don’t execute JS, for instance.

                          HTML, CSS, JS, Websockets, WebGL, Web Bluetooth API…there are a lot of features that websites/webapps can use. Each feature you add costs a few edge-cases.

                          It’s unrealistic to expect devs like you and me to test their personal sites in Netsurf, Dillo, braille readers, a browser that won’t be invented until the year 2040, e-readers, a Blackberry 10, and every other edge-case under the sun (I try to anyway, but I don’t expect everyone to do the same). But the fewer features a site uses, the more unknown edge-cases will be automatically supported. For example, my site worked on lynx, links, elinks, w3m, Readability, Pocket, and even my own custom hacky website-to-markdown-article script without any work because it just uses simple HTML and (optional) CSS.

                          Not all websites are the same. Customer dashboards probably need to do more things than our blogs. That’s why I like to stick to a rule of thumb: “meet your requirements using the fewest features possible” (i.e., use progressive enhancement). Use JS if it’s the only way to do so.

                          from a very, very small fraction of people.

                          I disagree with the mentality of ignoring small minorities; I try to cater to the largest surface possible without compromising security, and regularly check my access logs for new user-agents that I can test with. Everyone is part of a minority at some point, and spending the extra effort to be inclusive is only going to make the Web a better place.

                          I’d like to add that when making moral arguments, non-adherents tend to feel attacked; please don’t feel like I’m “targeting” you in any way. Your site is great, especially since it works without JS. Don’t let my subjective definition of “perfect” be the enemy of “good”.

                          1. 2

                            I don’t feel attacked at all. This is no different to me than someone who refuses to use an Android or iOS phone because they are afraid of being tracked. I see it as a lot of tinfoil with very little substance.

                            I personally do not see coding as a moral or political stance like so many do (especially here on Lobsters). I see it as a means to an end – and in my case it’s that I couldn’t decide on a theme and wanted to put an easter egg on my site.

                            For professional things, I tend to follow the 80/20 or 90/10 when approaching projects, catering to the low-hanging fruit to get the most stuff done. If I focus on all edge cases, I’ll never finish anything and it’s unreasonable to expect anyone to really do that.

                            Many sites don’t need to use JS and thus shouldn’t, but I think it’s throwing the baby out with the bathwater when people try to go “No JS” because of some sites doing stupid things to try to track users more or get more data out of them - or just turn their whole static site into a client-side app for no discernible reason. If you want total privacy, throw out your electronic devices altogether, start using cash only for purchases, get off the grid altogether.

                            JS itself is amazing and has propelled the web to incredible new uses. What I see from a lot of these No JS people is a really small segment of generally power users who either don’t like JS to begin with or are incredibly paranoid about being tracked for whatever reason. The average user, and most users by a large margin, are not concerned with running some arbitrary scripts (which the sandbox keeps getting tighter over time btw). This club feels like more virtue signaling than anything to me, and I think the No JS argument and “club” is silly altogether.

                            1. 1

                              (Preface: nothing I have said so far applies to software that is, by necessity, a web app)

                              I […] wanted to put an easter egg on my site

                              Easter eggs are fine! Your site is great. You might want to change the trigger, though; people might expect something else to happen when they press “t”. Technically-inclined users are more likely than the average user to use custom keybinds.

                              If I focus on all edge cases, I’ll never finish anything and it’s unreasonable to expect anyone to really do that.

                              I agree that it’s ridiculous to expect people to test every edge case, which is why I advocated for simple sites that use simple technologies. With the “textual websites” I described in my article, you automatically get support for everything from braille readers to HTML-parsing article-extraction programs, without doing any work because you’re just using HTML with progressive, optional CSS/JS. I literally didn’t spend a single moment optimizing my site for w3m, lynx, links, elinks, IE, etc; when I tested my site in them, it just worked.

                              I think it’s throwing the baby out with the bathwater when people try to go “No JS” because of some sites doing stupid things to try to track users more or get more data out of them.

                              Nobody knows what lies on the other side of a hyperlink. We don’t know whether a site will do those bad things, so we disable scripts by default and enable them if we can be convinced. “Minimizing tracking and fingerprinting” and “living in a cabin in the woods” are worlds apart. I don’t think it’s healthy to assume that all privacy advocates are anarcho-primitivists.

                              Disabling scripting for privacy isn’t uncommon; it’s the norm among Tor users. These people aren’t unhinged as you portrayed; they’re…normal people who use Tor. Their use cases aren’t invalid.

                              JS itself is amazing and has propelled the web to incredible new uses

                              Apps are new. Blogs are not new. We should use the right tool for the right job. The mentality of “progress + innovation at full speed” is great when used in the right places, but I don’t think it belongs everywhere. We should be aware of the consequences of using tools and use them appropriately.

                              This club feels like more virtue signaling than anything to me, and I think the No JS argument and “club” is silly altogether.

                              It is virtue signalling. We believe in and follow a virtue, and signal it to others by joining this club. The existence of this “virtue-signalling platform” can help encourage this behavior; I know for a fact that the various “clubs” that cropped up in the past week have encouraged many site authors to optimize their websites so they could be included.

                              1. 1

                                “Minimizing tracking and fingerprinting” and “living in a cabin in the woods” are worlds apart. I don’t think it’s healthy to assume that all privacy advocates are anarcho-primitivists.

                                I never said anything about living in a cabin in the woods or “anarcho-primitivists” - in fact this is the first time I’ve even heard the term.

                                You can minimize tracking and fingerprinting without disallowing JS altogether or starting a webring for sites without JS. That’s why I said “throwing the baby out with the bathwater.” If you remember, companies used to track with a pixel that folks would throw on their page which would then load from that domain and they would scrape whatever info they wanted on you. So when do we get to join the NoImages.club?

                                1. 1

                                  If you want total privacy, throw out your electronic devices altogether, start using cash only for purchases, get off the grid altogether.

                                  I never said anything about living in a cabin in the woods or “anarcho-primitivists” - in fact this is the first time I’ve even heard the term.

                                  Sorry, that’s the vibe I got from living “off the grid” without any electricity. Guess I was a bit hyperbolic.

                                  If you remember, companies used to track with a pixel that folks would throw on their page which would then load from that domain and they would scrape whatever info they wanted on you.

                                  That’s a good reason to test your site without images, in case users disable them. More on this below.

                                  There’s a big difference between logging the loading of a tracking pixel and tracking the canvas fingerprint, window size, scrolling behavior, typing rate, mouse movements, rendering quirks, etc. Defending against every fingerprinting mechanism without blocking JS sounds harder than just blocking it by default. There’s a reason why the Tor browser’s secure mode disables scripting (among other things) and why almost every Tor user does this; they’re not all just collectively holding the same misconception. The loading of an image without JS isn’t enough to make you unique, but executing arbitrary scripts certainly is; the equivalent of a “read” receipt isn’t the same as fingerprinting.

                                  So when do we get to join the NoImages.club?

                                  Unless there isn’t an alternative, images should be treated like CSS: an optional progressive enhancement that may or may not get loaded. That’s why all images should have alt-text and pages should be tested with and without image loading. Writing good alt-text is important not just for screen/braille-readers, but also for people struggling with packet-loss or using unconventional clients.

                                  IMO, text-oriented websites should only inline images (with alt-text, ofc) if they add to the content, and shouldn’t be used simply for decoration. I wouldn’t create a “no-images.club” because the potential for misuse isn’t nearly on the same level.

                    1. 3

                      As a sysadmin, One of the biggest problems with doing the recommended course of action in this article is unaware and unintentional data loss.

                      So, lets say everything is in the DB. You put pics in there, use imagemagick to shronk and store those in the DB. Your DB is getting stupidly large, because it’s not meant for binary blob data.

                      So, you yoink out the binary data. Those images are now saved in /opt/datastore/…. instead of postgres DBdirs.

                      The problem, that I’ve come across, is even though the DB is backed up in various ways, the backing up of the image dir path may not be (or be at different/bad backup periods).

                      Another problem is syncing problems… So when you save picture/document/binary content, you’re goinig to hash it and save the hash in a table along with appropriate metadata. Then you save the binary content in /opt/datastore/$HASH . But… what happens if the hash doesn’t exist? How do you guarantee data consistence between the table and what’s on the FS? (hint: its really hard).

                      Another issue is if you’re guiding someone else to do backup/restore/migration techniques. If everything’s in the DB, then it’s a backup and restore on another destination. Then copy over the configs, and you’re pretty much done. It’s effectively a “1 stop shop”. However splitting up the data in different dir’s means you also add complexity to backing up and restoring. And missing something is pretty easy.

                      I’ve more been a fan of using 2 DBs: a regular DB and a documentDB. But there’s no one-size-fits-all.

                      1. 1

                        Pretty sure this was covered in the section talking about how we’ve lost transactions once the database is no longer the sole data system.

                      1. 1

                        By far, I prioritize developer UX. Minimize developer suffering; maximize developer happiness. To that end, my current stack preferences are:

                        • REST in the back (probably Grape (Ruby), but I could tolerate JS-based, like Express)
                        • Vue in the front, without a doubt
                          • including Vuex
                        1. 1

                          I’m trying to decide between react and vue for a side project. I’m more familiar with react so leaning towards that.

                          Could you provide some info on what makes vue your choice? I’d just like to make a more informed decision so I’m trying to learn more about vue

                          1. 2

                            I’m at the point in my career where my primary criterion for choosing tools and tech is minimizing [developer] suffering. Pain, frustration, annoyance, irritation, etc. Out of the major JS frontend frameworks, Vue gives me the least suffering. I just want as direct a path as possible from conceived thought to functional code that realizes it. Vue comes closest to that ideal path.

                            I’ve dabbled a bit in React – though not enough for me to make a fully fair and balanced comparison. React alone doesn’t seem to have enough “batteries included”. Vue has batteries included, but not too many batteries. Angular makes me want to stop being a full stack developer – but I’m willing to admit that perhaps it’s just the codebase(s) I’ve been subjected to that gives me that jaded view.

                            I’ll say this: The same awe and wonderment I felt from discovering the joy of Ruby I feel from discovering and using Vue. No other two technologies have given me that over the course of my entire career. The keyword list you could extract out of my resume makes a tag cloud as big as the next person’s. For all the other keywords in that list: I use them because I’ve had to (inherited codebase; departmental mandate; whatever). I use Ruby and Vue because I want to.

                            I realize what I’m writing here is somewhat anecdotal, intangible and immeasurable. So I propose this: Call to mind just how much frustration and angst you’ve felt when dealing with the tech in your current work. Hold that as a standard to compare against. Then go try Vue on a side project [of sufficient size so the results can’t be dismissed]. I’m pretty sure most people would find using Vue a superior experience.

                        1. 1

                          Really cool!

                          1. 3

                            I plan on doing a bit of sailing and start a project best described as: “just-in-time cloud infrastructure for data pipelines” (working name Jittaform). The ideas is to combine Docker/Docker-compose and Terraform into a Gitlab-CI like yaml file that will allow someone to spin servers up and down based on the needs of the tasks they define.

                            I came across this: (https://www.capitalone.com/tech/cloud/just-in-time-cloud-infrastructure/)[Just-in-time-cloud infrastructure] but it seems to want to redefine everything that has already been built/developed, hence my decision to use Terraform and Docker. I am not an adept at Terraform, but this seems like a good reason to learn more about it.

                            1. 1

                              Hey, that’s a really neat idea!

                              Take a look at argoproj. It’s a workflow system built around ephemeral containers and kubernetes. It may be useful to see how they implement workflows.

                              Edit: link https://www.google.com/url?q=https://github.com/argoproj/argo

                              1. 2

                                Thanks so much for the link! Argoproj seems very interesting! They have the workflow system already worked out. I opted for Docker-compose and Docker Swarm because I have more experience with these technologies, but this looks like a good choice as well. I will play around with it to see if Terraform can be tacked on.

                                1. 1

                                  You’re welcome!

                            1. 3

                              After a little more than four years it’s my last week working remotely at $CURRENT_JOB before a week off and starting a new remote job in September. I fear leaving will be bittersweet, as I’ve enjoyed the job and got on well with my colleagues: will probably spend most of the week in goodbye calls! :-)

                              1. 1

                                Hey, glad to hear that you enjoyed the work. I’m a bit curious about remote work and the atmosphere. I would like to look into remote opportunities as it would open up a few different industries/position that aren’t currently possible in my current lifestyle and location.

                                What kind of experience did you have with working remotely? I’m mostly wondering about the social aspect.

                                1. 3

                                  This was my first job working remotely full time, and it has been overwhelmingly positive. I’ve been here for over four years, and my next job is also fully remote. I don’t plan to ever work in an open plan office again :-)

                                  I have a few caveats. For the first 3.5 years (until early this year) I rented a private 10m2 office about a 20 minutes drive away from my home, so I actually had a commute. The office was cheap, but it was difficult to find parking and I ended up spending more money on petrol and lunch (I wasn’t diciplined enough to make packed lunches); so since January 2019 I’ve been working from a room in my house.

                                  In some ways working from home is not ideal. My wife home-educates our son, so I’m not normally alone in the house during work. This means I’m prone to distraction by family. My dream situation is a 5-10 minute walking commute to a private office: far enough that I’m not distracted by family and chores stuff, but close enough to walk home for lunch :-)

                                  The lack of a commute means that I’m not finding myself in town, and thus can’t so easily meet up with friends in town after work. However, time not spent commuting is time I can use to go for a run before work, or practice my guitar. Since most of my friends themselves have families and we rarely manage to meet up, this works out in my favour.

                                  Mind you, the above is only true for fully remote – where everyone (or close to it) is remote. I would personally not be interested in working at a job where only some people are remote, as you’re going to find yourself excluded from meetings and decision making. (“We couldn’t find a meeting room with remote facilities, soz.”, “We forgot to dial you in, sorry.”)

                                  In my experience 100% remote work is more inclusive for people with families. At my previous job a group of the techies would often go to the pub after work. (Multiple times a week.) While everyone were welcome, I usually went home to my family. I don’t begrudge people going out and enjoying the company of their colleagues, but it felt like some important plans or decisions formed at the pub, and I felt excluded from that.

                                  1. 1

                                    it felt like some important plans or decisions formed at the pub

                                    Absolutely. People who are in the same room end up chatting and making decisions, even if not on the clock. It’s a strength of local work that has yet to be replicated in a remote setting.

                              1. 4

                                Personal: Just moved! Setting up a robust network for my personal office and media centre in the living room (the girlfriend is very happy about this).

                                Work: Writing out a pros-cons list of rewriting this entire backend service as a monolith instead of 45 microservices (AWS Lambda’s specifically) tied to API Gateway. This comes from the fact that Node.JS 6.10 is deprecated on AWS Lambda and we need to upgrade to implement new features. If anyone has done something like this and has some advice please feel free to share it with me! It’ll be greatly appreciated.

                                1. 1

                                  Just curious, is the primary reason for moving away from Lambda the nodejs runtime version?

                                  1. 1

                                    The runtime is what made us realize we need to upgrade each Lambda. There are some other concerning factors like there is no way to deploy the entire stack on our local development machines to debug or develop new features. The runtime also plays a factor in fixing existing bugs as we aren’t clear on how to test and debug this one Lambda without updating its runtime. Along with all of this, of course, is the fact that my colleague and I are unfamiliar with the stack as it was two previous employees who left the company that built and designed it.

                                    Its a single endpoint RESTful service that requires the maintenance of about 40 Lambdas behind API Gateway. It uses a ridiculous amount of duplicated code (I’ve heard of Layers, not sure how it actually works though), and these are big heavy functions (often many objects and methods implemented within each Lambda) that don’t seem to fit the use-case of the Lambda services.

                                    So the question is, do we rebuild the platform on a new runtime with the same infrastructure and the same roadblocks, or do we rebuild the platform in a way that is comfortable to me and my colleague - as a monolithic application in an EC2 instance proxied into by the API Gateway.

                                1. 3

                                  We got our 200/200 Fibre installed today, so I would really like to get the ethernet run at least to my desk done.

                                  Otherwise more work on Koalephant packages, and day to day client work.

                                  1. 1

                                    Hey, just curious what is Koaelephant?

                                    1. 1

                                      My company.

                                  1. 2
                                    • Redoing my personal website
                                    • Updating my Resume/CV
                                    • Playing DnD
                                    1. 2

                                      Hey, I’ve been trying to create a personal website as well.

                                      Currently it’s just a blog without any real content.

                                      rafikhan.io

                                      I’d love to take a look at yours as well for inspiration.

                                      1. 1

                                        Totally. I ended up just linking the domain to my blog for now and utilizing that. This way I have something with relatively active content and a way to connect with me.

                                        I’m going to be actively designing a full site in Sketch over the course of the next few weeks. My main concern though for this round was just having more content and a better way to connect with me.

                                        https://sneakycrow.dev/

                                    1. 9

                                      Glad to hear you are doing what you enjoy and getting paid for it. The industry definitely needs more dedicated, self-directed people.

                                      I’m curious to hear about your experience with writing for DigitalOcean? Have you just applied and got accepted? What is the process like?

                                      You should learn about algorithms and datastructures though: it’s not just helpful, it’s also a very interesting subject.

                                      And so are types: a good type system is really about expresiveness with automated checking rather than just the compiler preventing you from doing things it thinks make no sense. Languages with modern type systems like PureScript or Elm are gaining popularity in the JS ecosystem, and some classic languages can be cross-compiled to it: Facebook Messenger is written in OCaml for example. For a teaser: in https://perl.plover.com/yak/typing/notes.html there’s a live example of type checker finding an infinite loop.

                                      1. 8

                                        Regarding DigitalOcean, they found some articles I wrote on my own site and asked me if I’d be interested in writing for them, so I agreed to do the “How to Code in JavaScript” tutorial series. As for algorithms and data structures, (and types) I am interested in learning them; they’re on the top of my learning in public list. I’ve been playing around with TypeScript as well, to get familiar with more strict typing.

                                        1. 5

                                          I started learning typed languages with Elm and I found the experience really nice, even though I eventually “outgrew” it, mostly due to lack of interesting features and feeling like it’s kind of a dead end.

                                          OCaml/ReasonML is great but not very beginner-friendly atm, though the community is growing and doing a great job at making it more approachable. TypeScript also has a lot of good points, but I don’t think one can use it to its full potential without “being immersed” in a fully typed language first.

                                          So I definitely recommend Elm if you wanna get started with types and all that, but I wouldn’t invest on it for the long term.

                                          1. 1

                                            I was considering learning Go to get more familiar with a typed language.

                                            1. 13

                                              Go doesn’t really have much of a type system… If you’re mainly looking to learn Go by all means do it, but to really learn typed languages I suggest you look elsewhere.

                                              1. 4

                                                Go types aren’t really anything to write home about, and there are special rules for builtins that you don’t get to use (and therefore learn about) properly. If you’re comfortable with JS, I’d suggest TypeScript. You can add it one file at a time, and it’s easy to tell it to shut up and trust your word if you’re doing something magic it can’t grok, or you’re just mid-way through converting a module.

                                                1. 2

                                                  Go’s approach to interfaces are pretty uncommon and worth understanding.

                                                  1. 2

                                                    I agree on both counts. Doesn’t make Go a good place to start getting a handle on typed systems in general though, IMO. And to me, the most interesting thing about Go interfaces is the “static duck typing” aspect, which is exactly how TypeScript interfaces work, and is why you can migrate to TS a module at a time, using module-private typedefs even for inter-module communications. TS will happily let you do that, and still tell you when your structures aren’t compatible, which means you don’t have to have a single source-of-truth for any system-wide object shapes until you’re ready to do so.

                                                2. 2

                                                  A quick example. In Go, the usual approach to functions that may fail is to return a tuple of error value that can be nil and an actual value.

                                                  err, value := doThings();
                                                  if err != nil {
                                                  ...
                                                  }
                                                  

                                                  The caller must always remember to actually check the error.

                                                  In ML/Haskell, where you can have a “sum type” that can have multiple variants carrying different values, there are types like this:

                                                  (* 'a and 'b are placeholders for "any type" *)
                                                  type ('a, 'b) result = Ok of 'a | Error of 'b
                                                  

                                                  You cannot unwrap an (Ok x) without explicitly handling the other case without getting glaring compiler warnings (that you can make errors):

                                                  let result = do_things () in
                                                  match result with
                                                  | Ok value -> do_other_things value
                                                  | Error msg -> log_err "Bad things happened"; do_other_things some_default
                                                  

                                                  If you have a bunch of functions that may return errors, you can sequence them with a simple operator that takes a value and a function (the function must also return the Ok|Error type). If the value is Error is just returns it, but if it’s (Ok x) then it returns (f x).

                                                  let (>>=) x f =
                                                    match x with
                                                    | Error _ as e -> e 
                                                    | Ok value -> f x
                                                  
                                                  (* if at least one of these functions returns Error, res will be Error *)
                                                  let res = do_risky_thing foo >>= do_other_thing >>= do_one_more_thing
                                                  

                                                  (Note: OCaml also has exceptions and they are used just as widely as this approach, Haskell uses this approach exclusively)

                                              2. 2

                                                That learning in public list is really cool, what an awesome idea. Also, thanks for sharing this article :)

                                              3. 1

                                                I’ve been trying to learn more about expressive type systems as opposed to compilers complaining about nonsensical code.

                                                Could you please recommend some resources for further reading?

                                                1. 2

                                                  “OCaml from the very beginning” is a very nice book, and it’s not very expensive. For the non-strict way, http://haskellbook.com is good, but expensive.

                                                  If you want something free of charge, it’s a more difficult question. Stay away from “Learn You a Haskell”, it’s a very bad pedagody. Robert Harper’s “Programming in StandardML” (https://www.cs.cmu.edu/~rwh/isml/) is great and free, but it’s in, well, StandardML, the Latin of typed functional languages. You will have no problem switching to another syntax from it, but while you are in the SML land, you are on your own, with essentially no libs, no tooling, and not many people to ask for help. Tutorials on https://ocaml.org are good but not very extensive.

                                              1. 2

                                                Wow. You’ve accomplished and learnt so much! I’m definitely bookmarking this post and I hope to work through and learn a lot from your journey.

                                                Thank you for posting. Very inspirational!

                                                1. 1

                                                  Just finished a blog post about Full Automation, probably I will do some more bugfixing on gambe.ro this afternoon.

                                                  1. 1

                                                    Do you have a link? I’d love to read the post!

                                                    1. 1

                                                      I do but it’s in italian: https://write.as/chobeat/la-piena-automazione-spiegata-al-mio-microonde

                                                      I might translate to English though, it’s very short.

                                                  1. 3

                                                    I’m having some difficulty understanding the problem this solves. Can someone give me a use case for this?

                                                    1. 2

                                                      There are a bunch of constructions that are common in dynamic languages which can’t be tidily expressed in most type systems.

                                                      This library implements tools to let many of those be type annotated without being too verbose.

                                                      If you have a background in strongly types languages, those constructions would seem nonsensical; nevertheless, they are common.

                                                      1. 1

                                                        It allows one to manipulate/compute/change types so that you can have higher type safety. Thus making TS more flexible.

                                                      1. 2

                                                        At work I’m adding code to embed some extra meta-data in the portable globe files we “cut” from Google Earth. Currently when our Android plugin imports a portable file it has to scan all of the imagery and terrain data packets looking for some metadata like boundaries, min/max zoom levels, and a few other bits of information. Needless to say, “walking” the whole file structure is a performance problem with larger files, so our solution is to pre-compute the metadata we need and embed that in the file at creation time. It’s been nice to actually write code after weeks of manual testing and bug fixing, and I’m learning a lot about the portable cut file format and the process for creating them, which is a lot of fun.

                                                        It’s a day off today, though, so I’m going for a longer bike ride, and then incorporating some recent upstream changes to Blend2D into my Common Lisp binding. Right now I can’t read or write files, so it’s critical to get the new APIs included. The downside to writing a binding to a pre-release library is that I seem to spend more time tracking changes and tweaking/fixing the binding than I do using it, but the recent changes are an improvement, and I’m learning the API as it evolves, so it’s really not too bad.

                                                        For the rest of the week, I’d like to get back to the animations I was creating with the Blend2D bindings. And I have some bike maintenance to do once some parts arrive.

                                                        I’ve also started going for daily walks with a new neighbor friend of mine. One downside of working from home is that it’s easy to stay in, and it’s nice to have the external motivation to go out and talk to somebody face to face.

                                                        1. 1

                                                          I’m trying to pick up common lisp during free time. I’ve realized that working on bindings is something I may have to do often if I try making a more practical application.

                                                          Are your bindings available online? If so, I’d love to learn from them

                                                          1. 1

                                                            You may be surprised - after about 5 years of CL I’ve only had to create my own bindings a couple of times. The cl-autowrap and CFFI packages make it relatively simple to write C bindings. C++ is a different story, and I’m not sure there’s a great way to create those, especially for template heavy libraries.

                                                            These Blend2D bindings are using cl-autowrap, which uses c2ffi and clang to generate bindings from a header file. The downside of cl-autowrap is that it creates (and exports) bindings to every function and type found while parsing a header file, including system functions and types.

                                                            I didn’t want to export all of those from the blend2d package, so I created a nested package named blend2d.ll which uses cl-autowrap and exports everything, and then another package, blend2d, which selectively exports functions from blend2d.ll. There may be a better way to do it, but this is working okay for now.

                                                            An alternative to cl-autowrap is to use CFFI directly. It’s easier to use for small libraries, or situations where you only need a handful of foreign functions. I used this technique a while back to write an incomplete binding to ZBar, a barcode scanning library.

                                                        1. 3

                                                          I’m writing my blog with create-react-app. So far I’m having a lot of fun working out the subtleties of the design and what I want my blog to be like. Also brainstormed a bunch of ideas for blog posts.

                                                          I’m also working on putting out a stable release of https://nhooyr.io/websocket

                                                          And some other top secret stuff :)

                                                          1. 3

                                                            I looked through the readme of the project and I just wanted to say I really appreciate that you had an entire section dedicated to justifying why the library is being written and a comparison to existing libraries.

                                                          1. 3

                                                            Trying to juggle multiple projects while squeezing out a minimum viable Mitogen release supporting Ansible 2.8. Azure Pipelines is being an asshole, so I’ve downed tools for the evening

                                                            1. 2

                                                              Thanks for your work on Mitogen! I’ve started using Ansible at work this month and it’s been a real joy to use partly thanks to Mitogen.

                                                              1. 1

                                                                What is Mitogen? I tried looking through their website but couldn’t really grok it.

                                                                It seems like either an extension to ansible or an alternative runtime?

                                                              1. 16

                                                                If you want a good experience, the laptop needs to be sold to you as a “Linux laptop”, with the explicit promise that it has an OS with drivers that have been tested and pre-installed. Surprisingly few laptop makers are doing this (yet?).

                                                                Linux works a lot better on my ThinkPad x270 than it ever did on the Dell XPS 13 “Developer Edition” (~5 years ago), which was sold with Ubuntu. Lenovo may not support Linux, but a lot of Linux developers use ThinkPads, so they tend to be well supported.

                                                                The Dell XPS on the other hand was just a crappified Ubuntu install with proprietary drivers and applications to make it work. I’m not even sure why they installed all the binary crap, since Arch Linux also seemed to work for everything except bluetooth, and audio on HDMI out (which also never worked well on Ubuntu; did get that working eventually), so I just installed that over Ubuntu after I ran in to an apt-get bug that was never fixed in the Ubuntu LTS (“stability”).

                                                                Maybe things have improved in the meanwhile, but “sold as Linux laptop” does not automatically equal “good Linux experience”.

                                                                once, across a reboot, the entire settings panel (which is an app) just… vanished. I had to research what it’s called and reinstall it from apt. It was more funny than annoying.

                                                                apt has the horrible habit of “helpfully” removing packages it thinks you no longer need. Removing package A may also remove vaguely related package B, even though B is not a dependency of A. There are even cases where installing a package can remove other packages, or uninstalling can also install packages. The logic can be really opaque and hard to grok.

                                                                1. 8

                                                                  apt has the horrible habit of “helpfully” removing packages it thinks you no longer need.

                                                                  Weird Debian/Ubuntu stuff like this is why I never recommend apt-based distros anymore.

                                                                  1. 11

                                                                    Hello!

                                                                    For a moment there, after reading your comment, I wanted to fold my keyboard like a taco.

                                                                    I am pretty confident that it wasn’t your intent… :)

                                                                    …Anyway… FYI, there is no such thing as Debian/Ubuntu. Debian is a thing and Ubuntu is a thing and they are distinct.

                                                                    Debian is a very old, well designed, and respected GNU distribution. Dependency management is hard (I mean, NP-complete!), but engineers in the Debian project know that, and they care. They worked out a system of rules for keeping the dependency graph clean. Beyond that, they make efforts to teach new maintainers how to understand those admittedly complex rules. They work hard to tame the chaotic sea of packages as much as possible. May their beards be long and tangle free forever!

                                                                    Ubuntu is … popular. I’ve used it before. My best friend still does, and asks me for help with his computer frequently.. One of my pet peeves is that a lot of individual little packages depend on big meta packages that depend on gigs of desktop environment stuff.. It’s almost impossible to run Ubuntu without the default environment installed. See, you don’t have to use it, but Ubuntu puts so little care and feeding into their dependency graph that you end up in nonsense situations…

                                                                    Both Debian and Ubuntu are “dpkg-based”. apt-get and apt and aptitude are front-ends.

                                                                    dpkg does as it is told. Two things tell it what to do: the front-end and the package dependency graph (let’s call that “the repo”.)

                                                                    I assume that all contemporary package managers that aren’t broken will “do as they are told”. So, the problem with Ubuntu is their package graph and maybe their front-end tool.

                                                                    My front-end never auto-removes anything, though does remind me that I can run a command to make it remove those things that nothing depends on. I use Debian and apt-get. When I manually invoke the auto-remove feature, so far it has never removed anything that was still needed, on account of that artfully human-curated dependency graph from the repos…

                                                                    1. 2

                                                                      The only time apt has ever proposed removing packages it deemed no longer needed is when I installed those packages in support of an out-of-distro package, e.g. a downloaded .deb archive or something from a repository which I subsequently removed. It was my experience with apt versus that with rpm - this was before RedHat thought up something like yum - which made me settle on Debian and Debian-based distributions. While building rpm packages was (and possibly is, it is a while ago I last built an rpm package) generally easier than equivalent .deb packages the robustness of a .deb system managed by apt was far higher than that of an .rpm system.

                                                                      1. 1

                                                                        What do you recommend instead?

                                                                        I have a lot of criticism about apt/dpkg-based distributions, but the alternatives seem to be consistently worse.

                                                                        1. 6

                                                                          Void Linux.

                                                                          1. -8

                                                                            Void’s package manager and build system have been written from scratch.

                                                                            … in C.

                                                                            I think I’ll skip trying this. Using C or C++ in 2019 is just incredibly poor judgement.

                                                                            1. 11

                                                                              I would much rather my system has a fast and straightforward package manager written in lean C than something like Java, or Python. Calling his choice of C “incredibly poor judgement” comes across as patronising and kind of elitist - generally in poor taste

                                                                              1. 4

                                                                                Well the first commit to it was in 2009.

                                                                                1. 1

                                                                                  I’ll comment that using C in 2019 can be the right choice for certain low level tasks like firmware or important system libraries or languages or operating systems.

                                                                                  As for XBPS, it was written in 2009 by an ex-NetBSD dev. C was the most respectable language for system level stuff at that point.

                                                                              2. 1

                                                                                Solus.

                                                                            2. 3

                                                                              Maybe things have improved in the meanwhile, but “sold as Linux laptop” does not automatically equal “good Linux experience”.

                                                                              Another thing to note is that the “sold as Linux laptop” being conflated with “good Linux experience” tends to only be valid as long as the company continues to provide support.

                                                                              1. 2

                                                                                apt has the horrible habit of “helpfully” removing packages it thinks you no longer need

                                                                                … but it will never remove any packages you installed…

                                                                                1. 4

                                                                                  That’s the theory, but in practice it will also do really surprising things with packages you explicitly installed, not that this is a good model in the first place. I don’t have any examples at hand, and lack an apt-based machine. Here’s another old example which illustrates the kind of broken behaviour:

                                                                                  $ apt-get install consolekit:i386
                                                                                  
                                                                                  Reading package lists...
                                                                                  Building dependency tree...
                                                                                  Reading state information...
                                                                                  The following packages were automatically installed and are no longer required:
                                                                                    python-mutagen python-mmkeys python-cddb
                                                                                  Use 'apt-get autoremove' to remove them.
                                                                                  The following extra packages will be installed:
                                                                                    docbook-xml libck-connector0:i386 libpam-ck-connector:i386 libpam0g:i386
                                                                                    libpolkit-gobject-1-0:i386 sgml-data synaptic
                                                                                  Suggested packages:
                                                                                    docbook docbook-dsssl docbook-xsl docbook-defguide libpam-doc:i386 perlsgml
                                                                                    doc-html-w3 opensp dwww deborphan
                                                                                  Recommended packages:
                                                                                    rarian-compat
                                                                                  The following packages will be REMOVED
                                                                                    acpi-support aptdaemon apturl colord consolekit dell-recovery
                                                                                    gnome-bluetooth gnome-control-center gnome-power-manager gnome-system-log
                                                                                    gnome-user-share hplip indicator-datetime indicator-power indicator-sound
                                                                                    jockey-common jockey-gtk landscape-client-ui-install language-selector-gnome
                                                                                    libcanberra-pulse libck-connector0 libnm-gtk0 libpam-ck-connector
                                                                                    manage-distro-upgrade nautilus-share network-manager-gnome policykit-1
                                                                                    policykit-1-gnome printer-driver-postscript-hp pulseaudio
                                                                                    pulseaudio-module-bluetooth pulseaudio-module-gconf pulseaudio-module-x11
                                                                                    python-aptdaemon python-aptdaemon.gtk3widgets python-aptdaemon.pkcompat
                                                                                    sessioninstaller software-center software-properties-gtk
                                                                                    ubuntu-system-service ubuntuone-control-panel-common
                                                                                    ubuntuone-control-panel-qt ubuntuone-installer update-manager
                                                                                    update-notifier xul-ext-ubufox
                                                                                    The following NEW packages will be installed
                                                                                    consolekit:i386 docbook-xml libck-connector0:i386 libpam-ck-connector:i386
                                                                                    libpam0g:i386 libpolkit-gobject-1-0:i386 sgml-data synaptic
                                                                                  0 to upgrade, 8 to newly install, 46 to remove and 0 not to upgrade.
                                                                                  Need to get 3,432 kB of archives.
                                                                                  After this operation, 20.6 MB disk space will be freed.
                                                                                  Do you want to continue [Y/n]?
                                                                                  

                                                                                  Removed my wireless drivers like this once :-/ Easiest way to get them back was run the Dell recovery stuff :-(

                                                                                  I’ve been told that apt is the new apt-get; I don’t know how well it does in comparison, as this was before apt.

                                                                                  1. 4

                                                                                    There is a lot to unpack here.

                                                                                    It appears that you’re working with multiarch. That’s more complex that average.

                                                                                    Second, there does not appear to be a package named dell-recovery in the Debian repos. Was it there in the past, or is it a third party package? I bet it depended on specific versions of some packages rather than depending on something like “>= version 1.2.3. This is super-common among third party packages because the authors don’t know what to expect in the future from those packages and they fear an upstream upgrade breaking their package…

                                                                                    Finally, that package consolekit has been removed from Debian, so I can’t figure out how to check the reverse dependencies… But I can tell you, it was in the admin section. Those packages are all.. I dunno, “low level” or “fundamental”. It appears that you asked your 64-bit Debian to install the 32-bit version of a fundamental package.

                                                                                    1. 1

                                                                                      What would you prefer happen in this case? Just refuse to install?

                                                                                      1. 6

                                                                                        For example; that’s what most systems do. Or give me an option asking me what to do. Certainly not removing critical packages like pulseaudio, gnome-control-center, etc. No other system I’ve used tries to be “smart” like this.

                                                                                        Computers are really dumb, and algorithms like this doubly so. In attempting to do “the sane thing” apt-get is more likely to leave a system in a usable state, rather than the reverse.

                                                                                        1. 2

                                                                                          I understand your point that maybe it should force you to be explicit and remove conflicting packages yourself. It does however ask you if you want to continue after it has told you what it plans on doing. Also, often if you say no it’ll offer an alternate solution to the conflict that may be more palatable.

                                                                                          1. 7

                                                                                            It does however ask you if you want to continue after it has told you what it plans on doing

                                                                                            Yeah, but the output needlessly long, noisy, and has a general “wall-of-text”-y feel to it. It’s easy to miss things, especially if it’s just a few packages that are removed (instead of a whole bunch).

                                                                                            Here’s how it could look:

                                                                                            $ apt-get install consolekit:i386
                                                                                            
                                                                                            The following packages will be REMOVED
                                                                                              acpi-support aptdaemon apturl colord consolekit dell-recovery
                                                                                              gnome-bluetooth gnome-control-center gnome-power-manager gnome-system-log
                                                                                              gnome-user-share hplip indicator-datetime indicator-power indicator-sound
                                                                                              jockey-common jockey-gtk landscape-client-ui-install language-selector-gnome
                                                                                              libcanberra-pulse libck-connector0 libnm-gtk0 libpam-ck-connector
                                                                                              manage-distro-upgrade nautilus-share network-manager-gnome policykit-1
                                                                                              policykit-1-gnome printer-driver-postscript-hp pulseaudio
                                                                                              pulseaudio-module-bluetooth pulseaudio-module-gconf pulseaudio-module-x11
                                                                                              python-aptdaemon python-aptdaemon.gtk3widgets python-aptdaemon.pkcompat
                                                                                              sessioninstaller software-center software-properties-gtk
                                                                                              ubuntu-system-service ubuntuone-control-panel-common
                                                                                              ubuntuone-control-panel-qt ubuntuone-installer update-manager
                                                                                              update-notifier xul-ext-ubufox
                                                                                            
                                                                                            The following NEW packages will be installed
                                                                                              consolekit:i386 docbook-xml libck-connector0:i386 libpam-ck-connector:i386
                                                                                              libpam0g:i386 libpolkit-gobject-1-0:i386 sgml-data synaptic
                                                                                            
                                                                                            Need to download 3,432 kB; 20.6 MB disk space will be freed.
                                                                                            8 to install, 46 to remove
                                                                                            
                                                                                            :: WARNING: this operation will REMOVE packages!
                                                                                            
                                                                                            Do you want to continue [y/N]?
                                                                                            

                                                                                            So much cleaner, and the default is now “no”, as it’s an unexpected dangerous operation. The warning text should probably stand out (bold, standout attr, colour, whatever your taste prefers).

                                                                                            This is getting a bit off-topic, but commandline interfaces are user interfaces every bit as much as graphical desktop and web apps. It’s something that needs to thought about, designed, ideally it should be tested, and should be tweaked based on how people are actually using it.

                                                                                            apt-get is a good example of a terrible user interface in many ways. It’s the commandline version of a chaotic ERP product or 2001-era webapp that has grown since. Sure, it may be powerful and the underpinnings are probably good, but the UX is … not ideal.

                                                                                            apt has since replaced apt-get; I don’t know if it does better as I haven’t used any of this in a while, but this post suggests it may not :-(

                                                                                            1. 4

                                                                                              They already put REMOVED in all caps. Also, after you do a few thousand apt-get invocations, you certainly notice when the ‘removed’ stanza is present vs when it is not.

                                                                                              Debian was the first OS I knew about that had a reliable, sane package manager.

                                                                                              If the solution to the problem is to reinstall packages, then the system is NOT broken. If the solution to the problem is to reinstall the OS from scratch, then the system is BROKEN.

                                                                                              I hope everybody understands that in this circumstance, Debian was preventing the system from becoming broken. Small price to pay…

                                                                                          2. 2

                                                                                            In attempting to do “the sane thing” apt-get is more likely to leave a system in a usable state, rather than the reverse.

                                                                                            Never personally happened to me, but I know a bunch of people new to linux who have broken an ubuntu system with an apt command. This stuff never seems to happen on arch/fedora/whatever, but apt just seems to have a propensity for breaking stuff if you aren’t careful.

                                                                                        2. 1

                                                                                          But why are those packages being removed? Why remove packages when installing a package? What’s the rational?

                                                                                          1. 1

                                                                                            There is a conflict between the new package (or its dependencies) and existingly installed packages (or their dependencies). Thus the package cannot be installed on the current syste. Instead of simply refusing, apt suggests a possible system the package could be installed on (by removing some current packages) and asks you if that is what you want to do.