1. 1

    Just using a level meter app on my phone, with the phone resting right next to the keyboard:

    Ambient noise: 31dBA

    Typing on Macbook Pro 2015: up to 40 dBA

    Typing on Cherry MX browns with o-rings, metal backplate, bamboo case: up to 50 dBA

    Measurement obviously not professional, but I do type ‘quietly’ (not much finger movement and I just press to the activation point rather than bottoming out the keys)

    1. 2

      They didn’t even mention my bugaboo: USB cables that only carry power, not data. They’re packed in with (some) devices that only support USB for charging, and I seem to have accumulated a number of them.

      So sometimes when I retrieve a cable from my stash to connect something to my computer, I have the frustrating experience of not being able to connect to the device, checking that I’ve installed drivers / put the device in “connect mode”, etc. until I remember that this must be one of those dud cables…

      1. 5

        Ouch, I sympathize. I tend to store these sorts of cables separately, in the trash can.

        1. 1

          I currently carry device chargers with me and avoid plugging my devices into other USB sockets to charge. This is mostly because I’m concerned about security.

          As charging over USB becomes more common and available, I think ‘power only’ cables will be useful, though I might have to buy them in a particular, garish colour to avoid the nightmare you’re describing.

          1. 2

            You can buy the “USB condom” adapters pretty cheaply, and use them to turn any cable into a power-only cable when you’re going to be charging from an untrusted port. I always used to bring a few with me back in the days when traveling was a thing.

          2. 1

            I am dying today trying to plug a monitor in via USB-C, and apparently every single C->C cable I have is power only? Or maybe it’s this busted MBP? Or maybe it’s just the phase of the moon?

            I hate USB more than almost anything else in computing.

            1. 1

              O man, have you ever used SCSI? Macs used to use it, pre-USB. It wasn’t hot-pluggable, so you had to shut down everything first. And it sometimes required a “terminator” plug at the end of the chain, and there was weird voodoo about that which honestly I’ve blocked out of my memory. Plus the cables were as big around as a finger, stiff, and quite expensive.

              1. 1

                I used to tend a big SGI (a Power Challenge!) with weird flaky terminator problems. It was the worst. Super fun having a whole academic department go offline because of some dumbass $1,200 hunk of resistors and plastic in that edgy shade of mauve.

          1. 1

            Photo

            Personal desktop on the left: Ryzen Hackintosh. Contains stuff. I’m not ‘into’ hardware and just want it to work, not cost too much money, and be quiet.

            Big LG screen is lovely and close enough to ‘retina’ that it works for me.

            Laptop on the right is MacBook Pro 15” 2015. I have one from work and I also own the same model, so I have a nice consistent setup.

            Big Dell screen is lovely but not at all retina. The huge horizontal space is great for getting stuff done.

            The split keyboard with orange letters and yellow cables is a Redox from Falba Tech with one of their lovely stained bamboo cases.

            Random other stuff that’s in shot:

            • Apple Watch charging stand in the style of an 80s Mac. I’m a pragmatist and not at all a Mac fetishist but these are great stands and very cheap.
            • Bright light panel bodged onto a floor lamp’s stand. Really makes it feel more like I’ve had more outdoor time than I actually get during the short winter days.
            • Good hi-fi amp.

            Not in shot:

            • The floor. This is an IKEA standing desk and I almost always keep it at standing height. It took months to get used to standing all day but I’m glad I did.
            • The sky. When standing at the desk I see mostly sky out of the window. This makes me feel good.
            • Good hi-fi floor standing speakers. I love listening to music when I’m doing work that doesn’t require too much concentration.
            • Software. I use lots of it. Nothing particularly unusual. If it works with Vi keys, that’s an advantage, but I try not to get too attached to any particular software.
            1. 2

              There’s another message in the thread about the possibility of adding a garbage collection mechanism to atoms: https://erlang.org/pipermail/erlang-questions/2015-October/086385.html

              It references a lock-free implementation of the atom table in SWI-Prolog, located on github and with plenty of documentation, which I think is an interesting read on its own if you’re interested in the internal workings of the atom table.

              1. 3

                I’m not sure I understand why garbage collecting atoms would be useful in Erlang. The number of them in use shouldn’t ever grow continuously or at a high rate, and should be naturally limited.

                I’m assuming that there is no dynamic creation of atoms going on, of course. Users of Erlang (or Elixir, or Ruby…) know that they aren’t just there for free O(1) comparison and shouldn’t be created dynamically for most uses.

                There are legitimate reasons for creating them dynamically (e.g. easy mapping of dynamic data to static) but all those I can think of would come with an expectation that they’d be limited in the same way as if they’d come from code.

                Perhaps SWI Prolog (with which I have no experience) has a different definition of what an atom is and therefore collecting them makes sense in its world?

                1. 4

                  Normally yes, atoms not being garbage collected is not a problem. The usual places you’d find dynamically created atoms are in locally-registered processes and named ETS tables. Both gen_server:start/4 and erlang:register/2 require you to use an atom as the name, and ets:new/2 also requires an atom as the table name.

                  If you’re spawning processes on demand, and registering them, you need to dynamically create new names for them. Of course, you could start reusing names if you unregister them later, but you’d need to keep track of that somehow. For ETS tables, there’s usually no good reason to create them on demand, at least, I haven’t seen it yet.

                  I assume this was done for good reason at the time (efficiency, etc, although the global registry allows any term as the name), but it can be a problem. The obvious solution should be to allow any term as the key, but there are other ways to tackle the problem without creating atoms out of thin air, like using an ETS table or persistent_term as the registry, which allows any term as the key.

              1. 2

                Just a strange feeling: not even the simplest things like time and current weather (grade) we can obtain from our advanced technologies. So what good they serve?

                1. 9

                  It’s not a lie though, you are obtaining the time. It’s just rounded to the nearest second instead of rounded down, which is a pretty intuitive thing.

                  1. 3

                    I hadn’t noticed until it was pointed out and it’s great. It always feels ‘wrong’ when you start a timer at (to use the example) 5s and the first thing you see is 4.something. I can imagine there were arguments about implementing this though.

                    1. 5

                      There could be an argument in favour of rounding up too. Starts with a full second 5, then the very moment you see 0, it’s over. Very intuitive.

                      1. 9

                        Yeah, I’m pretty sure this is how most people speak a countdown out loud. “Five…four…three…two…one…” and then “zero” or “time” or “go” means it’s over. You wouldn’t say “zero” if there was still any time left.

                        1. 1

                          this makes the most sense to me, if they aren’t showing milliseconds, it ‘ending’ on zero seems far more reasonable, e.g. https://i.imgur.com/Y1AlKks.gif

                        2. 2

                          I’ve always used this rounding up approach. The article touches on it but dismisses it as not useful when using it for hours and minutes. Of course, in a rounding up approach, you only want to ever round up the smallest unit you are displaying and continue to round down the others.

                          There is some philosophical argument about showing the rounded up time, however. If the timer shows 1s you might be inclined to believe there is at least a whole second left. With the rounding down approach, this holds true. For the rounding to nearest and rounding up approaches, however, the timer shows something different. Showing a value of 3s in those cases only indicates that you have no more than 3s left before the countdown expires.

                          My intuitive understanding of what a timer means is more inline with the presentation given by rounding down, but it is definitely strange to think that hitting 0 is not the end. I suppose that’s why I prefer the rounding up approach in the end even if I find it mildly misleading.

                    2. 4

                      I can get the current time and weather from my technology fine. What are you talking about?

                    1. 1

                      No animation here, but more powerful CPU generally in use and therefore easy enough to apply effects to reflect user’s preference at app / desktop component startup without noticeable delay:

                      https://api.kde.org/frameworks-api/frameworks-apidocs/frameworks/kiconthemes/html/classKIconEffect.html

                      For a while I used desaturated icons which lit up with colour when hovered over - and some semi-transparency, if I remember correctly.

                      It was interesting implementing some of the effects, as it can be surprising what makes them run faster or slower on different CPUs

                      Also it was interesting to learn that linear operations on R, G and B channels don’t line up with human perception. Its not a huge problem for icon effects, though if you’re playing with a photo it would be much more obviously ‘wrong’.

                      1. 3

                        I have a few difficulties with fluent interfaces:

                        1. It seems like they aren’t as easily ‘discoverable’ as normal ‘set up this complex object and do stuff with it’ interfaces.
                        2. Probably related to the above: I’m never sure if I’ve finished adding all required calls.
                        3. Again probably related: Is ordering important? If so how should I ensure it’s correct? With a ‘normal’ interface this doesn’t seem hard to be sure about.
                        4. Can / should I call the same method more than once? Sometimes it’s not clear.

                        Perhaps these are all non-issues with careful design (which should be a requirement anyway of course) and the right tooling.

                        Recently I’ve played with SwiftUI, where if you get your calls in the wrong order, your UI goes wrong. In theory you can figure out the right order to make calls from the docs - except the docs are mostly missing, so it’s trial and error until what you have looks right. And hope it doesn’t go wrong again with different inputs.

                        Did anyone have examples of great fluent interfaces? I’d like to find some to use as references on how to do them well.

                        1. 15

                          Bad title, but actually interesting read. Except this part:

                          A good webcam, in my opinion, is a device that provides a decent sound quality

                          Maybe I’m just realistically pessimistic but years of experience with people and their shitty audio setups made me swear to never ever use a room mic myself for a video or audio call.

                          1. 11

                            When using a MacBook Pro for video calls I can get away with using the inbuilt mic as others tell me I come across very clearly and with no background noise / echo etc. I’ve noticed the same with other callers on Apple devices.

                            Those on our company’s (expensive) Dells all need to wear headsets to be heard properly and avoid noise / echo.

                            I don’t know what Apple is doing with their mics and processing of the audio but it works.

                            1. 11

                              The Apple mic array is doing a ton of digital signal processing behind the scenes – identifying voice-like noise and “steering” to it using phased array techniques. That stuff is really cool but expensive to develop.

                              1. 2

                                It sounds cool, but in reality it is not that hard to develop. And they only have three mics, that gives only a very crude beam-steering abilities.

                                1. 2

                                  I wonder why other device / OS manufacturers aren’t providing something similar then. Perhaps it’s encumbered by patents or is much harder to implement if you don’t have your hands on both the hardware and OS. Windows drivers should have enough access though, I’d have thought.

                                  1. 1

                                    Crude is good enough to distinguish between the voice directly ahead of the camera and other sources. If two people are directly in front of the camera, the chance is good that both intend to be heard.

                              2. 2

                                I use one and it works (and I’m conscious enough of these things to have spent almost €1k on conference quality improvements, and some of it was my own money). Location is everything. My microphone is far from my keyboard, somewhat directional, and I’m alone in the room.

                                if I wanted to build a good-quality product I’d probably spend a lot of effort on using two or three good microphones and driver code and training/calibration tools to be able to boost the voice and suppress noise sources (typing on the keyboard, construction work in the neighbouring offices, neighbours fighting, whatever). And I’d forget absolutely 4k.

                                I’m sure 4k resolution is useful for something, but being able to count the hairs on people’s chins during video conferences is not likely to help the conference achieve its purpose.

                                1. 2

                                  That and probably no current videoconferencing system allocates users enough bandwidth to transmit 4k anyway, even if their internet connection suffices for it.

                                2. 2

                                  My solution to all this, is I bought a good microphone. I have 2 now, a blue yeti, and a rode podcaster. The former is a condenser mic, so its a bit finicky on room noise pickup. The latter is a shotgun mic which is way better for meetings. I bought an arm for them as well.

                                  My only problem with all this is I kinda want the whole thing in a package, so I’m tempted to try to buy a new arm with in arm cable management (I HATE CABLES DIE DIE DIE), and to then get an rpi4 with the camera module v2 and make this whole setup work off the boom arm and setup obs there to act as a usb camera passthrough for say 720p. I’m also wondering if I solder up an led light array powered off usb too by the camera. Lighting seems to be the biggest issue with most live meeting setups.

                                  Then I can just use like xpra to connect to obs on the pi and all the stupid software on any system I plug this crap into will “just work” and think of the entire thing as a mic/camera but I’ll have sane audio filters.

                                  I’ve started down this path actually but not entirely sure I want to do the entire race. It all seems like a ton of silly work for little gain. Depends how bored I get this winter.

                                  1. 1

                                    Is OBS running on a Pi really good for that? I mean, I’ve thought about building a webcam out of a Pi and a HQ Camera module (and by my suggestion a friend did so), but he used Ethernet/RTSP from the Pi, and I thought about just getting a UVC stream from the Pi and using OBS on the computer it’s plugged into.

                                    I guess the real question I have is, how well does OBS run on a Raspberry Pi?

                                    1. 1

                                      Great question. I’m not entirely sure to be honest, I have a spare 4b 8gig I can test on. But my fallback option is this: https://www.lattepanda.com/products/3.html

                                      For 1080/720p should be enough and i’m also trying to have the goal of it doing all this over usb as a device not over ethernet/wifi which is a huge pita. The wifi is the only thing i’ll use and use it for having xpra run obs so I can disconnect/reconnect to things. I might just abandon the pi as the backbone and just use x86 instead because its a lot less of a pain to maybe do something that could be booted/installed off of pxe.

                                      You could run obs on the host computer as well instead of on the soc, my goal here was to more to have “obs in a box hooked up to a camera and mic”. It won’t connect super fast if I power it off the host bus and will have to boot but the tradeoffs seem worth it.

                                  2. 1

                                    Bad title, but actually interesting read.

                                    I’m open for suggestions on improving the title :)

                                    Maybe I’m just realistically pessimistic but years of experience with people and their shitty audio setups made me swear to never ever use a room mic myself for a video or audio call.

                                    It is actually possible to make a good audio setup with room mic. Of course, in some cases it is very hard, if someone is sitting in a crammed open space, but this product is supposed to be used basically at home, where it is much easier to do.

                                    1. 1

                                      I didn’t say it’s impossible, but I seem to have exclusively worked with people who don’t care about others in the past. I’m regularly the only one using a headset with a microphone, some people at least have earbuds with a non crappy mic, but environmental noise or static is more common than not. And yes, maybe I’m just grumpy because nobody seems to care a bit.

                                      Regarding the title: I think “good” is very subjective here, especially given the many different use cases. Yes, my ThinkPad one is horrible, but for team meetings where I have the people on a 14” laptop screen the one in e.g. 4-5y old Macbooks is totally fine. Also I kinda like the Logitech ones (forgot the model) that were actually just 70-100€ and maybe? catered to streamers. No, it’s not 4k but I honestly don’t see the need for that, many people I know never watch this on big enough screens to even notice.

                                      1. 2

                                        the Logitech ones (forgot the model) that were actually just 70-100€ and maybe? catered to streamers

                                        Maybe the C920 HD? They’re excellent, especially for their price. Not sure about the quality of the built-in mic, I always use a headset, but it’s overall a very solid product.

                                      2. 1

                                        I’m open for suggestions on improving the title :)

                                        Buzzfeed it! Top 10 reasons you can’t buy a good webcam, #10 will shock you! I think its fine as-is though, but I am also annoyed that getting good video and audio even for zoom stuff is sooooo more effort than i’d expected. I appreciate the people that put in the effort on calls now though. So much background noise that could be eliminated with a filter through obs or some other audio processing that would remove my headphones letting me hear every wash cycle of their clothes. (also, why do people not mute when not talking or try to do laundry when they’re not on a meeting but I digress)

                                        1. 1

                                          Top 10 reasons you can’t buy a good webcam, #10 will shock you!

                                          Ughh, thanks, I hate it :)

                                          So much background noise that could be eliminated with a filter through obs or some other audio processing

                                          When in a meeting from PC - sure, it’s possible. When someone is on a meeting from phone it’s both really hard to do anything custom and generally a lot more noise, because someone is walking by a busy street, or standing next to a grinding coffee machine… Or does their laundry, as you say.

                                          Seems like the only solution here is to convince people to buy some noise-cancelling headsets for their phone.

                                      3. 1

                                        At work, I have a fairly expensive VoIP phone with a speakerphone mode that I use exclusively as a microphone. It works very well in my office. In my home office, I’ve been using the microphone built into my Surface Book 2. That also works very well, though it works far better with Teams than Signal. As far as I can tell (having not looked at the code), Teams is doing some dynamic measurement to detect the latency between the sound and the speaker. This is really apparent when you use a wireless setup (for social things, I sometimes use the WiFi display functionality of my Xbox to send video and audio to the living room screen and speakers - this has about half a second latency, which Teams is fine with but Signal can’t handle at all).

                                        My webcam actually does have a microphone but I’ve not tried using it.

                                      1. 6

                                        I use an IDE, which indexes the project and provides instant method/function signatures, jump to definition, find usages, refactoring, etc etc.

                                        1. 1

                                          Yes - and Vi keys in the IDE.

                                          1. 1

                                            Not so much.

                                        1. 1

                                          On the contrary, I want a low budget low res cam for chatting with my developer friends. I bet there is a good market for this, but zero supply for this in my region.

                                          1. 2

                                            Many years back I bought a webcam based purely on it being advertised as ‘UVC’ and in theory therefore operable without third party drivers. It was extremely cheap and always worked perfectly - though of course the picture was only really adequate.

                                            Maybe cheap UVC cams still exist and might be an option if you’re looking for something useable but cheap.

                                            1. 2

                                              They might be sold as ‘USB security cameras’ :)

                                                1. 2

                                                  Show me webcam from this link looks interesting. Will try next year hopefully.

                                                  1. 1

                                                    Nice, I had missed that link. Thanks.

                                              1. 21

                                                Agree that CPU and disk (and maybe ram) haven’t improved enough to warrant a new laptop, but a 3200x1800 screen really is an amazing upgrade I don’t want to downgrade from.

                                                1. 6

                                                  I love my new 4k screen for text stuff.. Sadly on linux it seems to be pain in the ass to scale this appropriately and correctly. Even more with different resolutions between screens. So far windows does this quite well.

                                                  1. 4

                                                    Wayland can handle it ok, but Xorg doesn’t (and never will) have support for per-display DPI scaling.

                                                    1. 3

                                                      I don’t see myself being able to afford a 4k screen for a few years but if you just scale everything up, what’s the advantage?

                                                      1. 4

                                                        The text looks much crisper, so you can use smaller font sizes without straining your eyes if you want more screen real estate. Or you can just enjoy the increased readability.

                                                        Note: YMMV. Some people love it and report significantly reduced eye strain and increased legibility, some people don’t really notice a difference.

                                                        1. 2

                                                          I use a much nicer font on my terminals now, which I find clearer to read. And I stare at terminals, dunno, 50% of my days.

                                                          This is a Tuxedo laptop (I think it’s the same whitelabel as system86 sells) which don’t feel expensive to me.

                                                          1. 1

                                                            hah I’m also using a tuxedo one, but the font is far too tiny on that screen to work with everyday

                                                            1. 1

                                                              Which tuxedo laptop has 4k?

                                                              1. 1

                                                                I can’t find them anymore either. They used to have an option for the high res display. I go this one a bit over a year ago:

                                                                1 x TUXEDO InfinityBook Pro 13 v4  1.099,00 EUR
                                                                 - QHD+ IPS matt | silber/silber | Intel Core
                                                                i7-8565U
                                                                ...
                                                                Summe: 1.099,00 EUR
                                                                
                                                                1. 1

                                                                  how was your driver experience ? I’ve had to re-send mine twice due to problems with the CPU/GPU hybrid stack. Though mine is now 3? years old.

                                                                  1. 2

                                                                    Drivers are fine, it all simply works. Battery could last longer.

                                                                2. 1

                                                                  Yeah ok. I just ordered a Pulse 15. Also wanted a 4k display but didn’t see it anywhere. thanks

                                                              2. 1

                                                                well you have a much sharper font and can go nearer if you want (like with books). I get eye strain over time from how pixelated text can appear at evening to me. Also you can watch higher res videos and all in all it looks really crisp. See also you smartphone, mine is already using a 2k screen, and you can see how clean text etc is.

                                                                You may want to just get an 2k screen (and maybe 144 FPS?) as that may already be enough for you. I just took the gamble and wanted to test it. Note that I probably got a modell with an inferior background lighting, so it’s not the same around the edges when I’m less than 50CM away. I also took the IPS panel for superior viewing angle as I’m using it for movie watching also. YMMV

                                                                My RTX 2070 GPU can’t play games like destiny on 4k 60 FPS without 100% GPU usage and FPS drops the moment I’m more than walking around. So I’ll definitely have to buy a new one if I want to use that.

                                                              3. 1

                                                                I also just got a new 4k monitor, and that’s bothering me also. It’s only a matter of time before I fix the glitch with a second 4k monitor… Maybe after Christmas

                                                                1. 2

                                                                  I ended up doing that. It sucks, but Linux is just plain bad at HiDPI in a way Windows/macOS is not. I found a mixed DPI environment to be essentially impossible.

                                                              4. 2

                                                                This is where I’m at too. I’m not sure I could go back to a 1024x768 screen or even a 1440x900 screen even. I have a 1900x1200 xps 13 that I really enjoy which is hooked up to a 3440x1440p ultrawide.

                                                                Might not need all the CPU power, but the screens are so so nice!

                                                                1. 2

                                                                  And the speakers.

                                                                  I love my x230, but I just bought an M1 Macbook Air, and god damn, are those speakers loud and crisp!

                                                                  1. 1

                                                                    For me it’s also screen size and brightness that are important. I just can’t read the text on a small, dim screen.

                                                                    1. 1

                                                                      Oh I’d love to have a 4k laptop. I’m currently using a 12” Xiaomi laptop from 2017 with 4GB of RAM and a 2k display. After adding a Samsung 960 evo NVMe and increasing Linux swappiness this is more than enough for my needs - but a 4k display would just be terrific!

                                                                    1. 6

                                                                      This guy seems to have a couple misapprehensions:

                                                                      As far as I know, refurbished should mean that the computer was used as a sample machine somewhere, e.g. in an Apple Store. When they didn’t need it anymore, it got checked for defects and sold for a cheaper price as a second hand product in a very good shape. My theory is that the computer was almost unused by the time I’ve bought it, but perhaps it was open with the display shining all the time in the shop, so durability of the pixels suffered a lot.

                                                                      No, refurbished just means used but restored to like-new condition, e.g. reformatted, factory reset, physically cleaned, ideally delivered in original box or identical one w/ original discs and manuals, etc.

                                                                      Anyway, in addition to that, the [old] machine’s performance [a 2015 MacBook Air] is a total potato these days. I have always appreciated its portability (size and weight of a magazine) and silence (has no fan)

                                                                      The 2015 MacBook Airs do have a fan. I’m not sure how he came to be confused about that.

                                                                      1. 4

                                                                        Where did you find that? All I see is the opening sentence:

                                                                        It’s MacBook 12” early 2015, which I bought a few years ago, second hand, refurbished.

                                                                        1. 1

                                                                          Huh… I did see that bit, but read it as MacBook Air early 2015 at the time, and I don’t remember the size being mentioned (in fact, I specifically thought, “maybe he has the 11” Air and that doesn’t have a fan” so I checked around online before posting).

                                                                          I guess I misread the “12”” as “Air”.

                                                                        2. 1

                                                                          Maybe it never came on loud enough that they noticed it, or perhaps they mistyped and meant Macbook (12”)

                                                                          1. 3

                                                                            Maybe it never came on loud enough that they noticed it,

                                                                            Then they sure didn’t do any serious work on it, because non-M1 Air’s fan spins up all the time and is loud (we had several Airs in the household the past decade). MacBook 12” is indeed fanless.

                                                                            1. 2

                                                                              It’s MacBook 12” early 2015

                                                                              They state that earlier in the article. The commenter must have glossed over it.

                                                                          1. 7

                                                                            FWIW in Oil you can do put this at the top of your script and get sane behavior:

                                                                            shopt -s strict:all 
                                                                            

                                                                            Or if you want to also run under bash:

                                                                            shopt -s strict:all 2>/dev/null || true
                                                                            

                                                                            Then keep on writing the way you normally do. It’s sort of like “guard rails”. You’ll get better and earlier errors, and then you can run your script with bash too.

                                                                            http://www.oilshell.org/release/latest/doc/oil-options.html

                                                                            Though the optparse thing is a hole in bash (and currently in Oil), I’m discussing how to plug that hole right now:

                                                                            https://oilshell.zulipchat.com/#narrow/stream/264891-oil-help/topic/Passing.20a.20map.20to.20a.20proc.20as.20reference (requires login)

                                                                            (bash has getopts, but it leaves a lot to be desired)

                                                                            1. 2

                                                                              What’s the reason that sane behaviour isn’t the default? Compatibility? Is it possible to make it the default?

                                                                              Oil shell looks pretty interesting.

                                                                              1. 5

                                                                                Good question, it is the default under bin/oil, but not bin/osh!

                                                                                • bin/oil is for when you’re writing something new and don’t care about compatibility.
                                                                                • bin/osh runs existing shell scripts. But it also gives you an upgrade path into saner behavior. (It’s also much more stable than bin/oil at the moment, although they are technically the same binary)

                                                                                I guess I should write this in the docs somewhere… It’s explained in the blog but with a lot of context.

                                                                            1. 1
                                                                              set ts=2 sw=2 et number
                                                                              syn on
                                                                              
                                                                              1. 5

                                                                                Isn’t this generalized by enumerations?

                                                                                I really liked the contrast of the original API vs the one using chained methods and the bitmasks. Drives the message home right away.

                                                                                1. 2

                                                                                  Yes, and C# has an attribute called ‘Flags’ you can apply so that tooling can know these values are meant to be ORed together and help out.

                                                                                  Example from Microsoft:

                                                                                  [Flags]
                                                                                  public enum Days
                                                                                  {
                                                                                      None      = 0b_0000_0000,
                                                                                      Monday    = 0b_0000_0001,
                                                                                      Tuesday   = 0b_0000_0010,
                                                                                      Wednesday = 0b_0000_0100,
                                                                                      Thursday  = 0b_0000_1000,
                                                                                      Friday    = 0b_0001_0000,
                                                                                      Saturday  = 0b_0010_0000,
                                                                                      Sunday    = 0b_0100_0000,
                                                                                      Weekend   = Saturday | Sunday
                                                                                  }
                                                                                  
                                                                                  1. 1

                                                                                    Is it? Bitmasks can be combined with bitwise or, I’m not aware of similar enum combinations in the implementations I’m familiar with.

                                                                                    1. 1

                                                                                      C lets you do that with enums; from https://en.cppreference.com/w/c/language/enum :

                                                                                      Enumerated types are integer types, and as such can be used anywhere other integer types can, including in implicit conversions and arithmetic operators.

                                                                                       enum { ONE = 1, TWO } e;
                                                                                       long n = ONE; // promotion
                                                                                       double d = ONE; // conversion
                                                                                       e = 1.2; // conversion, e is now ONE
                                                                                       e = e + 1; // e is now TWO
                                                                                      

                                                                                      This works in clang 12:

                                                                                      #include <stdio.h>
                                                                                      
                                                                                      enum asdf { A = 1, B = 2, C = 4 };
                                                                                      
                                                                                      int main() {
                                                                                      	enum asdf x = A;
                                                                                      	x = B | C;
                                                                                      	printf("%d\n", (int) x);
                                                                                      }
                                                                                      
                                                                                      > clang --version
                                                                                      Apple clang version 12.0.0 (clang-1200.0.26.2)
                                                                                      Target: x86_64-apple-darwin20.1.0
                                                                                      Thread model: posix
                                                                                      InstalledDir: /Library/Developer/CommandLineTools/usr/bin
                                                                                      > clang -o bitmask_enum bitmask_enum.c
                                                                                      > ./bitmask_enum
                                                                                      6
                                                                                      
                                                                                      1. 2

                                                                                        The lack of type- and range-checking can be considered a drawback. B | C isn’t a valid asdf value since the declaration has no item equal to 6; its type is actually int. C++ is stricter and won’t let you assign that back to an asdf variable without a typecast.

                                                                                        This makes using enums as bit-sets annoying in C++-compatible code. Apple’s frameworks work around this by making the bit-set type actually a typedef for unsigned, not the enum type itself.

                                                                                        1. 1

                                                                                          The lack of type- and range-checking can be considered a drawback.

                                                                                          So much of typing in C can and should be considered a drawback, and I wouldn’t shed a tear if software written in C went off into the sunset.

                                                                                          Swapping the stdio.h to cstdio and running the same file through clang++ does in fact error, which, yeah makes sense :

                                                                                          clang: warning: treating 'c' input as 'c++' when in C++ mode, this behavior is deprecated [-Wdeprecated]
                                                                                          bitmask_enum.c:7:8: error: assigning to 'enum asdf' from incompatible type
                                                                                                'int'
                                                                                                  x = B | C;
                                                                                                      ~~^~~
                                                                                          1 error generated.
                                                                                          

                                                                                          And C++ is kind of ridiculous about what you gotta do to use a scoped enumeration (c.f. https://en.cppreference.com/w/cpp/language/enum ) as content for a bitmask:

                                                                                          #include <cstdio>
                                                                                          
                                                                                          enum class jkl { A = 1, B = 2, C = 4};
                                                                                          
                                                                                          int main() {
                                                                                          	enum jkl x = jkl::A;
                                                                                          	int f = (int)jkl::B | (int)jkl::C;
                                                                                          	printf("%d\n", f);
                                                                                          }
                                                                                          

                                                                                          In conclusion, ¯\_(ツ)_/¯

                                                                                          1. 3

                                                                                            It’s another case where C++ gave us something good (enum classes) but failed to add language support for making it pleasant to use. (See also: variants; functors and iterators 2003-2010; and apparently async/await in C++20.)

                                                                                            You can fix this, but it requires writing a bunch of operator overloads on your enum class to ad the necessary type-casts. Not rocket science, but why was this not added to the standard library?

                                                                                        2. 1

                                                                                          works in clang 12

                                                                                          It should work in any conformant c compiler.

                                                                                    1. 4

                                                                                      Advent of Code 2020. I forget how much fun it is until it starts again. Sharing solutions and learning from others is great.

                                                                                      1. 2

                                                                                        Question from a noob: are we supposed to upload our work to GitHub? I saw some other lobsters posting links. If so, is there a place to register your repo once you upload it?

                                                                                        1. 3

                                                                                          It’s up to you if you do or not, I like to for archival’s sake. I don’t know of any central place to register your repo, but it can be useful for posting links to your solutions in discussions.

                                                                                          1. 4

                                                                                            This was a thing a few years ago

                                                                                            https://github.com/a-red-christmas

                                                                                            Not sure who’s responsible for it now. I remember it being pretty clunky.

                                                                                            Simplest way to share code is probably to stick a link in ones lobste.rs bio and let people know that way.

                                                                                            I’ve personally set up a bunch of scripts that allow me to make short blog posts for each entry and stick them in one place: http://gerikson.com/blog/comp/Advent-of-Code-2020.html

                                                                                          2. 3

                                                                                            There’s no requirement to upload your work. But feel free to share it somewhere if you like!

                                                                                            1. 1

                                                                                              If you log in to adventofcode.com with github, you can choose to let it link to your github profile.

                                                                                              No option for sr.ht :(

                                                                                              1. 1

                                                                                                Well, if they allow for the officially hosted sr.ht login then they might feel obligated to also support all self-hosted instances.

                                                                                                1. 2

                                                                                                  Yes indeed, though I was actually hoping for the ability to use any URL as my link, not to use my sr.ht login on AoC. I understand why this might lead to some ‘interesting’ links from the AoC, though!

                                                                                            1. 3

                                                                                              Since lobsters is publicly viewable, I don’t think that leaderboard is very private.

                                                                                              1. 14

                                                                                                “Private” in this case just means it’s a namespace with a selection of usernames, with internal scoring.

                                                                                                I.e. in the global leaderboard (not visible by default), me and my buddy might be separated by thousands of entries. A private leaderboard just excludes anyone but those on it.

                                                                                                1. 8

                                                                                                  Private, not secret.

                                                                                                1. 5

                                                                                                  Let’s go for my dream desktop then.

                                                                                                  A user interface which doesn’t rely on good luck and processing speed to avoid annoying the user.

                                                                                                  • Don’t focus steal because you took a while to launch and the user got bored, but then you popped up and they pressed Return/Space/Esc and just cancelled / agreed to something / no idea what happened.
                                                                                                  • Don’t get in the way of the scrolling / task switching / actual work the user was doing because whatever it was that wasn’t keeping the user interface snappy was more important.
                                                                                                  • Don’t cause typing latency to drop below … whatever a good threshold is.

                                                                                                  Design for this system being owned and used by one person. A personal computer. If you don’t have to worry about user A being naughty and accessing user B’s RAM/files/sockets, perhaps that frees up time to work on making the thing pleasant to use.

                                                                                                  1. 4

                                                                                                    Don’t cause typing latency to drop below … whatever a good threshold is.

                                                                                                    I’m not sure that would be a welcome feature. :D - Maybe above a threshold?

                                                                                                    1. 2

                                                                                                      Oh good call!

                                                                                                    2. 3

                                                                                                      I believe BeOS/Haiku at least tries to optimize for “being owned and used by one person”.