1. 1

    Do games that were available on both DOS and Amiga not tend to have much worse music on DOS? The Amiga came with pretty nice hardware for playing music whereas the IBM PC frequently did not.

    1. 3

      Probably depends on the game and timing (and the quality of the made port).

      When AMIGA 1000 was introduced in 1984 (or it was 1985?) it was MASSIVE difference and AMIGA 1000 sound possibilities crushed anything else into the ground. Not sure how it was at 1994-1995 when PC generally took over and AMIGA was at the Deathbed Vigil (1994).

      For example Sensible World of Soccer 96/97 was released in 1996. Compare that to Need for Speed and FIFA 97 which were also released in 1996 :)

      Not to even mention 3Dfx revolution when 3Dfx released its Voodoo Graphics chip also on 1996.

      1. 2

        This. IBM PC had a pile of random add-on cards with varying capabilities. The built-in PC speaker is barely worth mentioning, since it is basically a one-bit channel that had to be driven entirely in software, and not many people wanted to try to write a software synthesizer for it. I have seen games that output human voice through it, but it didn’t work well. By the early/mid 90’s though the tech had more or less settled on the Adlib music card, which provided an FM synthesizer, and the Soundblaster, which provided a PCM sampling interface.

        1. 3

          The PC speaker was it until 1989, and it took heroic effort to get anything decent out of it: Sentinel Worlds 1 was a childhood delight and even had a technical explanation section in the manual by the sound programmer.

          By 1995 there was just enough almost-but-not-quite-entirely compatible hardware that a DOS game’s sound-setup might ask you to select from:

          And yes, companies might compose hardware-specific soundtracks. Sound Card Database combined the multiple versions from Dune 2 into one and found some slight differences: https://www.youtube.com/watch?v=o-Q_UO6Hp7U

        2. 1

          Mostly I’m recalling that on the DOS side I had an “Adlib” sound card which could barely do MIDI, and not entirely amazing MIDI voices at that. :)

      1. 2

        I took delivery of a mass-production Atreus keyboard this week and it’s the last weekend of housesitting for a friend, so it’s typing practice and petting dogs the whole way down.

        1. 3

          Staying inside with windows shut and hoping that the temperature comes down, due to smoke plume entering Seattle.

          Besides that, working on my first web photo gallery since canceling my Flickr Pro account. This has brought up a NixOS irritation with trying to overlay a newer version of Zola: the interaction between overrideAttrs and buildRustPackage.

          1. 9

            Thanks for the detailed writeup. Seems like the machine still needs some more polish in the audio department. Having a lot of low-level options via ALSA sound interesting to me, actually. As someone who produces music on Linux, I prefer to give ALSA a good kicking until it works, rather than dealing with Pulseaudio’s latency issues. Is it possible to record the internal stereo mix directly, ie. can you select it as a recording source without resorting to jackd trickery?

            1. 7

              To be honest, “you have to use ALSA instead of pulse to get audio to play reliable” is not a pinebook-specific problem; both my thinkpads are the same way.

              1. 7

                And I have the opposite experience with both my thinkpads. Audio ‘just works’ with pulseaudio, including multisource concurrent playback, auto-switching from internal mic/speaker to external headset on plug in, bluetooth headsets, etc. None of that works out of the box with alsa on those systems.

                1. 5

                  Agreed, wasn’t trying to suggest otherwise. That said, “reliable” maybe isn’t the right word. Pulseaudio works fine for just listening to music or watching video, and is usually much less of a hassle to set up. When latency matters however (music production, emulation), ALSA performs much better in my experience.

                  1. 5

                    Pulseaudio works fine for just listening to music or watching video

                    This has not been my experience. Of course every machine is different, but I used to have it cutting out constantly until I removed it entirely. Frequently plugging in my headset would reset the volume so that one ear was muted and the other wasn’t. Ever since uninstalling pulse, on my machines ALSA has been 100% reliable.

                    1. 2

                      I haven’t had problems with pa since switching to Arch from Fedora. I think the experience varies a lot based on what distro you use.

                      1. 1

                        On my old black plastic MacBook (3,2) running Arch back in the day, PulseAudio was what made Linux audio start to be nice. ALSA worked, but in an https://xkcd.com/963/ sort of way.

                  2. 3

                    Linux ecosystem in general needs cleanup in audio department

                    1. 9

                      Charts like that having been making the rounds for ages and always feel they’re a bit disingenuous because most people don’t have half that stuff installed, and use even less of the stuff they do have installed.

                      For most people, it’s just PulseAudio → ALSA → Sound card. Some of the abstractions/libraries on top of that – such as SDL, gstreamer, etc. – add useful features like the ability to decode mp3 files and whatnot. In other words, it’s a lot less messy than that chart makes it seem.

                      (I do have plenty of gripes with the ALSA C API, which is … not very good, but that’s a different matter)

                      1. 8

                        Indeed, these charts conflate the audio system with multimedia libraries (and in case of the first one, even multimedia applications like VLC). That said, I very much agree that the state of audio on Linux is not great. Sadly everybody seems to be betting their horses on Pulseaudio, which to me seems more like an attempt to cover up the mess rather than cleaning it up.

                        1. 3

                          VLC is included because of one of the cursed parts in the chart - VLC can be used as a playback backend by phonon, which is an audio playback API that is mostly used by KDE software.

                        2. 4

                          (I do have plenty of gripes with the ALSA C API, which is … not very good, but that’s a different matter)

                          Out of curiosity, do you have any references for what a good low-level audio API looks like? It’s something I’ve been wondering about for a while, since audio on Linux is so perennially bad, but while I’m decently familiar with the physics of audio I don’t know much about the low-level details. It seems like it should just be “bytes with the proper format go into buffer at a fixed rate”, which is something that GPU and network drivers have been solving forever…

                          1. 2

                            It’s not perfect, but in contrast to Linux, I’ve never had audio issues under FreeBSD: https://www.freebsd.org/cgi/man.cgi?query=sound&sektion=4

                            1. 2

                              Thanks, I’ll take a look at it!

                            2. 1

                              Coming in late, but sndio is also worth a look(default sound under OpenBSD)

                        3. 2

                          Is it possible to record the internal stereo mix directly, ie. can you select it as a recording source without resorting to jackd trickery?

                          I have no idea, but if you can give me some pointers on how to find out I can try. If it’s worth anything, the full dump from the alsa-info script is here: http://alsa-project.org/db/?f=5363202956bb52364b8f209683f813c662079d84

                          1. 2

                            Thanks. Judging from this output, it is not possible to record the internal mix out of the box. That’s somewhat disappointing. It’s not surprising considering this has been the norm for off-the-shelf laptops for several years now, but I would have expected an open hardware like the Pinebook Pro to fare better.

                          2. 1

                            Just to satisfy my curiosity, why do you want to record the internal mix on the built in sound card? This is surely handy in some situations (when you want to record all your screen for example) but… personally for serious music stuff I’ve always used USB sound cards. Playback is probably okay on most decent laptops these days, but recording is something entirely different even on a super high end machine. So if I’d buy a Pinebook Pro I would expect an awfully noisy sound card even for playback (even if I wouldn’t really care).

                            There’s another clue: the issue with the speakers described in the article feels like a noise coming from a bad sound card or amplifier. Broken speakers don’t produce noise like that.

                          1. 2

                            Continuing the job search and trying to convince myself that it’s worthwhile to follow through and finish small personal projects.

                            1. 3

                              I’m in that boat as well. One of my secrets is to have my personal projects in widely-varying fields so that I don’t have to force myself if I’m not feeling it at the time.

                            1. 8

                              Bell Labs has been merged into Nokia, I wonder if there are any p9 devs left there.

                              1. 13

                                None that I’m aware of.

                                That said, 9front is still fairly active. Commits this year:

                                cpu% hg log -d '2020-1-1 to 2020-05-11' | grep '^changeset:' | wc -l
                                242
                                

                                And the last month:

                                cpu% hg log -d '2020-04-11 to 2020-05-11' | grep '^changeset:' | wc -l
                                88
                                
                                1. 5

                                  Could you imagine a Plan 9 Nokia phone?! Actually, I wonder how hard it would be to create a mobile version of Plan 9… the very opinionated GUI that insists on e.g. a middle mouse button doesn’t give me hope.

                                  1. 7

                                    If it hadn’t been a few years too early, Plan 9’s successor Inferno might have been a contender on phones.

                                      1. 8

                                        In addition to hellaphone, there are already people working on running Inferno on the PinePhone.

                                      2. 5

                                        There was an ipaq version[1]. It apparently used the side buttons for chording, and Plan 9 ships with an on-screen keyboard with hand writing recognition that was written as part of the port.

                                        That said, what exists is not a very good fit for modern touch UIs. The underlying GUI system should work fine, but /dev/gesture would need to be created (could be done easily as a userspace file server), and applications would need to be purpose-written or heavily modified.

                                        [1] ipaq: https://en.wikipedia.org/wiki/IPAQ#/media/File:PocketPC-HP-iPAQ-h2210.jpg

                                    1. 2

                                      “Work”: Because I committed one of the unforgivable sins of entering dashes with my SSN into my state’s unemployment system, I have to call a helpdesk to fix it at random times and hope I get a connection that doesn’t automatically ring to a “too many calls, goodbye click” message.

                                      So I’ve made my animation tools project my “work” in lieu of any other solid job connections. This now involves:

                                      • relearning Windows configuration to run and study tools like Source Filmmaker in a stable environment.
                                      • writing a Nix derivation for arcan to use it in my daily-driver OS.
                                      • mapping out the Blender 2.8 source with doxygen and other introspection tools.
                                      1. 2

                                        If you are not on the IRC already, poke me there if you run into any issues. Finishing up the next networking transparency bits then the test/record/write/release process.

                                      1. 1

                                        I worked on the Disney Playmation toy-line as the build engineer for the main device firmware. I had a front-row seat to see an entire family of consumer electronic devices being designed, a semi-custom RTOS brought-up on the master unit, and an entire supply-chain worked up.

                                        It’s still a weird feeling that code I wrote was burned to ROMs on a toy that was sold at major stores in the US. Plus I can prove it by knowing the hidden button presses to bring up the toy’s test mode..

                                        1. 2

                                          Since quite a long time ago, I wonder if we’re going to see CPU designs (or even just extra support) aimed at running WASM bytecode straight on the CPU.

                                          1. 6

                                            Ah, the eternal cycle continues.

                                            1. 3

                                              Yep, it’s feeling almost crispy at this point.

                                          1. 3

                                            Getting more to grips with NixOS on my new desktop: I have my vim plugins+config mostly working, and now I’m switching into exploring lorri+direnv for project management. Next is writing a custom derivation for a missing python library.

                                            I also need to MacGyver a Pi0 into an AP/extender, as the major US ISPs “dropping” bandwidth caps means that wifi is now borderline unusable across my apartment during the day.

                                            1. 1

                                              I recently used lorri and direnv – it’s pretty nice, because most editors have a direnv plugin, so you can sync the state of say, emacs or VSCode with your dev shell.nix.

                                            1. 9

                                              MiniDisc was such a missed opportunity. Sony launched MD Data in 1993, but didn’t let them play audio MDs and didn’t ever manage to get third parties to buy them (and didn’t license the tech to allow third-party drives), so they lost a small market to Zip and LS-120 instead of entirely replacing the floppy disk. When I looked at them, Zip disks (100MB) and LS-120 (120MB) were both about £10/disk, whereas MiniDiscs were about £1.50. At that price, they were still quite a bit more expensive than a floppy (20-30p, though closer to £1 for reliable ones), but in a close enough ballpark that the transition would have been okay. And they were smaller than floppy disks, whereas Zip and LS-120 were similar sizes (a bit bigger).

                                              MD later went to 650MB and 1GB. We could have had 650MB floppy disks in the late ’90s if Sony had valued the computer ecosystem as much as their music business and not been so obsessed with being first-party supplier of everything.

                                              1. 5

                                                A college buddy was an exchange student in Japan around the turn of the century and brought back a MD Walkman and discs which I lusted over just in terms of physical design. Having owned a “portable” Zip drive, MD in the computer ecosystem at that point would have killed Zip dead for me.

                                                More examples of Sony’s NIH habit: Memory Stick and UMD.

                                                1. 1

                                                  MiniDisc was pretty much the BetaMax of the 90s.

                                                1. 6

                                                  I went with the HP Z27, based on the Wirecutter recommendation. It can handle my personal laptop, my work laptop, and my Nintendo Switch without moving cables around.

                                                  1. 2

                                                    Do you use macOS? If so, can you confirm that it allows you to scale it to 1440px retina?

                                                    1. 5

                                                      Checking, option-clicking the scaled resolutions in Preferences shows 2560x1440 as available. Setting it, the monitor stays at 3840x2160. (Overcommunicating here just in case.)

                                                    2. 1

                                                      Do you connect all your devices via USB-C or what do you mean by “without moving cables around”?

                                                      1. 1

                                                        (I went with the Wirecutter’s recommendation for a Z27 as well). The built-in hub’s USB-C connector is DisplayPort-capable, so between the HDMI port, two DP ports(Mini and full) and two USB3 Type-A connectors, it’s easy to leave cables connected and just hotplug when necessary. The stereo jack is even connected to the audio portion of the hub, so my music is coming through my new desktop’s Radeon.

                                                        1. 1

                                                          That’s doable, but my laptop is too old! I used to move the DVI plug for my monitor from one sort of dongle to another.

                                                      1. 2

                                                        Historical context: this would form the core of SGI’s early systems, as foreshadowed by the “scales to a single chip” comment.

                                                        1. 2

                                                          $work: keeping several interview pipelines open, assuming they can actually move into video interviewing. With the prospect of WFH even if I get a contract, I finally committed to a new desktop system for dev/video-editing/rendering last week, where the “need to run VMs” and “need to compile faster” are fig-leafs for the 32GB RAM and Ryzen 7 3700. Now to hope they get here before Seattle locks the city down even further. Practicing up with Execute Program in the meantime.

                                                          $notwork: I have a Switch but it’s impossible to get a copy of Ring Fit Adventure at this point, so Breath of the Wild in between stealthy walks outside. I emptied my closets to find all of the video discs I’d gotten from sales or used-stores, so I think I have enough for at least the next six months.

                                                          1. 8

                                                            My contract was abruptly ended about two weeks ago so I’m already staying home. I haven’t put nearly as much effort into jobhunting though, because I’m pretty sure interviewing is going to grind to a halt.

                                                            1. 2

                                                              Had that happen recently as well. It’s a bit of a pain. Good luck when it comes to looking for a new contract!

                                                            1. 2

                                                              At high-school in 1996 or so, the school library had a bunch of DEC420 terminals for the library catalog, along with options for looking up info from WAIS or the Library of Congress. Someone figured out that when the option “telnet to LOC” was chosen it was possible to press ctrl-] during the connection to drop to the bare telnet prompt, and then you could go anywhere you wanted. The info quickly spread around and before long the terminals would have lines to use them during lunch hour.

                                                              I’d been reading paper books on “best places to go to on the Internet”. Of these, the section on MU*s appealed to me since they were “multiplayer text adventures”, like I’d played on the C64. I would spend an hour or two every evening using guest accounts to try various ones out. Eventually I wanted to get an account on HoloMUCK since I was more into building than combat. But to get a permanent account required an email address. I found out about the Seattle Community Network, and that remained my email account until I entered college(and remained an active account for me until I lost the password in the early 00’s)

                                                              Of course, it was only that one year that the library terminals remained accessible: the librarians got suspicious of the increased activity and next year the interrupt command was changed to something untypable like “^^”. By that point the family had gotten an account with a local dial-up provider and I could play at home.

                                                              But that year in high school was a formative time for me with the Internet.

                                                              1. 1

                                                                Y2K is fascinating to me as someone born afterward. Such a silly thing looking back, but I’m curious as to why many people found the mythical bug a serious issue. Perhaps it was that knowledge of computers was not yet ‘mainstream,’ and people just didn’t understand computer systems in general?

                                                                1. 27

                                                                  It was a serious issue, and we fixed it.

                                                                  A lot of folks look at the fact pattern as: people said Y2K was a problem; we took them seriously and spent a lot of money addressing it; nothing happened — and therefore there was no problem to begin with. That’s just not the case: there was a problem, and those sums of money solved it.

                                                                  It’s though a bit cried ‘wolf!’ and the villagers banded together and drove it off, successfully defending their flocks — and then got angry at him, because the wolf didn’t eat any sheep.

                                                                  What really worries me is that the next Y2K issue won’t be fixed, and will result in death and destruction, precisely because folks think that the first one was a hoax.

                                                                  1. 6

                                                                    This, exactly. A couple weeks ago, my daughter was raving about the cleanliness of the floors in our house, as if this sort of thing happened naturally. I had to remind her that she’s just absent when I spend a lot of time taking care of home things like cleaning the floors. Not so different.

                                                                    1. 4

                                                                      What really worries me is that the next Y2K issue won’t be fixed, and will result in death and destruction, precisely because folks think that the first one was a hoax.

                                                                      I’m not really that worried about that; anyone who knows how computers work would find an argument like “These old machines count time as a 32-it number of seconds which overflows in a few years” convincing. When the entire IT department takes the issue seriously, I can only assume the people above that would let them so what they deem necessary to keep the critical systems running. This isn’t really something the general public needs to believe in to fix.

                                                                      Or maybe I’m just naive and people in charge don’t trust their IT staff to know what’s best for IT infrastructure.

                                                                      I’m worried about sporadic failures going forward due to the hacks intended to fix y2k though. If some people’s solution to y2k was to read all numbers below 20 as 20xx and all numbers above it as 19xx, because those 20 years ought to be enough to fix the issue properly…

                                                                      1. 3

                                                                        This is why good(and public) history sources are something I will always champion. A post this week comparing these attitudes on Y2K to the 1987 treaty banning CFCs really resonated with me, having grown up witnessing both events first-hand.

                                                                        Even here in Seattle where newcomers love the views: Metro didn’t start as a bus service and many of our beaches were unsafe for swimming until the 80s.

                                                                        1. 2

                                                                          I had no idea it wasn’t a hoax. Thanks for filling me in; I’ll go research it more for myself.

                                                                        2. 6

                                                                          There were some people who were scared that old computers running deep down in cold war nuclear silos might go haywire and launch missiles (I kid you not). I guess it’s the unpredictability of the whole thing that scared people, mostly. Nobody was able to tell exactly what would happen when these date counters would overflow, which kind of makes sense because overflow bugs can cause really strange effects.

                                                                        1. 2

                                                                          Instead of posting a snarky comment about a blog post devoted to “(SMS) MFA is insecure and security people are lazy, just use machine-learning to identify users’ behavior”, I dug out this paper that I cited to an executive at my last job that wanted my security team to “start learning ML!”

                                                                          For the tldr; crowd, there’s a good high-level presentation from the authors [PDF]

                                                                          1. 12

                                                                            Everyone who is interested in make(1) should check out the (superior) mk(1), a successor written by the original author, Andrew Hume, after 20 years of experience and use.

                                                                            There’s a good paper on it here http://www.vitanuova.com/inferno/papers/mk.html

                                                                            And there’s more on the cat-v website

                                                                            1. 7

                                                                              I’ve been using mk for my personal publishing workflows(and some JavaScript) for the past year or so. There’s a nice golang port of mk if it’s not in your system packages: https://github.com/dcjones/mk

                                                                              1. 2

                                                                                There’s also remake which has some nice extras, e.g.

                                                                                $ remake --tasks
                                                                                build
                                                                                fmt
                                                                                lint
                                                                                releases
                                                                                test
                                                                                test-integration
                                                                                test-memory
                                                                                todos
                                                                                update-version
                                                                                upload-releases
                                                                                vet
                                                                                

                                                                                Update: typo

                                                                                1. 2

                                                                                  Actually there are other more superior successors, called GNU Make and BSD Make. Both are pretty capable, and portable!

                                                                                  1. 9

                                                                                    Actually there are other more superior successors

                                                                                    [citation needed]

                                                                                    GNU Make is what I was referring to as the older version, it outright imports many of the problems that make(1) has including flaws that encourage bad Makefile usage.

                                                                                1. 1

                                                                                  Good write-up. Somewhat corroborating bityard, that storage is cheaper and designs building on that should be considered. For instance, one could have a few HD’s from different manufacturers that store the same data. The software periodically loads, hashes, and compares them. Auto-corrects based on 2 out of 3. Alternatively, doesn’t auto-correct with the users manually checking the others in rare event a file on main disk is corrupt. I’d say diversify their interfaces, too, based on past experience with USB drivers corrupting my stuff.

                                                                                  Also, optical media will likely fail differently from the disks. DVD’s are cheap. So, I backup most critical stuff to HD’s and DVD’s. The DVD’s are also good for off-loading non-critical, less-used stuff that takes up lots of your HD space.

                                                                                  1. 2

                                                                                    When I have a chance to build a new backup server I’m morbidly considering an LTO drive, as bulk older LTO cartridges show up regularly in recycle-PC stores around here.

                                                                                    1. 2

                                                                                      Are we to the point of writable DVDs being suitable for archival use? I have a handful of old burnt DVDs that have been kept in jewel cases in a dark, low humidity area and some of them have observable pitting.

                                                                                      Maybe it was a bad batch, but it made me nervous.

                                                                                      1. 2

                                                                                        Good point. Mine tend to last a few years. If regular backups, you’ll make new ones long before old ones go out. If long-term archiving, you’ll need to periodically move them to new media. That’s also a good time to compare hashes of HD’s against DVD’s, too.