1. 1

    A friend gave me a few surplus ones, they’re great and it’s easy to flash stock OpenWRT onto them. I’m using one in passthrough mode as a Wifi AP extender to my main OpenWRT router.

    1. 2

      First of all, most of our software technology was built in companies (or corporate labs) outside of academic Computer Science.

      Is this true? This seems like a pretty wild claim.

      The risk-aversion and hyper-professionalization of Computer Science is part of a larger worrisome trend throughout Science and indeed all of Western Civilization

      Why specifically western civilization? I would say this is more of an issue of how capital functions, not because of Western Culture™️

      1. 4

        Bell Labs, Xerox PARC, SRI, BBN invented a pretty jaw-dropping swath of tech in the 60s and 70s: packet switched networks and the ARPAnet (together with MIT), Unix, the mouse and the GUI, online collaboration, text editing as we know it, bitmap displays, Ethernet, the laser printer, the word processor, file servers, (much of) object-oriented programming, digital audio and computer music (Max Matthews)…

        On the hardware side, it was nearly all corporate — Bell Labs, TI, Fairchild, Intel… (though IIRC, Carver Mead at Caltech is the father of VLSI, and RISC was invented at UC Berkeley.)

        1. 2

          To back up snej’s comment, there are several great history books on this period:

          Gertner’s “The Idea Factory” is also a good read, but focuses more on the overall history of Bell Labs.

        1. 3

          Whole lot of nostalgia this week. I was more into the Unreal Tournament mod scene, but it was the same process of discovery and a wide variety of tools. I’ve mentioned time and again that I learned Ruby in order to understand UnrealScript coroutines, which lead to a professional Rails job years later.

          I’d also concur that keeping a project silent “until it’s done” is the quickest way to ensure it never gets released(he says, while quietly looking at recently unearthed HDD archives).

          1. 7

            Well, this is indeed a nice blast from the past I wasn’t expecting to see this evening!

            It’s only tangentially mentioned on the third page, but this is a game from 00’s indie darling ABA Games/Kenta Cho, and at the time convinced college-age me that languages outside of C or C++ were just as capable of making fast graphic games.

            However this was right around the big split between D1/Phobos and D2/Tango, so I never really pursued it any further.

            1. 2

              Most typesetting languages efforts seem to fail or be ignored because TeX (and its derivatives) is too pervasive and so much has been built upon it that making sense of it all, let alone improving upon it, is a massive undertaking. How does the language work? Where even is the core of TeX in its codebase nowadays? Is it still written in WEB? I remember looking into it a few years ago and I could hardly find my way around.

              I saw a talk by somebody who was reimplementing the core algorithms of TeX in Clojure, and looking at his github profile now I can’t even find the repo anymore.

              1. 2

                I’ve been using Tectonic, which is a reimplementation of the WEB2C version(plus XeTeX) in Rust.

                1. 1

                  It looks like not a reimplementation, but just a wrapper? Does this get you anything compared with something like ConTeXt?

                2. 2

                  There was a reimplementation of tex in java; now abandoned, unfortunately. I believe there was also some interest at one point in reimplementing it in ocaml.

                  That said, I believe the modern tex distributions are written in straight c, not WEB.

                  1. 1

                    I believe there was also some interest at one point in reimplementing it in ocaml.

                    You mean Patoline? It seems to be functional but it doesn’t look very actively developed.

                  2. 2

                    There’s an interesting project called SILE. AFAIU, its author went the crazy hacker way, and basically butchered out a few core libraries from TeX, and glued them together with Lua, changing whatever else he wanted to his liking, attempting to make it simple and modern. The main reason why I think this project didn’t get much more popular (yet?), is that it still lacks math rendering support. I tried to contribute something in this area, but… kinda fizzled away after stumbling over some stupid integration issue that I didn’t have an idea how to resolve… I still have this somewhere on my TODO “bucket lists”, but no idea if I’ll manage to get back to it ever…

                  1. 3

                    I’d almost forgotten about the Alf language and extremely pleased to discover from this post that it has a production-capable successor. Kudos to the author for landing a job with it!

                    1. 2

                      Thanks! I thought at first maybe you meant the Algebraic Logic Functional programming language, a predecessor of the Curry language of which I’m also a fan. But then I saw that you linked to the actual Alf in point. Happy to hear you know about it! I’m planning to write more about my experience with bmg and how to use in in production systems, in due time.

                    1. 2

                      The proposal outlined seems hugely over complicated for what is - as the commentary suggests - almost a complete non-issue. I’m surprised it’s generating so much discussion, unless I’m missing something important.

                      1. 3

                        It seems like a real issue to me. How are people supposed to connect to devices on their local network?

                        The past solution has been to use their desktop’s, laptop’s, or smartphone’s web browser to make an HTTP connection, via mDNS or to the device’s IP address. This had the great advantages of being simple , universal, and non-expiring. Unfortunately, browsers are increasingly restricting HTTP. Currently, many browsers will show scary warnings every time you try to make such a connection. This is annoying to people who know what’s going on, and a total deterrence to people who don’t. In the near future, browsers may disallow HTTP altogether.

                        I know of 3 other solutions which are compatible with current and near-future browser security requirements:

                        1. Devices can create self-signed certificates, but those are going the same way as HTTP – scary warnings in the web browser, which may soon become total stonewalls. It also requires more software and hardware than plain HTTP, and might one day become obsolete when browsers stop trusting the device’s SSL version.

                        2. The connection can go through infrastructure on the internet, which allows for servers with real domain names to obtain real, normal SSL certificates and connect to the user’s computer and to their network gadget. This has the downside of requiring that the networked gadget be permanently connected (exposed) to the internet, and that someone has to maintain infrastructure in order for the devices to work. In order to avoid creating a botnet or a pile of e-waste, someone has to own and maintain the software for the devices and the central server, and keep the update mechanism working nearly perfectly.

                        3. Network-gadget creators (including open-source communities) can produce special-purpose software for connecting to the device, and require users to download and run that. This may or may not actually be secure, but it can be made to work (on supported devices with supported operating systems that the user has permission to install/run software on) without having to jump through hoops like messing with DNS settings or installing certificates.

                        OpenWRT mailing list user abnoeh has proposed a new, hybrid solution. The OpenWRT organization would be granted limited Certificate Authority from a sponsoring Authority, like Let’s Encrypt. The scheme takes advantage of the fact that many (most?) OpenWRT devices are home routers: connected to the internet, and located in between the open internet and the user who probably want to connect to/configure their router. The OpenWRT router is authentifiably assigned a certificate for a specific subdomain of openwrt.org, and intercepts and responds to outgoing DNS and HTTP(S) requests to that domain, basically acting as an authorized Man-In-The-Middle attack on openwrt.org.

                        This scheme has the upsides of being currently technically possible, and working on many user’s devices without tweaking settings, installing custom software, or clicking through secutiry scarewalls. It has the major downsides that it needs support from a certificate authority, that it or the browser vendors could invalidate the whole scheme at any time, and that openwrt.org needs to take on the responsibilities of a certificate authority. It is also not a general solution to the problem of interacting with devices on the local network, many of which also run OpenWRT. As the world is being populated by more and more local devices, it would be great if there was an agreed-upon solution that wasn’t “connect to a central server on the internet”.

                        1. 1

                          Your solution 1, the current situation, really isn’t a bad one in my opinion. Having to affirm that you trust a connection the first time you connect to a machine is a perfectly okay flow.

                          It would be nice if HTTPS didn’t tightly couple authentication and integrity. In this particular situation, where we’re probably navigating directly to a local IP address, we want the later much more than the former but nevertheless have to drag in the whole CA system.

                          1. 1

                            Devices can create self-signed certificates…scary warnings…

                            People might be doing this wrong. The way I’ve found to make this work is to:

                            1. Create your own cert authority.
                            2. Create a cert with the CA and another key.
                            3. Add your CA to your keychain or certificates directory.
                            4. Add the cert and key to your server.
                            5. Add ip address of the device to your /etc/hosts/ of all your devices or do some dns stuff

                            There is no scary warning when you do it this way rather than doing a self signed cert. Please note des3 is deprecated but works so look into openssl if you’re not going to use this for development purposes.

                            Here are some things you can use:

                            root_certificate_authority_creation.sh

                            #!/usr/bin/env bash
                            mkdir ~/ssl/
                            openssl genrsa -des3 -out root.key 2048
                            openssl req -x509 -new -nodes -key root.key -sha256 -days 1024 -out rootCA.pem
                            

                            create_root_self_signed_certificate.sh

                            #!/usr/bin/env bash
                            openssl req -new -sha256 -nodes -out coolbox.local.csr -newkey rsa:2048 -keyout coolbox.local.key -config <( cat coolbox.local.cnf )
                            
                            openssl x509 -req -in coolbox.local.csr -CA rootCA.pem -CAkey rootCA.key -CAcreateserial -out coolbox.local.crt -days 500 -sha256 -extfile v3.ext
                            

                            coolbox.local.cnf

                            [req]
                            default_bits = 2048
                            prompt = no
                            default_md = sha256
                            distinguished_name = dn
                            
                            [dn]
                            C=US
                            ST=California
                            L=SF
                            O=End Point
                            OU=Testing Domain
                            emailAddress=admin@coolboxbro.com
                            CN = coolbox.local
                            

                            v3.ext

                            authorityKeyIdentifier=keyid,issuer
                            basicConstraints=CA:FALSE
                            keyUsage = digitalSignature, nonRepudiation, keyEncipherment, dataEncipherment
                            subjectAltName = @alt_names
                            
                            [alt_names]
                            DNS.1 = coolbox.local
                            DNS.2 = *.coolbox.local
                            

                            add_root_to_KeyChain.sh

                            #!/usr/bin/env bash 
                            # Uncomment the one you need and check the internet to make sure it's somewhat right
                            # MacOS
                            #security add-trusted-cert -k /Library/Keychains/System.keychain -d rootCA.pem
                            #Debian
                            #sudo cp foo.crt /usr/local/share/ca-certificates/foo.crt
                            #sudo update-ca-certificates
                            #windows, i think this works but I'm not on windows anymore.
                            #certutil -addstore -f "ROOT" new-root-certificate.crt
                            
                            1. 5

                              If I gave this list to my non-programmer friends, they would simply walk away, because this doesn’t mean anything to them.

                              Next step, once you get the CA going, is to generate a request, generate a certificate, then install said certificate on the IoT device you have, so you can then manage it via a web browser …

                              Look, I have set up my own CA to play around with this stuff. And so far, I have 8 scripts (totaling over 250 lines) to maintain it, and I don’t think I have it fully right.

                              1. 2

                                Seconding spc476’s comment: even with something like smallstep to streamline a local CA, it’s going to be beyond most non-programmers. And that’s before trying to roll it out to off-the-shelf IoT devices: see the Plex link in this thread.

                          1. 1

                            This is part of General Magic’s pre-smartphone technology, which has been mentioned on lobsters here.

                            Fun fact: when I saw the GM documentary premiere at a local indie theater, Marc Porat himself was in attendance and did a Q&A afterwards.

                            He’s still convinced that agent-based programming should be researched further but candidly admitted that the full implementation described here would be considered a virus(agents try to replicate themselves to the servers and run code).

                            1. 1

                              The recent pushes in server-side WebAssembly remind me a little of this; I hope they take better notice of security research as well as the ability to restrict resources beyond security (probably not solve the halting problem though). Don’t want to risk an inverse of the JS model where clients are pushing servers untrusted code.

                            1. 1

                              Do games that were available on both DOS and Amiga not tend to have much worse music on DOS? The Amiga came with pretty nice hardware for playing music whereas the IBM PC frequently did not.

                              1. 3

                                Probably depends on the game and timing (and the quality of the made port).

                                When AMIGA 1000 was introduced in 1984 (or it was 1985?) it was MASSIVE difference and AMIGA 1000 sound possibilities crushed anything else into the ground. Not sure how it was at 1994-1995 when PC generally took over and AMIGA was at the Deathbed Vigil (1994).

                                For example Sensible World of Soccer 96/97 was released in 1996. Compare that to Need for Speed and FIFA 97 which were also released in 1996 :)

                                Not to even mention 3Dfx revolution when 3Dfx released its Voodoo Graphics chip also on 1996.

                                1. 2

                                  This. IBM PC had a pile of random add-on cards with varying capabilities. The built-in PC speaker is barely worth mentioning, since it is basically a one-bit channel that had to be driven entirely in software, and not many people wanted to try to write a software synthesizer for it. I have seen games that output human voice through it, but it didn’t work well. By the early/mid 90’s though the tech had more or less settled on the Adlib music card, which provided an FM synthesizer, and the Soundblaster, which provided a PCM sampling interface.

                                  1. 3

                                    The PC speaker was it until 1989, and it took heroic effort to get anything decent out of it: Sentinel Worlds 1 was a childhood delight and even had a technical explanation section in the manual by the sound programmer.

                                    By 1995 there was just enough almost-but-not-quite-entirely compatible hardware that a DOS game’s sound-setup might ask you to select from:

                                    And yes, companies might compose hardware-specific soundtracks. Sound Card Database combined the multiple versions from Dune 2 into one and found some slight differences: https://www.youtube.com/watch?v=o-Q_UO6Hp7U

                                  2. 1

                                    Mostly I’m recalling that on the DOS side I had an “Adlib” sound card which could barely do MIDI, and not entirely amazing MIDI voices at that. :)

                                1. 2

                                  I took delivery of a mass-production Atreus keyboard this week and it’s the last weekend of housesitting for a friend, so it’s typing practice and petting dogs the whole way down.

                                  1. 3

                                    Staying inside with windows shut and hoping that the temperature comes down, due to smoke plume entering Seattle.

                                    Besides that, working on my first web photo gallery since canceling my Flickr Pro account. This has brought up a NixOS irritation with trying to overlay a newer version of Zola: the interaction between overrideAttrs and buildRustPackage.

                                    1. 9

                                      Thanks for the detailed writeup. Seems like the machine still needs some more polish in the audio department. Having a lot of low-level options via ALSA sound interesting to me, actually. As someone who produces music on Linux, I prefer to give ALSA a good kicking until it works, rather than dealing with Pulseaudio’s latency issues. Is it possible to record the internal stereo mix directly, ie. can you select it as a recording source without resorting to jackd trickery?

                                      1. 7

                                        To be honest, “you have to use ALSA instead of pulse to get audio to play reliable” is not a pinebook-specific problem; both my thinkpads are the same way.

                                        1. 7

                                          And I have the opposite experience with both my thinkpads. Audio ‘just works’ with pulseaudio, including multisource concurrent playback, auto-switching from internal mic/speaker to external headset on plug in, bluetooth headsets, etc. None of that works out of the box with alsa on those systems.

                                          1. 5

                                            Agreed, wasn’t trying to suggest otherwise. That said, “reliable” maybe isn’t the right word. Pulseaudio works fine for just listening to music or watching video, and is usually much less of a hassle to set up. When latency matters however (music production, emulation), ALSA performs much better in my experience.

                                            1. 5

                                              Pulseaudio works fine for just listening to music or watching video

                                              This has not been my experience. Of course every machine is different, but I used to have it cutting out constantly until I removed it entirely. Frequently plugging in my headset would reset the volume so that one ear was muted and the other wasn’t. Ever since uninstalling pulse, on my machines ALSA has been 100% reliable.

                                              1. 2

                                                I haven’t had problems with pa since switching to Arch from Fedora. I think the experience varies a lot based on what distro you use.

                                                1. 1

                                                  On my old black plastic MacBook (3,2) running Arch back in the day, PulseAudio was what made Linux audio start to be nice. ALSA worked, but in an https://xkcd.com/963/ sort of way.

                                            2. 3

                                              Linux ecosystem in general needs cleanup in audio department

                                              1. 9

                                                Charts like that having been making the rounds for ages and always feel they’re a bit disingenuous because most people don’t have half that stuff installed, and use even less of the stuff they do have installed.

                                                For most people, it’s just PulseAudio → ALSA → Sound card. Some of the abstractions/libraries on top of that – such as SDL, gstreamer, etc. – add useful features like the ability to decode mp3 files and whatnot. In other words, it’s a lot less messy than that chart makes it seem.

                                                (I do have plenty of gripes with the ALSA C API, which is … not very good, but that’s a different matter)

                                                1. 8

                                                  Indeed, these charts conflate the audio system with multimedia libraries (and in case of the first one, even multimedia applications like VLC). That said, I very much agree that the state of audio on Linux is not great. Sadly everybody seems to be betting their horses on Pulseaudio, which to me seems more like an attempt to cover up the mess rather than cleaning it up.

                                                  1. 3

                                                    VLC is included because of one of the cursed parts in the chart - VLC can be used as a playback backend by phonon, which is an audio playback API that is mostly used by KDE software.

                                                  2. 4

                                                    (I do have plenty of gripes with the ALSA C API, which is … not very good, but that’s a different matter)

                                                    Out of curiosity, do you have any references for what a good low-level audio API looks like? It’s something I’ve been wondering about for a while, since audio on Linux is so perennially bad, but while I’m decently familiar with the physics of audio I don’t know much about the low-level details. It seems like it should just be “bytes with the proper format go into buffer at a fixed rate”, which is something that GPU and network drivers have been solving forever…

                                                    1. 2

                                                      It’s not perfect, but in contrast to Linux, I’ve never had audio issues under FreeBSD: https://www.freebsd.org/cgi/man.cgi?query=sound&sektion=4

                                                      1. 2

                                                        Thanks, I’ll take a look at it!

                                                      2. 1

                                                        Coming in late, but sndio is also worth a look(default sound under OpenBSD)

                                                  3. 2

                                                    Is it possible to record the internal stereo mix directly, ie. can you select it as a recording source without resorting to jackd trickery?

                                                    I have no idea, but if you can give me some pointers on how to find out I can try. If it’s worth anything, the full dump from the alsa-info script is here: http://alsa-project.org/db/?f=5363202956bb52364b8f209683f813c662079d84

                                                    1. 2

                                                      Thanks. Judging from this output, it is not possible to record the internal mix out of the box. That’s somewhat disappointing. It’s not surprising considering this has been the norm for off-the-shelf laptops for several years now, but I would have expected an open hardware like the Pinebook Pro to fare better.

                                                    2. 1

                                                      Just to satisfy my curiosity, why do you want to record the internal mix on the built in sound card? This is surely handy in some situations (when you want to record all your screen for example) but… personally for serious music stuff I’ve always used USB sound cards. Playback is probably okay on most decent laptops these days, but recording is something entirely different even on a super high end machine. So if I’d buy a Pinebook Pro I would expect an awfully noisy sound card even for playback (even if I wouldn’t really care).

                                                      There’s another clue: the issue with the speakers described in the article feels like a noise coming from a bad sound card or amplifier. Broken speakers don’t produce noise like that.

                                                    1. 2

                                                      Continuing the job search and trying to convince myself that it’s worthwhile to follow through and finish small personal projects.

                                                      1. 3

                                                        I’m in that boat as well. One of my secrets is to have my personal projects in widely-varying fields so that I don’t have to force myself if I’m not feeling it at the time.

                                                      1. 8

                                                        Bell Labs has been merged into Nokia, I wonder if there are any p9 devs left there.

                                                        1. 13

                                                          None that I’m aware of.

                                                          That said, 9front is still fairly active. Commits this year:

                                                          cpu% hg log -d '2020-1-1 to 2020-05-11' | grep '^changeset:' | wc -l
                                                          242
                                                          

                                                          And the last month:

                                                          cpu% hg log -d '2020-04-11 to 2020-05-11' | grep '^changeset:' | wc -l
                                                          88
                                                          
                                                          1. 5

                                                            Could you imagine a Plan 9 Nokia phone?! Actually, I wonder how hard it would be to create a mobile version of Plan 9… the very opinionated GUI that insists on e.g. a middle mouse button doesn’t give me hope.

                                                            1. 7

                                                              If it hadn’t been a few years too early, Plan 9’s successor Inferno might have been a contender on phones.

                                                                1. 8

                                                                  In addition to hellaphone, there are already people working on running Inferno on the PinePhone.

                                                                2. 5

                                                                  There was an ipaq version[1]. It apparently used the side buttons for chording, and Plan 9 ships with an on-screen keyboard with hand writing recognition that was written as part of the port.

                                                                  That said, what exists is not a very good fit for modern touch UIs. The underlying GUI system should work fine, but /dev/gesture would need to be created (could be done easily as a userspace file server), and applications would need to be purpose-written or heavily modified.

                                                                  [1] ipaq: https://en.wikipedia.org/wiki/IPAQ#/media/File:PocketPC-HP-iPAQ-h2210.jpg

                                                              1. 2

                                                                “Work”: Because I committed one of the unforgivable sins of entering dashes with my SSN into my state’s unemployment system, I have to call a helpdesk to fix it at random times and hope I get a connection that doesn’t automatically ring to a “too many calls, goodbye click” message.

                                                                So I’ve made my animation tools project my “work” in lieu of any other solid job connections. This now involves:

                                                                • relearning Windows configuration to run and study tools like Source Filmmaker in a stable environment.
                                                                • writing a Nix derivation for arcan to use it in my daily-driver OS.
                                                                • mapping out the Blender 2.8 source with doxygen and other introspection tools.
                                                                1. 2

                                                                  If you are not on the IRC already, poke me there if you run into any issues. Finishing up the next networking transparency bits then the test/record/write/release process.

                                                                1. 1

                                                                  I worked on the Disney Playmation toy-line as the build engineer for the main device firmware. I had a front-row seat to see an entire family of consumer electronic devices being designed, a semi-custom RTOS brought-up on the master unit, and an entire supply-chain worked up.

                                                                  It’s still a weird feeling that code I wrote was burned to ROMs on a toy that was sold at major stores in the US. Plus I can prove it by knowing the hidden button presses to bring up the toy’s test mode..

                                                                  1. 2

                                                                    Since quite a long time ago, I wonder if we’re going to see CPU designs (or even just extra support) aimed at running WASM bytecode straight on the CPU.

                                                                    1. 6

                                                                      Ah, the eternal cycle continues.

                                                                      1. 3

                                                                        Yep, it’s feeling almost crispy at this point.

                                                                    1. 3

                                                                      Getting more to grips with NixOS on my new desktop: I have my vim plugins+config mostly working, and now I’m switching into exploring lorri+direnv for project management. Next is writing a custom derivation for a missing python library.

                                                                      I also need to MacGyver a Pi0 into an AP/extender, as the major US ISPs “dropping” bandwidth caps means that wifi is now borderline unusable across my apartment during the day.

                                                                      1. 1

                                                                        I recently used lorri and direnv – it’s pretty nice, because most editors have a direnv plugin, so you can sync the state of say, emacs or VSCode with your dev shell.nix.

                                                                      1. 9

                                                                        MiniDisc was such a missed opportunity. Sony launched MD Data in 1993, but didn’t let them play audio MDs and didn’t ever manage to get third parties to buy them (and didn’t license the tech to allow third-party drives), so they lost a small market to Zip and LS-120 instead of entirely replacing the floppy disk. When I looked at them, Zip disks (100MB) and LS-120 (120MB) were both about £10/disk, whereas MiniDiscs were about £1.50. At that price, they were still quite a bit more expensive than a floppy (20-30p, though closer to £1 for reliable ones), but in a close enough ballpark that the transition would have been okay. And they were smaller than floppy disks, whereas Zip and LS-120 were similar sizes (a bit bigger).

                                                                        MD later went to 650MB and 1GB. We could have had 650MB floppy disks in the late ’90s if Sony had valued the computer ecosystem as much as their music business and not been so obsessed with being first-party supplier of everything.

                                                                        1. 5

                                                                          A college buddy was an exchange student in Japan around the turn of the century and brought back a MD Walkman and discs which I lusted over just in terms of physical design. Having owned a “portable” Zip drive, MD in the computer ecosystem at that point would have killed Zip dead for me.

                                                                          More examples of Sony’s NIH habit: Memory Stick and UMD.

                                                                          1. 1

                                                                            MiniDisc was pretty much the BetaMax of the 90s.

                                                                          1. 6

                                                                            I went with the HP Z27, based on the Wirecutter recommendation. It can handle my personal laptop, my work laptop, and my Nintendo Switch without moving cables around.

                                                                            1. 2

                                                                              Do you use macOS? If so, can you confirm that it allows you to scale it to 1440px retina?

                                                                              1. 5

                                                                                Checking, option-clicking the scaled resolutions in Preferences shows 2560x1440 as available. Setting it, the monitor stays at 3840x2160. (Overcommunicating here just in case.)

                                                                              2. 1

                                                                                Do you connect all your devices via USB-C or what do you mean by “without moving cables around”?

                                                                                1. 1

                                                                                  That’s doable, but my laptop is too old! I used to move the DVI plug for my monitor from one sort of dongle to another.

                                                                                  1. 1

                                                                                    (I went with the Wirecutter’s recommendation for a Z27 as well). The built-in hub’s USB-C connector is DisplayPort-capable, so between the HDMI port, two DP ports(Mini and full) and two USB3 Type-A connectors, it’s easy to leave cables connected and just hotplug when necessary. The stereo jack is even connected to the audio portion of the hub, so my music is coming through my new desktop’s Radeon.