1. 1

    Ksh on OpenBSD

    Bash or KSH on most platforms

    CSH on Solaris

    VINCE on AmigaOS

    1. 3

      A large factor for some people is who can access the content being self hosted. There’s an argument that if you own the dedicated hardware in the colo that only you can access the content. I’m not sure that argument holds water in many, or even more than a minority of circumstances.

      However, with a VPS or Cloud system, there’s usually some form of ability for someone other than you to gain access to data and/or root on the system. For that reason I personally don’t class VPSes or cloud systems as self-hosting, but can see how some people would.

      I have dedicated servers, I consider those self hosted. I have boxes at home and at work, I’m not sure I consider them self hosting, just home and work servers. I have VPSes dotted here, there and everywhere along with the odd ephemeral cloud system. I don’t consider anything there self-hosted because of the control aspect.

      But to be honest, I feel like any argument about it is splitting hairs. If you’re running your own software instance on something you control and are comfortable with the data storage, more power to you.

      1. 20

        This is not an apology for Comcast, but my gut tells me that wrapping yet another protocol in HTTPS is maybe not the best idea. To be more technical, TCP overhead and SNI loopholes make DoH seem like a half-solution–which could be worse than no solution at all.

        Also, I think DoH is yet another Google power-play–just like AMP–to build yet another moat around the castle.

        1. 16

          Yea .. I mean, the slides aren’t wrong. And once Firefox is DoH->CloudFlair and Chrome is DoH->Google, who is to say either one wouldn’t just decide to delist a DNS entry they don’t like claiming it’s hate speech. Keep in mind, both companies have already done this to varying extents and it should be deeply troubling.

          I run a local DNS server on my router that I control. Still, it queries root servers plain-text and my ISP could see that (even though I don’t use my ISPs DNS .. not sure if they’re set to monitor raw DNS traffic or not). I could also pump that through one of my hosting providers (Vultr or DigitalOcean) and it’s less likely they’d be monitoring and selling DNS data (but they still could if they wanted).

          Ultimately the right legal argument that should be lobbied for is banning ISPs from collecting DNS data or altering DNS requests at all (no more redirects to a Comcast search page for non-existent domains!) That feels like it’s the more correct solution than centralizing control in Google/CloudFlare’s DNS.

          1. 12

            I also run a local resolver (a pihole – for dns based ad filtering), but also use DoT (dns over tls) between my resolver and an upstream resolver.

            It seems like host OS resolvers natively (and opportunistically) supporting DoT would solve a lot of problems, vs this weird frankenstein per-app DoH thing we seem to be moving towards.

            1. 4

              not sure if they’re set to monitor raw DNS traffic or not

              They most certainly do, and a few less scrupulous ISPs have been shown to be MITM’ing DNS responses for various reasons but usually $$$.

              1. 4

                Isn’t the real problem here the users choice of ISP? Or has so much of the internet become extremely monopolized around the world?

                1. 9

                  In the USA, there is basically zero choice in who your ISP can be, many even big urban areas have only 1 ISP provider. Perhaps if SpaceX can get their starlink stuff commercialized next year, the effective number will grow to 2…. maybe. I can’t speak for other countries, but in my experience they aren’t generally better in terms of options, but they do tend to be better in price. US ISP’s know they are the only game in town and charge accordingly.

                  1. 2

                    In the USA, there is basically zero choice in who your ISP can be

                    That’s understandable, but DoH is not the answer here. Addressing the lack of choice is the answer. If Google and Firefox/CF get a free pass in the US, it affects the rest of the world.

                    1. 1

                      I totally agree with you.

                    2. 2

                      I am considering myself lucky then. I can choose between anything that can run over POTS, cable and fiber. The POTS and fiber networks being required to open up their network for other ISP’s as well.

                      1. 1

                        In the USA, there is basically zero choice in who your ISP can be, many even big urban areas have only 1 ISP provider.

                        This is not strictly true at all. Most urban areas in the US of A are a duopoly insofar as the internet goes. You usually have a choice for the internet between the cableco or the telco. In addition, telcos are often required to provide CLECs with some sort of access to the copper lines as well, so, there’s some potential for a additional choices like Sonic DSL, although those become more rare because often the telco charges CLECs more for access to this copper than the price of their internet service directly to the consumer, so, Sonic is one of the few remaining independent CLECs out there.

                        Some areas do have extra third choices like PAXIO, Webpass, Google Fiber, as well as local municipal networks in some areas.

                        1. 3

                          10% of the US at any speed have more than 2 providers. When you get into slower speeds, there are 2 choices(telco and cable company).

                          “At the FCC’s 25Mbps download/3Mbps upload broadband standard, there are no ISPs at all in 30 percent of developed census blocks and only one offering service that fast in 48 percent of the blocks. About 55 percent of census blocks have no 100Mbps/10Mbps providers, and only about 10 percent have multiple options at that speed.” - https://arstechnica.com/information-technology/2016/08/us-broadband-still-no-isp-choice-for-many-especially-at-higher-speeds/

                          Figure 5 in the linked article above pretty much sums it up. So we are both correct, depending on perspective. :) The FCC thinks all is fine and dandy in the world of US internet providers. Something tells me the Cable companies are encouraging that behaviour :)

                  2. 1

                    And once Firefox is DoH->CloudFlair and Chrome is DoH->Google

                    Once the standards are in place for DHCP (et al) to report a default DoH endpoint to use, and OSes can propagate its own idea, informed by DHCP or user configuration, to clients (or do the resolving for them via DoH), there’s little reason for Firefox or Chrome not to use that data.

                    That issue is regularly mentioned in the draft RFCs, so there will be some solution to that. But given that there’s hijacking going on, browser vendors seem to be looking for a solution now instead of waiting that this part of the puzzle propagated through systems they don’t control.

                    Also, web browsers have a culture of “implement first, standardize once you experienced the constraints”, so this is well within their regular modus operandi - just outside their regular field of work.

                    Lobbying work isn’t as effective as just starting to use DoH because you have to do it in each of the nearly 200 jurisdictions around the globe.

                    1. 1

                      Not holding my breath on a legal solution. US gov has not been a friend of privacy, and other governments are far worse.

                      Only thing coming to mind here is some sort of privacy-oriented low-profit/non-profit organization to pool and anonymize queries over many different clients. Even that’s not so great when most setups are 8.8.8.8, admin/password, and absolutely DNGAF.

                      1. 1

                        And once Firefox is DoH->CloudFlair and Chrome is DoH->Google, who is to say either one wouldn’t just decide to delist a DNS entry they don’t like claiming it’s hate speech. Keep in mind, both companies have already done this to varying extents and it should be deeply troubling.

                        Like Cloudflare not supporting edns.. :/

                      2. 3

                        To be more technical, TCP overhead and SNI loopholes make DoH seem like a half-solution

                        The TCP/TLS overhead can be minimized with keep-alive, which DoT clients like stubby already do. You can simply reuse an established connection for multiple queries. This has worked very well for me in my own setups.

                        As others have probably pointed out, the SNI loophole can be closed with eSNI. How soon and if this is going to take hold is anyones guess at this point. But I personally see privacy as more of a side effect as I simply care that my queries are not manipulated by weird networks.

                        This is not an apology for Comcast, but my gut tells me that wrapping yet another protocol in HTTPS is maybe not the best idea.

                        I would love to agree with you here (and I do so in principle), but from my own experience with DoT and DoH I can tell you that many networks simply don’t allow a direct DoT port, leaving you with either DoH or plain DNS to an untrusted (and probably non-validating) resolver. The shift to “X over HTTPS” is but a reaction to real world limitations, where almost everything but HTTP(s) is likely to be unreachable in many networks. I’d love to use DoT and do so whenever I can. But I need to disable it more often than I’d like to. :(

                        A minor fun fact regarding DoH: Since a http(s) server can redirect to different endpoints, it’s in principle possible for clients to choose different “offers” - a DoH server may offer a standard resolution on /query and filter out ad networks on /pihole or whatever. And using dnsdist, this is easy to setup and operate yourself. DoH doesn’t really mean DNS centralization but the opportunity for quite the opposite: You could now take your own resolver with you wherever you go.

                        1. 1

                          I’m fine with DoH as a configurable system-level feature, but application-level resolvers are bad news, and that seems to be where all of this is headed.

                          If that’s where it goes, many of applications will default to their own favorite DoH provider for some kind of kickback. The prospect of having to find the “use system resolver” check box for every single application after every other update does not bring joy.

                        2. 3

                          HTTPS is upgrading to QUIC, so we’ll eventually have DNS back on UDP, but with proper encryption this time.

                        1. 15

                          I’m so glad I stopped using Linux after Systemd took over.

                          1. 11

                            For me, systemd is actually the reason why I cannot stand Free/OpenBSD anymore. I don’t want to see another shell script for controlling important system software that might or might not work. Perhaps one of these days, OpenBSD pulls a typical OpenBSD move and some cool hacker reimplements the best parts of systemd (mostly the service administration really) into a single cohesive package.

                            1. 4

                              Funny because systemd and the pulseaudio situation with firefox are two of the reasons I switched from GNU/Linux to OpenBSD.

                              1. 1

                                pulseaudio situation with firefox

                                Mind elaborating on that? I use both and haven’t had any problems in years.

                                1. 2

                                  Firefox no longer supports ALSA. ALSA has been fine for me for the past 14 years. I was unwilling to install pulseaudio and unwilling to switch to Chromium, so I moved to OpenBSD.

                                2. 1

                                  There was a while ago when pulse audio and its Gnome controllers were shit and then it got good and I could just remote play or remote source really neatly and then that went away and I can only do normal stuff and it’s honestly kind of painless now.

                                3. 1

                                  I quite enjoy having my boot process be shell-based. I regularly ^T it for information or ^C if I can’t be bothered to wait for something.

                                4. 4

                                  What do you use now? What are the tradeoffs you had to make?

                                  1. 2

                                    I use Open, Net and FreeBSD as alternatives. For desktop and some server stuff I’ve used MacOS for a while. I’ve been using FreeBSD for a long time, and Open and Net on and off, but Systemd caused me so many problems that I’m almost completely off Linux. I have the odd throwaway VM here and there, and two hosts left running docker, one of which is going to be replaced with FreeBSD, the other retired.

                                    I probably should’ve been clearer - I don’t use Systemd encumbered Linux as a primary OS anymore for anything important, and took steps to reduce the amount of Linux. Of course my phone is Android, so I’m still using Linux in some capacity.

                                    The one tradeoff I’ve had to make with OpenBSD is that a lot of things are written for Linux, and because OpenBSD tends to do things in a more correct manner, things that don’t check to see something is there (e.g. /proc) tend to break. FreeBSD has better compatibility. On the other hand, OpenBSD is as rock solid stable as Debian used to be, which isn’t an indictment of Debian but really kudos to OpenBSD.

                                    I have been surprised at just how good OpenBSD is as a daily driver Laptop. I use an X230 thinkpad with a custom BIOS and X220 keyboard as a personal laptop, and it’s really nice to use.

                                  2. -2

                                    You’ll be back

                                    1. 6

                                      well, then there are some sane options like slackware or gentoo left to use :>

                                      1. 10

                                        Void Linux, too.

                                        1. 1

                                          yes, my list wasn’t complete :)

                                        2. 5

                                          You also have the option of Devuan, Void Linux, Alpine, and GuixSD (all of which I’ve run as daily drivers at various points). I’ve also heard of Artix (which I haven’t attempted to run). The first two have worked in general in my experience. I run Alpine on my laptop (where I don’t need all the extra applications that would require libc), and Devaun on my desktop machine. Besides regular run-ins with PulseAudio stupidity, I’m pretty happy with it.

                                          I did try OpenBSD in the past, but I have an nvidia card on my desktop (which isn’t supported), and on my laptop, it was uncomfortably slow to run a web browser. While I’m not inclined to run an OpenBSD desktop machine, I do have an OpenBSD webserver hosting my website, and I always pick OpenBSD implementations of tools if I have the option. At another point, I gave FreeBSD a shot, but it would occasionally boot without detecting my keyboard/mouse (due to an unsupported motherboard, or bug in the USB stack). The audio systems in BSDs are delightfully simple in comparison to Linux, though.

                                          1. 3

                                            When Gentoo looks sane, we’re in a dark dark place ;)

                                            (and yes, Gentoo looks sane)

                                            1. 1

                                              …why?

                                          2. 1

                                            I should’ve been clearer when I wrote the comment, I’m not fully gone. I still have two boxes running Docker, one on Ubuntu, one on Debian. The Ubuntu one is going to be replaced by FreeBSD, the Debian one will be retired when I get round to it. I also use Android, so I guess that counts too.

                                        1. 4

                                          I’m happy to see FTP die. But aren’t some websites still providing download links over FTP? I think it was just a year ago when I noticed I was downloading an ISO file from an FTP server..

                                          1. 9

                                            There’s nothing wrong with downloading an ISO from an FTP server. You can verify the integrity of a download (as you should) independently of the mechanism (as many package managers do).

                                            1. 4

                                              I agree! The same goes for downloading files from plain HTTP, as long as you verify the download you know the file is okay.

                                              The reason I don’t like FTP has to do with the mode of operation; port 21 as control channel and then a high port for actual data transfer. Also the fact that there is no standard for directory listings (I think DOS-style listings are the most common?).

                                              1. 2

                                                The reason there’s no standard for directory listings is possibly more to do with the lack of convention on filesystem representation as it took off. Not everything uses the same delimiter, and not everything with a filesystem has files behind it (e.g. Z-Series).

                                                I absolutely think that in the modern world we should use modern tools, but FTP’s a lot like ed(1): it’s on everything and works pretty much anywhere as a fallback.

                                                1. 1

                                                  If you compare FTP to ed(1), I’d compare HTTP and SSH to vi(1). Those are also available on virtually anywhere.

                                                  1. 1

                                                    According to a tweet by Steven D. Brewer, it seems that at least modern Ubuntu rescue disks only ship nano, but not ed(1) or vi(1)/vim(1).

                                                    1. 1

                                                      Rescue disks are a special case. Space is a premium.

                                                      My VPS running some Ubuntu version does return output from man ed. (I’m not foolish enough to try to run ed itself, I quite like have a usable terminal).

                                                2. 1

                                                  Yes, FTP is a vestige of a time where there was no NAT. It was good until the 90s and has been terrible ever since

                                                3. 1

                                                  Most people downloading files over FTP using Chrome don’t even know what a hash is, let alone how to verify one.

                                                  1. 1

                                                    That’s not really an argument for disabling FTP support. That’s more of an argument for implementing some form of file hash verification standard tbh.

                                                  2. 1

                                                    There is everything wrong with downloading an ISO over FTP.

                                                    Yeah, you can verify the integrity independently. But it goes against all security best practice to expect that users will do something extra to get security.

                                                    Security should happen automatically whenever possible. Not saying that HTTPS is the perfect way to guarantee secure downloads. But at the very least a) it works without requiring the user to do anything special and b) it protects against trivial man in the middle attacks.

                                                    1. 1

                                                      But it goes against all security best practice to expect that users will do something extra to get security.

                                                      Please don’t use the term best practice, it’s a weasel term that makes me feel ill. I can get behind the idea that an expectation that users will independently verify integrity is downright terrible UX. It’s not an unrealistic expectation that the user is aware of an integrity failure. It’s also not unrealistic that it requires the user to act specifically to gain some demonstrable level of security (in this case integrity)

                                                      To go further, examples that expect users to do something extra to get security (for some values of security) include:

                                                      1. PGP
                                                      2. SSH
                                                      3. 2FA

                                                      Security should happen automatically whenever possible.

                                                      And indeed, it does. Even over FTP

                                                      Not saying that HTTPS is the perfect way to guarantee secure downloads

                                                      That’s good because HTTPS doesn’t guarantee secure downloads at all. That’s not what HTTPS is designed for.

                                                      You’ve confused TLS (a transport security mechanism) with an an application protocol built on top of TLS (HTTPS) and what it does with the act of verifying a download (which it doesn’t). The integrity check in TLS exists for the connection, not the file. It’s a subtle but important difference. If the file is compromised when transferred (e.g. through web of trust, through just being a malicious file) then TLS won’t help you. When integrity is important, that integrity check needs to occur on the thing requiring integrity.

                                                  3. 7

                                                    You got it backwards.

                                                    Yeah, some sites still ofter FTP downloads, even for software, aka code that you’re gonna execute. So it’s a good thing to create some pressure so they change to a more secure download method.

                                                    1. 9

                                                      Secure against what? Let’s consider the possibilities.

                                                      Compromised server. Transport protocol security is irrelevant in that case. Most (all?) known compromised download incidents are of this type.

                                                      Domain hijacking. In that case nothing prevents attacker from also generating a cert that matches the domain, the user would have to verify the cert visually and know what the correct cert is supposed to be—in practice that attack is undetectable.

                                                      MitM attack that directs you to a wrong server. If it’s possible in your network or you are using a malicious ISP, you are already in trouble.

                                                      I would rather see Chrome stop sending your requests to Google if it thinks it’s not a real hostname. Immense effort required to support FTP drains all their resources and keeps them from making this simple improvemen I guess.

                                                      1. 1

                                                        MitM attack that directs you to a wrong server. If it’s possible in your network or you are using a malicious ISP, you are already in trouble.

                                                        How so? (Assuming you mostly use services that have basic security, aka HTTPS.)

                                                        What you call “malicious ISP” can also be called “open wifi” and it’s a very common way for people to get online.

                                                        1. 1

                                                          The ISP must be sufficiently malicious to know exactly what are you going to download and setup a fake server with modified but plausibly looking versions of the files you want. An attacker with a laptop in an open wifi network doesn’t have resources to do that.

                                                          Package managers already have signature verification built-in, so the attack is limited to manual downloads. Even with resources to setup fake servers for a wide range of projects, one can wait a long time for the attack to succeed.

                                                  1. 2

                                                    I have an old box running a ton of VMs all with wasted resources and things that aren’t being well maintained. I’m backing it up, rebuilding it with alpine and going to run a docker cluster on it to replace the services so they’re updated with ouroborous.

                                                    1. 13

                                                      It’s not censorship if it’s a private service, revoking service. It’s reasonable for Cloudflare to decide who it does and doesn’t want as customers.

                                                      What’s not reasonable is for Cloudflare to become a fundamental gatekeeper to infrastructure. As long as 8chan aren’t dependent upon Cloudflare to be able to operate, it’s not a problem. The moment they are, it is.

                                                      1. 10

                                                        What’s not reasonable is for Cloudflare to become a fundamental gatekeeper to infrastructure. As long as 8chan aren’t dependent upon Cloudflare to be able to operate, it’s not a problem. The moment they are, it is.

                                                        They aren’t. There’s multiple other options, including building a CDN yourself.

                                                        1. 3

                                                          It’s not to one needs to have a CDN to provide a website, however much the CDN providers want you to believe that, but including building a CDN yourself as a realistic[1] option is laughable.

                                                          [1] Yes, I know, you didn’t use that word.

                                                          1. 4

                                                            I don’t agree that building a CDN setup yourself isn’t feasible. It’s been done before CloudFlare was on the market. As an example, major FOSS projects do binary distribution, self-built on volunteer time.

                                                            It’s just expensive compared to just buying CFs services.

                                                            1. 2

                                                              Website in general: yes, you can build without a CDN.

                                                              Imageboards serve a lot of images(its in their name), which uses a lot of bandwidth. You really need a CDN for even a medium sized imageboard. 8chan is an imageboard.

                                                              1. 4

                                                                Imageboards serve a lot of images(its in their name), which uses a lot of bandwidth. You really need a CDN for even a medium sized imageboard. 8chan is an imageboard.

                                                                Yes, you really need a CDN. But images are also relatively easy to distribute and extremely disk-cache friendly. You can build a special-cased CDN for an imageboard. I don’t want so say it is cheap or as high-quality and can just be done on the side, but it is a relatively well-understood problem.

                                                                (I used to build image and video-CDN, FWIW)

                                                                1. 2

                                                                  Yes, but there is still price problems, and 8chan would rather not have those. Also, they probably want DDoS protection, as they host controversial content, and building your CDN to handle DDoS attacks adds even more cost. Needing to build your own CDN is not exactly a nice problem to have, and you rather just use somebody’s CDN.

                                                                  1. 4

                                                                    If 8chan’s business model is only cost effective because they are subsidized by CloudFlare that’s a problem with 8chan’s business model, not CloudFlare.

                                                                    Although I guess it is kind of a problem with CloudFlare as well.

                                                                    1. 3

                                                                      There’s no moral right that every cheap option is available to you unless you are a protected class as much as there is no moral right to your business model.

                                                                      1. 2

                                                                        Freedom of speech means the government can’t interfere with speech, not that uttering that speech should be as cost-effective as possible.

                                                                        1. -1

                                                                          Yes, but if there was a 1000$ tax on anything that you want to say publicly, it wouldn’t be free speech, would it?

                                                                          1. 4

                                                                            This is a non-sequitur. No such tax exists and if it did in a country with freedom of expression, it would be rightfully challenged in court.

                                                                            Before the internet, if you wanted to get your views out there, you had to pay to publish a newspaper, or a pamphlet, or a book. There was no expectation that you could do this for free.

                                                            1. 2

                                                              Fing is a decent Android/iOS app that does most of this. You can see which devices on a network respond and then run a port scan, all through a fairly decent gui.

                                                              1. 4

                                                                I’m a bit confused about the date thing. My Mac has the date and time in the menu bar on the right.

                                                                1. 1

                                                                  Dato can show more information like the calendar and multiple time zones.

                                                                1. 5

                                                                  Today I set up a peertube instance at https://watch.44con.com/ for Security conferences and researchers.

                                                                  We won’t have open registration, but for any existing security conferences worried about their talks being taken down, they can use our site as a backup. They just need to get in touch with us (details on the about page).

                                                                  We’re also open to security researchers of note.

                                                                  1. 3

                                                                    I take it you have experience with peertube. So just a quick question. Feel free to disregard it.

                                                                    Where do I find interesting things to watch on peertube? If I go to peertube.social, I don’t find anything of interest to me. I try the names of channels I know from youtube, but no luck. Do I have to find specific instances just like watch.44con.com or is there other ways to find content other than “just browsing” page after page? And where do I find other instances?

                                                                    It’s kinda the same problem as with mastodon. It’s not easy to find things to follow. Not as easy as twitter and YouTube anyways. Sigh.

                                                                  1. 21

                                                                    This week I have two C64s coming for use in a project. We’re building an alternate reality game set in the Blade Runner universe for 44CON. Attendees can register as blade runners, build their own portable Voight-Kampff machines, and by connecting attendee badges and interviewing other attendees determine whether or not the attendee is a replicant.

                                                                    Their devices give the blade runners a code they can enter into C64-based Citizen Database terminals to determine whether or not the person interviewed is a human or a replicant. We’ll have a mix of Nexus 6s (who’ll know they’re replicants), and Nexus 7s (who won’t).

                                                                    It is distinctly possible some replicants will also register as blade runner units.

                                                                    We’ve got some lovely people in California working on the badge and PVK units. I’m working on the C64 terminal code, so lots of 6510 assembly for me this week.

                                                                    1. 3

                                                                      Epic. Too bad I am on the other side of the planet.

                                                                      1. 1

                                                                        That is incredibly cool!

                                                                        I sometimes wish I’d bought a C64 rathan than an Atari as my initial 8 bit computer back in 1980.

                                                                        But then if I had I’d never have fallen in love with Atari LOGO and would have never had the experience of having my mind blown and experiencing the fireworks someone likes me who loves high levels of abstraction can get from a really finely crafted programming environment :)

                                                                        1. 1

                                                                          My 8-bit experience is almost all Z80 and 8086 (which is almost the same) aside from microcontrollers. 6502/10 is a weird beast for me. I’m finding C64 asm architecture incredibly obtuse thanks to the custom chips and kernal functions, but it’s all part of the fun.

                                                                          1. 1

                                                                            Yeah there’s certainly a lot of quirky in the C64 and Atari 8 bit lines! It’s one of the reason I still have a soft spot for them :)

                                                                      1. 4

                                                                        After a long week doing stuff at BSides London, 44CONnect and two days of training, I’m going to sleep properly.

                                                                        I’m also hoping to catch up on some house stuff, do some Amiga tinkering and learn about text manipulation on the C64 for an ARG I’m working on.

                                                                        1. 1

                                                                          Interesting. I had the chance to sit down with the safepass.me guys and go through their approach, which is equally about optimal coverage rather than mindlessly comparing against the db itself (safepass assert coverage higher than HIBP due to the way their algorithm works).

                                                                          It’s good to see innovation in this space. With rotating passwords finally being accepted as a suboptimal idea, it’s even more important that passwords chosen are good enough to withstand password cracking.

                                                                          1. 14

                                                                            I like how the bottom of the post has a link to ESR’s (now defunct) Google Plus profile.

                                                                            1. 0

                                                                              What irony, lol

                                                                            1. 2

                                                                              Nice work. My only concern is that one centralized actor is being swapped out for two here (netlify and MS/Github in this case).

                                                                              Not for me, but anything that gets people out of Medium is a good thing in my book.

                                                                              1. 2

                                                                                I’ve just finished writing feedback emails for every 44CON submission this year, so I’m having a quieter day today.

                                                                                This week I’m working on getting ready for our training next week, which is in the same week as BSides London. It’s going to be a hectic week. We’re also preparing speaker announcements for next week at BSides London.

                                                                                Finally, I’m spending a few hours getting to grips with Pagestream on the Amiga. I’ve been batting the idea of some sort of an annual 44CON zine for a while but I’m unsure of the format yet. If I can do something reasonable on an Amiga and fill it with things from the period as well as research, then I might give it a go.

                                                                                1. 4

                                                                                  While most of the recommendations are sensible alternatives, the inclusion in this list of systems with a clear network effect – Mastodon in place of Twitter, PeerTube in place of YouTube – detracts from its effectiveness. The alternatives are software alternatives but the software is an incredibly tiny part of the value of the system.

                                                                                  1. 1

                                                                                    I’m not sure they’re all entirely sensible alternatives. In particular, Mastodon is pretty full on to deploy and manage for most use cases. Perhaps the piece could’ve been improved with some backup alternatives.

                                                                                  1. 5

                                                                                    I’ve contacted them about dropping Amiga support, offering to try and get something up and running for them. It’ll mean the 4000 gets put to good use, and once I have a stable build setup I can try to recreate it using the 68k AROS kickstart and runtime in UAE so they can have an automated checkout->build->submit process.

                                                                                    Hopefully they’ll get back to me. Anyone want to take on any of the other OSes?

                                                                                    1. 3

                                                                                      This is SO COOL!

                                                                                      Just curious, you cite using it for writing and other ‘creative tasks’ in the article, but other than writing what are you using it for?

                                                                                      I have a deep abiding love for DeluxePaint :) I could play with multicycle brushes like, forever :)

                                                                                      1. 9

                                                                                        Ok, I have a ton of stuff lined up for this, but phase 1 is backing up all of my still-working disks with old code, music, art and writing and possibly finishing stuff off. I’ve learned a lot about those things in the last couple of decades, so remixing some of the content and putting it out there in the modern world is high on my agenda.

                                                                                        I’m still building out the box. I have coming down the pipeline for hardware:

                                                                                        • A 16-bit soundcard
                                                                                        • A combo Graphics card, Ethernet card, Coprocessor and Memory expansion
                                                                                        • A 68060/50 to replace the 040
                                                                                        • A Compact Flash card to replace the hard drive as heat will start to become an issue once the 060 is in.

                                                                                        All of the older stuff I’m buying needs to be recapped, so I’m going to look into doing that myself. The Amiga doesn’t do APM or ACPI, so I’m going to build my own device to monitor temperatures and shut the Amiga down if it gets too hot.

                                                                                        I’m going to use it for:

                                                                                        • 3D Modelling and Fractal animation generation
                                                                                        • Pixel art and photo editing (once the memory expansion is in)
                                                                                        • Setting up a modtunes radio station that records modtune mixes and releases them.
                                                                                        • Writing an intro/demo for next year’s 44CON
                                                                                        • Writing short form fiction with Final Writer
                                                                                        • Managing my finances with Turbo Calc
                                                                                        • Possibly doing an online zine with Pagestream
                                                                                        • Trying to edit a podcast once I have the 16-bit soundcard, network card and extended drive set up
                                                                                        • Remixing old mod tunes I wrote, and writing new ones
                                                                                        • Adding Amiga hunk binary support to Radare

                                                                                        One of the 3d world generation tools I want to use, Vista Pro has problems opening up on my RTG workbench, so I’m using that as an excuse to learn a disassembler/debugger called ReSource. I know how to fix the binary, but I want to understand why the fix works.

                                                                                        Basically I’ve spent nearly 20 years away from the Amiga, in which time I’ve developed (relatively) god-like reverse engineering and hardware hacking powers compared to my teenage self, so I want to put them to good use and have a go at all of the things. Hopefully it’ll give me something fun to do for the next 20 years.

                                                                                        1. 4

                                                                                          I remember Vista Pro!!! I could never figure out how to use it fully but it was amazing for its time.

                                                                                          Remember Director? The 2D animation DSL? I used that a lot and did a Media internship project in it for college.

                                                                                          Thinking about this stuff makes me realize how much of the software that made the Amiga great really was way ahead of its time and still has things to teach us today. There are many lessons that breakthrough software can teach us.

                                                                                          You should totally document this project to the nines as you go!

                                                                                          1. 3

                                                                                            Remember Director?

                                                                                            Sadly, no. I never used it. I used Scala, but not Director.

                                                                                            Having said that, if you’re interested, there’s an ADF for use in WinUAE, along with the manual in case you’re feeling rusty.

                                                                                            I’ve made a note to check it out and spend some time with it though. Might be a while before I get to it.

                                                                                            1. 3

                                                                                              The main thing I took away from reading about the Amiga was that it (IIRC) used a mix of software and hardware offloading. Our smartphones are doing that now. You could say its legacy lives on in that way. Just too ahead of its time.

                                                                                              1. 2

                                                                                                That’s one piece of it, but it’s far far more than that. AmigaOS had pre-emptive multitasking way before any other non UNIX desktop OS did, and it had a message passing ‘exec’ (Most would call it a micro-kernel these days).

                                                                                                And yes it had an awesome graphics coprocessor (Coppper) and a bit blit transfer coprocessor (Blitter) which all had rich support in the API (Intiution).

                                                                                                The whole thing was written with a sense of humor and had an … elegance? To it that’s hard to describe in the here and now.

                                                                                                It also had a full user / application scripting enviornment, ARexx, so you could have scripts that ADRESSed running applications and sent them commands that they exported.

                                                                                                So you could have a script that had your Telecom program download a ZOO file full of images, tell your unarchiver to unarchive them, and then tell DeluxePaint to load and transform them, saving them back out, and then have your mail program mail them to you.

                                                                                                The other thing to know about Amiga is that a TON of incredibly ground breaking software was originally developed on that platform. Lightwave 3D started out there for example.

                                                                                                Also - the games were amazeballs for their time. So yeah, if you were into computing at that time and didn’t have access to super high end workstations, it was basically magic :)

                                                                                                1. 2

                                                                                                  Thanks for the details! I’m slowly trying to piece the picture together one article and conversation at a time. You’re the first to tell me about the scripting stuff. It definitely sounds better than my DOS with graphic shell experience. ;) I think modern audiences could get an appreciation for it today if it was presented comparatively to a system, apps, and games of that time. Not a rigged demo by zealots: someone highlighting realistic use of good apps on both platforms in a way that shows Amiga’s advantage as a side effect.

                                                                                                  I heard about Lightwave. Closer to home was that the Preview Channel ran on Amiga. Means I used Amiga without knowing it for a decent chunk of my life.

                                                                                                  1. 2

                                                                                                    Kinda not surprised. They’re tucked away in some surprising places. There was one that was still running a school’s HVAC in a closet for YEARS:

                                                                                                    1. 2

                                                                                                      Amazing. Reminds me of this advertisement about an AS/400 doing something similar. People used to lose VMS servers, esp pizza boxes, too. This kind of thing probably happens way more than we hear about it. The ones that ran for years seem to be on specific OS’s and hardware that aren’t mainstream, though. I still think high-reliability deployments that don’t need raw speed should consider leveraging such technologies where possible.

                                                                                                      I also speculate that the physics of modern, process nodes that breaks chips means using oldest ones available will always have advantages. The used Amiga you bought on eBay might outlast your brand-new, high-reliability chip from 28nm fab. There’s you a business justification for loading up on them for critical services. :)

                                                                                              2. 2

                                                                                                I used to love Vista as a teen and had totally forgotten about it until this comment :) Deluxe Paint IV, too.

                                                                                                How’s emulation lately? I guess ROMs are difficult to find. Presumably the hardware can be emulated at native speed though?

                                                                                                1. 1

                                                                                                  You can purchase the full ROM set as well as super easy to use software at http://www.amigaforever.com - emulation is startlingly good on quite a number of platforms.

                                                                                                  I’ve been playing with getting UAE running on my Clockwork Pi - handheld Shadow of the Beast!!! :)

                                                                                          1. 3

                                                                                            Instead of having a bulky A4000 for tasks you outlined in the commect, couldn’t you just get a little cute A600 with Vampire 600 V2 accelerator (68080 CPU on gate-array, which is a 68060 with fixed bugs and added pipelines, also RTG graphics card beating any MNT product (or any ZorroIII card, but I’m not trying to advocate it over Mediator PCI + Radeon / Vodoo))?

                                                                                            Soundcard can be added on „clock port” and these are cheaper (and newer, as clockport got developers’ attention pretty recently).

                                                                                            Of course A4000 is great looking „desktop” machine and I really appreciate it, but currently the only case I would find for it, except some VERY SERIOUS stuff like plugging PowerPC, Mediators, TV cards and so on is to have VideoToaster in it, or other „DraCo-style” setup with few TV/encoding/processing cards, Scala and other video editing software.

                                                                                            Not to mention you can just simply plug cheap RTL8319/3C589/Prism2 network card into PCMCIA port in A600, add Roadshow to S:Startup-Sequence and release yourself from the need of any other x86 machine, also releasing your CPU a lot from TCP/IP processing from raw serial port

                                                                                            1. 2

                                                                                              I have a bunch of other Amigas, I think I have an A600 in the loft, and my A1200, possibly with 2 A500s is somewhere in my conservatory. Of course, I can do things even faster still with a Raspberry Pi, or with WinUAE on my i7 beast.

                                                                                              This isn’t about performance though, it’s about youth and love. The A4000 was my dream machine as a kid. It’s not something really for me to own, it’s something for me to take care of, to look after until it’s next owner, probably a museum.

                                                                                              I do miss having a clock port on the A4000 - I was hoping to build an i2c interface that I could use with some temperature sensors to do some kind of power management. That’s a project for down the line though.