1. 3

    While I like this project whenever PeerTube is coming up I wonder how they want to keep bad actors (conspiracy theorist, nazis, shills etc) off their platform? I never managed to find out. I was hoping that maybe somebody here could help me out?

    1. 7

      They can’t. Anyone can start an instance and post whatever they want. Instances on the fediverse handle this problem by curating who they federate with. The worst instances end up isolated, so it’s barely a problem in practice.

      1. 1

        Thank you, I see that I misunderstood the structure of PeerTube. I thought that it is a service like youtube built on a distributed filesystem like ipfs and not decentralized like for example mastodon.

      2. 1

        If a tree falls in the forest and there is nobody there to hear it, did it make a sound?

        The same way you filter out unwanted voices in real life: by ignoring them. I can easily ignore those who proclaim the virtues of some *-ism, those who try to inject a $pet_issue perspective into anything they touch, those who insist on calling anyone who does not agree with them some form of *-ist or *-phobe and so can others.

      1. 1

        Could anybody explain what socket activation is?

        1. 3

          Basically socket activation has a service manager keep a network socket open and it starts the service being served up (if it’s not already running) when a new client connects. In systemd, it can be used so the SSH daemon to has a separate instance of sshd per user connection.

          Here’s more words: http://0pointer.de/blog/projects/socket-activation.html

          1. 1

            I am wrong or does this sound a bit like message passing for services?

        1. 6

          u/ddevault I’m not particularly fond of using a circle as sourcehut’s icon, but even ignoring that, the fact that the text is not centered with the circle is triggering the OCD in me really hard. In this image, we can see that top partition is larger than the bottom partition.

          1. 5

            I have a really bad eye for this… would you mind terribly if I imposed on you for a patch? The code is here:

            https://git.sr.ht/~sircmpwn/sourcehut.org

            hugo serve -w will get it running. The stylesheet is in assets/main.scss, and the HTML is in layouts/partials/nav.html. I can backport the fix to *.sr.ht after that. I spent a few minutes messing with it but I can’t really tell when it’s right.

            1. 3

              Would you mind if I would also give it a go?

              1. 4

                Scyphozoa already sent a patch - and it made it upstream. Maybe you’d like to try it for *.sr.ht, rather than sourcehut.org?

                https://git.sr.ht/~sircmpwn/core.sr.ht

                See srht/templates/nav.html

                1. 3

                  Sure. Would you mind if I maybe tried something with the logo?

                  1. 4

                    By all means.

              2. 1

                Thanks for replying. I might try, but I have never written a single line of either html or css in my life :(

              3. 1

                fwiw, I liked it… encompassing

                1. 4

                  I think it would be cool if he did some pun because of the word “hut”, maybe just using a caret symbol on top of the circle. I don’t know, just a circle seems bland, personally. And because it is a circle, at a glance it might trick you into reading “Osourcehut”

                  1. 3

                    sôurcehut (:

              1. 7

                This is a half-truth; Almost every web application should be both. The fact that this is currently too much work is a huge indictment of current development tools.

                1. 1

                  Have you seen Phoenix LiveView? I did not yet have the opportunity to use it

                  1. 1

                    No I’ve never heard of it. I’ve heard of Elixir though, not really something that interests me.

                  2. 1

                    Right now I use both for my current project. The server side is Jinja2 and the client side is handled by some Vue.js. So far, it has been a great mix because the app mixes “static” content and dashboards with multiple graphics that are updated every minute (each graphic has its own API endpoint).

                  1. 1

                    In the end I went with the LG 27UL600 instead of the HP Z27 because the LG has 99% sRGB Colour space coverage (the HP has around 70%) and it was over 200 CHF cheaper. Thanks for all the feedback!

                    1. 2

                      Good choice! I have the 27UL600-W as well. Great monitor.

                    1. 6

                      I went with the HP Z27, based on the Wirecutter recommendation. It can handle my personal laptop, my work laptop, and my Nintendo Switch without moving cables around.

                      1. 2

                        Do you use macOS? If so, can you confirm that it allows you to scale it to 1440px retina?

                        1. 5

                          Checking, option-clicking the scaled resolutions in Preferences shows 2560x1440 as available. Setting it, the monitor stays at 3840x2160. (Overcommunicating here just in case.)

                        2. 1

                          Do you connect all your devices via USB-C or what do you mean by “without moving cables around”?

                          1. 1

                            (I went with the Wirecutter’s recommendation for a Z27 as well). The built-in hub’s USB-C connector is DisplayPort-capable, so between the HDMI port, two DP ports(Mini and full) and two USB3 Type-A connectors, it’s easy to leave cables connected and just hotplug when necessary. The stereo jack is even connected to the audio portion of the hub, so my music is coming through my new desktop’s Radeon.

                            1. 1

                              That’s doable, but my laptop is too old! I used to move the DVI plug for my monitor from one sort of dongle to another.

                          1. 5

                            Cool, but the next wave of ad blockers will need a completely novel approach once SSAI (server-side ad insertion) takes off unless we all just collectively reject ad monetized video content.

                            DAI (Google’s SSAI solution) is already in what amounts to a prerelease for larger customers

                            1. 5

                              Could you explain quickly what SSAI is?

                              1. 6

                                Sure! I will limit my explanation to the bounds of HLS (HTTP Live Streaming) since the concept is the same for both HLS & DASH (Dynamic Adaptive Streaming over HTTP) and these are the two most important ABR (Adaptive Bitrate) content types.

                                1. You have a manifest example.m3u8 file that declares a list of where your video fragment files <N>.ts are, this file usually sits somewhere “private”, maybe even encrypted with a key that only the SSAI server knows if the company has enough technical expertise to handle running the infra for it.
                                2. Browser asks for example.m3u8 from some URL that the SSAI server sits in front of, the server fetches the actual manifest (or maybe has a local cached version already available) and looks for special places where the manifest declares an Ad can be inserted, SSAI fetches the Ad (bidding/etc) and inserts the resulting .ts files into the example.m3u8
                                3. SSAI server sends the resulting spliced example.m3u8 back to the client with a few extra .ts files in it, and updated metadata (this is a big thing I’m glossing over) so it doesn’t break metadata in the browser about video duration, etc.

                                Here are some more resources:

                                1. 3

                                  Who controls the SSAI server? Would that be Google in this case and the content is made available to them by the company that owns the page where the video will be displayed? So does that mean that in order to host a ad network that uses SSAI you basically have to proxy all traffic for your customers?

                                  It seems weird to do the ads on the server since (as I understand it) advertisers don’t trust content providers not to cheat, and that’s why ads are fetched on the client from separate servers (which can then be blocked with relative ease).

                                  Maybe I just totally don’t understand what’s happening here.

                                  1. 3

                                    Advertisers don’t trust content providers in general not to cheat.

                                    However, Google have been caught ‘cant-believe-its-not-cheating’ multiple times with no impact, and it took years for it (eg putting brands next to KKK vids) to catch up with them on youtube.

                                    I suspect YT could pull it off and tell advertisers that’s the new deal.

                              2. 4

                                I think the next wave, already here, really, are service-specific user agents. Instead of cutting out the advertising, they cut out the content and make a new frame for it.

                                These take many different forms including websites (archive.is, youtube downloader sites), scripts (youtube-dl), binary apps (Frost, AlienBlue, NewPipe).

                                1. 2

                                  As @whjms noted, unless they are patching the manifest files on the fly to undo the SSAI (possible, but would lead to another type of whack-a-mole) it doesn’t matter how you are showing the content

                                  1. 1

                                    Wouldn’t newpipe still have to display the SSAI ads, since the ads are dynamically inserted into the video?

                                  2. 2

                                    It’s already taken off. Quite a few of the youtube videos I watch–maybe as many as 50%–are sponsored by an audiobook company or a learning-video company.

                                    The only solution I can think of to this is a crowd-sourced database of video timestamps to skip between; this is is an impossible-to-complete task which grows ever larger, and it’s open to abuse.

                                    1. 1

                                      There’s a machine learning model that was trained to skip sponsorship sections, too, though, personally I’m not so bothered if they were picked by the creator and the creator is getting paid directly and reasonably well for it.

                                      1. 1

                                        The leading extension that blocks sponsorships relies on user-submitted times, what’s this machine learning driven one you’ve mentioned? Actually pretty curious about this, I’ve been planning to build an ad-blocker for the TV!

                                        1. 1

                                          It was a recurring neural net trained on the automatic video transcriptions: Reddit thread (and very good intro video); repo.

                                    2. 2

                                      My old employer, a big player in the video space, has been doing SSAI for a few years now.

                                      I never worked in that directly, because I find it gross, but I suspect you could detect differences in encoding between the “content” and “ad” segments.

                                      1. 2

                                        That sounds like it would be fun to make. I suspect you’re right, and I would not be surprised if the differences are huge and glaring. On podcasts, which I listen to much more frequently than I watch online video, the differences are often audible. I can detect the ad spots by ear in many cases, just because the artifacts change when they cut over.

                                        1. 2

                                          I bet that you don’t even need to look at the data, per se. My guess is that the primary method for all of this is HLS, where you have a top-level (text) manifest file that lists the different renditions, and each of those URLs points to another manifest that lists the actual video segment URLs. If I were building SSAI without an eye towards adblockers, I would splice the content and the ads at that second manifest level, so the URLs would suddenly switch over from one URL pattern to another. I believe the manifest also includes the timestamps and segment lengths, so you should be able to detect a partial segment just before you switch from content to ad.

                                          It’s possible that they’re instead delivering it all as one MP4 stream, but that seems out of favor these days. Or they could do HLS but have segments that bridge the gap from content to ad, but that might involve re-transcoding, and if it didn’t… well, you might see something interesting with keyframes or something, I suppose? I don’t think they’d bother with that anyhow, since it sounds more complicated.

                                          1. 1

                                            I think most of it is currently based around #EXT-X-DISCONTINUITY declarations

                                      2. 2

                                        Does SSAI get to track you across the web? TBH, I don’t care about ads themselves, especially in video (that last bit may be because I just don’t watch all that much video). What aggravates me is the whole surveillance aspect of most current online advertising. By my read, SSAI should neuter the ability to track you across different sites. I’m set to call that flawless victory, if ad supported content is forced to resort to something that can’t track me.

                                        1. 1

                                          They still build it to involve tracking, with JS and cookies and whatnot that all happens before the video stream is requested. I believe if all of that is blocked, you still get ads, just not “retargeted” ones.

                                      1. 5

                                        Do you also have a otf or ttf version available? I am working on a project where this font could be really usefull :)

                                        1. 2

                                          It’s a bitmap.

                                          1. 6

                                            You can wrap bitmap fonts in a TTF, so the actual bitmap is used at specific point sizes, and the other sizes are naively scaled. I don’t know, myself, how to do this, but I’ve seen it done.

                                            1. 3

                                              The tools Tilman made for Terminus TTF might be useful.

                                        1. 1

                                          Has anybody experience using riemann (http://riemann.io/) in production?

                                          1. 9

                                            I’ve been considering taking on a project in this line for a while but haven’t had the need for it recently - I’ve got a Kodak Pakon, which is a bulk film scanner that drug stores used back in the days of 1 hour photo development. Unfortunately the driver is Windows XP only, and I’d really love to use it without running a VM.

                                            1. 1

                                              Can I ask where you got that device? Is the quality of the scans good?

                                              1. 3

                                                I got mine on the second hand market from someone who got it from whichever drug store liquidated it - Walgreens, I think? - and yeah! It scans at 6MP, which is a little low by modern standards but meets all my requirements from film, and it has really excellent color reproduction. The real killer feature is that it can scan an entire roll of 35mm film without any interaction. You see them on ebay pretty frequently, there’s a couple different versions but the most common is the F135 that I have.

                                              2. 1

                                                The lack of drivers is exactly what has kept me away from acquiring a Pakon, hoping someone will write a FOSS driver of some kind.

                                              1. 3

                                                I am currently building an ARM cluster of my own, though for slightly different purposes than self-hosting. If you want more hardware to expend your research with, I would suggest you take a look at http://wordpress.supersafesafe.com/clusterboard (that URL though..). It is a bit of a gamble, though a cheap one, as the packaging and shipping was some of the worst I have ever seen.

                                                The hope was getting the nodes to work from ramdisk only and netboot alpine linux, but struggling with uboot for the time being.

                                                1. 1

                                                  What is your cluster for? I am toying with building a cluster myself with zfs but I have not found a good solution on how to attach a bunch of harddrives to the sbcs. Mostly they have a USB ports to attach drives but I neither have money nor time to test this approach.

                                                  1. 2

                                                    a few projects, mainly fuzzing for vulnerabilities. The perhaps more interesting experiment is “volatile single-use, throwaway” thin clients, be it android apps or mapping browser “tabs” on my desktop to short-lived “one device, one page, kill and reboot on close” chrome instances.

                                                1. 13

                                                  I may as well join in.

                                                  I’ve had a light conversation with SirCmpwn before and he doesn’t care for macros either, which I find foolhardy, but I’ll focus on just this article.

                                                  The inertia of “what I’m used to” comes to a violent stop when they try to use Go. People affected by this frustration interpret it as a problem with Go, that Go is missing some crucial feature - such as generics. But this lack of features is itself a feature, not a bug.

                                                  I use a number of wildly different languages, including Common Lisp, APL, and most recently Ada; each of these languages is lacking things the other has, but it also vastly more suited to other tasks than the rest. I’ve never used Go. Unlike these three languages I’ve mentioned, which have perfectly good reasons for lacking whatever it is they lack, Go very often has poor reasons or perhaps even no reasons, although I don’t skulk around the mailing lists or whatnot.

                                                  For a good example, take a look at this; it’s my understanding Go lacked a proper mechanism for determining time and many people critiqued this, but daddy Google didn’t care until someone important was hit by it. This is a good example of the problems caused by a language that is not only uncustomizable by the users, but is designed by people who don’t care and won’t care. Unless you’re someone, Google doesn’t care about what you think and the language certainly doesn’t, considering it is designed at every point to take away programmer choice.

                                                  Go strikes me as one of the most conservative programming languages available today. It’s small and simple, and every detail is carefully thought out. There are very few dusty corners of Go - in large part because Go has fewer corners in general than most programming languages.

                                                  This isn’t equivalent to a language that is good for writing programs in. Ofttimes, a lack of edge cases in the world of the language doesn’t correspond to a lack of edge cases in real use. Take a look at Ada for a counterexample; the rules may not have a nice technical explanation, but the corresponding real world explanation is very simple, because it’s usually to prevent some manner of error.

                                                  I feel that this applies to generics. In my opinion, generics are an imperfect solution to an unsolved problem in computer science.

                                                  Dynamic typing as in Lisp is one solution. Ada has a nice generic system, but again, Ada was designed not for theoretical prettiness, but to actually make large systems easier to write without flaws, so generics were of course there because otherwise you get people copying and pasting code, which makes maintenance and everything else harder because you can’t tell if one of the copies is wrong or otherwise changed easily or quickly.

                                                  I used to sneer at the Go maintainers alongside everyone else whenever they’d punt on generics. With so many people pining after it, why haven’t they seen sense yet? How can they know better than all of these people?

                                                  Have you ever considered these people don’t know better than anyone else. Have you considered that Go is just an extension of the UNIX and C religion and people like Rob Pike are just playing their part as a priest over scared people who don’t know any better and want a panacea and a movement to join?

                                                  I don’t think programming languages should compete with each other in an attempt to become the perfect solution to every problem. This is impossible, and attempts will just create a messy kitchen sink that solves every problem poorly.

                                                  I’d prefer to think that’s common sense. APL and its family is the clear choice for array problems, but will fall flat against many other types of problems. What is Go actually good for? I find that poor languages, typically ALGOL clones, tend to differentiate themselves on purpose rather than anything intrinsic. You see this with Perl being a ’‘scripting’’ language, Ruby being for ’‘web services’’, Python being ’‘glue code’’, and, what, Go being for ’‘scalable programs with a focus on internet-connected services’’? The key detail to observe is these languages are all rather the same and, utterly lacking originality, attempt to dominate in a particular usage, because that’s the only way they can really be differentiated.

                                                  If you disagree with this, compare Perl to PHP to Go to Python and compare those differences to those between comparing Common Lisp to Forth to APL to Ada.

                                                  If you’re fighting Go’s lack of generics trying to do something Your Way, you might want to step back and consider a solution to the problem which embraces the limitations of Go instead. Often when I do this the new solution is a much better design.

                                                  I felt something similar when I was writing an Ada program and, wanting to use the package system properly, was forced to structure my program in a different, albeit natural and better way. Tell me if there’s a document that lists all of Go’s design decisions and why they were taken, or am I only going to find the typical UNIX and C response of ’‘We know better. It’s better this way. Don’t consider other ways. Our way is the one true way.’’?

                                                  So it’s my hope that Go will hold out until the right solution presents itself, and it hasn’t yet. Rushing into it to appease the unwashed masses is a bad idea.

                                                  Go was designed for the ’‘unwashed masses’’, I mean those ’‘not capable of understanding a brilliant language, but we want to use them to build good software. So, the language that we give them has to be easy for them to understand and easy to adopt.’’, straight from Rob Pike’s mouth. Go is designed to use programmers as unintelligent implementing machines, which is why it’s so opinionated. Its opinions have little or nothing to do with good programs and apparently everything to do with preventing the damage any single fool they want to use can cause or, worse, prevent a new hire who isn’t a fool from writing a good program that makes them more valuable than their peers and harder to fire. There’s no macros in Go, only the same thing, everywhere, no matter how poorly suited it is to the problem. If everyone writes Go the same, it’s easy to fire and interchange employees without friction.

                                                  I could keep going on about how Go is just a continuation of UNIX, C, and so also Plan9, UTF-8, and whatever else those malign idiots create, but I believe this gets the point across well enough. The only ’‘philosophy’’ these things espouse is that the computer isn’t a tool for leveraging the mind.

                                                  1. 10

                                                    ’‘We know better. It’s better this way. Don’t consider other ways. Our way is the one true way.’’

                                                    Hilariously, a pro-Go commenter just said to me that Go is an anti-“we know better” language.

                                                    Go is just an extension of the UNIX and C religion

                                                    And yet it goes against everything in the actual modern Unix world. Go likes static linking (because Linux distros), has custom syscall wrappers, a custom assembler (!), custom calling convention and weird stack setup… As a result, calling non-Go code requires either overhead (cgo) or ridiculous hacks (c2goasm), LD_PRELOAD hooks don’t work, and porting the main official Go implementation to a new OS/CPUarch combo is utter hell.

                                                    1. 9

                                                      Go being for ’‘scalable programs with a focus on internet-connected services’’?

                                                      Two comments: the two wins go has over other languages is

                                                      • (1) build/link - that its build system is fast, and it produces reasonably small static binaries, suitable for deploying into containers. This requires a fair bit of fiddling in other languages and with fairly large binaries in the outcome. Not infeasible, but certainly more than plug and play.

                                                      • (2) it aligns with the sensibilities of python and ruby programmers in general, but in a typed manner, so improved maintainability with a fairly simple semantic.

                                                      I’m not a go fan, but these are key good things for go.

                                                      Rather write in something like Haskell or Common Lisp, but ce la vie…

                                                      1. 9

                                                        I had you until the last paragraph. What the heck do you find bad in UTF-8?

                                                        1. 1

                                                          I don’t want that to turn into its own discussion, but I have reasons aplenty and I’ll list all those that currently come to mind.

                                                          Firstly, I have my own thoughts about machine text. I find the goal of Unicode, being able to have all languages in one character set, to be fundamentally misguided. It’s similar to the general UNIX attitude: ’‘Should we have the ability to support multiple standards and have rich facilities for doing so transparently? No, we should adopt a single, universal standard and solve the problem that way. The universal standard is the one true way and you’re holding back progress if you disagree!’’.

                                                          Operating systems can support multiple newline conventions, as VMS did, and it would be trivial to have a format for incorporating multiple character sets into a single document without issue, but that’s not what is done. Instead, Unicode is forced and there are multiple Unicode encodings. Unicode is also filled with dead languages, emojis, and graphics-building characters, the latter being there, I think in part, because GUIs under UNIX are so poor and so turning the character set into the GUI toolkit is such an easy ’‘solution’’. I’m fully aware the other reasoning is likely to encompass graphics from other character sets, however.

                                                          UTF-8 is a large, variable-length character set that can have parsing errors, which I find unacceptable. It’s backwards compatible with ASCII, which I also dislike, but at least ASCII has the advantage of being small. UTF-8 takes pains to avoid containing the zeroeth character, so as to avoid offending C’s delicate sensibilities, since C is similarly designed to not accommodate anything and expect everything to accommodate it instead. It is as if Ken Thompson thought: ’‘I haven’t done enough damage.’’

                                                          UTF-8 disadvantages other languages, such as Japanese and Chinese (This isn’t even mentioning the Eastern character controversy.), by being larger than a single-minded encoding, leading several such peoples to prefer their own custom encodings, anyway. You can only add UTF-8 support to a program transparently in trivial cases, as anything more such as a text editor will break in subtle ways.

                                                          There’s also that Unicode makes the distinction between characters, graphemes, and other such things that turn a simple problem into an unmanageable one. I use Common Lisp implementations that support Unicode characters, but don’t actually support Unicode, because there are so many combining characters and other such things that have no meaning to Common Lisp and so can’t be implemented ’‘correctly’’, as they would violate the semantics of the language.

                                                          There are other reasons I can list, but this is sufficient.

                                                          1. 10

                                                            multiple standards and have rich facilities for doing so transparently

                                                            Well, looks like getting everyone to agree on a way of selecting encodings turned out to be way harder than getting everyone to agree on one encoding :)

                                                            And sure — we have Content-Type: what/ever;charset=MyAwesomeEncoding on the web, we can have file formats with specified encodings inside, but there’s nothing you can do about something as fundamental as plain text files. You could never get everyone to agree to use something like extended FS attributes for this, and to make it work when moving a file across filesystems… that’s just not happening.

                                                            format for incorporating multiple character sets into a single document without issue

                                                            Again, some format that software has to agree on. Plain, zero-metadata text fields and files are a thing that’s not going away, as much as you’d like it to.

                                                            UTF-8 disadvantages other languages, such as Japanese and Chinese

                                                            They often include ASCII pieces like HTML tags, brand names, whatever; you should use an actual compressor if you care about the size so much; and every character in these languages conveys more information than a Latin/Greek/Cyrillic/etc character anyway.

                                                            1. 7

                                                              It seems like you don’t actually know what UTF-8 is. UTF-8 is not Unicode. Rob Pike did not design Unicode, or have anything really do to with Unicode. Those guys designed UTF-8, which is an encoding for Unicode, and it’s an encoding that has many wonderful properties.

                                                              One of those properties is backwards compatibility. It’s compatible with ASCII. You ‘dislike’ this, apparently. Why? It’s one of the most important features of UTF-8! It’s why UTF-8 has been adopted into network protocols and operating systems seamlessly and UTF-16 hasn’t.

                                                              UTF-8 doesn’t ‘disadvantage’ other languages either. It doesn’t ‘disadvantage’ Japanese or Chinese at all. Most web pages with Japanese and Chinese text are smaller in UTF-8 than UTF-16, despite the actual Japanese and Chinese text taking up 3 bytes instead of 2, because all the other bytes (metadata, tags, etc.) are smaller.

                                                              The fact is that anyone that says that Unicode ‘makes the distinction between characters, graphemes, and other such things that turn a simple problem into an unmanageable one’ doesn’t know what they’re talking about. Unicode did not create those problems, Unicode simply represents that problem. That problem exists regardless of the encoding. Code units, code points, characters, graphemes.. they’re all inherently different things.

                                                              Unicode does not have any GUI characters.

                                                              1. 2

                                                                Could you maybe elaborate the following quote?

                                                                UTF-8 disadvantages other languages, such as Japanese and Chinese (This isn’t even mentioning the Eastern character controversy.)

                                                                1. 6

                                                                  I reckon it refers to the controversial Han unification, which was in China’s favour.

                                                                2. 1

                                                                  It’s similar to the general UNIX attitude: ’‘Should we have the ability to support multiple standards and have rich facilities for doing so transparently? No, we should adopt a single, universal standard and solve the problem that way. The universal standard is the one true way and you’re holding back progress if you disagree!’’.

                                                                  What precisely does UNIX force you into? Are you sure this isn’t also the LISP attitude as well? For example, Lispers usually sternly glare over the interwebs if you dare you use anything but EMACS and SLIME.

                                                                  Operating systems can support multiple newline conventions, as VMS did, and it would be trivial to have a format for incorporating multiple character sets into a single document without issue, but that’s not what is done.

                                                                  You’re confusing multiple newlines in a single character encoding with multiple newlines across character encodings. You say that it would be trivial to have multiple character sets in a single document, but you clearly have not tried your hand at the problem, or you would know it to be false.

                                                                  Give me twenty individual sequences of bytes that are all ‘invalid’ in twenty different character encodings, and then give me 200 individual sequences of bytes that are all ‘invalid’ in 200 different character encodings. Otherwise there is ambiguity on how to interpret the text and what encoding is used.

                                                                  This problem can be seen by the people who are trying to revamp the c2 wiki. Reworking it has stalled because there are around 150 files with multiple different character encodings, and they cannot be identified, separated, and unified by the machine.

                                                                  Unicode is also filled with dead languages, […]

                                                                  Right, because Unicode is supposed to be a superset of all encodings. The fact it supports languages that are not used anymore is a feature, not a bug. It is important to people working in linguistics (you know, that field outside of computer science…) that any computer encoding format has a method of displaying the text that they are working with. This is important to language archival efforts.

                                                                  UTF-8 disadvantages other languages, such as Japanese and Chinese (This isn’t even mentioning the Eastern character controversy.

                                                                  This is outright false, but someone else has already mentioned that.

                                                                  I use Common Lisp implementations that support Unicode characters, but don’t actually support Unicode, because there are so many combining characters and other such things that have no meaning to Common Lisp and so can’t be implemented ’‘correctly’’, as they would violate the semantics of the language.

                                                                  Unicode allows language implementations to disallow some sets of characters for ‘security’ reasons: http://www.unicode.org/reports/tr31/

                                                                  This entire rant remined me of Steve Yegge’s post “Lisp is not an acceptable Lisp”:

                                                                  But what’s wrong with Common Lisp? Do I really need to say it? Every single non-standard extension, everything not in the spec, is “wrong” with Common Lisp. This includes any support for threads, filesystem access, processes and IPC, operating system interoperability, a GUI, Unicode, and the long list of other features missing from the latest hyperspec.

                                                                  Effectively, everything that can’t be solved from within Lisp is a target. Lisp is really powerful, sure, but some features can only be effective if they’re handled by the implementation.

                                                            1. 6

                                                              I still don’t really understand urbit. Could a kind person explain me the project like i’m five?

                                                              1. 9

                                                                Based on reading the above comments, there’s a reason for that. I don’t understand it either, and it would seem that its author didn’t optimize for accessibility :)

                                                                1. 5

                                                                  This is the best explanation: https://urbit.org/primer/

                                                                  1. 4

                                                                    See the new primer they just released. The video in it gives an overview. https://urbit.org/primer/

                                                                  1. 7

                                                                    Nice, but it’s unclear to me whether the algorithm it uses is better than simple Markov Chains for entertainment value.

                                                                    1. 1

                                                                      Recently I actually did a project like this. You can find it here: https://tofu.wtf/buzzwords

                                                                      1. 1

                                                                        It does seem a bit less repetitive, but that’s probably just the large amount of data it’s been fed.

                                                                        // My favorite generators are the Conservative Book Title Generator and the Startup Generator :D

                                                                      1. 1

                                                                        The link doesn’t seem to work

                                                                        1. 2
                                                                        1. 2

                                                                          Does anyone have any information on what actually happened? It seems to have been all scrubbed. Hard to have feelings about it without knowing. Although violations of the CoC should of course be enforced.

                                                                          1. 6

                                                                            Here is a screenshot of some of the scrubbed issue: https://twitter.com/maybekatz/status/899760806551666690

                                                                            1. 6

                                                                              Thanks a lot. “anti-Code-of-Conduct article” linked is The Neurodiversity Case for Free Speech. I support CoC, and I also support this article. I don’t see how the article can be construed as anti-CoC. (It’s also written by Geoffrey Miller, a scientist I highly respect.)

                                                                              1. 6

                                                                                As a perceived leader in the project, it can be difficult for outsiders to separate Rod’s opinions from that [sic] of the project.

                                                                                This is a pretty cancerous attitude. Rod can’t control what you perceive. Without coming right out and saying it, this reads to me as The Community Shall Dictate/Censor Rod’s Personal Views. What a bunch of crap. If I perceive The Darn Kat to be a ‘perceived leader’, can I tell her what to tweet?

                                                                                1. 5

                                                                                  I think the problem manifest itself differently in that case: To me it seems that Rod has a rather important role in the Node project and at one point in time he accepted the projects CoC. As the perceived leader of the project you cannot decide to apply the CoC sometimes and sometimes not.

                                                                                  Please bear in mind that nobody in this thread knows the full context of this problem.

                                                                              2. 5

                                                                                The guy, Rod, posted an article that was critical of codes of conduct

                                                                              1. 2

                                                                                Wow, that is crazy. Do you know who is behind this?

                                                                                1. 2

                                                                                  ICANN says the registry agreement is with a company named Top Level Spectrum. The state of Delaware says they are a corporation registered 2011-12-21. CrunchBase says they are owned by Jay Westerdal, who owned DomainTools (formerly whois.sc) and NameIntelligence.

                                                                                1. 3

                                                                                  Did some linux distro already changed to wayland by default?

                                                                                  1. 4

                                                                                    Yes, Fedora 25 uses Wayland by default although it falls back to X11 for certain display chipsets (some NVidia chipsets for one).

                                                                                    1. 2

                                                                                      I’m running gnome on arch linux with wayland which became the default this year.

                                                                                    1. 2

                                                                                      I’m thinking on moving from gnome ti i3/sway. How did you do the step and why?

                                                                                      1. 6

                                                                                        If you’re moving from GNOME to i3, you may be interested in reading about running gnome-session with i3 so you still have access to GNOME features like auto-mounting removable devices, media keys, screen-locking, etc.

                                                                                        1. 1

                                                                                          Your reply touches something that I don’t really understand because I always used gnome. So for me it’s hard to tell where gnome ends and linux starts. Can you explain quickly why I want to run a gnom-session inside i3?

                                                                                          1. 5

                                                                                            It’s not to run gnome-session inside i3, it’s to replace GNOME’s window manager with i3 in a GNOME session.
                                                                                            GNOME is much more than just a window manager, it’s a Desktop Environment. It provides a suite of softwares to manage your desktop. i3, along with most other dynamic/tiling window managers are just window managers.
                                                                                            Meaning: they will only provide you with a means of managing X windows. On their own, they will not provide a dock, menu/status bar, application launcher, notification system, etc. However, some (like i3) will actually have ready made solutions to replace some these components, and if one doesn’t come directly from the project, you just build up your suite of components yourself :)

                                                                                            It’s very much a more modular approach: GNOME is kinda like a flat-packed Desktop Environment, whereas if you go down the dynamic/tiling window manager route, it’ll be more like building your own thing with LEGO - which can be really fun, and beneficial in some ways.

                                                                                            I think Screwtape’s suggestion to try out i3 inside of a GNOME session is so that you could try i3 for what it is: a window manager - but still have the comfort of GNOME (the menu/status bar, notification system, application launcher, workspace manager).

                                                                                            1. 2

                                                                                              I think he already did: to get the listed features that gnome implements.

                                                                                              That said, there are other implementations too.

                                                                                          2. 1

                                                                                            i3 was pretty easy to get used to, but I had to spend an hour or so practicing after reading https://i3wm.org/docs/userguide.html

                                                                                          1. 25

                                                                                            This is the office I built into the top corner of the roofline in my loft; the machine itself is old-ish (most parts from 2011) but still works great and can play Overwatch at 60 FPS, which is all I care about :) Monitor spins between portrait and landscape, although portrait mode only works in Linux. http://static.haldean.org/battlestation.jpg

                                                                                            awesome3, tmux, PragmataPro (+ Konsole to get the ligatures), vim. http://static.haldean.org/screenshot.png

                                                                                            1. 4

                                                                                              tenkeyless mechkeyboard, gaming mouse, portrait display, case with side open, oversized clamp to hang headphone.

                                                                                              Wow you are all over the place

                                                                                              1. 7

                                                                                                Balance is achieved through tension :)

                                                                                              2. 1

                                                                                                That is a nice keyboard. What model is it?

                                                                                                1. 5

                                                                                                  It’s this one. I split my time between that and a Kinesis Advantage; the Kinesis is currently at my desk at work.

                                                                                                2. 1

                                                                                                  Wow, I really like that portrait monitor.