The example given by the author sound hacky more than anything, to “discover” source files.
I’ve been using makefiles of this kind for years now, and it works remarkably well. I do that so they’re easy to package, regardless of the distro. I must admit my projects are fairly simple, but I never missed any “feature” with this makefile.
I’m probably missing something here though so feel free to tell me!
I use a very similar makefile for my own projects. It doesn’t support header dependencies, though, and separate (out‐of‐tree) builds would be nice. I’ve been considering adding a (non‐autotools) configure script to the mix for that reason.
The example I gave doesn’t have header files, but you can easily add them to the mix (example).
For out-of-tree build, I don’t get the point of it. Is it only to keep the source tree clean?
I’ve seen some handmade configure script in the wild as well, pretty short ones (some only included the line “echo do not use autotools!”).
In my case, they would generate the config.mk file, which includes all the customizable bits. IMO, customization should be done at the environment level, and with make -e. That is the reason why I like mk a lot, which does that by default (but that’s another topic!)
echo do not use autotools!
I don’t understand all the Xvfb/Xephyr stuff. I do simple screencasts just by enabling audio monitoring and:
ffmpeg -f sndio -i snd/0.mon -video_size 1680x1050 -framerate 30 -f x11grab -i :0.0 -c:v libx264 -qp 0 -preset ultrafast output.mkv
“Maybe we can finally get past our nemesis: the lady in the red dress! … Nope, Well, it was worth a shot.”
someone help me out here, what is this referencing?
Just a guess, but the game probably crashes early on, enough to reach that character but no further.
It’s described in this article.
Is the website failing HTTPS cert verification for anyone else?
I keep seeing this as a reply but I’m not sure what purpose does it serve: you still can’t read the site. The only thing you can get from comments is that yes, the site is using a self-signed certificate, meaning that the breakage is intentional.
It is not broken - it is simply a different approach to CAs.
It is not broken
It is not broken
Broken has a couple of different meanings in this context. The relevant ones being (a) “according to design” and (b) “according to reasonable expectations of users.” It can be broken(b) while also not-broken(a). Or in other words it can be “broken by design.”
The user process is broken. The browser tries its best to give a very technical workaround, but the fact is that all other sites I read on the web don’t require me to trade my own sense of security for that of the author.
I do respect his choice, to be sure, but I ask people here to stop just silently referring to that original comment thread as if it explains anything. It doesn’t.
In Chrome at least you can certainly read the site, you just have to click “Advanced” and “proceed to teduangst.com”
The idea is to add the CA to the browser store. The CA is constrained to creating certs for tedunangst.org, which is nice. The weakness here is acquiring the CA in a secure way in the first place; the model is similar to SSH or signify.
Ideally you would acquire the CA out of band, like by meeting Ted in person. Good luck with that.
Unfortunately clicking through like you described loses any benefit: you’re obviously not checking the cert every time, so you’re prone to being MITMed each time you visit the site, as opposed to just the first time. (Firefox lets you save the exception, but Chrome doesn’t.)
The benefit of this over Let’s Encrypt is that if you add Ted’s CA and remove all the other CAs (that don’t have their own name constraints) from your cert store, you know that any valid HTTPS cert for tedunangst.com came from Ted and not from another compromised CA. I doubt even people who have added Ted’s CA have removed those other CAs, though, so it doesn’t seem like a real benefit to me.
Indeed. I don’t understand this at all.
seems it’s my turn to direct you to: https://lobste.rs/s/qeqqge/moving_https ;-)
[Comment removed by author]
It’s not just a self-signed cert, it’s a custom CA cert. If it were a self-signed cert, great, trust the site or don’t and move on. As a CA, the question is whether you trust @tedu to sign certs for your email, bank, and every other site.
Thanks for digging in. I guess we’re getting to the point where someone should roll all this up in a FAQ to get linked from every “hey site’s ssl config is broken” comment is posted on a tedunangst.com story, which is going to happen regularly for the foreseeable future.
Or, you know, he could just use a trusted CA, like everyone else. ;)
An alternative that would not violate his conviction would be to still provide a non-HTTPS service on a different port, such as 8080. This allows proper use of HSTS and all the modern trimmings - while still allowing people to use software that doesn’t understand this CA/cert without additional hackery. It’s a solution that works for me.
I use it when I find my overly strong TLS/SSL configuration to fail on an older device, for example.
Oh, that’s something I hadn’t considered. Bit of a discovery problem, and then the question of which link people pass around, and duplicate detection, and oh my, but it’s a good addition to the list of alternative plans.
Oh! I thought HSTS would enforce HTTPS for all ports. Are you sure this works in all browsers? :O
The certificate business is a protection racket, plain and simple, so be sure to read “Or, you know, he could just use a trusted CA, like everyone else” in your very best mobster-movie voice. It’ll make a lot more sense that way.
If mob, I was thinking along the lines of, (mafia voice) “it would be a shame of what might happen to your site your users saw it without our protection and quality assurances and that sort of thing.”
And now they’re trying to persuade every project they use to switch to Apache License 2:
I wish the ASF would still be using APlv1. It’s sad that the US legal system and patent situation caused this mess. The ASF is a very US-centric organisation (even though they don’t tend to view themselves as such), and from a perspective of a country where software paternts are not (yet) a thing, the differences between APLv1 and APLv2 appear as a solution looking for a problem.
Even in the US, this feels like a solution looking for a problem. BSD licenses have long been considered to provide an implicit patent grant (by the very wording: “Permission is hereby granted to use, copy, modify and distribute for any purpose…”). http://en.swpat.org/wiki/Implicit_patent_licence
And now they’re trying to persuade every project they use to switch to Apache License 2
And now they’re trying to persuade every project they use to switch to Apache License 2
No, they are asking politely if the projects might be willing to consider changing their licensing to be compatible. There is no persuasion going on by ASF people (which I assume you mean by “they”).
Maybe I used the word incorrectly, but to me a polite request to change the license or the influential in the open source world organization would stop using the product feels pretty close to persuasion.
I don’t see any major problem with them trying to persuade React and RocksDB to use a different license (in fact, I welcome it, personally). What they aren’t trying to do is coerce RocksDB and React to use the APL2. That would be a very different situation.
Is the current source available? I can only find the original Undeadly tarball (http://undeadly.org/undeadly-src.tar.gz).
What’s unclear about MIT/ISC and patents? I always assumed the answer was a simple no.
“Unclear” probably just means “would have to be decided in court”.
US-based lawyers are super happy with an explicit patent grant they can use to defend their client in court, should someone sue for patent infringement.
The author has a full article on MIT. It comes down to “Neither copyright law nor patent law uses “to deal in” as a term of art; it has no specific meaning in court.” and refers to the following part of MIT:
to deal in the Software without restriction,
ISC does not use this terminology. So why did he throw it in one bucket with MIT?
EDIT: See https://www.openbsd.org/policy.html for arguments in favour of ISC.
I think that’s because MIT doesn’t mention patents explicitly while Apache has this:
Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.
There’s an implicit patent grant in these licenses. Given the statement “Permission to use, copy, modify, and distribute this software … is hereby granted” I think it would be hard to argue that the recipient is not given a license to use the patent.
This only works if the copyright holder also holds the patent. But I (an eminently unqualified non‐lawyer) don’t see what the Apache 2.0 text provides that the ISC text doesn’t. “Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual … patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted.”
What’s really annoying about Apache, besides the deluge of verbiage, is the next sentence: “If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed.”
What’s annoying about the patent pooling? It discourages a sue fest by revoking any patents granted to you by other contributors if you sue users of the software for patents you have granted to the project.
gunzip -c SRCFILE.tar.gz
Why not just use zcat? :)
zcat ist a shell script calling gunzip
A month late, but that depends on the system ;):
~ % file /usr/bin/zcat
/usr/bin/zcat: Mach-O 64-bit executable x86_64
You don’t need the -f if the argument is stdin.
You do, because tar’s behavior is not portable. On OpenBSD, for example, if -f is omitted then tar defaults to /dev/rst0 (a tape drive).
It’s been about ten years, but I had an idea for a site called “listen to my CDs”. I’d upload a CD, and then allow a visitor (one, singular) to listen to it, streaming data in real time, but no faster. Surely I’m allowed to do that?
I thought it would be a really good test case for copyright law, but never quite got around to it. Never decided to give it up, maybe it’ll still work.
the aereo case seems relevant.
Indeed. By the way, Scalia (RIP) dissented in Aereo:
We came within one vote of declaring the VCR contraband 30 years ago… The dissent in that case was driven in part by the plaintiffs’ prediction that VCR technology would wreak all manner of havoc in the television and movie industries. … We are in no position to judge the validity of those self‐interested claims or to foresee the path of future technological development.
surprisingly enough, I was in favour of the aereo ruling. I absolutely believe rebroadcasting should not be illegal, but if it is, the sophistry of having one microantenna per customer to make it technically okay is a clear case of evading the law
Maybe also of interest:
Document Formatting and Typesetting on the UNIX System (ISBN: 9780961533625)
Document Formatting and Typesetting on the UNIX System: GRAP, MV, MS and TROFF (ISBN: 9780961533632)
Some parts are also available at google books.
there’s some more titles on http://www.troff.org/books.html
A free ebook for writing manpages, Practical Unix Manuals, by Kristaps Dzonsons (the author of mandoc): https://manpages.bsd.lv/
It also contains “The History of UNIX Manpages”, another nice read.
This is a scan of the original book. The Groff community transcribed the scans into troff source and generated a new PDF from that, which is much nicer to read: http://home.windstream.net/kollar/utp/
yes, see the last link on the O'Reilly site
I saw, but it was worth pointing out because people are more likely to click “A single PDF file via HTTP” than “groff and PostScript files‐‐Beta”…
I really love this talk. Leveraging userland exploits and GPU to gain full kernel access and finally total control of the system within moments from boot.
And all on a common piece of consumer electronics with over 60 million units shipped. Makes you wonder how much other hardware/software is capable of being so completely compromised.