This article is especially interesting to me, as it shows how VS Code still doesn’t have the “Emacs nature”. Even though I’m a 30-year Emacs user, I do hesitate to recommend it to younger programmers because it’s so alien, and VS Code has one of the essential characteristics of Emacs: the extension language and the implementation language are the same. But this article is a great example of how it doesn’t — extensions are limited to using an extension API, rather than having full access to the application’s internals. Maybe a good thing, if you’re a mass-market product worried about malicious extensions. But I’ll note that rainbow-delimiters-mode dates back to 2010, and has never noticeably slowed down loading or display of source files, even in languages with lots of delimiters like Lisp.
Maybe a good thing, if you’re a mass-market product worried about malicious extensions.
It may be partly about malicious extensions, but there are also potential performance wins to having extensions in a separate process (as noted in this post and also seen in Sublime Text). Maybe “performance win” is the wrong phrase here: It’s more about consistency of performance: a poorly written extension is not going to slow the core editor experience. I’ve worked on a couple of editors in the past, and it is a real concern that people will complain “your editor is slow and crappy” when the real issue is the extensions in use. There’s also a certain compatibility benefit to having a limited, supported API for extensions vs extensions just being able to reach in and change anything.
This is not to say the VS Code/Sublime Text way is better than Emacs, to be clear. Just that there are more tradeoffs. Emacs has clearly been super successful with the tradeoffs it made, but it’s possible that the different audience has different expectations.
I just tried it and while it is not 100% instant, it takes less than a tenth of a second and I don’t see a noticable spike in cpu usage when editing the file with the rainbow delimiters mode enabled.
Yes, it’s accurate and excludes comments and strings; I just tested this. On my machine (i5-3360M circa 2012, 8GB RAM, Fedora Linux, Emacs 28 with native compilation), I could not notice any slowdown loading the file, or editing near the top of the document. Near the bottom of the document, I do see a little slowdown when inserting or deleting delimiters, but I would also say it’s around 100ms. Even at the bottom of the document, scrolling or jumping around isn’t slowed down at all.
I hope the AC or space heating wasn’t on while the author was writing this piece because that would completely negate the few Wh of carbon saved a thousand-fold.
ps. using solar power for miniscule amount of carbon saving is mental masturbation and would solve no problem.
It sounds like you’re unironically criticizing people for participating in society while also trying to improve society. Any serious attempt to address the climate crisis will require the coordinated participation of our whole civilization, individual action is insufficient, but it is still valuable for individuals to explore what steps our society could take to address the climate crisis. People do not have to be ideologically pure in order to try to make the world a better place.
That said, the author has taken heroic efforts to take his apartment off-grid, to the point where “These conditions allow me to get through the winter without a heating system, relying only on solar heat and thermal underclothing.” So this writer definitely did not have space heating. (He acknowledges this is only possible because of the climate where he is.) I don’t see him mentioning in that piece whether he has air conditioning or not, but given that he has written extensively about more efficient cooling using fans instead of air conditioning, I suspect he uses less air conditioning than the average person. Purity should not be required for criticizing our society, but Low Tech Magazine does pretty well in the purity department.
It’s also a little disingenuous to call this a minuscule amount of carbon saving. Yes, his one relatively low-traffic website would not produce massive amounts of carbon emissions either way. But the point is that if the tech industry used these techniques to cut website emissions to a similar degree, it absolutely would have a substantial effect on global emissions. In 2015, data centers alone already contributed 2% of global greenhouse gas emissions, a similar share to global air travel.Worst-case scenarios could lead to the internet contributing up to 23% of global greenhouse gases by 2030. As we do more and more online, turning everything from cars to refrigerators into computers and connecting them to the internet, the internet will be an increasingly large part of our carbon footprint, unless we take action to limit the internet’s emissions. EDIT: I also expect the pandemic to accelerate the movement of everything online, making the sustainability of our web infrastructure even more vitally important.
I feel that these types of dismissive comments are often left on projects addressing components of the climate crisis and are not very productive. You are implying that it is useless to work on this component because other parts of the problem are even bigger and more daunting. As mentioned on the about page, internet usage is already responsible for 10% of global energy usage, so even if we solve the AC and space heating issues first, we will also still need to solve the internet problem.
And even apart from that, just because a project doesn’t solve the whole problem at once, doesn’t mean that it is useless. There is actually huge value in providing inspiring solutions and showing how small things can be improved, if only for the value of inspiring people and getting them to think about the energy use of the websites they use each day. We actually will have to transition to a carbon-neutral or even negative global society in the coming decades and it can only be done in steps, so it would be best if every initiative towards that goal is received constructively, without dismissive comments that divert attention by referring to the scale of the overall problem.
Edited to add: I also just want to say that this whole idea of good efforts being ‘negated’ by some other appliance running is completely nonsensical. Whether an AC or space heating is on is completely independent from whether the server for this site is sustainable. In this hypothetical scenario where the author is running an AC or space heater, if the site was not sustainable the total emissions would be those from the AC/space heater plus those of the server, while with a sustainable site it is only those of the AC/space heater if one of those is on. As you can see, nothing is being ‘negated’, and it is misleading to state it as such.
I agree. Apart from the fact that many frankly act and speak in bad faith, we must never forget the scale of climate change. Even if one wants to dismiss this solar server as a toy experiment, it is at least an experiment. To any and all pot-shotters, MacKaye’s corollary holds: ‘what the f*ck have you done?’
You are implying that it is useless to work on this component because other parts of the problem are even bigger and more daunting.
No. Im implying that it is not very useful and that it would be more useful to solve a bigger part of the problem.
And even apart from that, just because a project doesn’t solve the whole problem at once, doesn’t mean that it is useless.
It does mean that the effort could be more effectively spent else where. That it is not being spent then begs the questions of why? If the purpose is to reduce carbon emission, there are much better places to spend effort. Therefore, one can conclude that the purpose is not to reduce emission.
without dismissive comments that divert attention by referring to the scale of the overall problem.
Diverting attention to the scale of the overall problem is a good thing in my book, lest attention is spent on micro-solutions.
Whether an AC or space heating is on is completely independent from whether the server for this site is sustainable.
Opportunity cost of effort etc. The fact that it might be carbon+ to simply stay at home for one day rather than coming to work to write a piece about how you can save a few Wh of carbon by giving up 10% uptime.
The fact that it might be carbon+ to simply stay at home for one day rather than coming to work to write a piece about how you can save a few Wh of carbon by giving up 10% uptime.
And if you read the article you would know the whole setup is based on the author’s apartment balcony :)
My comment is about the writing, not the physical setup of the machine.
And one could contrive a few scenarios where doing something else will end up saving on emission. Because at the end of the say, beating a few Wh in carbon saving isn’t hard.
Connectivity will be the great equalizer in the future.
Equalizer of what, exactly? This statement and the ensuing paragraph betray the misguided belief that technology is the prescription for social problems. Starlink feels like nothing more than a giant ego trip. If this was truly an egalitarian effort, the people behind it would have consulted the rest of the world.
The author also seems to assume that somehow Starlink will provide a cheap and high-quality service, while complaining about the “greedy last-mile monopolists” in another paragraph. Would it not make more sense to assume that this company will be just as greedy once it has established its own monopoly, at the detriment of us all?
You can’t create a monopoly by adding another competitor.
Existing telcos are a natural monopoly because trenches, poles and wires are astonishingly expensive and it doesn’t make economic sense to build a duplicate set of them in the same location.
While Starlink is also astonishingly expensive (more expensive per unit bandwidth for all but the lowest-density regions), it’s not locked to a single physical location. Being able to rearrange the fleet to serve different regions at different densities is a huge deal because it means every monopoly ISP on the planet now has plausible competition.
Monopoly ISPs (eg Comcast in many US cities) will be forced to adapt and offer a reasonable level of service to fight off this competition (as they did when google fiber came out).
I don’t think Starlink will offer particularly great value for money, but a capitalist market cannot operate well without competition, and Starlink will provide that.
https://invulns.nl/ - All the static code generators I could find didn’t do it exactly how I liked it, so it’s generated statically by a racket script I threw together which takes some markdown files with some custom properties at the top to generate the pages.
I really like making ASCII art, so I added an ASCII animation in the header. I had a lot of fun making that :)
I really hate this trend of employees of big companies using Medium as a pseudo-anonymous complaints board.
But they are nog anonymous at all? At the bottom of the post is an extensive list of employees with their names and titles.
If six zillion Google employees feel this strongly about this issue, they should get together, draft a letter, sign it, and deliver it to management. This is what people of integrity do when they feel strongly about a given issue. Put your money where your mouth is people. Yes, that means risking your job. Do you feel strongly, or not?
The only difference is that they also posted the letter they drafted and signed online to spread awareness about their position and get additional signatures. I’m sure that they have sent the letter to management as well, so I don’t see what is wrong with them also putting it online.
Regarding Wikipedia, do they sell offline copies of it so we don’t have to download 80GB? Seems like it be a nice fundraising and sharing strategy combined.
I second this. While I know the content might change in the near future, it would be fun to have memorabilia about a digital knowledge base. I regret throwing to the garbage my Solaris 10 DVDs that Sun sent me for free back in 2009. I was too dumb back then.
Actually Wikipedia is exempt from this directive, as is also mentioned in the linked article. While I agree that this directive will have a severely negative impact on the internet in Europe, we should be careful not to rely on false arguments.
To be explicit, this is not a “modern systems are bloated” thing. The English Wikipedia has an estimated 3.5 billion words. If you took out every single multimedia, talk page, piece of metadata, and edit history, it’d still be 30 GB of raw text uncompressed.
Oh that’s not what I was implying. The commenter said “It’s only 80 GB (wtf?)”
I too was surprised at how small it was, but them remembered the old encyclopedias and realized that you can put a lot of pure text data in a fairly small amount of space.
Remember that they had a very limited selection with low-quality images at least on those I had. So, it makes sense there’s a big difference. I feel you, though, on how we used to get a good pile of learning in small package.
That sounds like a fun text encoding challenge: try to get that 30GB of wiki text onto a single layer DVD (about 4.6GB?)
I bet it’s technically possible with enough work. AFAIK Claude Shannon experimentally showed that human readable text only has a few bits of information per character. Of course there are lots of languages but they must each have some optimal encoding. ;)
Huh! I think gzip usually achieves about 2:1 on ASCII text and lzma is up to roughly twice as good. At least one of those two beliefs has to be definitely incorrect, then.
Okay so, make it challenging: same problem but this time an 700MB CD-R. :)
There is actually a well-known text compression benchmark based around Wikipedia, the best compressor manages 85x while taking just under 10 days to decompress. Slightly more practical is lpaq9m at 2.5 hours, but with “only” 69x compression.
What does 69x compression mean? Is it just 30 GB / 69 = .43 GB compressed? That doesn’t match up with the page you linked, which (assuming it’s in bytes) is around 143 MB (much smaller than .43 GB).
There is actually an exception for websites like Wikipedia in this version of the directive:
“online content sharing service provider’ means a provider of an information society service one of the main purposes of which is to store and give access to the public to copyright protected works or other protected subject-matter uploaded by its users, which the service optimises. Services acting in a non-commercial purpose capacity such as online encyclopaedia, and providers of online services where the content is uploaded with the authorisation of all rightholders concerned, such as educational or scientific repositories, should not be considered online content sharing service providers within the meaning of this Directive. Providers of cloud services for individual use which do not provide direct access to the public, open source software developing platforms, and online market places whose main activity is online retail of physical goods, should not be considered online content sharing service providers within the meaning of this Directive;
Reda says Voss misrepresents the true scope of the upload filtering obligation and at no point does the definition exclude platforms that don’t make money off their users’ sharing of copyrighted content. She concedes that “completely non-commercial platforms” are excluded, but points out that experience has shown that even a call for donations or the use of an advertising banner can be considered commercial activity.
does an american organization have to care about exceptions in stupid european laws?
They only do if they have enough presence in a European country willing to enforce those laws that they could be hurt in court.
If a company has no presence in any EU country, it can ignore those laws just like it ignores the laws against insulting the Thai king and laws against telling the truth about the Tienanmen Square Massacre.
Untill some European countries order their ISP’s to block all traffic towards those companies.
This has already happened with major torrent sites like ThePirateBay,org, which serves up this page to everyone in The Netherlands with this ISP (and they are quite activistic about providing everyone unrestricted access to the entire internet).
Take note that other European countries have ordered similar filters and take-downs onto their ISP’s and those are being actively being enforced.
Untill some European countries order their ISP’s to block all traffic towards those companies.
Again, that only hurts the company in proportion to how much of their business was coming out of the EU to begin with.
It also isn’t forcing them to abide by the law of any EU member state, any more than West Germany was forced to abide by East German law when the Berlin Wall was up and East Germans were barred from going to West Germany.
Again, that only hurts the company in proportion to how much of their business was coming out of the EU to begin with.
True, but since most major content-platforms in Europe are American companies, I doubt they’d get away with ignoring these laws. Nor do I think that they’d like to give up a market of about 510 Million people. Note that the United States is a market of only 325 Million people. So in terms of numbers, you have to care if you intend to grow beyond the United States, Canada and Mexico somewhere in the near future.
You also have to keep in mind that Europe is a lot closer to the United states than you might think.
It also isn’t forcing them to abide by the law of any EU member state, any more than West Germany was forced to abide by East German law when the Berlin Wall was up and East Germans were barred from going to West Germany.
Actually, that isn’t true at all. West Germany still had West-Berlin and had to maintain supply lines to that part of Berlin through East-German (DDR) territory. Because of this, there were a bunch of DDR-laws they had to abide by, despite of being separate countries. A scenario like this, might also happen to US-companies as well.
Wasn’t this feature initially there in smalltalk? I distinctly remember playing with a smalltalk dialect like Pharo a few years ago and was blown away when I stumbled upon this feature in the menu.
This article is especially interesting to me, as it shows how VS Code still doesn’t have the “Emacs nature”. Even though I’m a 30-year Emacs user, I do hesitate to recommend it to younger programmers because it’s so alien, and VS Code has one of the essential characteristics of Emacs: the extension language and the implementation language are the same. But this article is a great example of how it doesn’t — extensions are limited to using an extension API, rather than having full access to the application’s internals. Maybe a good thing, if you’re a mass-market product worried about malicious extensions. But I’ll note that rainbow-delimiters-mode dates back to 2010, and has never noticeably slowed down loading or display of source files, even in languages with lots of delimiters like Lisp.
It may be partly about malicious extensions, but there are also potential performance wins to having extensions in a separate process (as noted in this post and also seen in Sublime Text). Maybe “performance win” is the wrong phrase here: It’s more about consistency of performance: a poorly written extension is not going to slow the core editor experience. I’ve worked on a couple of editors in the past, and it is a real concern that people will complain “your editor is slow and crappy” when the real issue is the extensions in use. There’s also a certain compatibility benefit to having a limited, supported API for extensions vs extensions just being able to reach in and change anything.
This is not to say the VS Code/Sublime Text way is better than Emacs, to be clear. Just that there are more tradeoffs. Emacs has clearly been super successful with the tradeoffs it made, but it’s possible that the different audience has different expectations.
Can you verify on checker.ts, please?
I just tried it and while it is not 100% instant, it takes less than a tenth of a second and I don’t see a noticable spike in cpu usage when editing the file with the rainbow delimiters mode enabled.
Is it accurate? Comments/strings excluded?
Yes, it’s accurate and excludes comments and strings; I just tested this. On my machine (i5-3360M circa 2012, 8GB RAM, Fedora Linux, Emacs 28 with native compilation), I could not notice any slowdown loading the file, or editing near the top of the document. Near the bottom of the document, I do see a little slowdown when inserting or deleting delimiters, but I would also say it’s around 100ms. Even at the bottom of the document, scrolling or jumping around isn’t slowed down at all.
Will do when I’m on a better connection.
I hope the AC or space heating wasn’t on while the author was writing this piece because that would completely negate the few Wh of carbon saved a thousand-fold.
ps. using solar power for miniscule amount of carbon saving is mental masturbation and would solve no problem.
It sounds like you’re unironically criticizing people for participating in society while also trying to improve society. Any serious attempt to address the climate crisis will require the coordinated participation of our whole civilization, individual action is insufficient, but it is still valuable for individuals to explore what steps our society could take to address the climate crisis. People do not have to be ideologically pure in order to try to make the world a better place.
That said, the author has taken heroic efforts to take his apartment off-grid, to the point where “These conditions allow me to get through the winter without a heating system, relying only on solar heat and thermal underclothing.” So this writer definitely did not have space heating. (He acknowledges this is only possible because of the climate where he is.) I don’t see him mentioning in that piece whether he has air conditioning or not, but given that he has written extensively about more efficient cooling using fans instead of air conditioning, I suspect he uses less air conditioning than the average person. Purity should not be required for criticizing our society, but Low Tech Magazine does pretty well in the purity department.
It’s also a little disingenuous to call this a minuscule amount of carbon saving. Yes, his one relatively low-traffic website would not produce massive amounts of carbon emissions either way. But the point is that if the tech industry used these techniques to cut website emissions to a similar degree, it absolutely would have a substantial effect on global emissions. In 2015, data centers alone already contributed 2% of global greenhouse gas emissions, a similar share to global air travel. Worst-case scenarios could lead to the internet contributing up to 23% of global greenhouse gases by 2030. As we do more and more online, turning everything from cars to refrigerators into computers and connecting them to the internet, the internet will be an increasingly large part of our carbon footprint, unless we take action to limit the internet’s emissions. EDIT: I also expect the pandemic to accelerate the movement of everything online, making the sustainability of our web infrastructure even more vitally important.
I believe you replied to me by mistake, but I appreciate your argument and fully agree with you.
I feel that these types of dismissive comments are often left on projects addressing components of the climate crisis and are not very productive. You are implying that it is useless to work on this component because other parts of the problem are even bigger and more daunting. As mentioned on the about page, internet usage is already responsible for 10% of global energy usage, so even if we solve the AC and space heating issues first, we will also still need to solve the internet problem.
And even apart from that, just because a project doesn’t solve the whole problem at once, doesn’t mean that it is useless. There is actually huge value in providing inspiring solutions and showing how small things can be improved, if only for the value of inspiring people and getting them to think about the energy use of the websites they use each day. We actually will have to transition to a carbon-neutral or even negative global society in the coming decades and it can only be done in steps, so it would be best if every initiative towards that goal is received constructively, without dismissive comments that divert attention by referring to the scale of the overall problem.
Edited to add: I also just want to say that this whole idea of good efforts being ‘negated’ by some other appliance running is completely nonsensical. Whether an AC or space heating is on is completely independent from whether the server for this site is sustainable. In this hypothetical scenario where the author is running an AC or space heater, if the site was not sustainable the total emissions would be those from the AC/space heater plus those of the server, while with a sustainable site it is only those of the AC/space heater if one of those is on. As you can see, nothing is being ‘negated’, and it is misleading to state it as such.
I agree. Apart from the fact that many frankly act and speak in bad faith, we must never forget the scale of climate change. Even if one wants to dismiss this solar server as a toy experiment, it is at least an experiment. To any and all pot-shotters, MacKaye’s corollary holds: ‘what the f*ck have you done?’
Why must a person have done something for his opinion on something to be valid?
Can I not criticise the pointlessness of something if I don’t also participate in said pointless something?
Not to mention that when it comes to carbon emission, not doing anything is actually better than the alternative, a lot of the time.
No. Im implying that it is not very useful and that it would be more useful to solve a bigger part of the problem.
It does mean that the effort could be more effectively spent else where. That it is not being spent then begs the questions of why? If the purpose is to reduce carbon emission, there are much better places to spend effort. Therefore, one can conclude that the purpose is not to reduce emission.
Diverting attention to the scale of the overall problem is a good thing in my book, lest attention is spent on micro-solutions.
Opportunity cost of effort etc. The fact that it might be carbon+ to simply stay at home for one day rather than coming to work to write a piece about how you can save a few Wh of carbon by giving up 10% uptime.
Office work is itself a big problem one lowtechmag has engaged with here for example: https://solar.lowtechmagazine.com/2016/11/the-curse-of-the-modern-office.html
And if you read the article you would know the whole setup is based on the author’s apartment balcony :)
My comment is about the writing, not the physical setup of the machine.
And one could contrive a few scenarios where doing something else will end up saving on emission. Because at the end of the say, beating a few Wh in carbon saving isn’t hard.
More details on how the system is built and power usage here: https://solar.lowtechmagazine.com/2018/09/how-to-build-a-lowtech-website.html
Equalizer of what, exactly? This statement and the ensuing paragraph betray the misguided belief that technology is the prescription for social problems. Starlink feels like nothing more than a giant ego trip. If this was truly an egalitarian effort, the people behind it would have consulted the rest of the world.
The author also seems to assume that somehow Starlink will provide a cheap and high-quality service, while complaining about the “greedy last-mile monopolists” in another paragraph. Would it not make more sense to assume that this company will be just as greedy once it has established its own monopoly, at the detriment of us all?
You can’t create a monopoly by adding another competitor.
Existing telcos are a natural monopoly because trenches, poles and wires are astonishingly expensive and it doesn’t make economic sense to build a duplicate set of them in the same location.
While Starlink is also astonishingly expensive (more expensive per unit bandwidth for all but the lowest-density regions), it’s not locked to a single physical location. Being able to rearrange the fleet to serve different regions at different densities is a huge deal because it means every monopoly ISP on the planet now has plausible competition.
Monopoly ISPs (eg Comcast in many US cities) will be forced to adapt and offer a reasonable level of service to fight off this competition (as they did when google fiber came out).
I don’t think Starlink will offer particularly great value for money, but a capitalist market cannot operate well without competition, and Starlink will provide that.
https://invulns.nl/ - All the static code generators I could find didn’t do it exactly how I liked it, so it’s generated statically by a racket script I threw together which takes some markdown files with some custom properties at the top to generate the pages.
I really like making ASCII art, so I added an ASCII animation in the header. I had a lot of fun making that :)
But they are nog anonymous at all? At the bottom of the post is an extensive list of employees with their names and titles.
The only difference is that they also posted the letter they drafted and signed online to spread awareness about their position and get additional signatures. I’m sure that they have sent the letter to management as well, so I don’t see what is wrong with them also putting it online.
Quite right. I stand corrected. Good on them!
Urgh, damn it. I guess I should download Wikipedia while Europeans like me are still allowed to access all of it… It’s only 80 GB (wtf?) anyway.
That and the Internet Archive. ;)
Regarding Wikipedia, do they sell offline copies of it so we don’t have to download 80GB? Seems like it be a nice fundraising and sharing strategy combined.
I second this. While I know the content might change in the near future, it would be fun to have memorabilia about a digital knowledge base. I regret throwing to the garbage my Solaris 10 DVDs that Sun sent me for free back in 2009. I was too dumb back then.
Its a bit out of date but wikipediaondvd.com and lots more options at dumps.wikimedia.org.
I wonder how much traffic setting up a local mirror would entail, might be useful. Probably the type of thing that serious preppers do.
You can help seeding too.
Actually Wikipedia is exempt from this directive, as is also mentioned in the linked article. While I agree that this directive will have a severely negative impact on the internet in Europe, we should be careful not to rely on false arguments.
Do you remember the encyclopedias of the 90s? They came on a single CD. 650MB.
To be explicit, this is not a “modern systems are bloated” thing. The English Wikipedia has an estimated 3.5 billion words. If you took out every single multimedia, talk page, piece of metadata, and edit history, it’d still be 30 GB of raw text uncompressed.
Oh that’s not what I was implying. The commenter said “It’s only 80 GB (wtf?)”
I too was surprised at how small it was, but them remembered the old encyclopedias and realized that you can put a lot of pure text data in a fairly small amount of space.
Remember that they had a very limited selection with low-quality images at least on those I had. So, it makes sense there’s a big difference. I feel you, though, on how we used to get a good pile of learning in small package.
That sounds like a fun text encoding challenge: try to get that 30GB of wiki text onto a single layer DVD (about 4.6GB?)
I bet it’s technically possible with enough work. AFAIK Claude Shannon experimentally showed that human readable text only has a few bits of information per character. Of course there are lots of languages but they must each have some optimal encoding. ;)
Not even sure it’d be a lot of work. Text packs extremely well; IIRC compression ratios over 20x are not uncommon.
Huh! I think gzip usually achieves about 2:1 on ASCII text and lzma is up to roughly twice as good. At least one of those two beliefs has to be definitely incorrect, then.
Okay so, make it challenging: same problem but this time an 700MB CD-R. :)
There is actually a well-known text compression benchmark based around Wikipedia, the best compressor manages 85x while taking just under 10 days to decompress. Slightly more practical is lpaq9m at 2.5 hours, but with “only” 69x compression.
What does
69x compression
mean? Is it just30 GB / 69 = .43 GB compressed
? That doesn’t match up with the page you linked, which (assuming it’s in bytes) is around 143 MB (much smaller than .43 GB).From the page,
So 10e9 = 9.31 GiB. lpaq9m lists 144,054,338 bytes as the compressed output size + compressor (10e9/144,054,338 = 69.41), and 898 nsec/byte decompression throughput, so (10e9*898)/1e9/3600 = 2.49 hours to decompress 9.31GiB.
Nice! Thanks.
https://juliareda.eu/2018/09/ep-endorses-upload-filters/
Hmm, I think this actually makes it mandatory for Wikipedia to install an upload filter.
There is actually an exception for websites like Wikipedia in this version of the directive:
(Emphasis mine)
(Emphasis mine, https://thenextweb.com/eu/2018/06/19/the-eus-disastrous-copyright-reform-explained/)
Also, I am not sure that this is the exact wording that has passed. I am, to be honest, not well-versed in the EU legislative procedure.
does an american organization have to care about exceptions in stupid european laws?
They only do if they have enough presence in a European country willing to enforce those laws that they could be hurt in court.
If a company has no presence in any EU country, it can ignore those laws just like it ignores the laws against insulting the Thai king and laws against telling the truth about the Tienanmen Square Massacre.
Untill some European countries order their ISP’s to block all traffic towards those companies.
This has already happened with major torrent sites like ThePirateBay,org, which serves up this page to everyone in The Netherlands with this ISP (and they are quite activistic about providing everyone unrestricted access to the entire internet). Take note that other European countries have ordered similar filters and take-downs onto their ISP’s and those are being actively being enforced.
Again, that only hurts the company in proportion to how much of their business was coming out of the EU to begin with.
It also isn’t forcing them to abide by the law of any EU member state, any more than West Germany was forced to abide by East German law when the Berlin Wall was up and East Germans were barred from going to West Germany.
True, but since most major content-platforms in Europe are American companies, I doubt they’d get away with ignoring these laws. Nor do I think that they’d like to give up a market of about 510 Million people. Note that the United States is a market of only 325 Million people. So in terms of numbers, you have to care if you intend to grow beyond the United States, Canada and Mexico somewhere in the near future. You also have to keep in mind that Europe is a lot closer to the United states than you might think.
Actually, that isn’t true at all. West Germany still had West-Berlin and had to maintain supply lines to that part of Berlin through East-German (DDR) territory. Because of this, there were a bunch of DDR-laws they had to abide by, despite of being separate countries. A scenario like this, might also happen to US-companies as well.
It’s going to be interesting for US firms that use e.g. the Dutch sandwich to avoid US taxes.
Wasn’t this feature initially there in smalltalk? I distinctly remember playing with a smalltalk dialect like Pharo a few years ago and was blown away when I stumbled upon this feature in the menu.
You’re right, in this blog post the author mentions that they were inspired by the implementation of this feature in pharo.