I enjoyed the author’s previous series of articles on C++, but I found this one pretty vacuous. I think my only advice to readers of this article would be to make up your own mind about which languages to learn and use, or find some other source to help you make up your mind. You very well might wind up agreeing with the OP:
Programmers spend a lot of time fighting the borrow checker and other language rules in order to placate the compiler that their code really is safe.
But it is not true for a lot of people writing Rust, myself included. Don’t take the above as a fact that must be true. Cognitive overheads come in many shapes and sizes, and not all of them are equal for all people.
A better version of this article might have went out and collected evidence, such as examples of actual work done or experience reports or a real comparison of something. It would have been a lot more work, but it wouldn’t have been vacuous and might have actually helped someone answer the question posed by the OP.
Both Go and Rust decided to special case their map implementations.
Rust did not special case its “map implementation.” Rust, the language, doesn’t have a map.
Hi burntsushi - sorry you did not like it. I spent months before this article asking Rust developers about their experiences where I concentrated on people actually shipping code. I found a lot of frustration among the production programmers, less so among the people who enjoy challenging puzzles. They mostly like the constraints and in fact find it rewarding to fit their code within them. I did not write this sentence without making sure it at least reflected the experience of a lot of people.
I would expect an article on the experience reports of production users to have quite a bit of nuance, but your article is mostly written in a binary style without much room for nuance at all. This does not reflect my understanding of reality at all—not just with Rust but with anything. So it’s kind of hard for me to trust that your characterizations are actually useful.
I realize we’re probably at an impasse here and there’s nothing to be done. Personally, I think the style of article you were trying to write is incredibly hard to do so successfully. But there are some pretty glaring errors here, of which lack of nuance and actual evidence are the biggest ones. There’s a lot of certainty expressed in this article on your behalf, which makes me extremely skeptical by nature.
(FWIW, I like Rust. I ship Rust code in production, at both my job and in open source. And I am not a huge fan of puzzles, much to the frustration of my wife, who loves them.)
I just wanted to say I thought your article was excellent and well reasoned. A lot of people here seem to find your points controversial but as someone who programs C++ for food, Go for fun and Rust out of interest I thought your assessment was fair.
Lobsters (and Hacker News) seem to be very favourable to Rust at the moment and that’s fine. Rust has a lot to offer. However my experience has been similar to yours: the Rust community can sometimes be tiresome and Rust itself can involve a lot of “wrestling with the compiler” as Jonathan Turner himself said. Rust also provides some amazing memory safety features which I think are a great contribution so there are pluses and minuses.
Language design is all about trade-offs and I think it’s up to us all to decide what we value in a language. The “one language fits all” evangelists seem to be ignoring that every language has strong points and weak points. There’s no one true language and there never can be since each of the hundreds of language design decisions involved in designing a language sacrifices one benefit in favour of another. It’s all about the trade-offs, and that’s why each language has its place in the world.
I found the article unreasonable because I disagree on two facts: that you can write safe C (and C++), and that you can’t write Rust with fun. Interpreted reasonably (so for example, excluding formally verified C in seL4, etc.), it seems to me people are demonstrably incapable of writing safe C (and C++), and people are demonstrably capable of writing Rust with fun. I am curious about your opinion of these two statements.
I think you’re making a straw man argument here: he never said you can’t have fun with Rust. By changing his statement into an absolute you’ve changed the meaning. What he said was “Rust is not a particularly fun language to use (unless you like puzzles).” That’s obviously a subjective statement of his personal experience so it’s not something you can falsify. And he did say up front “I am very biased towards C++” so it’s not like he was pretending to be impartial or express anything other than his opinion here.
Your other point “people are demonstrably incapable writing safe C” is similarly plagued by absolute phrasing. People have demonstrably used unsafe constructs in Rust and created memory safety bugs so if we’re living in a world of such absolute statements then you’d have to admit that the exact same statement applies to Rust.
A much more moderate reality is that Rust helps somewhat with one particular class of bugs - which is great. It doesn’t entirely fix the problem because unsafe access is still needed for some things. C++ from C++11 onwards also solves quite a lot (but not all) of the same memory safety issues as long as you choose to avoid the unsafe constructs, just like in Rust.
An alternative statement of “people can choose to write safe Rust by avoiding unsafe constructs” is probably matched these days with “people can choose to write safe C++17 by avoiding unsafe constructs”… And that’s pretty much what any decent C++ shop is doing these days.
somewhat with one particular class of bugs
It helps with several types of bugs that often lead to crashes or code injections in C. We call the collective result of addressing them “memory safety.” The extra ability to prevent classes of temporal errors… easy-to-create, hard-to-find errors in other languages… without a GC was major development. Saying “one class” makes it seem like Rust is knocking out one type of bug instead of piles of them that regularly hit C programs written by experienced coders.
An alternative statement of “people can choose to write safe Rust by avoiding unsafe constructs” is probably matched these days with “people can choose to write safe C++17 by avoiding unsafe constructs”
Maybe. I’m not familiar with C++17 enough to know. I know C++ was built on top of unsafe language with Rust designed ground-up to be safe-as-possible by default. I caution people to look very carefully for ways to do C++17 unsafely before thinking it’s equivalent to what safe Rust is doing.
I agree wholeheartedly. Not sure who the target survey group was for Rust but I’d be interested to better understand the questions posed.
Having written a pretty large amount of Rust that now runs in production on some pretty big systems, I don’t find I’m “fighting” the compiler. You might fight it a bit at the beginning in the sense that you’re learning a new language and a new way of thinking. This is much like learning to use Haskell. It isn’t a good or bad thing, it’s simply a different thing.
For context for the author - I’ve got 10 years of professional C++ experience at a large software engineering company. Unless you have a considerable amount of legacy C++ to integrate with or an esoteric platform to support, I really don’t see a reason to start a new project in C++. The number of times Rust has saved my bacon in catching a subtle cross-thread variable sharing issue or enforcing some strong requirements around the borrow checker have saved me many hours of debugging.
I really don’t see a reason to start a new project in C++.
Here’s one: there’s simply not enough lines of Rust code running in production to convince me to write a big project in it right now. v1.0 was released 3 or 4 years ago; C++ in 1983 or something. I believe you when you tell me Rust solves most memory-safety issues, but there’s a lot more to a language than that. Rust has a lot to prove (and I truly hope that it will, one day).
I got convinced when Rust in Firefox shipped. My use case is Windows GUI application, and if Firefox is okay with Rust, so is my use case. I agree I too would be uncertain if I am doing, say, embedded development.
That’s fair. To flip that, there’s more than enough lines of C++ running in production and plenty I’ve had to debug that convinces me to never write another line again.
People have different levels of comfort for sure. I’m just done with C++.
I was interested in what he was saying just up until he said
Some may even be lucky enough to find themselves doing Extreme Programming, also known as ‘The Scrum That Actually Works’.
My experience with XP was that it was extremely heavyweight and did not work well at all. It created the greatest developer dissatisfaction of any of the versions of Agile I’ve encountered.
Couldn’t disagree more – the most successful team I was on was heavily into XP. When people say it’s heavyweight, they’re usually talking about pair programming. I’m not sure what people have against it; I’ve found it’s a great way to train junior developers, awesome for tricky problems, and generally a great way to avoid the problem of, “Oh this PR looks fine but redo it because you misunderstood the requirements.”
I don’t want to discount your experience, but it sounds like the issues you’ve had with pair programming are more with the odd choices your employer imposed.
Both people have specialized editor configs? Sure, switch off computers or whatever too; no need to work in an unfamiliar environment.
And if one person is significantly less experienced than the other, that person should be at the keyboard more often than not – watching the master at work will largely be useless.
Why I like XP over anything else is the focus on development practices rather than business practices. Pairing, TDD, CI, <10 minute builds, WIP, whole team estimation, etc are all used to produce a better product, faster.
The weekly retrospective offers a way to adjust practices that aren’t working and bolster those that are.
Agreed 100%. It turned my head a bit when he thought Agile was too prescriptive, but then was considering an even more prescriptive methodology.
What was your experience with XP? Also, scrum is heavyweight as well in my experience and doesn’t work excellently in an actually agile environment like a startup. Feels like it could work in corp. though.
Nice article thanks. I liked your very rational approach to evaluating the language. Way too much programming language comparison seems more ego driven than fact driven and it’s nice to see it done cheerfully and without team calling.
Thanks for the feedback! I’ll be sure to keep that in mind when I go off to write more posts like this
It’s interesting to see Bjarne Stroustrup complaining that C++ is getting too complex - from its inception it’s always been one of the harder languages to fully comprehend so it seems unsurprising that it’s becoming even more intractable over time. And I say this as a C++ programmer.
I’m always curious about the intended audience with these types of posts. The posts typically paint a straw man picture that there are people unwilling to change the operating model to be more efficient given the option, which is absurd. Should we abandon bitcoin? Is that the thesis here?
Clearly the non technical people would probably not know PoW is inefficient but they also have little to no control over the dominance of bitcoin and the way it works. There are strong economic incentives for actors supporting the current structure to keep supporting it as is and the blog post does not address this problem at all.
The cryptocurrency posts themselves paint a strawman that we cant do anything better than corrupt, for-profit, centralized tech unless we switch over to blockchains. That’s a lie with many counterexamples. Bitcoin itself also has huge hype and drawbacks in practice.
The author is highlighting that hype and drawbacks. He’s also highlighting a social phenomenon where many proponents try to talk like bad things are good things to downplay them. I’d straight up call that fraud since they’re trying to get people’s money.
I understand that you have a strong opinion on the subject but you’re essentially calling anyone who has an interest in decentralized systems a fraudster. I think it’s disingenuous to say “people who have interests different from my own are by definition fraudsters”.
Decentralized, trustless systems have important applications. Bitcoin was created as a response to the banks being involved in widespread fraud. Calling Bitcoin users frauds seems to miss the point in the largest way possible.
“ Bitcoin was created as a response to the banks being involved in widespread fraud.”
So were credit unions and non-profits in response to earlier fraud. I don’t see a lot of them involved in things like 2008 crises. I thought even Bitcoin had a non-profit/foundation controlling or supporting it.
He’s also highlighting a social phenomenon where many proponents try to talk like bad things are good things to downplay them. I’d straight up call that fraud since they’re trying to get people’s money.
That was the key circumstance that I brought up fraud on. The need to use as much energy as Ireland to avoid unscupulous parties screwing up a few transactions a second is one such implication. It’s a total lie since the regular, banking system prevents or catches lots of stuff like that on a daily basis. From there, I pointed out in another comment that a system using regular databases and protocols with distributed checking might take a $5 VPS or one server per participant. Those don’t take the energy of Ireland, insanely-slow transactions, or crypto magic.
That the very-smart proponents of Bitcoin don’t investigate such options or tell their potential investors of such alternatives with their risk/reward tradeoffs means they’re more like zealots or con artists. I mean, most people might trust such alternatives since they’re using the regular financial system. They might love solutions that knock out all the real problems they’ve dealt with efficiently plus make plenty of headway on the more rare or hypothetical risks many cryptocurrency proponents worry about all night.
Save the best for last. If it’s Bitcoin, they might also want to know it’s primarily a volatile, financial instrument used for speculation instead of a stable currency they can depend on as its proponents are selling it. I know people who are day trading these things right now riding the hype waves profitably while the adopters driving them and sustaining the systems aren’t getting the much better thing people probably promised them. Many of them have also lost money they wouldn’t have lost storing currency in traditional, financial system. Looks like fraud to me.
The need to use as much energy as Ireland to avoid unscupulous parties screwing up a few transactions a second is one such implication. It’s a total lie since the regular, banking system prevents or catches lots of stuff like that on a daily basis. From there, I pointed out in another comment that a system using regular databases and protocols with distributed checking might take a $5 VPS or one server per participant. Those don’t take the energy of Ireland, insanely-slow transactions, or crypto magic.
It sounds like you might endorse the notion that PayPal is more effective than Bitcoin. PayPal supports more transactions per second, catches a lot of fraud, supports chargebacks when fraud does happen, and doesn’t require proof-of-work – it all runs safely on PayPal’s verified servers. This is all true, and for many people PayPal is fine enough.
However, the centralized nature of PayPal does have some problems. There’s always the risk of getting your account frozen, which has happened to countless people. Minecraft made too much money in 2010. Wikileaks pissed off powerful entities in 2012. Google has over 600,000 results for “paypal accounts frozen”. I hear that PayPal freezes lots of crowdfunding efforts in particular.
What it comes down to is trust. If you can trust the corporate entity PayPal to expedite your transactions and send you on your way, then the status quo is fine. But if you have a problem with PayPal, or PayPal has a problem with you, then you need to find an alternative.
You can see the same problem on a larger scale with the SWIFT network. Nearly every international interbank transfer takes place on SWIFT, and it works fine as long as everyone trusts each other. But if you find yourself on the wrong end of US sanctions, suddenly your banking system comes to a screeching halt. Russia, China, and Iran are all too aware of this problem and are trying to build alternatives. Russia is working on SFPS and China is building CIPS. They’re also both stockpiling gold; another asset that won’t freeze you out at a moment’s notice.
Bitcoin never freezes anyone out of their funds. If you have the private key, you control the bitcoin wallet, period. It’s math, not bureaucracy.
“This is all true, and for many people PayPal is fine enough.” “However, the centralized nature of PayPal does have some problems”
You’re almost there. The centralized solution like PayPal works really well except in well-known failure modes. SWIFT is another good example I bring up myself in these discussions as better than Bitcoin so far. There’s centralized companies, esp credit unions or nonprofits, that aren’t doing all the shady stuff PayPal does. That’s by design. There’s cooperatives leaner than Swift, too. So, the logical, first thing to explore is how to mix those protections with centralized companies like PayPal. If we do decentralized, the first thing to explore should be proven tech for centralized case with distributed checking maybe at a granularity of participating organization like with banks and SWIFT. So, so, so much more efficient to do that.
Instead, cryptocurrency proponents paint a false dilemma between for-profit, greedy banks vs distributed, energy-sucking, blockchain system. It’s misleading given all the designs in between. Not to mention they seem to only focus on what for-profit, scumbag banks do instead of what centralized organizations designed for public benefit can do. A little weird to sidestep the whole concept of nonprofit, consumer-focused banks or companies, eh? It’s like they want a specific solution ahead of time looking for justifications for it instead of exploring the vast solution space trying to find what works best for most peoples’ goals.
“Bitcoin never freezes anyone out of their funds. If you have the private key, you control the bitcoin wallet, period. It’s math, not bureaucracy.”
You’re telling me Bitcoin ledgers, exchanges, and/or hardware can’t be blocked or made a felony in a country. I doubt that. Hell, the mining situation makes it look more like a traditional oligopoly. I can’t remember if they’re all in China or not. That would be even worse given it would be an oligopoly whose companies are under control of one government that’s not about libertarianism and greater good. There’s currently more diverse control and subversion difficulty in traditional, banking system right now if not doing business with banks that are scumbags. I’d avoid any of them on the bailout list to start with.
Good points all around. On second thought, what you’re describing sounds less like PayPal and more like Ripple.
In May 2011, [the creators of Ripple] began developing a digital currency system in which transactions were verified by consensus among members of the network, rather than by the mining process used by bitcoin, which relies on blockchain ledgers. This new version of the Ripple system was therefore designed to eliminate bitcoin’s reliance on centralized exchanges, use less electricity than bitcoin, and perform transactions much more quickly than bitcoin.
It’s targeting the interbank/SWIFT space, and purports to “do for payments what SMTP did for email”.
Oh yeah, I loved their concept when I looked into this stuff. Interledger was my favorite concept but Ripple stood out, too. Obviously, I have some technical disagreements but they’re going in much smarter direction. Their marketing said you can pay for stuff with quick settlements, multiple parties checking stuff, and none of Bitcoin’s energy problems. The quick settlements in a global system probably being the main, selling point for most customers.
there are people unwilling to change the operating model to be more efficient given the option, which is absurd
There are absolutely lots of these people unwilling to change the operating model to be more efficient given the option. This is why I looked for claims from reasonably noteworthy bitcoiners and not random nobodies - though the random nobodies use the same arguments, and quote the noteworthy arguments - and linked and quoted them at length to make it clear that this is not straw but actual arguments they make in real life. This is all real. I’m not sure how I could make that clearer.
I don’t see a quote about choosing PoW over efficient alternatives. All the claims quoted in your post all seem to be something along the lines of “the benefits of proof of work are worth it.” To these claims you respond with the argument that they are “highly questionable to anyone who isn’t already a Bitcoin fan.”
From my read I’d say you do not address the claim that immutability and a shared transaction concensus is useful with any sort of reasoning or argumentation, just a slew of examples meant to bring doubt in the readers mind. You use terms like “waste” to describe the use of energy, which clearly reveals the a priori and entirely unargued assumption that it is not worth it. A better approach would be to lay down a reasonable framework for analysis and explain the limits of immutability and the price being paid for it within that framework.
Ultimately, I still don’t quite understand the thesis of this post. Why should the externality of energy expenditure be regulated by the economics driving it (proponents of PoW blockchains) and not governments?
Is this legal in Europe? In Australia if not being tracked was considered legally to be a “common law right” it’s not possible to opt out of it.
I think we need to wait and see, as GDPR will go into effect on May 25 and probably a number of practices like this one will be challenged legally. I personally feel this give-your-consent-or-so-long approach is not in the spirit of the law.
If it’s not legal, they’ll make it legal and sugar-coat it with GDPR in a way that’s impractical or infeasible to the users.
I hope Facebook users can combat this with addons, but as most users are mobile users, they surely lack the addons or the technical know-how to set it up.
Just opt out of Facebook already.
I hope Facebook users can combat this with addons
At some point, the person being abused has to acknowledge that they are being abused, and choose to walk away.
Yeah, just opt out. But sadly there are people who, say, expatriated and have no better way to stay in touch with old friends.
Until a viable replacement comes along, which may never happen, I think it’s a nice hope that they can find a way to concentrate on their use case without all the extra baggage.
I am an expat.
I manage to keep in contact with the friends that matter, the same as I did when I didn’t use Facebook in a different state in my home country.
If they’re actually friends, you find a way, without having some privacy raping mega-corp using every conversation against you.
Agreed, I don’t buy the argument that Facebook is the only way to keep in touch from afar.
I’m an expat, and I have regular healthy contact with my friends and loved ones from another continent, sharing photos and videos and prose. I have no Facebook account.
I hope Facebook users can combat this with addons
Then this will happen: https://penguindreams.org/blog/discoverying-friend-list-changes-on-facebook-with-python/
Unfriend Finder was sent a cease and desist order and chose not to fight it. I made my own python script that did the same thing, and ironically, Facebooks changes the fixed the Cambridge Analytica issue broke my plugin. It stopped 3rd parties yes, but it also kept developers from having real API access to our own data.
I also wrote another post about what I really think is going on with the current Facebook media attention:
https://fightthefuture.org/article/facebook-politics-and-orwells-24-7-hate/
You’re not forced to use Facebook. It looks like they’re following GDPR and capturing consent. It seems the biggest issue is the bundling of multiple things into one consent and not letting folks opt in or out individually.
I still use it :-) Anecdotally (other users spotted “in the wild”) I think I’d even say usage is increasing in the last few years.
I couldn’t agree more. The amount of dedication and determination this must have taken is quite impressive.
EDIT: Also worth reading about is Jeri Ellsworth, mentioned in the piece as an inspiration.
As far as I’ve seen Jeri has all but disappeared from the Internet, I used to follow her YouTube channel quite a bit. It’s a shame, she was a great teacher.
Edit: seems like she’s still active on Twitter
I got excited when she started posting about radio stuff about six months ago, but it looks like it was only a short lived return. She was really one of my favourite technical YouTubers back in the day.
Another “quirks” question: did you find any unexpected quirks of Go that made writing this emulator harder or easier?
In this particular case, it feels like the code isn’t too far from what C code would be: here are some basic data structures and here are some functions that operate on them, mostly on the bit level. No fancy concurrency models nor exciting constructs. I think given the fact that this is an inherently low level program, most nicieties from Go weren’t immediately needed.
I did use some inner functions/closures and hash maps, but could’ve just as well done without them. The bottom line is that the language didn’t get in the way, but I didn’t feel like it was enourmously helpful, other than making it easier to declare dependencies and handling the build process for me.
Did you run into any issues with gc pauses? That’s one of the things people worry about building latency sensitive applications in go.
Not the OP, but I would assume this kind of application generates very little garbage in normal operation.
The gc pauses are so miniscule now, for the latest releases of Go, that there should be no latency issues even for realtime use. And it’s always possible to allocate a blob of memory at the start of the program and just use that, to avoid gc in the first place.
The garbage collector hasn’t been an issue either. Out of the box, I had to add artificial delays to slow things down and maintain the frame rate, so I haven’t done much performance tuning/profiling. I am interested in scenarios where this would be critical though.
Interested in this as well. I’ve toying with the idea of writing a CHIP-8 emulator in Go and would love to hear about how is the experience of writing emulators.
I did exactly this as a project to learn Go! I used channels in order to control the clock speed and the timer frequency and it ended up being a really nice solution. The only real hangup I had was fighting with the compiler with respect to types and casting, but having type checking overall was a good thing.
This article is pretty hollow. I mean yeah, Bitcoin failed. There’s no doubt about that. The weird balance of power between the developers/miners is poorly designed and there really is no incentive to use Bitcoin/Ethereum/Monero/etc other than for speculation and buying drugs.
However these are just design flaws, the germ of the idea is out there and I’m sure someone will iron out these edge cases. Personally I think it involves consolidating the developers and miners together so that they are one (probably anonymous) entity. You need to trust the developers already, might as well put that into the design instead of also having to trust randos who can afford massive GPU farms.
So your solution to achieving a trustless design is … a design with trust in?
What bit of the cryptocurrency promise is left?
I mean, it basically boils down to what kind of political system you think would work best. Bitcoin is clearly anarchist, personally I think a monarchy is the way to go.
So your solution to achieving a trustless design is … a design with trust in?
How can we be sure a trustless design is even possible? With what’s available now, you have to trust the developers to ensure the protocol works as advertised, and to distribute software updates. You have to trust the miners to not pull a 51% attack.
What bit of the cryptocurrency promise is left?
A frictionless, resilient, (hopefully) non-corruptible financial system.
That’s what the people I know in Singapore tell me. They tell me all kinds of messed up stuff about their little surveillance/police state. They also tell me the main rulers are smart people that at least want to take care of the people. The younger one is also a bit tech savvy I’m told. About the closest thing to a monarchy or surveillance state I could tolerate. Close to. Call me paranoid. ;)
You need to trust the developers already
no, you can verify their code or write your own alternative client.
In practice, about 0 users actually do this, and it should be reasonably obvious that this won’t change - particularly for a system aspiring to general adoption. Civilisation runs on division of labour; the universally competent Heinleinian individual is a fictional construct.
I think at least a few hundred people review each change, perhaps thousands. Then you’re not trusting the developers, you’re trusting that if something were wrong then one of the people who reviewed it would make a fuss about it.
I’m guessing your point is that not all open source code is bug free? I don’t see how that contradicts anything I’ve said.
Nope, the DAO hack was a bug in the contract, not Ethereum itself.
My point it’s not just about bugs in the software and verifying that the software does as advertised. The developers are BDFL and are at the sole discretion of how the cryptocoin evolves. The Ethereum devs decided that they wanted their money back and forked the blockchain to reverse this DAO hack. Lots of users protested, to the point of creating an entirely different cryptocoin with the original blockchain in-tact, but it clearly failed, look at their prices. Most people just don’t care.
It’s not like the Ethereum project can do whatever they want. They were able to do that fork because enough people thought it was reasonable. Any changes they want to make are at the mercy of public opinion.
Sure, what they can do is limited. But as we have seen, they have enough power to do some pretty nepotistic stuff, with little protest.
enough people thought it was reasonable
I don’t think so, most people just weren’t aware of what was going on or didn’t care. There have been other hacks since the DAO, with lots of money at stake, that didn’t see a fork.
My contrast, your idea of consolidating miners and developers seems that they could do whatever they want. Unless I’ve misunderstood your proposal.
Yep, that’s exactly what I’m saying. I don’t think centralization isn’t inherently bad, as long as the people in charge have good intentions.
have you thought through how to select people who have good intentions, and how to ensure that they aren’t corrupted or replaced by people with bad intentions?
I’ve thought about it but haven’t really figured anything out yet. The other thing you have to consider is competence, they can be well-intentioned but if they don’t know how to lead then things could go bad as well.
As far as specific people go I thought about Scott Forstall and Smealum, but I’m not 100% sure they would check both boxes. There’s always a risk, I guess.
So much ignorance in one post… It’s hard to know where to begin.
Bitcoin has a market cap of $117 billion. N64N64, if that’s your idea of “failed” you have great aspirations indeed!
“there really is no incentive to use Bitcoin/Ethereum/Monero/etc other than for speculation and buying drugs” - Bitcoin was originally designed by Satoshi Nakamoto as a response to the bank failures of 2008. He wanted to disintermediate online transactions, removing banks from the equation altogether. Today thousands of online merchants accept Bitcoin or Bitcoin Cash, and it’s growing. International remittance is the really big application for cryptocurrencies right now though with millions of dollars being transferred internationally every day. Banks aren’t a party to these transactions so to that extent Satoshi’s vision has succeeded.
“buying drugs” - Bitcoin is uniquely poorly suited to confidential transactions given its public ledger. These days the NSA routines tracks all Bitcoin transactions so only the most foolish would use it for illegal purposes.
These are probably the weakest arguments against Bitcoin I’ve seen. But the coolest bit about Bitcoin is that it is completely voluntary, so you do your thing, and we’ll do ours.
Real arguments against Bitcoin are:
And I’m sure there are others but literally none of the ones presented here are valid.
These are probably the weakest arguments against Bitcoin I’ve seen.
As it says, this is in response to one of the weakest arguments for Bitcoin I’ve seen. But one that keeps coming up.
But the coolest bit about Bitcoin is that it is completely voluntary, so you do your thing, and we’ll do ours.
When you’re using literally more electricity than entire countries, that’s a significant externality that is in fact everyone else’s business.
I would also like to be able to upgrade my gaming PC’s GPU without spending what the entire machine cost.
This is getting better though.
For what it’s worth, Bitcoin mining doesn’t use GPUs and hasn’t for several years. GPUs are being used to mine Ethereum, Monero, etc. but not BItcoin or Bitcoin Cash.
When you’re using literally more electricity than entire countries, that’s a significant externality that is in fact everyone else’s business
And yet, still less electricity than… Christmas lights in the US or gold mining.
https://coinaccess.com/blog/bitcoin-power-consumption-put-into-perspective/
When you reach for “Tu quoque” as your response to a criticism, then you’ve definitely run out of decent arguments.
Bitcoin (and all blockchain based technology) is doomed to die as the price of energy goes up.
It also accelerates the exaustion of many energy sources, pushing energy prices up faster for every other use.
All blockchain based cryptocurrencies are scams, both as currencies and as long term investments.
They are distributed, energy wasting, ponzi scheme.
wouldn’t an increase in the cost of energy just make mining difficulty go down? then the network would just use less energy?
No, because if you reduce the mining difficulty, you decrease the chain safety.
Indeed the fact that the energy cost is higher than the average bitcoin revenue does not means that a well determined pool can’t pay for the difference by double spending.
If energy cost doubles, a mix of two things will happen, as they do when the block reward halves:
Either way, the mining will happen at a price point where the mining cost (energy+capital) meets the block reward value. This cost is what secures the blockchain by making attacks costly.
Either way, the mining will happen at a price point where the mining cost (energy+capital) meets the block reward value.
You forgot one word: average.
Much of the brains in the cryptocurrency scene appear to be in consensus that PoW is fundamentally flawed and this has been the case for years.
PoS has no such energy requirements. Peercoin (2012) was one of the first, Blackcoin, Decred, and many more serve as examples. Ethereum, #2 in “market cap”, is moving to PoS.
So to say “ [all blockchain based technology] is doomed to die as the price of energy goes up” is silly.
Much of the brains in the cryptocurrency scene appear to be in consensus that PoW is fundamentally flawed and this has been the case for years.
Hum… are you saying that Bitcoin miners have no brain? :-D
I know that PoS, in theory, is more efficient.
The fun fact is that all implementation I’ve seen in the past were based on PoW based crypto currencies stakes. Is that changed?
As for Ethereum, I will be happy to see how they implement the PoS… when they will.
Blackcoin had a tiny PoW bootstrap phase, maybe weeks worth and only a handful of computers. Since then, for years, it has been purely PoS. Ethereum’s goal is to follow Blackcoin’s example, an ICO, then PoW, and finally a PoS phase.
The single problem PoW once reasonably solved better than PoS was egalitarian issuance. With miner consolidation this is far from being the case.
IMHO, fair issuance is the single biggest problem facing cryptocurrency. It is the unsolved problem at large. Solving this issue would immediately change the entire industry.
Well, proof of stake assumes that people care about the system.
It see the cryptocurrency in isolation.
An economist would object that a stake holder might get a lot by breaking the currency itself despite the loss in-currency.
There are many ways to gain value from a failure: eg buying surrogate goods for cheap and selling them after the competitor’s failure has increased their relative value.
Or by predicting the failure and then causing it, and selling consulting and books.
Or a stake holder might have a political reason to demage the people with a stake in the currency.
I’m afraid that the proof of stake is a naive solution to a misunderstood economical problem. But I’m not sure: I will surely give a look to Ethereum when it will be PoS based.
doomed to die as the price of energy goes up.
Even the ones based on proof-of-share consensus mechanisms? How does that relate?
Can you point to a working implementation so that I can give a look?
Last time I checked, the proof-of-share did not even worked as a proof-of-concept… but I’m happy to be corrected.
Blackcoin is Proof of Stake. (I’ve not heard of “Proof of Share”).
Google returns 617,000 results for “pure pos coin”.
Instructions to get on the Casper Testnet (in alpha) are here: https://hackmd.io/s/Hk6UiFU7z# . No need to bold your words to emphasize your beliefs.
The emphasis was on the key requirement.
I’ve seen so many cryptocurrencies died few days after ICO, that I raised the bar to take a new one seriously: if it doesn’t have a stable user base exchanging real goods with it, it’s just another waste of time.
Also, note that I’m not against alternative coins. I’d really like to see a working and well designed alt coin.
And I like related experiments as GNU Teller.
I’m just against scams and people trying to fool other people.
For example, Casper Testnet is a PoS based on a PoW (as Etherum currently is).
So, let’s try again: do you have a working implementation of a proof of stake to suggest?
It’s not live or open-source, so I’d understand if you’re still skeptical, but Algorand has simulated 500,000 users.
Again I don’t seem to understand your anger. We’re on a tech site discussing tech issues. You seem to be getting emotional about something that’s orthogonal to this discussion. I don’t think that emotional exhorting is particularly conducive to discussion, especially for an informed audience.
And I don’t understand what you mean by working implementation. It seems like a testnet does not suffice. If your requirements are: widely popular, commonly traded coin with PoS, then congratulations you have built a set of requirements that are right now impossible to satisfy. If this is your requirement then you’re just invoking the trick question fallacy.
Nano is a fairly prominent example of Delegated Proof of Stake and follows a fundamentally very different model than Bitcoin with its UTXOs.
No anger, just a bit of irony. :-)
By working implementation of a software currency I mean not just code and a few beta tester but a stable userbase that use the currency for real world trades.
Actually that probably the minimal definition of “working implementation” for any currency, not just software ones.
I could become a little lengthy about vaporware, marketing and scams, if I have to explain why an unused software is broken by definition.
I develop an OS myself tha literally nobody use, and I would never sell it as a working implementation of anything.
I will look to Nano and delegated proofs of stake (and I welcome any direct link to papers and code… really).
But frankly, the sarcasm is due to a little disgust I feel for proponents of PoW/blockchain cryptocurrencies (to date, the only real ones I know working, despite broken as actual long term currency): I can understand non programmers that sell what they buy from programmers, but any competent programmer should just say “guys Bitcoin was an experiment, but it’s pretty evident that has been turned to a big ponzi scheme. Keep out of cryptocurrencies! Or you are going to loose your real money for nothing.”
To me, programmers who don’t explain this are either incompetent enough to talk about something they do not understand, or are trying to profit from those other people, selling them their token (directly or indirectly).
This does not means in any way that I don’t think a software currency can be built and work.
But as an hacker, my ethics prevent me from using people’s ignorance against them, as does who sell them “the blockchain revolution”.
The problem is that in the blockchain space, hypotheticals are pretty much worthless.
Casper I do respect, they’re putting a lot of work in! But, as I note literally in this article, they’re discovering yet more problems all the time. (The latest: the security flaws.)
PoS has been implemented in a ton of tiny altcoins nobody much cares about. Ethereum is a great big coin with hundreds of millions of dollars swilling around in it - this is a different enough use case that I think it needs to be regarded as a completely different thing.
The Ethereum PoS FAQ is a string of things they’ve tried that haven’t quite been good enough for this huge use case. I’ll continue to say that I’ll call it definitely achievable when it’s definitely achieved.
Covert asicboost was fixed with segwit, overt is being used: https://mobile.twitter.com/slush_pool/status/977499667985518592
I also wonder aloud … might this CPU be vulnerable to any SgxSpectre/MeltdownPrime/SpectrePrime/BranchScope/Meltdown/Spectre-like vulnerabilities due to the branch predictor?
If comparing them, so far I’m saying Barbara Liskov for the win. The work I like most is the kind that can accelerate other work. There’s lots of influence given both abstract, data types influence on language design and CLU having some influence on C++ that got widely used. Her work might have also helped build a lot of the others’ work if it got more investment or refining. The other two things on her Wikipedia were first language for distributed systems and an object-oriented database. I recall she also did something in Byzantine Fault-Tolerance but I can’t remember if it was big impact.
She wasn’t obscure to CompSci either: they gave her a Turing Award for her contributions. She’s still publishing with new and old papers here. Definitely should be recognized by modern programmers esp since distributed systems are the thing these days with her laying a lot of groundwork when client-server on PC’s and servers was the thing. True pioneer.
Her work on Byzantine Fault-Tolerance is now used in some cryptocurrencies (Ripple and Stellar for example) so it’s pretty significant.
While we’re acknowledging fundamental abstractions, Emmy Noether should get some credit for her contributions to algebra. Algebraic invariants, well-founded induction, lots of important and subtle stuff there. It’s the kind of math that makes formal semantics of programming languages even possible.
Yes! I was thanks to the Noether language giving tribute that I found out about her. She was amazing.
Liskov’s work is great because she created so many great mental tools that I literally use everyday.
They mention bitcoin, but can it still be mined with GPUs? I thought after specialized ICs for bitcoin/sha256 appeared, mining it with GPUs became highly unprofitable.
And if not, are these GPUs used for ethereum? Found this site and I never heard about almost all coins listed there.
Ethereum, Monero, etc.
Everyone just says “bitcoin mining” meaning cryptocurrency mining in general, because “bitcoin” is easy to say and more people are aware of the word I guess.
No, it hasn’t been viable to mine Bitcoin with GPUs for a long time. Most newer cryptocurrencies are “ASIC resistant” which essentially means they’re designed so you need a GPU to mine them.
It’s correct to put “ASIC resistant” in quotes, because that really only refers to the hash function used by Bitcoin. Any hashing function can be implemented in hardware via an ASIC if the economics motivate it.
This has already happened with Litecoin and Ethereum, I believe.
That could not have been more prescient really.
“As I said before, hiding in this list are 20-30 bugs that cannot be worked around by operating systems, and will be potentially exploitable. I would bet a lot of money that at least 2-3 of them are.”
Spectre v1 + Spectre v2 + Meltdown = about 2-3 unfixable exploitable CPU bugs.
I’m a bit puzzled why the author seems to think that integer wrap on overflow behaviour has something to do with C and undefined behaviour. The same thing happens with nearly all languages which use the processor’s integer arithmetic, because those semantics are provided by the processor itself. Java, C#, etc. all wrap on overflow. There are some exceptions though - Ada provides the “exception on overflow” semantics the author prefers, but it does come with a significant performance penalty because checking for overflow requires additional instructions after every arithmetic operation.
The point here is that if you want performant arithmetic it’s all about what the processor is designed to do, not anything to do with the languages. Java defines integer wrap as the language’s standard behaviour but as a result it incurs a performance penalty for integer arithmetic on processors which don’t behave this way. C doesn’t incur this penalty because it basically accepts that overflow works however the processor implements it. And let’s face it if your program is reliant on the exact semantics of overflowing numbers you’re probably doing it wrong anyway.
There are some processors which provide interrupts on integer overflow. This eliminates the performance penalty associated with overflow checks if your language is Ada and so you want to trap on overflow. There are other semantics around too - DSP processors often have “clamp on overflow” instead since that suits the use case better and old Unisys computers use ones complement rather than twos complement so their overflow behaves slightly differently.
Performance penalty of “trap on overflow” can be reduced by clever modeling, for example by allowing delayed trap instead of immediate trap. As-if Infinitely Ranged is one such model. Immediate trap disallows optimizing
a+b-btoa, because ifa+boverflows the former traps and the latter doesn’t. Delayed trap allows such optimization.You are mixing up underlying behaviour of the processor with defined (or un-defined) behaviour of the language. Wrap on integer overflow is indeed the natural behaviour of most common processors, but C doesn’t specify it. The post is saying that some people have argued that wrap-on-overflow should be the defined behaviour of the C language, or at least the implementation-defined behaviour implemented by compilers, and then goes on to provide arguments against that. There is a clear example in the post of where behaviour of a C program doesn’t match that of 2’s complement arithmetic (wrapping).
That’s the point - in C, it doesn’t happen.
I don’t get the point. The advantage of using integer wrap for C on processors that implement integer wrap is that it is high performance, simplifies compilation, has clear semantics, and is the semantics programmers expect. If you want to argue that it should be e.g. trap on overflow, you need to provide a reason more substantive than theoretical compiler optimizations that are shown by hand waving. The argument that it should be “generate code that overflows but pretend you don’t” you needs a stronger justification because the resulting semantics are muddy as hell. I’m actually in favor of a debug mode overflow trap for C but a optimized mode of use processor semantics.
Read the post, then; there are substantive reasons in it. I’m not engaging with you if you’re going to start by misrepresenting reasoned arguments as “hand waving”.
“However, while in many cases there is no benefit for C, the code generation engines and optimisers in compilers are commonly general and could be used for other languages where the same might not be so generally true; “
Ok! You think that’s a substantive argument.
You’re making a straw man. What you quoted is part of a much larger post.
That’s not what “straw man” means.
It means that you’re misrepresenting the argument, which you are. I said that the post contained substantive reasons, you picked a particular part and insinuated that I had claimed that that particular part on its own constituted a substantive reason, which I didn’t. And: you said “If you want to argue that it should be e.g. trap on overflow, you need to provide a reason more substantive than theoretical compiler optimizations that are shown by hand waving” but optimisations have
nothingvery little to do with trapping being a better behaviour than wrapping, and I never claimed they did, other than to the limited extent that trapping potentially allows some optimisations which wrapping does not. But that was not the only reason given for trapping being a preferable behaviour; again, you mis-represented the argument.They are related, yes. E.g. whilst signed integer overflow is well defined in most individual hardware architectures (usually as a two’s compliment wrap), it could vary between architectures, and thus C leaves signed integer overflow undefined.
The whole argument is odd.