1.  

    I am going to release a big internal change to the Merit web wallet and Desktop Wallet this week that should improve transaction and transaction history performance by a lot. We are moving a lot of processing client-side vs going through our server. I’m helping get this release out the door.

    Most of my other work will be engaging with the community on Discord and Telegram. We also have a bitcointalk annoucement that participate daily in.

    1. 0

      Linus is not ‘abusive’. He’s exactly the same as anyone else. It’s a sample size issue. When you send literally hundreds of emails every day for 2 decades, that’s a lot of emails. Of course amongst them will be some that are rude. There are also a hell of a lot that are not.

      If anyone sent the quantity of publicly viewable emails a day for decades he sends, and they were actual genuine emails from them, not automated stuff sent by tools etc. but actually emails thought through and typed out, they would send a couple of rude ones over the years. Guaranteed.

      He’s never said anything rude to anyone he hasn’t had several beers with before and met in person. It’s to long term maintainers that have done something extremely unprofessional and inappropriate like fuck up basic maintainership duties. And guess what? Every time, it’s had the right result: the problem has been fixed.

      The half a dozen times he’s said something really rude in an email, it’s been picked up by people that delight in publicly shaming anyone that doesn’t meet their ridiculous standards who then dogpile him on whatever the dominant social media site is at the time (reddit, twitter, HN, whatever). There are countless people in these threads diagnosing him with autism or tourettes or other similar nonsense. There are even some here. It’s stupid.

      Linus has completely cracked if he thinks that a couple of weeks away is going to result in him magically being able to avoid ever offending anyone over email when he sends dozens of emails a day for years.

      EDIT: got a little carried away in this thread. sorry.

      1. 26

        Linus is feeling bad about his behaviour. Who are you to second guess his own feelings about himself?

        1. 3

          What kind of question is that? I’m me. Who else would I be? ‘Who are you to [say thing I disagree with]’ is pretty bad form. I’m as much anyone as anyone else here.

          His ‘feelings about himself’ are the product of being inundated with bullshit about his perfectly reasonable behaviour for years based on a couple of cherry-picked emails out of tens of thousands and outrage culture. When you say things enough times, people believe them even though they know they’re false.

          1. 14

            His ‘feelings about himself’ are the product of being inundated with bullshit about his perfectly reasonable behaviour for years based on a couple of cherry-picked emails out of tens of thousands and outrage culture.

            I believe we’re on the same track on outrage culture. That certainly exists and is total bullshit. I’m hoping that ignoring it will be enough, but I will fight it if necessary.

            When you say things enough times, people believe them even though they know they’re false.

            Sometimes fighting against popular sentiment is heroism, and sometimes it’s sociopathy. I think fighting for the right to behave like an asshole is steering dangerously close to the latter.

        2. 7

          You are making a statistical argument, but did you do a sentiment analysis of his emails? If you are going to make a statistical argument, I recommend you do some statistical analysis on his writings. If what you say is true, he should have a similar result to other public people in the field.

          However, my GUT feeling is that you are wrong, but I’m not going to put in the effort to prove you wrong, because I have other things to do. However, if you feel you are right, I encourage you to actually do the work to find out.

        1. 41

          My respect for Linus just shot up. It’s never too late to reinvent yourself.

          1. [Comment removed by author]

            1. 6

              WHY?

          1. 4

            Going to the zoo, because I fucking love animals. In the 6th mass extinction event, enjoy them while we got them.

            1. 2

              God, I loved Seamonkey. Used it for like 12 years as my browser/mail/irc client. I moved to firefox after quantum came out. If SeaMonkey ever merges quantum, I’ll go back.

              1. 4

                Preparing to release a big change to the Merit Lightwallet which has some major internal changes which improve transaction performance significantly.

                1. 8

                  I love this kind of stuff because it seems young developers confuse the web for the internet. There is more than HTTP out there folks! For god sakes make your own protocols! It’s fun!

                  1. 6

                    I agree. Do you have any recommendation about how to learn to implement your own protocols?

                    1. 11
                      • Assume network drops or delays your packets indefinitely.
                      • Use CBOR for binary protocols and JSON (one message per line) for plaintext protocols as a very safe starting point.
                      • Anauthorized peers being able to grow other peers’ internal state opens up a possibility of cheap DoS attacks.
                      • Don’t roll your own crypto.
                      1. 7

                        I’ll add that learning about and practicing with FSM’s plus FSM’s of FSM’s is good preparation. Most protocols that I looked into were FSM’s.

                        1. 4

                          Haha, just edited my post to say finite state machines are your friend. :-P

                          1. 2

                            Yeah. People sometimes make the mistake of assuming the network to be reliable and fail to factor in the drops, fixing them on case by case basis, turning the code into a horrible spaghetti mess.

                            FSMs turn that into “what if I receive an init packet while waiting for a reply?” which leads to much more solid designs.

                        2. 3

                          Any time you need IPC within or across machines is a chance to implement a protocol. Generally, it’s not a good idea if you don’t know what you are doing, so I would first try on a hobby project. If you are getting paid for the work, do it when you have the chops to do it and the need.

                          This goes for everything if you are skilled at making it, make it, otherwise use the work of those that are. Clearly, there is a chicken and egg problem, where you need to acquire the skill, and that’s where hobby projects or practice projects are great.

                          EDIT: Pro Tip — Finite state machines are your friend.

                          1. 1

                            Do you have experience implementing protocols that are not your own? If not, start with that. You will learn a lot more about protocol design and implementation that way than by reading a textbook or blog posts or whatever.

                            1. 1

                              I agree. I do have experience, but I want to know more about how other people learn and what they recommend since I might have missed something.

                        1. 6

                          Electricity usage is a huge concern even within the cryptocurrency community. There is a lot of work going towards more energy efficient solutions. However, proof-of-work is still the defacto method. At Merit we still use PoW but I chose a memory-bound algorithm called Cuckoo Cycle which is more energy efficient since it’s memory bandwidth bound. I hope to move away from proof-of-work completely in the future, but it’s not easy to get the same properties. Since in some ways, Merit is half PoW and PoS (Proof-of-Stake) via our Proof-of-Growth (PoG) algorithm, we are already halfway there.

                          Proof-of-Work is fascinating because it’s philosophically the opposite of Fiat money. Fiat money is one of the few things in the world where you can expend less effort and produce more of it. Cryptoccurrencies with PoW are the opposite, where you produce fewer of it in the proportion of effort expended.

                          1. 2

                            How much more memory efficient is Merit (on the scale of the top 100 countries electricity consumption)?

                            The article points out that ASIC miners have found ways of solving algorithms that have previously been thought to be resistant to a bespoke hardware solution.

                            Consuming the same amount of electricity as a large’ish country is certainly fascinating.

                            1. 4

                              Warning! this will be a bummer reply, nothing I will say here will be uplifting…..

                              Notice of course, the difference between the #1 country, and the #2 country is large. It likely follows zipf’s law. The issue with ASICs is that they are not accessible to acquire and therefore insiders get access to them first and have a huge advantage. It’s anathema to the goal of having anyone download the software and mine.

                              In the scheme of things, the amount of electricity used to mine cryptocurrencies pales in comparison to the amount of electricity wasted on countless other things. We should just acknowledge that there is something fundamentally wrong with the global economic system that allows for gross externalities that aren’t accounted for. And that there is such a gross disparity of wealth where some countries have such excess capacity for electricity while others struggle with brownouts and blackouts every day.

                              Global warming itself is an incredibly complex problem. Using a slow scripting language for your software? How much hardware are you wasting running at scale? Buying a Tesla? Too bad your electricity is likely dirty, and the production caused 5 years worth of CO2 a normal car puts out. Switching to solar and wind? Too bad the air will be cleaner causing more sunlight to hit the earth heating it up faster because even stopping now, we have decades of warming built in, and that a cleaner atmosphere accelerates that warming.

                              Global warming is such an insanely difficult, complex, and urgent problem that we are missing the forest for the trees.

                              Cryptocurrencies are not tackling the problem of Global Warming, but so aren’t most technologies we are creating every day. I would love to hear how many people on Lobsters are tackling global warming head on? I suspect almost zero. And isn’t that just the most depressing thing? It is for me, I think about this every day when I look at my children.

                              EDIT: Holy poop I was right, totally zipf’s law https://en.wikipedia.org/wiki/List_of_countries_by_electricity_consumption .

                              1. 9

                                NB: this may be ranty ;)

                                In the scheme of things, the amount of electricity used to mine cryptocurrencies pales in comparison to the amount of electricity wasted on countless other things.

                                how about not doing things which have currently no value for society, except from being an item for financial speculation, and burning resources. that would be a start. i still have to see a valid application of cryptocurrencies which really works. hard cash is still a good thing which works. it’s like voting machines: they may kinda work, but crosses made with a pen on paper are still the best solution.

                                the electricity wasted on other things is due to shitty standby mechanisms and lazyness. these things can be fixed. the “currency” part of “cryptocurrency” is to waste ressources, which can’t be fixed.

                                Global warming itself is an incredibly complex problem.

                                so-so.

                                Using a slow scripting language for your software? How much hardware are you wasting running at scale?

                                see the fixing part above. fortunately most technology tends to get more efficient the longer it exists.

                                Buying a Tesla? Too bad your electricity is likely dirty, and the production caused 5 years worth of CO2 a normal car puts out.

                                yeah, well, don’t buy cars from someone who shoots cars into orbit.

                                Switching to solar and wind? Too bad the air will be cleaner causing more sunlight to hit the earth heating it up faster because even stopping now, we have decades of warming built in, and that a cleaner atmosphere accelerates that warming.

                                the dimming and warming are two seperate effects, though both are caused by burning things. cooling is caused by particles, while warming is caused by gases (CO2, CH4, …). there are some special cases like soot in the (ant)arctic ice, speeding up the melting. (cf. https://en.wikipedia.org/wiki/Global_cooling#Physical_mechanisms , https://en.wikipedia.org/wiki/Global_warming#Initial_causes_of_temperature_changes_(external_forcings) )

                                Cryptocurrencies are not tackling the problem of Global Warming, but so aren’t most technologies we are creating every day. I would love to hear how many people on Lobsters are tackling global warming head on? I suspect almost zero. And isn’t that just the most depressing thing? It is for me, I think about this every day when I look at my children.

                                as global warming doesn’t has a single cause, there isn’t much to do head on. with everything theres a spectrum here. some ideas which will help:

                                • don’t fly (less CO2).
                                • buy local food when possible, not fruit from around the globe in midwinter. don’t eat much meat (less CO2, CH4, N2O).
                                • use electricity from renewable sources (less CO2).

                                those things would really help if done on a larger scale, and aren’t too hard.

                                1. 2

                                  how about not doing things which have currently no value for society, except from being an item for financial speculation, and burning resources. that would be a start. i still have to see a valid application of cryptocurrencies which really works.

                                  Buying illegal goods through the internet without the risk of getting caught by the financial transaction (Monero and probably Bitcoin with coin tumblers).

                                  1. 4

                                    mind that i’ve written society: a valid reason are drugs, which shouldn’t be illegal but be sold by reliable, quality controlled suppliers. i think other illegal things are illegal for a reason. additionally, i’d argue it’s risky to mail-order illegal things to your doorstep.

                                    1. 2

                                      cryptocurrencies solve a much harder problem than hard cash, which is they have lowered the cost of producing non-state money. Non-state money has existed for thousands of years, but this is the first time in history you can trade globally with it. While the US dollar may be accepted almost everywhere, this is not true for other forms of cash.

                                      1. 4

                                        but what is the real use case?

                                        • if globalized trade continues to exist, so will the classic ways of payment. cryptocurrencies are only useful in this case if you want to do illegal things. there may be a use case in oppressed countries, but the people tend to have other problems there than to buy things somewhere in the world.

                                        • if it ceases to exist, one doesn’t need a cryptocurrency to trade anywhere in the world, as there is no trade.

                                        i’m not a huge fan of the current state of the banking system, but it is a rather deep local optimum. it bugs me that i have to pay transaction fees, but thats the case with cryptocurrencies, too. i just think that while theoretically elegant, cryptocurrencies do more harm than good.

                                        anecdote: years ago, i payed for a shell account by putting money in an envelope and sending it via mail ;)

                                        1. 2

                                          Cryptocurrencies are a transvestment from centralized tech to decentralized. It’s not what they do, but how they do it that’s different. It’s a technology that allows the private sector to invest in decentralized tech, where in the past they had no incentive to do so. Since the governments of the world have failed so miserably to invest in decentralized technology in the last 20 years, this is the first time that I can remember where the private sector can contribute to building decentralized technology. Note cryptocurrencies are behind investments of decentralized storage, processing, and other solutions, where before the blockchain, they would have been charity cases.

                                          The question you can ask is, why not just stick with centralized solutions? I think the argument is a moral one and about power to the people, vs to some unaccountable 3rd party.

                                          1. 1

                                            It’s a technology that allows the private sector to invest in decentralized tech, where in the past they had no incentive to do so.

                                            i still don’t see exactly where the cryptocurrencies are required for investment in decentralized technology. we have many classic systems which are decentralized: internet (phone before that), electricity grid, water supply, roads, etc. why are cryptocurrencies required for “modern” decentralized systems? it just takes multiple parties who decide that it is a good solution to run a distributed service (like e-mail). how it is paid for is a different problem. one interesting aspect is that the functionality can be tightly coupled with payments in blockchainy systems. i’m not convinced if that is reason enough to use it. furthermore some things can’t be well done due to the CAP theorem. so centralization is the only solution in these cases.

                                            Note cryptocurrencies are behind investments of decentralized storage, processing, and other solutions, where before the blockchain, they would have been charity cases.

                                            I’d say that the internet needs more of the “i run it because i can, not because i can make money with it” spirit again.

                                            1. 1

                                              i still don’t see exactly where the cryptocurrencies are required for investment in decentralized technology.

                                              You are absolutely right! It isn’t a requirement. I love this subject by the way, so let me explain why you are right.

                                              we have many classic systems which are decentralized: internet (phone before that), electricity grid, water supply, roads, etc. why are cryptocurrencies required for “modern” decentralized systems

                                              You are absolutely right here. In the past, our decentralized systems were developed and paid for by the public sector. The private sector, until now, failed to create decentralized systems. The reason we need cryptocurrencies for modern decentralized systems is that we don’t have the political capital to create and fund them in the public sector anymore.

                                              If we had a functioning global democracy, we could probably create may systems that “i run it because i can, not because i can make money with it”.

                                              That spirit died during the great privatization of computing in the mid 80s, and the privatization of the internet in the mid 90s.

                                  2. 2

                                    I love rants :-) Let’s go!

                                    “currency” part of “cryptocurrency” is to waste ressources, which can’t be fixed.

                                    Some people value non-state globally tradeable currencies. Google alone claims to have generated $238 billion in economic activity from their ads and search. https://economicimpact.google.com/ . The question is, how much CO2 did that economic activity create? Likely far greater than all cryptocurrencies combined. But that’s just my guess. It’s not an excuse, I’m just pointing out we are missing the forest for the trees. People follow the money, just as google engineers work for google because the money is there from ads, many people are working on cryptocurrencies because the money is there.

                                    see the fixing part above. fortunately most technology tends to get more efficient the longer it exists.

                                    While true, since our profession loves pop-culture, most technologies are replaced with more fashionable and inefficient ones the longer they exist. Remember when C people were claiming C++ was slow? I do.

                                    the dimming and warming are two separate effects, though both are caused by burning things.

                                    They are separate effects that have a complex relationship with our models of the earth warming. Unfortunately, even most well-meaning climate advocates don’t acknowledge dimming and that it’s not as simple as changing to renewable resources since renewables do not cause dimming, and god knows we need the dimming.

                                    those things would really help if done on a larger scale and aren’t too hard.

                                    Here is my honest opinion, we should have done this 30 years ago when it wasn’t too late. I was a child 30 years ago. The previous generation gave me this predicament on a silver plate. I do my part, I don’t eat meat because of global warming, I rarely use cars, use public transport as much as possible. Work from home as much as possible. etc, etc,

                                    But I do these things knowing it’s too late. Even if we stopped dumping CO2 in the atmosphere today, we have decades of warming built in that will likely irreparably change our habitat. Even the IPCC assumes we will geoengineer our way with some magical unicorn technology that hasn’t been created yet.

                                    I do my part not because I think they will help, but because I want to be able to look at my children and at least say I tried.

                                    I think one of my next software projects will be helping migrants safely travel, because of one of the biggest tragedies and sources of human suffering as a result of climate change has been the refugee crisis, which is going to increase more.

                                    1. 2

                                      Some people value non-state globally tradeable currencies. Google alone claims to have generated $238 billion in economic activity from their ads and search. https://economicimpact.google.com/ . The question is, how much CO2 did that economic activity create? Likely far greater than all cryptocurrencies combined. But that’s just my guess. It’s not an excuse, I’m just pointing out we are missing the forest for the trees. People follow the money, just as google engineers work for google because the money is there from ads, many people are working on cryptocurrencies because the money is there.

                                      i won’t refute that ads are a waste of resources, i just don’t see why more resources need to be wasted on things which have no use except for speculation. i hope we can do better.

                                      While true, since our profession loves pop-culture, most technologies are replaced with more fashionable and inefficient ones the longer they exist. Remember when C people were claiming C++ was slow? I do.

                                      Javascript has gotten more efficient in the order of magnitudes. Hardware is still getting more efficient. There is always room for improvement. As you’ve written, people go where the money is (or can be saved).

                                      They are separate effects that have a complex relationship with our models of the earth warming. Unfortunately, even most well-meaning climate advocates don’t acknowledge dimming and that it’s not as simple as changing to renewable resources since renewables do not cause dimming, and god knows we need the dimming.

                                      But I do these things knowing it’s too late. Even if we stopped dumping CO2 in the atmosphere today, we have decades of warming built in that will likely irreparably change our habitat.

                                      Dimming has an effect. As reason not to switch to renewable energy it isn’t a good argument. Stopping to pump more greenhouse gasses would be a good start, they tend to be consumed by plants.

                                      […] we will geoengineer our way with some magical unicorn technology that hasn’t been created yet.

                                      lets not do this, humans have a tendency to make things worse that way ;)

                                      1. 1

                                        i hope we can do better.

                                        I don’t think our economic system is setup for that.

                                        Javascript has gotten more efficient in the order of magnitudes. Hardware is still getting more efficient. There is always room for improvement. As you’ve written, people go where the money is (or can be saved).

                                        I think because moore’s law is now dead, things are starting to swing back towards efficiency. I hope this trend continues.

                                        Dimming has an effect. As reason not to switch to renewable energy it isn’t a good argument. Stopping to pump more greenhouse gasses would be a good start, they tend to be consumed by plants.

                                        I didn’t provide dimming as a reason not to switch to renewables, I provided it because JUST switching to renewables will doom us. As I’ve said, there are decades of warming backed in, there is a lag with the CO2 we already put in. Yes, we need to stop putting more in, but it’s not enough to just stop. And in fact, stopping and not doing anything else will doom us faster.

                                        lets not do this, humans have a tendency to make things worse that way ;)

                                        I totally agree. I don’t want countries to start launching nuclear weapons, for example. The only realistic thing that could possibly work is to do massive planting of trees, like I mean billions of trees need to be planted. And time is running out, because photosynthesis stops working at a certain temperature, so many places are already impossible to fix (iraq for example, which used to be covered in thick forests thousands of years ago).

                                        1. 1

                                          I don’t think our economic system is setup for that.

                                          aren’t we the system? changes can begin small, just many attempts fail early i suppose.

                                          And in fact, stopping and not doing anything else will doom us faster.

                                          do you have any sources for that?

                                          The only realistic thing that could possibly work is to do massive planting of trees, like I mean billions of trees need to be planted. And time is running out, because photosynthesis stops working at a certain temperature, so many places are already impossible to fix (iraq for example, which used to be covered in thick forests thousands of years ago).

                                          well, if the trends continues, greenland will have some ice-free space for trees ;) just stopping deforestation would be a good start though.

                                          1. 1

                                            aren’t we the system?

                                            We did not create the system, we were born into it. To most, they see it as reality vs a system that’s designed.

                                            do you have any sources for that?

                                            https://www.sciencedaily.com/releases/2017/07/170731114534.htm

                                            well, if the trends continues, greenland will have some ice-free space for trees ;) just stopping deforestation would be a good start though.

                                            Sorry if I’m wrong, but do I sense a bit of skepticism about the dangers we face ahead?

                                  3. 5

                                    That was such a non-answer full of red herrings. He wanted to know what your cryptocurrency’s electrical consumption is. It’s positioned as an alternative to centralized methods like Bitcoin is. The centralized methods running on strongly-consistent DB’s currently do an insane volume of transactions on cheap machines that can be clustered globally if necessary. My approach is centralized setup with multiple parties involved checking each other. Kind of similar to how multinational finance already works but with more specific, open protocols to improve on it. That just adds a few more computers for each party… individual, company, or country… that is involved in the process. I saw a diesel generator at Costco for $999 that could cover the energy requirements of a multi-national setup of my system that outperforms all crypto-currency setups.

                                    So, what’s the energy usage of your system, can I participate without exploding my electric bill at home (or generator), and, if not, what’s the justification of using that cryptosystem instead of improving on the centralized-with-checking methods multinationals are using right now that work despite malicious parties?

                                    1. 3

                                      How much more memory efficient is Merit (on the scale of the top 100 countries electricity consumption)?

                                      Sorry, That’s his question. I can answer that easily, it’s not on that scale. My interpretation of that question was that he was making a joke, which is why I didn’t answer it. If derek-jones was serious about that question, I apologize.

                                      As I mentioned, the algorithm is memory bandwidth bound, I’m seeing half the energy cost on my rig, but I need to do more stringent measurements.

                                      1. 1

                                        More of a pointed remark than a joke. But your reply was full of red herrings to quote nickpsecurity.

                                        If I am sufficiently well financed that I can consume 10M watt of power, then I will always consume 10M watt. If somebody produces more efficient hashing hardware/software, I will use it to generate more profit, not reduce electricity consumption. Any system that contains a PoW component creates a pushes people to consume as much electricity as they can afford.

                                        1. 1

                                          If somebody produces more efficient hashing hardware/software, I will use it to generate more profit, not reduce electricity consumption.

                                          This is true for any resource and any technology in our global economic system.

                                          I wasn’t trying to reply with red herrings, but to expand the conversation. It’s really interesting that people attack cryptocurrencies for wasting electricity when there is a bigger elephant in the room nobody seems to want to talk about. Everyone knows who butters their bread. Keep in mind I’m not defending wasting electricity, but focusing on electricity is like, to use a computer analogy, focussing only on memory and creating garbage collection to deal with it, while ignoring other resources like sockets, pipes, etc. That’s why I like C++, because it solves the problem for ALL resources, not just one. We need a C++ for the real world ;-)

                                  4. 2

                                    I answered your question more directly, see response to nickpsecurity.

                                1. 3

                                  I’m doing a lot of writing for Merit. Last week we published an article about the performance of our new Proof-of-Growth algorithm and the numbers look really good.

                                  My primary goal is to write a high-level design for a feature which will allow building communities and custom tokens with Merit. I usually write these things in LaTeX, and will likely do the same thing this time. I usually start first by creating an idea graph using FreeMind, though I never publish those.

                                  1. 10

                                    Just like the author, I thought unikernels would fit a specific space of security and performance.

                                    I would argue the reason they didn’t take off had nothing to do with the limitations of the tech. It took me a long time to learn that technology doesn’t get investment because it’s good, it gets good because of investment. The investment into unikernels just wasn’t there. I disagree with the author , I still think it’s superior to embedded linux. But that doesn’t matter, VC money went to docker instead of rump kernels. Money just isn’t that smart and ultimately investors are …. simple creatures.

                                    1. 11

                                      Dont forget backward compatibility with existing API’s and stuff. Lots of tech that takes off does that where clean slate usually doesnt.

                                      1. 2

                                        I don’t really understand the docker/unikernel dicothomy. Can unikernels be used as containers like what docker enables?

                                        It just seems like very different technology

                                      1. 3

                                        Great! Applied for an invite. Excited to contribute to the conversation.

                                        1. 3

                                          I, like many fellow engineers, wrote my own called muda. Why “Muda”? Because productivity is waste. It’s web-based and completely written in C++. Unfortunately, it doesn’t compile with the latest version of Wt

                                          1. 2

                                            a screenshot in the readme would be helpful.

                                            1. 2

                                              Thanks, good idea. Will do that when i get it runnig with the new version of Wt.

                                          1. 3

                                            I finally launched a big protocol update last week for Merit. It had a huge impact on user growth so far. This week will be a big week of article writing and designing the next big feature we code named MRC20.

                                            1. 3

                                              Hopefully, I can provide something unique here. I am co-founder of Merit which is a decentralized digital currency. Releasing software for decentralized software is interesting and unlike any other project that I’ve worked on before. Yes, I actually write a ton of code and most of the protocol level changes. I also do the release process personally (we are a small team). We had a huge protocol level change recently and the release process went something like this.

                                              1. Get community buy-in on any proposed protocol changes. Big miners are critical here.
                                              2. Private tests on the regression test chain.
                                              3. People review PR on Github and do private testing.
                                              4. Announce to the community that the software is ready to go to the test network.
                                              5. Deploy changes to the test network which has dozens of machines privately run and many more publically run.
                                              6. The software is released on test network but the new feature isn’t turned on until a future date.
                                              7. Monitor the test network and start testing it. This process can take a month or more.
                                              8. Announce to the community the new software is ready to be released on the main network and the feature turned on at date X.
                                              9. Merge to master.
                                              10. Release binaries to the community and insist on updating them ASAP.
                                              11. Watch as the new feature turns on and monitor for issues. Issue patch releases for any problems.

                                              That said, all of the above can only be done with a relatively small community (compared to say bitcoin). The important members who control a lot of the mining power on the network must agree on the changes.

                                              Other approaches we will take in the future would use a signaling mechanism where when it looks like a sufficient majority are signaling support for a feature then it will be turned on.

                                              This is the most difficult kind of deployment I have ever done. Decentralization makes everything harder here.

                                              1. 16

                                                Fuck, it’s Friday?

                                                1. 11

                                                  Nope, I’m midway through Saturday already! (I live in New Zealand.)

                                                1. 10

                                                  The simplest way to start is to understand your relationship with your boss. You work several weeks before getting paid, you are the creditor, they are the debtor. Saying no should be natural if you understand your boss owes YOU, not the other way around.

                                                  1. 9

                                                    I doubt that will work in most situations for most employees. It was a refreshingly-different take on the subject, though. :)

                                                    1. 5

                                                      It’s definitely is a disturbing way to think about it. Because typically the creditor is either in a position of power or a peer, but in this case, the debtor is in a position of power. No wonder wage theft outstrips any other theft combined. You are more likely to get money stolen from you by your employer than from some thief.

                                                      Though if you look at the history of wage labor, you will find that some of the first wage workers were really slaves who rented themselves out. Sometimes they would give a portion of their wage to their owner, sometimes they wouldn’t :-)

                                                      1. 2

                                                        I don’t know the model fits, though. We’ve traditionally thought of these things as agreements. Then, built contract law to formalize it. I think that fits better. So, you agreed to do specific things for specific benefits for specific amount of time. That’s on top of workers having no rights (at-will employment) in many states. The models could be combined possibly.

                                                        1. 3

                                                          The model of creditor and debtor fits. There is a reason wages are called liabilities on the books.

                                                          1. 2

                                                            Good point.

                                                  1. 2

                                                    This is interesting, but another 20 years of software development continues to prove him wrong.

                                                    The current dominant paradigm is flat, single-ordered lists, and search (perhaps augmented with tags like our dear lobste.rs here).

                                                    This is even more of all the bad stuff he’s railing against at the start of the article, but this is the stuff that works and there are innumerable other approaches dead or dying.

                                                    It suspect that for UI’s less freedom is simpler (one button, one list, one query, one purpose, etc.) and not the other way around.


                                                    For developers, I think he was right, and it’s also what we’ve got today. It’s clearly preferable for developers to have a simple model to work against (Like URIs + JSON).

                                                    apt-get install firefox (Which unpacks to a resource identifier and a standardized, machine-readable package file) is quite probably as good as it gets. It’s a directed graph instead of an undirected graph like his zipper system, but undirected graphs require an unrealistic (and in my opinion probably harmful) amount of federation between producers of API’s and their consumers.

                                                    1. 7

                                                      When the pitch is “good computing is possible”, “bad computing has dominated” isn’t actually a great counterargument – particularly when the history of so much of it comes down to dumb luck, path dependence, tradeoffs between technical ability & marketing skills, and increasingly fast turnover and the dominance of increasingly inexperienced devs in the industry.

                                                      If you’re trying to suggest that the way things shook out is actually ideal for users – I don’t know how to even start arguing against that. If you’re suggesting that it’s inevitable, then I can’t share that kind of cynicism because it would kill me.

                                                      A better world is possible but nobody ever said it would be easy.

                                                      1. 4

                                                        Your comment is such a good expression of how I feel about the status quo! I was just having a similar discussion in another thread about source code, where I said “text is hugely limiting for working with source code”, and somebody objected with “but look at this grep-like tool, it’s totally enough for me”. I can understand when people raise practical objections to better tools (hard to get traction, hard to interface with existing systems etc.). What’s dispiriting is the refusal to even admit that better tools are possible.

                                                        1. 2

                                                          The mistake is believing that we’re anywhere close to status quo in software development. The tools and techniques used today are completely different from the tools we used 5 and 10 years ago, and are almost unrecognizable next to the tools and techniques used 40 and 50 years ago.

                                                          Some stuff sticks around, (keyboards are fast!) but other things change and there is loads of innovative stuff going on all the time. With reference to visual programming: I recently spent a weekend playing with the Unreal 4 SDK’s block programming language (they call it blueprints) it has fairly seamless C++ integration and I was surprised with how nice it was for certain operations… You might also be interested in Scratch.

                                                          Often, these systems are out there, already existing. Sometimes they’re not in the mainstream because of institutional momentum, but more often they’re not in the mainstream because they’re not good (the implementations or the ideas themselves).

                                                          The proof of the pudding is in the eating.

                                                          1. 4

                                                            I don’t think I can agree with this. I’m pretty sure the “write code-compile-run” approach to writing code that is still in incredibly widespread use is over 40 years old. Smalltalk was developed in the 70s. Emacs was developed in the 70s. Turbo Pascal, which had an integrated compiler and editor, was released in mid-80s (more than 30 years ago). CVS was developed in mid-80s (more than 30 years ago). Borland Delphi and Microsoft Visual Studio, which were pretty much full-fledged IDEs, were released in the 90s (20 years ago). I could go on.

                                                            What do we have now that’s qualitatively different from 20 years ago?

                                                            1. 3

                                                              Yup. Some very shallow things have changed but the big ideas in computing really all date to the 70s (and even the ‘radical’ ideas from the 70s still seem radical). I blame the churn: half of the industry has less than 10 years of experience, and degree programs don’t emphasize an in-depth understanding of the variety of ideas (focusing instead on the ‘royal road’ between Turing’s UTM paper and Java, while avoiding important but complicated side-quests into domains like computability).

                                                              Somebody graduating with a CS degree today can be forgiven for thinking that the web is hypertext, because they didn’t really receive an education about it. Likewise, they can be forgiven for thinking (for example) that inheritance is a great way to do code reuse in large java codebases – because they were taught this, despite the fact that everybody knows it isn’t true. And, because more than half their coworkers got fundamentally the same curriculum, they can stay blissfully unaware of all the possible (and actually existing) alternatives – and think that what they work with is anywhere from “all there is” to “the best possible system”.

                                                              1. 1

                                                                I got your book of essays - interested in your thinking on these topics.

                                                                1. 1

                                                                  Thanks!

                                                                  There are more details in that, but I’m not sure whether or not they’ll be any more accessible than my explanation here.

                                                              2. 2
                                                                • Most languages aren’t AOT compiled, there’s usually a JIT in place (if even that, Ruby and python are run-time languages through and through). These languages did not exist 20 years ago, though their ancestors did (and died, and had some of the good bits resurrected, I use Clojure regularly, which is both modern and a throwback).

                                                                • Automated testing is very much the norm today, it was a fringe idea 10 years ago and something that you were only crazy enough to do if you were building rockets or missiles or something.

                                                                • Packages and entire machines are regularly downloaded from the internet and executed in production. I had someone tell me that a docker image was the best way to distribute and run a desktop Linux application.

                                                                • Smartphones, and the old-as-new challenges of working around vendors locking them down.

                                                                • The year of the Linux desktop surely came sometime in the last or next 20 years.

                                                                • Near dominance of Linux in the cloud.

                                                                • Cloud computing and the tooling around it.

                                                                • The browser wars ended, though they started to heat up before the 20 year cutoff.

                                                                • The last days of Moore’s law and the 10 years it took most of the industry to realize the party was over.

                                                                • CUDA, related, the almost unbelievable advances in computer graphics. (Which we aren’t seeing in web/UI design, again, probably not for lack of trying, but maybe the right design hasn’t been struck)

                                                                • Success with Neural Networks on some problem sets and their fledgling integration into other parts of the stack. Wondering when or if I’ll see a NN based linter I can drop into Emacs.


                                                                I could go on too, QWERTY keyboards have been around 150 years because it’s good enough and the alternatives aren’t better then having one standard. I don’t think that the fact that my computer has a QWERTY keyboard on it is an aberration or failure, and not for lack of experimentation on my own part and on the parts of others. Now if only we could do something about that caps lock key… Oh wait, I remapped it.


                                                                It’s easy to pick up on the greatest hits in computer science, 20, 30, and 40 years ago. There’s a ton of survivorship bias and you don’t point to all of those COBOL-alikes and stack-based languages which have all but vanished from the industry. If it seems like there’s no progress today, it’s only because it’s more difficult to pick the winners without the benefit of hindsight. There might be some innovation still buried that makes two way linking better then one way linking, but I don’t know what it is and my opinion is that it doesn’t exist.

                                                                1. 3

                                                                  Fair enough. Let me clarify my comment, which was narrowly focused on developer tools for no good reason.

                                                                  There is no question that there have been massive advances in hardware, but I think the software is a lot more hit and miss.

                                                                  In terms of advances on the software front, I would point to distributed storage in addition to cloud computing and machine learning. For end users, navigation and maps are finally really good too. There are probably hundreds of other specific examples like incredible technology for animated films.

                                                                  I think my complaints are to do with the fact that most of the effort in the last 20 years seems to have been directed to reimplementing mainframes on top of the web. In many ways, there is churn without innovation. I do not see much change in software development either, as I mentioned in the previous comment (I don’t think automated testing counts), and it’s what I spend most of my time on so there’s an availability bias to my complaints. There is also very little progress in tools for information management and, for lack of a better word, “end user computing” (again, spreadsheets are very old news).

                                                                  I think my perception is additionally coloured by the fact that we ended up with both smartphones and the web as channels for addictive consumption and advertising industry surveillance. It often feels like one step forward and ten back.

                                                                  I hope this comment provides a more balanced perspective.

                                                          2. 2

                                                            In the last 20 years, the ideas in that paper have been attempted a lot, by a lot of people.

                                                            Opensource and the internet have given a ton of ideas a fair shake, including these ideas. Stuff is getting better (not worse). The two way links thing is crummy, and you don’t have to take my word for it, you can go engage with any of the dozens of systems implementing it (including several by the author of that paper) and form your own opinions.

                                                            1. 4

                                                              In the last 20 years, the ideas in that paper have been attempted a lot, by a lot of people.

                                                              Dozens of people, and I’ve met or worked with approximately half of them. Post-web, the hypertext community is tiny. I can describe at length the problems preventing these implementations from becoming commercially successful, but none of them are that the underlying ideas are difficult or impractical.

                                                              The two way links thing is crummy, and you don’t have to take my word for it, you can go engage with any of the dozens of systems implementing it (including several by the author of that paper) and form your own opinions.

                                                              I wrote some of those systems, while working under the author of that paper. That’s how I formed my opinions.

                                                              1. 1

                                                                That’s awesome. Maybe you can change my mind!

                                                                Directed graphs are more general then undirected graphs (You can implement two-way undirected graphs out of one way arrows, you can’t go the other way around). Almost every level of the stack from the tippy top of the application layer to the deepest depths of CPU caching and branch prediction is implemented in terms of one-way arrows and abstractions, I find it difficult to believe that this is a mistake.


                                                                EDIT: I realized that ‘general’ in this case has a different meaning for a software developer then it does in mathematics and here I was using the software-developers perspective of “can be readily implemented using”. Mathematically, something is more general when it can be described with fewer terms or axioms. Undirected graphs are more maths general because you have to add arrowheads to an undirected graph to make a directed graph, but for the software developer it feels more obvious that you could get a “bidirected” graph by adding a backwards arrow to each forwards arrow. The implementation of a directed graph from an undirected graph is difficult for a software developer because you have to figure out which way each arrow is supposed to go.

                                                                1. 1

                                                                  Bidirectional links are not undirected edges. The difference is not that direction is unknown – it’s that the edge is visible whichever side of the node you’re on.

                                                                  (This is only hard on the web because HTML decided against linkbases in favor of embedded representations that must be mined by a third party in order to reverse them – which makes jump links a little bit easier to initially implement but screws over other forms of linking. The issue, essentially, is that with a naive host-centric way of performing jump links, no portion of the graph is actually known without mining.

                                                                  Linkbases are literally the connection graph, and links are constructed from linkbases. In the XanaSpace/XanaduSpace model, you’ve got a bunch of arbitrary linkbases representing arbitrary subgraphts that are ‘resident’ – created by whoever and distributed however – and when a node intersects with one of the resident links, the connection is displayed and made navigable.

                                                                  Also in this model a link might actually be a node in itself where it has multiple points on either side, or it might have zero end points on one side, but that’s a generalization & not necessarily interesting since it’s equivalent to all combinations of either end’s endsets.)

                                                                  TL;DR: bidirectional links are not undirected links – merely links understood above the level of the contents of a single node.

                                                                  1. 1

                                                                    Ok then, and how is it that you construct a graph out of a set of subgraphs? Is that construction also two way links thereby assuring that every participant constructs the same graph?

                                                                    1. 1

                                                                      Participants are not guaranteed to construct the same graph, and the graphs aren’t guaranteed to even be fully connected. (The only difference between bidirectional links & jump links is that you can see both points.)

                                                                      Instead, you get whatever collection of connected subgraphs are navigable from the linkbases you have resident (which are just lists of directed edges).

                                                                      This particular kind of graph-theory analysis isn’t terribly meaningful for either the web or translit, since it’s the technical detail of how much work you have to do to get a link graph that differs, not the kind of graph itself. (Graph theory is useful for talking about ZigZag, but ZigZag is basically unrelated to translit / hypertext and is more like an everted tabular database.)

                                                                      1. 1

                                                                        I guess I’m trying to understand how this is better or different from what already exists. If it’s a curated list of one way links that you can search and discuss freely with others, then guess what, lobste.rs is your dream, the future is now, time to throw one back and celebrate.

                                                                        1. 1

                                                                          I’m trying to understand how this is better or different from what already exists

                                                                          Well, when the project started, none of what we have existed. This was the first attempt.

                                                                          If it’s a curated list of one way links that you can search and discuss freely with others, then guess what, lobste.rs is your dream, the future is now,

                                                                          ‘Link’ doesn’t actually mean ‘URL’ in this sense. A link is an edge between two nodes – each of these nodes being a collection of positions within a document. So, a linkbase isn’t anything like a collection of URLs, but it it’s a lot like a collection of pairs of URLs with an array of byte offsets & lengths affixed to each URL. (In fact, this is exactly what it is in the XanaSpace ODL model.) A URL by itself is only capable of creating a jump link, not a bidirectional link.

                                                                          It’s not a matter of commenting on a URL, but of creating sharable lists of connections between sections of already-existing content. That’s the point of linking: that you can indicate a connection between two existing things without coordinating with any authors or owners.

                                                                          URL-sharing sites like lobste.rs provide one quarter of that function: by coordinating with one site, you can share a URL to another site, but you don’t have control over either side beyond the level of an entire document (or, if you’re very lucky and the author put useful anchors, you can point to the beginning of a section on only the target side of the link).

                                                                          1. 1

                                                                            To take an example of a system which steps in the middle and does take greater control over both ends, Google’s AMP. I feel like it is one of the worse things anyone has ever tried to do to the internet in it’s entire existence.

                                                                            Control oriented systems like AMP and to a lesser degree sharing sites like Imgur, Pinterest, Facebook, and soon (probably) Medium, represent existential threats to forums like lobste.rs.

                                                                            So, in short, you’re really not selling me on why this two way links thing is better.

                                                                            1. 2

                                                                              We actually don’t have centralization like that in the system. (We sort of did in XU88 and XU92 but that stopped in the mid-80s.)

                                                                              It’s not about controlling the ends. The edges are not part of the ends, and therefore the edges can be distributed and handled without permission from the ends.

                                                                              Links are not part of a document. Links are an association between sections of documents. Therefore, it doesn’t make any sense to embed them in a document (and then require a big organization like Google to extract them and sell them back to you). Instead, people create connections between existing things & share them.

                                                                              I’m having a hard time understanding what your understanding of bidirectional linking is, so let me get down to brass tacks & implementation details:

                                                                              A link is a pair of spanpointers. A spanpointer is a document address, a byte offset from the beginning of the document, and a span length. Anyone can make one of these between any two things so long as you have the addresses. This doesn’t require control of either endpoint. It doesn’t require any third party to control anything either. I can write a link on a piece of paper and give it to you, and you can make the same link on your own computer, without any bits being transferred between our machines.

                                                                              We do not host the links. We do not host the endpoints. We don’t host anything. We let you see connections between documents.

                                                                              Seeing connections between documents manifests in two ways:

                                                                              1. transpointing windows – we draw a line between the sections that are linked together, and maybe color them the same color as the line
                                                                              2. bidirectional navigation – since you can see the link from either side, you can go left instead of going right

                                                                              It’s not about control, or centralization. Documents aren’t aware of their links.

                                                                              The only requirement for bidirectional linking is that an address points to the same document forever. (This is a solved problem: ignore hosts & use content addressing, like IPFS.)

                                                                              1. 1

                                                                                Wow, thank you for taking the time to walk me through these ideas. I think I’m starting to understand a little better.

                                                                                I still think we’ve got this, or could implement it on the existing web stack. I think any user could implement zig-zag links in a hierarchal windows-style file structure since ’98 if not ‘95. I think it’s informative that most users do not construct those links, who knows how many of us have tried it in the name of getting organized.

                                                                                I really believe that any interface more complex then a single item is too complex, and if you absolutely must you can usually present a list without distracting from a UI too badly. I think a minimalist and relatively focused UI is what allows this website to thrive and us to have this discussion.

                                                                                I’m going to be thinking over this a lot more. A system like git stores the differences between documents instead of the documents themselves, so clearly there are places for other ways of relating documents to each other then what we’ve got, which work!

                                                                                1. 3

                                                                                  I should clarify: I’ve been describing bidirectional links in translit (aka hypertext or transliterature). ZigZag is actually a totally different (incompatible) system. The only similarity is that they’re both interactive methods of looking at associations between data invented by Ted Nelson.

                                                                                  If we want to compare to existing stacks, transliterature is a kind of whole-document authoring and annotation thing like Word, while ZigZag is a personal database like Access – though in both cases the assumptions have been turned inside-out.

                                                                                  You’re right that these things, once they’re understood, aren’t very difficult to implement. (I implemented open source versions of core data structures after leaving the project, specifically as demonstrations of this.)

                                                                                  I really believe that any interface more complex then a single item is too complex, and if you absolutely must you can usually present a list without distracting from a UI too badly. I think a minimalist and relatively focused UI is what allows this website to thrive and us to have this discussion.

                                                                                  Depending on how you chunk, a site like this has a whole host of items. I see a lot of characters, for instance. I see multiple buttons, and multiple jump links. We’ve sort of gotten used to a particular way of working with the web, so its inherent complexity is forgotten.

                                                                                  thank you for taking the time to walk me through these ideas. I think I’m starting to understand a little better.

                                                                                  No problem! I feel like it’s my duty to explain Xanadu ideas because they’re explained so poorly elsewhere. I spent years trying to fully understand them from public documentation before I joined the project and got direct feedback, and I want to make it easier for other people to learn it than it was for me.

                                                            2. 1

                                                              I wouldn’t say so. What you have is more and more people are using the same tools, therefore you will never get a “perfect” solution. Generally, nature doesn’t provide a perfect system but “good enough to survive”. My partner and I are getting a child at the moment, and the times the doctor told us: “This is not perfect, but nature doesn’t care about that. It just cares about good enough to get the job done”.

                                                              After I’ve heard this statement, I see it everywhere. Also with computers. Code and how we work run a huge chunk of important systems, and somehow they work. Maybe they work because they are not perfect.

                                                              I agree that things will change (“for the better”), but it will come in phases. We will have a bigger catastrophic thing happening and afterwards systems and tools will change and adapt. As long everything sort of works, well, there is no big reason to change it (for the majority of people) since they can get the job done and then enjoy the sun, beaches and human interactions.

                                                              1. 1

                                                                Nobody’s complaining that we don’t have perfection here. We’re complaining about the remarkable absence of not-awful in projects by people who should know better.

                                                            3. 3

                                                              I think the best way to describe what we have is “Design by Pop Culture”. Our socio-economic system is a low pass filter, distilling ideas until you can package them and sell them. Is it the best we got given those economic constraints? maybe…

                                                              But that’s like saying “Look, this is the best way to produce cotton, it’s the best we got” during the slave era…(slavery being a different socio-economic system)

                                                            1. 15

                                                              I’m hosting my first TLA+ workshop this week. Beyond excited and beyond nervous.

                                                              1. 2

                                                                Good luck!

                                                                1. 2

                                                                  Kick ass!

                                                                  1. 1

                                                                    Wish I was in Chicago! Do you ever come out to Seattle?

                                                                    1. 2

                                                                      I’d be totally down to host a workshop in Seattle!

                                                                  1. 2

                                                                    Finishing up and launching PoG2 on the Merit blockchain. My body is ready.

                                                                    1. 2

                                                                      I mostly write about the intersection of tech and society. Must write moooore!

                                                                      https://mempko.wordpress.com