This looks really cool. Learning about Partisan and Lasp led me to read about membership protocols like HyParView and broadcasting algorithms like Bimodal broadcast, PlumTree and Thicket. I’m also fascinated by fault injection techniques like LDFI, and now Filibuster.
I’m excited to read further :)
Thanks for the kind words.
“maybe IBS maybe who knows” gang reporting in. I feel you, I’m in the same shit (pun intended). The article is nice but, as any good software engineer, I feel you’re overcomplicating the matter.
Recipes in the way you treat them are a modern invention, even when they are traditionalized. Recipes make up for lack of food knowledge in the industrialized world. They are the “no-code” revolution of food. They work until they don’t work anymore, like in your case.
My suggestion is to throw them out of the window and learn real cooking. Cooking is a system, with rules, principles and sinergies. It’s not just about flavor but it’s about logistics, availability and health. Recipes are prepackaged solutions but they don’t hold any ultimate truth: make your own recipes, learn to design and compose meals according to foundational rules. Go back to the “barebone” cooking to achieve the flexibility recipes cannot give you. Substitutes will appear, because many of them are contextual and probably now you’re limiting yourself to absolute substitutes (like assafetida for garlic or soy cream for normal cream). I hope this will help you explore food from a different perspective and find your okay spot.
So, I completely agree.
Regarding cooking as a system, what do you recommend as reading material? I mean, I’ve had trouble finding books that start from nothing and build up from nothing. What’s a good starting point?
(Is there like a SICP for cooking, or similar?)
Depending on your cooking skills you might want to look at something like molecular gastronomy and the chemistry behind cooking or at least outside of regular recipe books. I’m a decent chef myself and usually have no trouble coming up with my own recipes based on what’s on sale, in season, in the fridge or found hiding in the back of the cupboard, and all that comes down to hard earned experience, especially flavor pairing.
Understanding the processes taking place in the kitchen (or at least having in idea of what is going on) is something I find can help me to make my cooking better, more interesting or simpler. Things like understanding how an emulsion works (good when making a dressing or mayonnaise), using acid and base (e.g. vinegar/lemon juice and baking soda/powder) to make vegetables have more or less bite (adding a dash of vinegar when boiling potatoes makes them never disintegrate/become soggy), the relationships between temperature, surface area oil and salt e.g. for all of those nice Maillard reactions.
I’m blessed with having no food allergies, but I imagine that something like consistency and mouth feel can be hard to handle when having a much restrained choice of ingredients. Martin Lersch has a blog at https://khymos.org/ and has a free book, Texture, which has a collection of recipes using different hydrocolloids, i.e. substances that gels in contact with water which can be used to thicken, gel, foam, emulsify etc.: https://khymos.org/recipe-collection/
I see that he’s recently restarted his blog, definitely worth a read with lots of interesting observations and recipes. Check out Maximizing Food Flavor by Speeding Up the Maillard Reaction or Ten tips for practical molecular gastronomy.
I can’t really recommend any paper books, all of my reading has been online (with the sole exception being Cooking for Geeks by Jeff Potter, bought at FOSDEM with their usual O’Reilly discount), but for something gawk-worthy (and expensive) have a look at Modernist Cuisine by Nathan Myhrvold.
Hope this helps!
Sorry to hear about your struggles. I went through something very similar a couple years ago. I summarized my story in a TEDx talk about a year after I figured out what my GI issues were: https://www.youtube.com/watch?v=iR3_yIx2X0s
As some encouragement, it (slowly) gets easier and easier. There are foods I still miss, but after going from feeling terrible to feeling wonderful it becomes easier to say no to that soda or slice of pizza.
I’ll check it out, thanks!!
Crikey, I have every sympathy for you. This seems like a great idea, would you consider opening up the source of your little program for other sufferers? I understand if you feel potentially uncomfortable sharing the details it contains.
The plan is to open source everything once I get it working to a level where someone else could actually use it, for sure!
I can’t promise it’ll work, but I’ve gotten good results from the newspaper Python module when it comes to filtering down to a page’s real content. Good luck. This is a neat solution, wish I had thought of it when working out how to handle diet changes that come from being prone to kidney stones.
I’m going to try this module, thanks for the heads up!
That sounds really tough, I’m sorry. I’m glad that you’ve been able to make progress over lockdown and best of luck finding a new diet that works for you.
Re: noise from other page elements
Perhaps selecting/copying the ingredients list and having the script work on your clipboard contents would be a good UX for you? Maybe select, copy and hit a keybind to have your window manager start the script and put the answer in a notification bubble or window?
This is a good idea if I’m looking to spot-check an individual recipe, but I do want an automated solution because ideally I’d like to spider recipe blogs and automatically download the ones that I know will work for me.
Then perhaps looking for an element or run of text with a high proportion of ingredient names would work?
A bit off-topic, but are there videos of this course anywhere online? It looks extremely interesting.
Since the recordings of the course contain student information (e.g., faces, names) we cannot release them per federal guidelines (FERPA).
I was a TA during last Spring Semester when in Belgium the health policy discourse changed from “covid is a flu” to “full lockdown and borders closed” in a matters of two weeks or so. I was lucky to work with a professor that had a student-first mentality and really pushed to adapt and reduce the stress induced on the students by the situation. We had ~60 students enrolled in the course and it was an introduction to biostatistics for undergrad in their third years using the JMP software. A small feedback:
The license for JMP must have been paid by the student at a reduced price if they wanted on their personal computer because the university provided room with computers with the software. We were able to push to have free licenses for the students due to the lockdown to be sure everyone can participate to the course. That was a relief. I made a quick install guide and acted as a hotline for any technical problem related to the course.
I gave 4 remote lessons with a 80% attendance each time. Students come with mic and camera off and it is almost impossible to have interaction with this size of group orally. I used a “Twitch”-strategy : Asking the students to interact with me in the Teams’ chat and answer the question live or in a delayed fashion. Others students with similar questions began to “react” in the chat to give weight to the question and allowed to spot when the majority did not understand a specific concept. It worked way better than waiting to someone find the courage to switch on their mic and asks their question.
I don’t know who my students was. With the mic/cam off and only having written contact by mail or teams’ chat, I have catch a few names but I was not able to associate them with any face.
The mentality of the professor was really important for the students. She manages to provide video recorded lessons and adapted to the situation. For example, she totally changed the type of exam so she will not have to use one of those ugly spying web platform ahem. A student-first approach is key to provide a reassuring environment in troubled time. It was not business as usual and the academics who did not or would not understand that have really hurt their relation with students and the performance of students.
On a technical level, because we were not allowed to go in your office, I had to work from my personal computer whcih is on Fedora/Gnome Wayland. Giving lessons was a funny set-up :
It was shaky but it works 99% of the time. It was a fun experience for me.
I started grad school in Belgium and lived there for a few years (Louvain-la-Neuve, UCL) before moving to CMU. I had an Erasmus Mundus fellowship.
It’s cool to see you had an 80% attendance rate for lessons. We had significantly less in recitations as the semester got closer to the end. I assume this a combination of the recitations being async and being less and less related to the final project and more just to reinforce the lectures. Obviously, this posed the problems we talked about in the post.
Funny, it was at UCLouvain where I was researcher during three years and TA for a semester. Now I am looking for something in the Netherlands.
The missing part about the 80% attendance was that students had to submit a report 2 weeks after my lessons and I gave more directions than I will do during the lessons. I found that the key for student to feel that they are not on their own is to provide multiple channels to contact you. They had access to direct email, Moodle and Teams. Different profiles of student used those channels and it really helped them to feel they can have access to the teaching staff (and did not mind to have lagged answers as soon they realize that an answer will come and everybody is in a shitty situation).
The attendance level are really dependent on how the course is structured and how students are evaluated.
I guess the “true hacker” would be using a shell, tmux, emacs and what have you. But for someone like me I like the idea of running something like VSCode in the browser and having an online IDE I can use from anywhere (maybe underneath it’s just running linux on a VM and I have access to the shell for npm i etc.). Does anyone use anything like that. And are they doing it on an iPad?
I develop with VSCode on a Surface Go, which is quite small. I enjoy it, even if the type cover is a bit cramped.
TIL Joe Armstrong has a lobste.rs account that he uses to repost from beyond the grave.
clicked the wrong button, muscle memory as I’m used to submitting my own work. fixed.
“muscle memory as I’m used to submitting my own work”
That’s the best reason I’ve ever read for this mistake. I always look forward to your articles. :)
Brings a whole new meaning to “necroposting”.
(I’m so sorry.)
Performance numbers look really impressive! I love everything about Erlang except I don’t have any experience writing it! Actors are just such a beautiful idea and really shows the benefit of true OOP which is why I used them in Firestr. Is there any performance benchmarks compared to something like the CAF which is for C++?
Any systems that share the same design of Distributed Erlang (e.g., Akka Cluster, Microsoft Orleans) should be able to benefit from our design. We just only evaluated our techniques using Erlang.
This is awesome. The performance numbers looks really impressive. Is there any place I can find the up to date documentation or usage guide for Partisan?
Documentation is a bit lacking at this point – I don’t have numbers, and as I’ve unfortunately become used to saying, you don’t get a Ph.D. for writing documentation on the software you write. We’ve got a little bit up on in the Lasp docs at http://lasp-lang.org.
Nice work @cmeiklejohn. Congrats on the awesome results!
How is this different from other actors solutions?
For a fair comparison, you should be considering actor systems that are distributed because the architectural decisions made here are specific to distribution. That means we would be comparing only to Distributed Erlang and Akka.
Briefly highlighting some of the differences:
I also wrote a summary on how Orleans differs from using Basho’s Riak Core (built on Erlang) for building fault tolerant, highly-available, distributed applications. It provides a line-by-line comparison of the paper.
I found that once I switched from a computer to a notebook for note-taking and general research, I became a lot more productive. I find having the computer in front of me makes me more distracted and less focused, and I love the ability I have to just stick my notebook in my pocket with a pen in my jacket at all times of the day. I bring it to bars, cafes, train rides, the park, etc.
I used to be a big fan of the Ogami Stone Notebooks , the paper is wonderful to write on with a ballpoint pen and is waterproof. However, once I realized that the paper decomposes at around the five year mark, I switched back to a Moleskine.
The Moleskine  has worked well for me – I buy the exact same size and same one every time, I can buy them almost anywhere – abroad, in an airport, in basically every single city I may visit for both school and work. I like the consistency, because I can keep them all together and date them and have a record of whatever I was working on at a given moment related to my research.
It’s an interesting idea, but in my opinion, it reaches a little in service of bolstering the AP story. The observable local real world isn’t eventually consistent, and that’s where we spend all of our time and form all of our intuitions. Even when we see a far off event happening on TV, we perceive it as being perfectly contemporaneous and admit zero possibility that the experience will be altered retroactively to include updated distant information. So I feel like the argument is unconvincing, and I still feel like AP is a great idea waiting for an actual use case.
Personal anecdotal example of correspondingly low value: having dealt with global leaderboards myself for a certain (in)famous, well-trafficked multiplayer shooter, I was super interested to read an article on using AP & CRDTs with leaderboards (https://christophermeiklejohn.com/lasp/erlang/2015/10/17/leaderboard.html). But the benefit of AP was still a mystery to me after reading it; we were able to keep the leaderboards for even this game, which experiences significantly more traffic than all but possibly 2 mobile games (and maybe all), in a single centralized sql instance with a few read slaves. Certain very high update rate stats went into a nosql store. The insert, select and update statements for the sql and nosql stores were the predictable one-liners. Data didn’t have to be specially structured. All of our availability problems had nothing to do with network partitioning between members of the cluster over the course of several years. The architecture and problem-solving around the architecture were so standard I could get junior developers up to speed in hours. Comparing and contrasting that with LASP, even as an erlang fan, …
The observable local real world isn’t eventually consistent, and that’s where we spend all of our time and form all of our intuitions.
I don’t think this is a true statement. Knowledge is a function of perception, and is not immediate unless we’re the agent. Reasoning about truth external to us is bound by the laws of math and physics. For example, if a star goes super nova, I will eventually receive that update–but someone closer to the event will undoubtedly see it first. We can compute, based off relative distance, when that event occurred but it might take some time to form consensus.
I agree that it’s only a true statement insofar as I included the word “local”. I drop a hammer; the hammer is on the ground simultaneously for everyone in the room. There’s no possibility that a larger quorum of people are going to barge in and declare that the hammer did not in fact land on the floor, because it was replaced by a screwdriver before it could, and so now we should all amend our experiential beliefs to account for this new factual evidence. I’ll agree with you as far as cosmological events over interplanetary distances, but the market for those databases may not be very big at the moment.
But the benefit of AP was still a mystery to me after reading it; we were able to keep the leaderboards for even this game, which experiences significantly more traffic than all but possibly 2 mobile games (and maybe all), in a single centralized sql instance with a few read slaves.
The benefit of the CRDT based leaderboard, using Lasp, is that it allows peer-to-peer synchronization without coordinating with a central MySQL instance, where the guarantee is convergence by each member in the system without the risks of message ordering introducing nondeterminism.
Doh, I wasn’t clear, my apologies. I understood the paper, and Lasp looks pretty nifty. What I didn’t understand was the motivation or the payoff, given the triviality of the problem space, the low cost-to-implement and cost-to-operate of the “traditional” solution, and the added complexity of the Lasp-based solution. What factors, if any, got a lot better and made the cost-benefit analysis worth it to do it this way? Or was this pure thought experiment?