This is an interesting point of view, as my reaction was, “Awesome! Now that it’s a subscription I don’t have to go through the red tape of software license acquisition each time I need/want a specialized version of IntelliJ or an upgrade license.”
I’m with you. This rage seems insane to me. The most expensive option costs less than a dollar a day. In an industry with routine six-figure incomes. What the hell?
I don’t think it has anything to do with costs. A lot of people (I’m one) really hate so-called subscription models, because you spend money and have nothing to show for it when the “subscription” lapses.
(I use “so-called” and scare quotes because these newer models don’t really resemble traditional subscriptions to, for example, magazines or newspapers. When you subscribe to a magazine, you get to keep the back issues forever; if the company folds or your subscription lapses, it only means that you don’t get new issues, not that your old issues blink out of existence. Software “subscription” models are more like paying rent; and while renting may make sense when dealing with very expensive things like real property which many or most people cannot afford to buy outright, most people really don’t like renting ordinary consumer goods, to the point where business models like Rent-A-Center’s are widely seen as scammy and predatory.)
Frankly I don’t see why this business model is any less scammy or predatory in the realm of software (or consumer electronics which phone home and require rental fees to continue using after the initial purchase–like the WiFi SD card I just bought which has an undisclosed requirement that I pay for a “cloud subscription” to copy my data over my WiFi network to my computer.)
I like the new pricing also. To some degree I think the most vocal people are ones that are stingy (not generating tons of revenue for JetBrains anyway), or work in dysfunctional organizations (which may be more of a problem for JetBrains).
A colleague raised the idea of Jetbrains also providing a “buyout” style of license, where you can pay one-time to continue to use the currently existing version(s) in perpetuity, but get no upgrades. This would probably cost more than the single-product licenses, but may mollify people who can justify one-time expenses easier than ongoing expenses.
I still would go for the subscription model - it’s a good value, especially for polyglots.
not generating tons of revenue for JetBrains anyway
This is not about cost. The vast majority of the most vocal people are e.g. those who love IntelliJ because they consider it’s by far the best IDE for Java nowadays (and that has been so for years). I don’t have the revenue numbers obviously, but I bet that’s a lot of people and a big chunk of what JetBrains makes out of its products.
Up until now, I’ve had the choice to buy my own car or rent one. Overnight, I can only rent. I hope JetBrains reconsiders.
They did provide a perpetual license kinda like I had imagined. https://sales.jetbrains.com/hc/en-gb/articles/204784622-What-is-perpetual-fallback-license-
A week ago I started The Neuromancer, but I couldn’t keep up. It’s a seminal work and whatever, I guess I was not in the mood. On the other hand, I picked up Code, The Hidden Language of Computer Hardware and Software, and suddenly I cant put it down. “A book that takes you on a trip from Physics principles up to showing the workings of real computers? Woot!”. I would put it at the same level as The Elements of Computing Systems, less practical though.
I’ve been picking up Metamagical Themas by Douglas Hofstadter a bit this week. It’s been several months going now, but I quite enjoy forays into abstract thinking.
After having read GEB, this has been in my to-read list for ages. I read the chapters about Lisp, and they were fantastic.
Did some performance improvements in the udp messaging stack for Fire★. The performance was about 3x faster messaging performance. Makes me wonder what I was thinking before! There is probably still a lot of room for improvement too. Advanced data structures are NOT your friend when it comes to performance. This is counter intuitive, but true.
Working on UI for rejoining conversations when you get a network partition or disconnection.
Based on your experience, can you briefly elaborate on why the tension between advanced data structures vs performance?
Cache misses. We no longer have random access. Most advanced data structures are node based, which results in large amounts of cache misses. This has also been true for a long time but most people don’t internalize this, myself included.
Looking forward to more stuff on prose. I finished what’s currently there and I liked it so far, especially the structure of the material (so meta!).
Thank you for the kind words! Having a clear and visible structure was actually one of the design constraints I set myself for the book, so that it can (hopefully) serve as an example of the style it is teaching.
Just looked at PfP. Very cool. Do you have an email list or something I can subscribe to for updates? I find watching git repos of docs is too noisy to be of use, as you get notified for every spelling correction.
Edit: Just found the LeanPub link, never mind me :)
There isn’t a mailing list. However, I try to keep the noisiness of the repo to a minimum. Basically the only updates are when I issue a PR for a new section that I’d like people to “prose review”. :-)
I’m going again through The Elements of Computing Systems. But this time I’m creating a lexer/parser in Racket (learning this language on the side) for the HDL files in the first chapters. I didn’t really like working with the GUI they provide.
Wow, that’s a very interesting book. I wonder how long it’d actually take to go through all those projects to completion, though.
Concepts of Modern Mathematics. It’s from the 1970’s, and I think the chapters about Abstract Algebra are good enough.
That looks like a pretty interesting book. When I was in high school, What is Mathematics? was also a very interesting overview of things I wanted to learn.
It’s a pet peeve of mine that people say abstract or modern algebra instead of just plain algebra. I get it that it has to happen because of the terminology that the educational system uses, but it’s not a subject name that professional mathematicians use. It’s just algebra. It is commonplace enough that it does not need extra qualifiers.
I kind of blame that Wikipedia is largely edited by undergraduates, who only know algebra from the one undergraduate course that they took which was called abstract algebra. Thus, Wikipedia systematically calls it abstract algebra (and has a bunch of other undergraduate habits). Mathworld tries to clarify the name somewhat.
In my previous company, we used an in-house build system that took 20 minutes on average to build the entire code base. That’s a lot of time gone down the tube, considering that we were +10 developers and each one of us built everything several times a day. So I resolved to improve the dire situation and, by parallelizing the steps that comprised a build, I got to reduce the time to a bit more than 1 minute. I felt like a hero. Now, I read that these guys execute the build and the tests in under a second. FML
Saying that syntax doesn’t matter is like saying “Learn the concepts of addition and multiplication, the notation doesn’t matter”. Try multiplying two big numbers using Roman numerals.
By the way…
… and it matters to recruiters that new employees have the right values, technical skills, and experience, not the right list of languages.
My experience with recruiters in real life is actually quite the opposite.
This is actually about Coq. I was not sure what tags to apply, wish there was one for automated proof assistants.
Yeah, I’d really like one for formal methods in general. @jcs, possible?
In between the extremely specific “proofassistants” tag (estimate: one post per week) and the useful but extremely general “compsci” tag (estimate: 10 posts per week), I was thinking “formalmethods” (estimate: 3 posts per week) would be a useful middle ground.
Also recently applicable to this other story.
As a working academic in programming languages, I’ve mostly resisted the urge to dive into proof assistants for my work. (As an academic, I mostly resist the urge to do anything that seems to be popular among my peers.)
Remarkably, the points he makes (almost a decade ago) are no less familiar today. I’m more familiar with ACL2 than Coq, but it seems each tool has its own issues when it comes to learning curve.
Reading his summary makes me wonder if making major investments in IDE/UI technology (perhaps proof state visualization?) would help.
Visualization as a debugging aid helps, and in this case I guess it complements seamlessly with the way stepwise development in Coq presumably works (I’m no expert). For example, there is prooftree for Proof General.
There’s also the work in Fstar which has some interesting tooling around it. Also has an MSR Cambridge connection.
I upvoted this post in the hope many users read it and come up with their own opinion on what should be considered good vs bad manners in this online community.
The actual coding requires a great care and a non-failing talent for accuracy; it is labour-intensive and should therefore be postponed until you are as sure as sure can be that the program you are about to code is, indeed, the program you are aiming for. I know of one —very successful— software firm in which it is a rule of the house that for a one-year project coding is not allowed to start before the ninth month! In this organization they know that the eventual code is no more than the deposit of your understanding. When I told its director that my main concern in teaching students computing science was to train to think first and not to rush into coding, he just said “If you succeed in doing so, you are worth your weight in gold.” (I am not very heavy).
But apparently, many managers create havoc by discouraging thinking and urging their subordinates to “produce” code. Later they complain that 80 percent of their labour force is tied up with “program maintenance”, and blame software technology for that sorry state of affairs, instead of themselves. So much for the poor software manager. (All this is well-known, but occasionally needs to be said again.)
I wonder how things would be different in a world where this, rather than the Agile focus on delivering “business value” from day one, were the conventional industry wisdom.
I think that it requires balance. Agile/Scrum should die in a taint fire, but I’d be annoyed if someone told me that I had to spend 8 months in “design meetings” before I could write any code, build something, demo it, and get feedback on whether I’m digging at the spot marked X. I could get on board with “the business leaves the coders alone for 8 months, and they don’t commit to putting anything into production code until they know what the fuck they are doing”.
In other words, I don’t think that “thinking” and “coding” deserve to be separated into phases. Coding is a part of the thinking process (that doesn’t mean that all of the code must go into production) and thinking is part of the coding process, unless you’re doing rote work that ought to be automated.
The Agile vs. Waterfall is a red herring because it presumes business-driven engineering, which is an unpolishable turd. Waterfall replicates the sociology of a stable but dysfunctional hierarchical company: the work ripples down from level to level as each tier picks off the work it finds interesting: businessmen shout incoherent orders (with great emotion!) in January, designers come up with products in March, architects call the technical shots in May, programmers write the code in July, and ops people and QA support the code in its Eternal September. Agile replicates the sociology of an equally dysfunctional company without a stable hierarchy, which is even worse. In Waterfall, shit rolls downhill. In Agile, shit is flung from all directions. Personally, I prefer not to have shit thrown at or on me from any direction, which means that I don’t want to do business-driven engineering.
Waterfall is only better than Agile because it’s less aggressive. The Waterfall culture is one of seniority and complacency, which means that as long as you don’t piss anyone off, you can hide away and actually produce something. The Agile culture is the macho-subordinate culture that is constantly in your face. However, they’re both terrible and they both produce substandard software.
There isn’t one right feedback cycle. For some projects, weekly demos (or “iterations”) are appropriate. For others, monthly or yearly audit cycles are better. What is objectively true is that engineer-driven companies can produce quality software (not everything will be excellent, but the wins will pay for the losses) while business-driven engineering is doomed to failure.
This is something of a classic…my best resourceful-system-recovery story happened a few years ago (not as dramatic or enjoyably-related as the linked story, but I was sorta proud of myself):
After making a semi-special request, I had recently been granted root access on my workstation at my (university) department. One afternoon a few days after that, I started mucking around trying to do a local install of GHC – installing a newer glibc into
/usr/local (since the system’s C library was too old for it), tweaking some libc/ld.so symlinks, so on and so forth. Despite a few hours of bashing my head against it though, I wasn’t able to get it working and eventually gave up, moved on to other things and basically forgot about it (I had an older GHC available and didn’t need the new one for anything critical).
Later that night, however, I was busily tapping away at my shell session, when suddenly:
$ ls Segmentation fault
$ whoami Segmentation fault $ /bin/true Segmentation fault
Like in the linked story, in my initial semi-panic I almost rebooted the box, but thankfully thought better of it (doing so would have destroyed any chance I had of recovery).
After some further poking around (I don’t quite remember what clued me in to exactly what was going on) I realized that one of my earlier symlink adjustments had broken my x86-64
ld.so – but only in a delayed fashion that didn’t actually manifest itself until the 4:00AM
prelink cron job ran (I’ve always been a bit of a night owl, and was still working at 4:00AM). So any dynamically-linked 64-bit binary I tried to execute would just immediately fall on its face before even reaching
main(). That’s basically every binary on the system, very much including the
ksu, since it was a Kerberos environment) binary I’d have needed to actually fix the problem.
The prospect of sheepishly going back to the department IT folks for help un-breaking my system immediately after convincing them that I could in fact be trusted with root access on it was pretty embarrassing, especially as someone who considers himself a decently competent sysadmin.
In considering my options, I remembered I actually did have a running root shell on the system, but it was tucked away in a screen session I wasn’t currently attached to – and
/usr/bin/screen, needless to say, was a 64-bit dynamic executable and thus not really working. (Yes, perhaps on general principle I should be admonished for having left a root shell lying around unattended, but that’s another matter, and in this case it was critically useful…)
The other key thing I realized I had at my disposal was the departmental AFS filesystem, which was mounted on the machine. While my x86-64
ld.so was borked, i386 programs continued to work, I just didn’t have i386 versions of anything I actually needed. So I logged in to a 32-bit machine elsewhere in the department, compiled a 32-bit
screen from source, dumped the binary into AFS, and hoped like hell screen’s authors had the decency to keep whatever IPC protocol it uses architecture-independent…thankfully, they apparently did, and I was able to re-attach to my 64-bit screen session with my freshly-compiled 32-bit binary, regaining access to my root shell. From there it would have been relatively easy to build 32-bit versions of whatever minimal subset of coreutils I’d need to fix the actual problem, but I realized I wouldn’t even need to do that – there was a 32-bit python interpreter sitting in AFS, so I fired that up as root and manually issued system calls (
import os; os.unlink(...); os.symlink(...)) until things in
/lib looked like they had before I started messing with them…and then at last:
$ /bin/true $ echo $? 0
Success! Admin embarrassment avoided.
Where do you keep copies of busybox to ensure you’re not affected? Surely the parent’s problem would have caused problems for you, even with your copy of busybox. Though! I guess you’d have had a shell open and could have cd’d to the directory it was in… Saved… this time. :)
Like in the linked story, in my initial semi-panic I almost rebooted the box, but thankfully thought better of it (doing so would have destroyed any chance I had of recovery).
It would? Couldn’t you have just booted a live cd, mounted your hard drive and done the link changes there?
On a “normal” system, sure, but in this case no, because the IT-administered machines in the department have their bootloader & BIOS locked down.
Thank you for sharing this. It would seem that the mechanisms that make you more effective in the face of hunger and fear of embarrassment are alike!
I’m slowly working my way through Types and Programming Languages. It stands out by actually including answers for almost all of the exercises, making it worthwhile to work through them, which makes it a slow but fulfilling read.
Why is it that most texts, specifically self-paced math textbooks, don’t include solutions for all exercises? It’s overly pesky.
Because in some sense that’s just the way it is with mathematics. Eventually the only arbiter for a correct answer is your own intuition and your ability to convince others… so at some point in mathematical literature they start to assume that’s where your answer key will be sourced from.
Don’t feel confident about a proof you wrote?
Well, why? Maybe we’ve all got some (re)thinking to do
That’s a bit extreme. For the vast majority of mortals, having access to the solutions is of an enormous help in guiding the self-study. Anyway, I’m not convinced with your answer, since lots of math books include answers to some portion of them (usually even/odd-numbered ones). Why?
Oh, I agree that they’re helpful—and this is also why they’re included. But they’re also a lot of work to build and kind of a crutch.
can you tell us what good stuff is sinking in after reading that?
i’m reading “artificial intelligence and creativity”, edited by Terry Dartnall from springer
it’s nice. similar ideas to those found in “i am a strange loop” by douglas hoftstaedter (really worth a read)
The author starts describing the structure of a system, which in its basic form is composed of stocks and flows. Stocks are those elements that you can measure; it’s just a quantity: the amount of worldwide oil reserves, the temperature of a room, you name it. A flow is something that makes a stock bigger (inflow) or smaller (outflow). A fairly important concept is that of dynamic equilibrium: you can have competing inflows and outflows with the net effect of the stock being kept at a fixed level. Imagine a dam during a flood, the inflow of water could equal that of the water leaving the dam through the open gates. The water level in the dam (the stock) does not change, the system is kept in a dynamic equilibrium.
Another concept is that of delay, since flows take time to flow, acting as buffers of a system. Your bank account (stock) allows you to spend money (outflow) at a pace different from how you earn it (inflow). This was certainly profound to me.
And what happens when changes in a stock changes in turn the inflow and outflows of that stock? You have feedback loops, you must be familiarized with those, having read Hofstadter :). Actually, the author mentions two types: balancing feedback loops and reinforcing feedback loops. The former, also called goal-seeking loops, fight to keep the stock at a certain level. The latter, they are the source of exponential growth or decay. You can have a myriad of combinations of all these types of flows and you get to visualize really funny patterns over time. Reinforcing loops are always accompanied by balancing loops, because no physical system can grow forever in a finite universe. D'oh!
Insightful was also to learn that when people study the dynamics of a system, the goal is not to predict what will occur, but to look into how the system would behave when conditions change.
As I mentioned, I’m only half way through the book, and at this point I think my comment qualifies as a spoiler :/
To wrap it up, just say that all these properties and whatnot stem from the definition of the structure of systems -entangled stocks + flows- given at the beginning of the book. Fascinating to me.
I am (and have been for the past few weeks) still working through Infinite Jest. I’ve been enjoying it for the most part; though there have definitely been some parts I’ve had to slog through.
I’ve wanted to start reading some more educational material, particularly around PL and compilers, but I’ve been a little daunted by the idea of reading it in parallel with IJ. Suggestions and what and how to read would be appreciated!
I’ve heard only good things about Let’s Build a Compiler. Anyone with further good things about it to share with us?
Thanks for the reference, although I’m looking for something a couple steps beyond that as I’m already familiar with type theory and basic compiler implementation.
And good things about IJ? It’s an unprecedented and fascinating look into addiction, depression, and American views on these topics. Those are just the central themes (I’d say, anyway), and the author has much more to say along the way.
I’ve read the Dragon Book (Compilers: Principles, Techniques, and Tools, Aho et al.) at the time. It’s quite good (considered the seminal text on these matters), though if you’re already familiar with the basics, I don’t know if it’ll help you much.
Maybe I should have entitled the post: ‘If you’re reading a book this week, then tell us about it! If you’re reading none, then no need to share the fact with us. Thank you.’.
I’m reading The Princeton Companion to Mathematics to brush up on my math. It has been really easy to read so far.
Well, I imagine it gets tougher. I saw someone say they read it in 1.5 years, so that’s my target. It is a thousand pages!
TBH counting has been a great difficulty for me whether it is 0 or even 1.
would pretty much force me to use fingers.
That’s actually the easiest: (upper - lower) + 1. In your example: (12 - 2) + 1 = 11.
Yup. Off by one errors have a way of screwing your head I guess :)