1. 4
  1.  

  2. 7

    I think DHH has a lot of accurate points here, but I think he’s wrong about not needing to write SQL or have an understanding of the database technology supporting your application. With applications that store few data in their database, then I agree it may be possible to have a completely opaque perception of the database technology. However, I don’t see a way to forgo knowledge of the database layer and avoid writing SQL for things like running migrations on tables with millions of rows.

    As a simple example, creating an index on a large Postgres table without the CONCURRENTLY keyword is a surefire way to block reads and writes on a table while the index is created and cause downtime. I don’t work with ActiveRecord, but it appears there is an abstraction for this keyword (algorithm: :concurrently). But how would you know to use this option if you don’t have an understanding of the database and its transactional isolation behavior?

    As another example, adding a NOT NULL constraint on a large table will also block reads and writes in Postgres, while it validates that all current rows conform to the constraint. You’re better off creating an INVALID check constraint to ensure a column is non-null, and then VALIDATE’ing that constraint later to avoid downtime. These are the type of things where knowledge of just an abstraction layer and not of the underlying database will cause problems.

    To be fair, DHH does only mention that Basecamp 3 has no raw SQL in “application logic”, and he never mentions migrations in the post, so maybe he is ignoring migration-type SQL commands in this context.

    1. 3

      As another example, adding a NOT NULL constraint on a large table will also block reads and writes in Postgres, while it validates that all current rows conform to the constraint. You’re better off creating an INVALID check constraint to ensure a column is non-null, and then VALIDATE’ing that constraint later to avoid downtime.

      And this (among other things) is why I just can’t believe the claim that they could move from MySQL to Postgres and “not give a damn”.

      1. 1

        I interpreted that as he wouldn’t care what underlying technology he used not that the migration process would be trivial.

        1. 1

          But I’m not talking about the migration process either. He will care about the underlying technology when, for example, his team will have to tackle vacuum issues – long after the move has been done.

          1. 1

            But if your claim is that you do have to care about the technology then your problem is with the entire blog post, not just if he would give a damn about running postgresql.

    2. 3

      Good point about beginners, but thing about non-trivial abstractions is that they always leak someplace.

      He uses memory management as an example of something programmers don’t have to think about day to day, but you kinda need to have a mental model of memory. I’ve visited teams that haven’t and they’ve dug themselves into deep holes.

      The cases for data are real. The question is how much rework you have to do when you come across one.

      1. 3

        I don’t know if he’s literally saying you don’t need to know SQL, the language —in which case whatever, OK — or that you don’t need to know how databases work (constraints, concurrency, join performance, etc.) — in which case no, you do need to know that stuff or you will find yourself in bewildering pain once your application becomes large or successful enough to be interesting.

        That is classic Rails app evolution, though — you can get so much done without knowing what you’re doing, then a few years later you have a terrifying ball of misconceptions.

        1. 2

          Thankfully beginners can now use “mental compression” on dirty, dirty SQL and can save that precious mental filesystem for information about Ruby meta-programming.