1. 5
  1.  

  2. 2

    I didn’t even know you could insert multiple rows with one statement that way. I’ll have to go look at the SQLite syntax diagrams again!

    1. 5

      You mean insert into x values (1),(2)? Yup! Pretty standard SQL.

    2. 1

      I don’t have a go tool chain available; what would these look like if you weren’t timing reading the full CSV into memory before sending it to SQLite? Currently, all the tests are reading the full file from disk each iteration, except the CSV import, where it’s not even opening the file and letting SQLite do it all.

      I’m guessing that if the data is coming from in memory already, the bulk insert and CSV import look a lot more similar.

      1. 1

        It’s unfortunate that the Go sql package don’t have a more streamlined method of building bulk inserts. I feel like such a common use-case should be covered by the standard library.

        Having to manually build the parameter list and the prepared query values string in parallel is kind of tiresome.

        Also, what is with people writing benchmarking code that doesn’t make use of the standard library benchmark functionality… I saw the same with the article about testing gcc-go sqlite vs. go sqlite packages. SMH.