I’ve been using https://github.com/peterldowns/pgtestdb and it has been going great so far. i have replicated the same setup in python with a custom script.
I wrote my own code for using schemas, but using template databases is actually smarter. Even if a schema is lighter-weight, having to run migrations takes time.
I do something very similar with Python, using pytest-postgresql; I stand up a brand new instance and then build the structures in it I need. I haven’t put this into CI/CD yet, because time, but I expect it to work fine there, too.
I am thinking of setting the test runner up to try and use a templated version of our database running locally, first, rather than load the whole thing into the new DB instance at runtime, just to save time. using the -T flag to createdb might be a better approach.
Call me impatient, but the startup time of this setup was too slow for me. I quickly ended up with long-living and shared Postgres databases instead. In my case, I started up a Docker image with configuration options to keep as much stuff in memory instead of disk. Then, the tests just connected to a hardcoded pg connection and did whatever it was asked to test.
Sharing database has some additional benefits: You won’t just test the database from an empty state. If you accidentally forget to add the AND user_id = %3 expression to your query, you’ll quickly realize whenever the second run of the same test occurs. Though this also depends on you being able to generate new users and data on the fly.
Of course, there are nitty gritty details here. You’d like to clean up the database now and then so that it doesn’t blow up your memory use. And you probably need an isolated database from time to time. But that’s manageable, and the complexity is well worth it to make tests snappy imo.
(Also, if you can, this is just much easier to do with Sqlite)
There’s https://github.com/zonkyio/embedded-postgres-binaries and some other project that I cannot find right now that package relocatable PostgreSQL binaries. There’s also a few libraries similar to the linked one that use these projects do to this without requiring containers or installing PostgreSQL on your machine.
I have a slight preference for avoiding containers. I think they are great in some scenarios, but in some they make things easy, but make some other stuff complex.
I’ve been doing something similar, a little more manual, but easy to reason about. I have a makefile target to start a postgres instance, which I keep running in a terminal during the day I work on the project. Something like this:
There’s also a target to initialize the database, something like this:
postgres-init:
mkdir tmp/pg
initdb -D tmp/pg
TestMain sets up a new template db with migration scripts run. Each test creates a new db based off that template db, with tests run in parallel.
The postgres instance is also used for local development, with some long-term data for testing. During development, especially with concurrent branches with conflicting schema migrations, I just backup/restore the whole tmp/pg directory. With the instance running for local development/manual testing, might as well use it for automated tests. At the end of the day I shut down the terminal with postgres. Conceptually very simple, and fast.
Nice. I’ve used TestContainers (https://testcontainers.com/?language=go) for this and it works quite nicely (uses docker though).
I’ve been using https://github.com/peterldowns/pgtestdb and it has been going great so far. i have replicated the same setup in python with a custom script.
This library is the real deal. Pairing it with running the postgres in tmpfs (ram) it practically feels like you are not doing IO at all.
I wrote my own code for using schemas, but using template databases is actually smarter. Even if a schema is lighter-weight, having to run migrations takes time.
I enjoyed it so much I ported it to sqlite for my side projects. https://github.com/terinjokes/sqlitestdb
I was looking for exactly this recently, thank you so much!
I do something very similar with Python, using pytest-postgresql; I stand up a brand new instance and then build the structures in it I need. I haven’t put this into CI/CD yet, because time, but I expect it to work fine there, too.
I am thinking of setting the test runner up to try and use a templated version of our database running locally, first, rather than load the whole thing into the new DB instance at runtime, just to save time. using the
-Tflag tocreatedbmight be a better approach.Call me impatient, but the startup time of this setup was too slow for me. I quickly ended up with long-living and shared Postgres databases instead. In my case, I started up a Docker image with configuration options to keep as much stuff in memory instead of disk. Then, the tests just connected to a hardcoded pg connection and did whatever it was asked to test.
Sharing database has some additional benefits: You won’t just test the database from an empty state. If you accidentally forget to add the
AND user_id = %3expression to your query, you’ll quickly realize whenever the second run of the same test occurs. Though this also depends on you being able to generate new users and data on the fly.Of course, there are nitty gritty details here. You’d like to clean up the database now and then so that it doesn’t blow up your memory use. And you probably need an isolated database from time to time. But that’s manageable, and the complexity is well worth it to make tests snappy imo.
(Also, if you can, this is just much easier to do with Sqlite)
There’s https://github.com/zonkyio/embedded-postgres-binaries and some other project that I cannot find right now that package relocatable PostgreSQL binaries. There’s also a few libraries similar to the linked one that use these projects do to this without requiring containers or installing PostgreSQL on your machine.
I have a slight preference for avoiding containers. I think they are great in some scenarios, but in some they make things easy, but make some other stuff complex.
I’ve been doing something similar, a little more manual, but easy to reason about. I have a makefile target to start a postgres instance, which I keep running in a terminal during the day I work on the project. Something like this:
There’s also a target to initialize the database, something like this:
TestMain sets up a new template db with migration scripts run. Each test creates a new db based off that template db, with tests run in parallel.
The postgres instance is also used for local development, with some long-term data for testing. During development, especially with concurrent branches with conflicting schema migrations, I just backup/restore the whole tmp/pg directory. With the instance running for local development/manual testing, might as well use it for automated tests. At the end of the day I shut down the terminal with postgres. Conceptually very simple, and fast.
This is nice. Is there a similar Go module for local DynamoDB?
does https://testcontainers.com/modules/dynamodb/ work ?
Yes, thank you.
[Comment removed by author]
[Comment removed by author]