As one of the few people left (apparently) who doesn’t prefer pytest, I feel like I should speak up for the other side of the argument.
So. You’re perfectly able to use assert in unittest-style tests, too – it’s not like they somehow forbid it. It’s just that the whole xUnit family (of which Python’s unittest module is a member) has a tradition of more complex/richer assertions beyond the ones people usually make fun of when suggesting alternatives, and in fact you’ll often find that a unittest-oriented approach will be more likely to take the logic involved in a complex assertion and wrap it up for reuse (Django does this, for example, in its extensions to unittest – there are helpers to wrap slightly-tricky implementations like asserting exactly how many DB queries a function performs, or that provide semantic equality checks on blobs of JSON and HTML, etc., – I know there’s a pytest plugin for Django providing equivalents, but I also get the impression that pytest purists frown on them). Also, for completeness’ sake, pytest only is able to do what it does because pytest literally rewrites your code at the AST level in ways which very much do not boil down to “just” using Python’s assert statement.
And then there’s the verbosity. I feel like pytest outputs huge amounts of irrelevant information when a test fails, but also often omits the actual important bits! You can see it in this post every time pytest hides part of the diff between expected/actual data structures – you actually have to turn the verbosity up even further to see the full details at the critical spot where your test failed (and that’s after scrolling back through everything else pytest dumped into your console).
And then there’s the fixtures. I’m not going to break out the “m” word here, but the level of implicit and hard-to-reason-about work pytest does behind the scenes is just not something I personally see as an advantage. In a unittest-style test case, I can trace everything easily: either it’s defined right there on the TestCase class I’m looking at, or inherited from an ancestor class or some explicit reference to a helper function, and either way I can locate it by following the definitions and imports. In pytest if I see a fixture… well, who knows where it came from? There are a bunch of rules about where pytest will find and load fixtures, which ones it will activate and in which contexts, and so the only rule really seems to be that any code in any file might end up influencing any test through some sort of implicit action-at-a-distance. So I prefer being able to follow standard Python constructs to find the relevant code, and generally see pytest as doing too much implicit stuff for its own good.
That said, I know a lot of people do prefer pytest, for various reasons. And it’s good that they can have something which suits their preference. I just wish it weren’t so constantly being presented as objectively better rather than subjectively preferred.
Edit: And because I don’t want this to be pure grumpy ranting, I’ll mention something positive, that almost always is left out of “use pytest, it’s better” discussions: pytest gives you clear ideas about separation of concerns in testing, and I think there is value in that. When you write tests with pytest, its various helpers and features are steering you toward separating, and thinking separately about:
The logic of the test (in pytest, the function body)
The resources required to carry out the test (in pytest, these are the dependency-injected “fixtures”)
The data the test will operate on (in pytest, these are often supplied via its parametrization helpers)
If it were just a matter of adopting that way of thinking, and didn’t come with the things mentioned above that bother me, I’d be a wholehearted advocate of pytest.
One thing that I think unittest is missing that really seals the deal for pytest, is that pytest has a granularity level higher than “a TestCase class”.
Pytest lets you define session-level fixtures, with markings lets you define “test suites” across multiple files without forcing package grouping, and general lets you tweak your test runs without imposing many structural restrictions.
I agree about assertX being born out of necessity, though we have ended up writing helpers (stuff like assertCSVEquals to get more useful error messages than “these two strings are not the same”).
I think that pytest encourages nice writing of lightweight tests, and though unttest-style classes work well for grouping related tests, I think it starts getting a bit harder with cross-sections
examples of “test suites a la carte” that is easy with pytest markings:
run all my selenium tests
run my tests that require the database, but don’t require spinning up a sample user
list all my tests that require this fixture, but not this other one
I think that overall pytest is very accepting of modifications from the standard path, so I have never really felt “stuck” using it.
FWIW Django’s unittest-based test tooling allows you to do something very similar to pytest marks – Django does it via the decorator django.test.tag, which can be applied either to a TestCase class as a whole or to individual methods, and then its test runner accepts a list of tags to include/exclude in the run. The implementation isn’t that complex. So I don’t have a need for pytest for that.
Similarly, non-class-based unittest-style tests are implementable fairly easily using unittest.FunctionTestCase – which lets you pass in custom setup/teardown as well. Or you could write an extension of unittest.TestLoader that just natively scans a module for functions instead of classes/methods. So again, it’s not that this is something that pytest is uniquely able to do.
The things that truly are unique about pytest, unfortunately, are mostly the things I personally don’t like about it (such as the implicit dependency-injected fixtures).
I have mixed feelings about pytest, but it has one killer feature: it displays warnings, including deprecation warnings, by default. That’s enough that I recommend it strongly at $DAYJOB — otherwise, only the most savvy Python programmers ever see deprecations.
It’s also great that it doesn’t force an awful subclassing-based API on its users.
As one of the few people left (apparently) who doesn’t prefer pytest, I feel like I should speak up for the other side of the argument.
So. You’re perfectly able to use
assertinunittest-style tests, too – it’s not like they somehow forbid it. It’s just that the whole xUnit family (of which Python’sunittestmodule is a member) has a tradition of more complex/richer assertions beyond the ones people usually make fun of when suggesting alternatives, and in fact you’ll often find that aunittest-oriented approach will be more likely to take the logic involved in a complex assertion and wrap it up for reuse (Django does this, for example, in its extensions tounittest– there are helpers to wrap slightly-tricky implementations like asserting exactly how many DB queries a function performs, or that provide semantic equality checks on blobs of JSON and HTML, etc., – I know there’s a pytest plugin for Django providing equivalents, but I also get the impression that pytest purists frown on them). Also, for completeness’ sake, pytest only is able to do what it does because pytest literally rewrites your code at the AST level in ways which very much do not boil down to “just” using Python’sassertstatement.And then there’s the verbosity. I feel like pytest outputs huge amounts of irrelevant information when a test fails, but also often omits the actual important bits! You can see it in this post every time pytest hides part of the diff between expected/actual data structures – you actually have to turn the verbosity up even further to see the full details at the critical spot where your test failed (and that’s after scrolling back through everything else pytest dumped into your console).
And then there’s the fixtures. I’m not going to break out the “m” word here, but the level of implicit and hard-to-reason-about work pytest does behind the scenes is just not something I personally see as an advantage. In a
unittest-style test case, I can trace everything easily: either it’s defined right there on theTestCaseclass I’m looking at, or inherited from an ancestor class or some explicit reference to a helper function, and either way I can locate it by following the definitions and imports. In pytest if I see a fixture… well, who knows where it came from? There are a bunch of rules about where pytest will find and load fixtures, which ones it will activate and in which contexts, and so the only rule really seems to be that any code in any file might end up influencing any test through some sort of implicit action-at-a-distance. So I prefer being able to follow standard Python constructs to find the relevant code, and generally see pytest as doing too much implicit stuff for its own good.That said, I know a lot of people do prefer pytest, for various reasons. And it’s good that they can have something which suits their preference. I just wish it weren’t so constantly being presented as objectively better rather than subjectively preferred.
Edit: And because I don’t want this to be pure grumpy ranting, I’ll mention something positive, that almost always is left out of “use pytest, it’s better” discussions: pytest gives you clear ideas about separation of concerns in testing, and I think there is value in that. When you write tests with pytest, its various helpers and features are steering you toward separating, and thinking separately about:
If it were just a matter of adopting that way of thinking, and didn’t come with the things mentioned above that bother me, I’d be a wholehearted advocate of pytest.
One thing that I think unittest is missing that really seals the deal for pytest, is that pytest has a granularity level higher than “a
TestCaseclass”.Pytest lets you define session-level fixtures, with markings lets you define “test suites” across multiple files without forcing package grouping, and general lets you tweak your test runs without imposing many structural restrictions.
I agree about
assertXbeing born out of necessity, though we have ended up writing helpers (stuff likeassertCSVEqualsto get more useful error messages than “these two strings are not the same”).I think that pytest encourages nice writing of lightweight tests, and though unttest-style classes work well for grouping related tests, I think it starts getting a bit harder with cross-sections
examples of “test suites a la carte” that is easy with pytest markings:
I think that overall
pytestis very accepting of modifications from the standard path, so I have never really felt “stuck” using it.FWIW Django’s
unittest-based test tooling allows you to do something very similar to pytest marks – Django does it via the decoratordjango.test.tag, which can be applied either to aTestCaseclass as a whole or to individual methods, and then its test runner accepts a list of tags to include/exclude in the run. The implementation isn’t that complex. So I don’t have a need for pytest for that.Similarly, non-class-based
unittest-style tests are implementable fairly easily usingunittest.FunctionTestCase– which lets you pass in custom setup/teardown as well. Or you could write an extension ofunittest.TestLoaderthat just natively scans a module for functions instead of classes/methods. So again, it’s not that this is something that pytest is uniquely able to do.The things that truly are unique about pytest, unfortunately, are mostly the things I personally don’t like about it (such as the implicit dependency-injected fixtures).
I have mixed feelings about pytest, but it has one killer feature: it displays warnings, including deprecation warnings, by default. That’s enough that I recommend it strongly at $DAYJOB — otherwise, only the most savvy Python programmers ever see deprecations.
It’s also great that it doesn’t force an awful subclassing-based API on its users.