Oh wow, I just tested it, he’s right. How on earth did this get through QA?
My question exactly! I’m not surprised people aren’t convinced it’s a real bug until the test it; it’s pretty crazy :(
How do you know that it did get past QA? A QA Engineer’s job is to report on software quality, not to decide whether or not to ship a product. There could be a reported failure of a test case specifically for this issue, followed by an administrative decision to release the product with the failure anyway.
You might be tempted to ask ‘how on Earth did this get past the Developer who wrote it?’, but you can’t really assume that it got past that person either. Software is complex, and a small change made by someone else at a later date, somewhere far away in the code tree, could have an unintended effect on code that worked just fine when it was first written and tested.
It would be interesting to know what the ultimate root cause of the problem was. Its too bad that outsiders don’t usually get to see that information, because there can be a lot to learn from it.
A QA Engineer’s job is to report on software quality, not to decide whether or not to ship a product.
If the QA department doesn’t own the release, then they aren’t Quality Assurance, they’re merely Quality Information Reporting Staff or something.
I guess I’ve only worked at places where the QA org owns the release, so your comment really surprised me. There have been places that had NO QA at all explicitly, and those had as bad of quality as you would expect.
That’s the ideal situation, but is it common? Releases dates and feature lists are often very “political”, and at many places QA isn’t allowed an outright veto as a result. In games, for example, QA can at best argue for a release date to slip or alternately for scope to be cut given time/resources, but since release dates slipping or features promised in trailers not materializing are both seen as bad PR and seriously impact the entire marketing strategy, someone fairly high up in management would have to OK it, and there’s often substantial pushback. Management would weigh QA’s input as just one of many sets of pros/cons, and it ends up being a judgment call which is less-bad of the options available at that point: “miss the Christmas season”, “ship for the Christmas season but without promised features”, or “ship for the Christmas season with promised features, but they’re horribly buggy”.
Of course you can say games are “just entertainment”, but this tale from SGI, among others, sounds pretty similar.
That’s the ideal situation, but is it common?
I don’t know if it’s common (I suspect not), but where I work we do this! When I (as team leader) am happy with a release (all important items for that version complete, code reviews done, etc.) my responses is always that it’s ready to go when QA say so.
That said, we’re talking about YouTube here. I would be both surprised and disappointed if they chose to ship things with with QA having raised bugs like this one. My guess is that they didn’t know about this and something wasn’t tested thoroughly.
I’m sorry to hear that you’ve only worked at places like that. It’s like working with teams that don’t use source control, or that never do unit tests, or that can’t version a release, or that have no QA at all. Sometimes you’ve got to do it, but it is generally agreed that it isn’t a recipe for long term success.
A Quality Assurance team is an information reporting staff. Its right there in the name.
‘assure’ - 1. tell someone something positively or confidently to dispel any doubts they may have.
Some organizations think they can create a team that can force a high quality release by finding and eliminating all of the bugs, a team that refuses to ship until the bugs are all gone - a Quality Ensurance team.
ensure - 1. make certain that (something) shall occur or be the case
Forming a QE team is a bad gambit. The only way for them to be certain of a bug not going into production is to never ship code. If they do ship code, they become the scapegoat at the first sign of trouble. (“How on Earth did this get past the quality team?”) Its an inherently conflicted group.
Without going into a long dissertation on it, your business will be a lot better off, and you’ll ship better quality software if you have an honest quality reporting team (a QA team) providing actionable data up the chain of command.
The Capers Jones view of the world (inspections, static analysis, formal testing..) has lead to some of the highest quality software I’ve ever been a part of, and it has the research data to back it up.
Your straw man about having a bad culture in a company for how they treat QA in light of total engineering failure (bugs) is one that I have never experienced. I’d love to hear more about your experience and why you feel so strongly on QA not owning releases, or I may be confused about what bothers you so much about QA ownership of releases.
I have a connection at YT. I was able to repro, so I sent it along. Fingers crossed!
The only workaround for this bug that I know of is to shorten all your links. I hate this because it looks scummy like you’re trying to hide something.
Also because it’s really annoying when the URL shortening service shuts down… ;(
This is a problem wherein developers don’t build in bug reporting or don’t make bug reporting easy because it’s seen as something that should only be used for beta testers or in-house.
This number is unknowable, but I can imagine there are plenty of people who find weird bugs or edge cases in Facebook, Youtube, whatever, but because there isn’t an obvious way to report the bug; they just never do.
As evidenced by app stores, there are lots of regular people who will write detailed accounts of bugs they are experiencing. Only instead of getting an email about it, you get it in the form of a public rant and a one-star rating.
Have you ever run a popular public service with a bug reporting address? I created security@[business].com on our About page so whitehats could privately report security problems. We immediately got dozens of support emails per day from users who wanted to unsubscribe from mailing lists, etc. It’s intractable. The signal to noise ratio is unbelievably low.
This is definitely a problem, but I think it can be reduced. For example, if the same page that had that email address on it made it clear how to unsubscribe, maybe it’d be used for that less?
I understand it’s hard supporting lots of people - for example if there was a contact email address at the bottom of every YouTube page it’d become quite expensive to triage; but I don’t believe making it impossible to contact anyone to report what might be serious issues is the best (or only) fix.
What other strategies have you found for effectively reporting problems to large companies?
Also in high demand if anyone has good ones: What strategies have you found for large companies to effectively receive problem reports?
Blog posts and HN seem to be the only way ;-)
Disqus ignored this issue until I blogged it and a Google employee has now responded to my forum thread about this issue since I blogged this too!