What this article fails to mention is that none of the popups demonstrated are necessary, and hence of dubious legality. Rather than design a clear cookie consent banner, just defer displaying it until it is necessary. And no, your Google AdWords cookie is not necessary.
It is also ironic that the article itself is covered by a floating banner:
To make Medium work, we log user data. By using Medium, you agree to our Privacy Policy, including cookie policy.
Plus the idea that the reason these bad dialogs exist because no one’s designed a better one is just … hopelessly misguided. Offering a better alternative won’t make sites switch to it, because they’re doing what they’re doing now because it’s bad, not because it’s good.
Civil disobedience is what you do when the law is unjust or immoral; this is more like “the law doesn’t allow us to be as profitable as we would like so we are going to ignore it.”
Civil disobedience doesn’t have to be ethically or morally just.
civil disobedience, also called passive resistance, the refusal to obey the demands or commands of a government or occupying power, without resorting to violence or active measures of opposition; its usual purpose is to force concessions from the government or occupying power.
I’m targeting designers, not companies, in this article. I need to write another article about companies. I think each designer is responsible for their own design and must ensure that their design is ethical.
Yes, but I doubt these dark patterns are being created without the company’s knowledge and explicit direction. So the ethical designer has to be prepared to walk. I guess that’s always the case but likely no one particular dark cookie pattern would seem bad enough on its own to be worth making a stand. But that’s how they get you I guess.
I think @geocar was spot-on in his comment here: this is a social and economical problem, not a technical problem. It can’t be solved by technology.
Better UX is opt-in and, like all UX choices, lends itself easily to blanket statements like “users value efficiency”, which can then be easily distilled into “we want to make it easy for users to get to content so, uh, we just let them opt-in without any hassle, yeah”.
IMHO the right way to “fix the unethical design of cookie consent windows” is 100% non-technical: teach users that websites that don’t have clearly labeled “opt out” and “opt in” buttons, with default being out, are unsafe. Not “spyware”, that gives rise to all sorts of relativistic talk about what is and isn’t spyware. Unsafe is good enough, and we can get a new catchy term for it if we really must, like trackware or whatever. If they aren’t doing anything shady, then they have no reason not to be upfront about it. Sure, the fact that they’re not being upfront about this doesn’t mean they’re doing anything shady with your data, but why take the chance?
Plenty of nagware back in the day really didn’t do much other than nag you, and we still treated it as unsafe. The fact that a company engages in data siphoning while also holding a thick stack of documents written in the peculiar Legalese dialect of English shouldn’t really matter.
It’s not impossible. There’s an entire generation of Internet users, including 100% non-technical people like my parents (literally), who understands that those “chat to hot singles in your area” banners are bad. We should teach the next generation of Internet users that websites with weird cookie consent windows are bad. That is the kind of thing that makes media owners improve them, not UX advice.
It’s not like this is cold fusion technology: we’ve known how to make efficient “yes/no” dialogs for like thirty years now, so there is no lack of knowledge on how to make clear cookie consent dialogs. The ones currently in use don’t have “bad UX” because the designers can’t figure out a better one – they have exactly the UX that they need to have. By the internal metrics which are used for promotion and bonus payment, they’re excellent, and the designers who came up with all this crap earned them. If you’re one of them, all this article is literally bad advice.
I realize this would hurt the Internet advertising industry. That would break my heart only slightly more than something that would hurt the tobacco industry.
Teaching all users is probably impossible, as evidenced by the fact that e.g. phishing is still a thing. However, teaching enough of them that tracking becomes unpalatable for mainstream companies is, I think, not just easier to achieve than modifying legislation in a useful manner, but also much more efficient.
Media and advertising companies have a lot of political influence. Even the GDPR, which is neither very strict nor anywhere near as harsh as it should be IMHO, was adopted with some difficulty and against media campaigns that had eye-watering budgets (think prime-time talk shows where “analysts” took turns explaining how the GDPR is going to make it impossible for European companies to sell goods and services).
Even if this could be achieved, I think one of the things that we’ve learned after 30+ years of cyberspace regulation is that regulation and legislation is inherently reactive. By the time an abusive practice, like online tracking, is becoming so problematic as to not only require regulation, but overcome the resistance of the industry lobby, it’s already so tightly ingrained that it’s hard to root it out, and it has produced so much industry expertise that the industry will quickly wiggle its way around any legal definition and any legal measure. In this case, I bet any attempt to criminalise the collection of unnecessary data will quickly be rendered useless in court by small tweaks which will allow it to circumvent the definition of “necessary”. User education, on the other hand, is proactive: once a critical mass of users thinks data collection is fishy, attempts to circumvent that wisdom will seem even fishier.
Laws are good, and it’s bad to be fatalistic about the possibility of making good laws.
The CAN SPAM Act didn’t can all spam, but it did drive spam underground and make it so that every legitimate newsletter has a not that hard to find unsubscribe link. It was very effective and makes my life better every day. We can and should make more laws like that.
I also speak Japanese and I will never be able to unsubscribe from a Japanese newsletter I signed up for ten years ago because they don’t have an equivalent of the CAN SPAM Act.
I’m not fatalistic about the possibility of making any good laws, just laws that make it hard for politicians to stay in power :). The CAN-SPAM Act is, in fact, my gauge for what kind of legislation can be passed and under what circumstances.
Back when it was passed, spam was recognized as a problem but CAN-SPAM regulated just enough of it to be palatable – enough to make a difference but still so little of it that we jokingly referred it as the You-Can-Spam act. Even companies that were literally in the spamming business could generally adapt to it unless they were already doing something illegal and the spam was just the front.
Criminalising unnecessary data collection, on the other hand, would make the entire business model of even some large companies illegal, and it’s worth noting that, at least in parts of the EU, some of these companies are either the owners of, or are tied to the owners of, various important media outlets. For an MP, supporting this kind of legislation would be the equivalent of committing media suicide. Unnecessary data collection is also one of the big facilitators of modern election campaigns – it’s unlikely that political parties would be willing to support a move against it.
I’m not at all opposed to the passing of laws, and I agree that it would be better to have legislation that makes unnecessary data collection illegal. But I don’t think this is possible in a 5-10-year timeframe.
The cookie consent window is one of the worst things to happen to the internet. I get the idea but the execution just annoys everyone on all sides of the aisle, and I doubt people feel any safer (if they cared at all in the first place).
Why would you consent to being tracked, other than to make the pop-up go away?
GDPR should protect people who don’t understand or don’t care, not the 1% of users who actually read the pop-up.
That’s part of the goal. Most people wouldn’t consent to being tracked, the law requires you to disclose the unethical behaviour. This article is important in light of the recent lawsuit that affirmed that most of the ‘dark patterns’ in this article are not just unethical, they are also illegal. The article says:
When designing the cookie (tracker) window, we only need to keep one thing in mind: transparency.
This is true from a usability perspective but it’s closely related to a critical legal concept: informed consent. Under the GDPR, you may track people if and only if they have explicitly consented and understand what they are consenting to.
More importantly:
Another point we need to pay attention to is the name “cookie”. The real name of these “cookies” is trackers. Their main purpose is to track you. Therefore, it would be more correct to use the name “tracker”. Better not hide it behind a sweet word.
This is also really critical. You don’t need consent to use cookies. You need consent to track visitors. The mechanism that you use is irrelevant. If you use cookies, HTML5 local data, local cache timing from JavaScript, whatever, it’s the effect that requires consent, not the mechanism.
Companies that rely on tracking people have had a big marketing push to try to frame these things as ‘cookie banners’. They are not. They are consent to be tracked banners. If, like GitHub, you don’t record any linked or linkable data for visitors that have not logged into the site, then you don’t need a pop-up banner at all.
What is going on is that you need consent for is to transfer customer data to a third-party, when it’s not specifically required to service the users’ intentions:
Companies that rely on tracking people have had a big marketing push to try to frame these things as ‘cookie banners’. They are not. They are consent to be tracked banners.
This is exactly right.
The directive cited by ad networks that require their publishers hoist this machinery into the view of the user in order to get those higher-paying “targeted” impressions, is the eprivacy directive and it’s not even law yet. Maybe 2022 is the year, but the reality is that this is about the GDPR requires users actually consent to stuff that isn’t to their benefit, and the bid to the regulators is that these forms (a) actually obtain consent to do nasty things and (b) that they provide the best technological measure (another cookie!) towards communicating and distributing that consent. So far, I don’t think the regulators agree, but I also think Europe is a bit distracted at the moment, so I don’t expect it to be resolved very soon either.
We need to keep reminding people these forms are about their intention to do something that isn’t to their benefit, and not about complying with some stupid euro-law, and so when they see one, consider leaving, or if not that consider at least using private browsing/container browsing to keep data-sharing impact to a minimum.
What is going on is that you need consent for is to transfer customer data to a third-party, when it’s not specifically required to service the users’ intentions:
You also require consent to store the data. In some cases the consent is a side effect of some other action (if I create an account on your web site, then there is implied consent for you to retain my username and profile). The missing part for a lot of sites is that you must also provide a mechanism to withdraw consent. If all personal information is tied to an account then a delete-account action is sufficient here as long as it actually deletes the data tied to the account but if you’re collecting information about visitors to your site that don’t create an account and you don’t have a mechanism to for them to request deletion, then you may also be in violation.
It is legitimate to store traffic data to deal with security issues, app crashes and so on. You need consent to process them for other purposes not immediately related to providing the service. E.g. to target content, unless you are in fact a recommendation service.
Also, you cannot ask people to delete account to opt out of profiling that can be enabled separately. You must let them use the service without it, if they wish so.
The law got watered down, because there’s a multi-trillion-dollar adtech surveilance lobby fighting to it death.
Keep in mind that most ad cookie popups are intentionally annoying. The surveilance lobby wants you to associate privacy with annoyance, so that you will blame their malicious (non)compliance on the law and demand less privacy protections from your legislators.
I will go further than that: having consent as a legal base for processing in GDPR is a mistake.
People often misunderstand GDPR as “you need consent to process user data” but that’s completely wrong.
Consent is one of the 6 legal bases for processing, the other 5 (and two in particular: execution of contracts and legitimate interest) are enough in most legitimate cases. So let’s just get rid of consent.
I am not sure. It might steer people into signing risky contracts to “get better prices” at stores. Some shops might pull shady crap like periodic membership fees once it gets normalized to sign contracts online. Then after couple of years sue with interest.
Instead of pop-up dialogs the browser should be able to automatically select the desired behaviour for you. Which means, that it should always deny any tracking. If one really wants to get tracked then they can override that decision manually for each domain.
Instead of diddling with browsers, we should just make it illegal to track users for purposes of advertising or to sell data about users to third parties.
I don’t understand why this hasn’t been solved by browser manufacturers. Couldn’t there be a browser API for requesting cookie permissions? Then:
The browser manufacturers would be less likely to use dark patterns because their incentives are aligned more with users than with sites
The browser could enforce the user’s selection automatically without relying on the site to have special infrastructure in place
You could give the browser a policy to apply to every web site, and never have to deal with this again.
I know this by itself wouldn’t solve the problem, since at first web sites would still (I assume) be required to use banners by law. But if the technology is widespread in browsers, the law can eventually be amended. “If you build it, they will come”.
Couldn’t there be a browser API for requesting cookie permissions?
You’ve fallen into the trap of the ad companies. You do not need consent to store cookies on a client device. That is the lie that they are pushing by calling these things ‘cookie banners’. If you require explicit user action to store cookies then:
You block some legitimate things. For example, DuckDuckGo stores your user preferences in a cookie. This contains the same text string that you can append to the query URL. It is not unique to you in any way (assuming a sufficiently large anonymity set).
You don’t block non-cookie forms of tracking.
That’s precisely the response that the ad vendors want: it harms everyone doing non-evil things and causes backlash against privacy regulations, and it doesn’t stop them from tracking you using any of the other forms of fingerprinting available to them.
The current state seems pretty good to me: sites that spy on their users have to pop up something that annoys their users. Sites that don’t spy on their users don’t. This provides an incentive for people to move. Half the time I see a consent pop-up, my reaction is to just close the tab and go somewhere else for the information. If anything, I’d like to see browser manufacturers (at least, the ones that aren’t Google, since it’s counter to their interests) make these pop ups more annoying.
If Google did not have a massive conflict of interest here, the obvious thing to do would be to penalise sites with consent pop-ups in their search rankings. Other search engines could do that and drive traffic to sites that didn’t spy on their visitors.
The fallout from the recent court case that ruled that some of these ‘dark pattern’ consent pop-ups did not constitute informed consent is likely to be interesting because it means that just sticking an obnoxious banner that drives people to hit ‘I consent to whatever you want because I don’t understand what’s going on’ is not, in fact, the CYA move, it’s a way of increasing your liability.
I like this solution from a technical standpoint but I’m not at all convinced that the interest of browser manufacturers align with those of the users, not the sites. That’s true of Mozilla, to some degree, and maybe of Apple if we’re willing to stretch it a little. But the incentives of Google, who makes the dominating browser today, are pretty much aligned with those of the websites, not those of the users. Real web privacy would make Google pretty much bankrupt.
I am always sad when people recommend this because it is explicitly granting companies permission to spy on you. Consent-O-Matic exists if you want to automate clicking on the tracking banners, it will go through and click all of the deny buttons for you.
What this article fails to mention is that none of the popups demonstrated are necessary, and hence of dubious legality. Rather than design a clear cookie consent banner, just defer displaying it until it is necessary. And no, your Google AdWords cookie is not necessary.
It is also ironic that the article itself is covered by a floating banner:
Plus the idea that the reason these bad dialogs exist because no one’s designed a better one is just … hopelessly misguided. Offering a better alternative won’t make sites switch to it, because they’re doing what they’re doing now because it’s bad, not because it’s good.
Yes. It’s basically a form of civil disobedience.
Basically operating on game theory, hoping the other sites don’t break rank
Uhhhh… that’s an odd analogy.
Civil disobedience is what you do when the law is unjust or immoral; this is more like “the law doesn’t allow us to be as profitable as we would like so we are going to ignore it.”
Civil disobedience doesn’t have to be ethically or morally just.
Hm; never thought about it that way but fair point! It feels a bit off to compare corporate greed with like … Gandhi, but technically it fits.
Yeah I find publishing an article like this through medium a bit ironic and hypocritical.
I’m targeting designers, not companies, in this article. I need to write another article about companies. I think each designer is responsible for their own design and must ensure that their design is ethical.
Yes, but I doubt these dark patterns are being created without the company’s knowledge and explicit direction. So the ethical designer has to be prepared to walk. I guess that’s always the case but likely no one particular dark cookie pattern would seem bad enough on its own to be worth making a stand. But that’s how they get you I guess.
I think @geocar was spot-on in his comment here: this is a social and economical problem, not a technical problem. It can’t be solved by technology.
Better UX is opt-in and, like all UX choices, lends itself easily to blanket statements like “users value efficiency”, which can then be easily distilled into “we want to make it easy for users to get to content so, uh, we just let them opt-in without any hassle, yeah”.
IMHO the right way to “fix the unethical design of cookie consent windows” is 100% non-technical: teach users that websites that don’t have clearly labeled “opt out” and “opt in” buttons, with default being out, are unsafe. Not “spyware”, that gives rise to all sorts of relativistic talk about what is and isn’t spyware. Unsafe is good enough, and we can get a new catchy term for it if we really must, like trackware or whatever. If they aren’t doing anything shady, then they have no reason not to be upfront about it. Sure, the fact that they’re not being upfront about this doesn’t mean they’re doing anything shady with your data, but why take the chance?
Plenty of nagware back in the day really didn’t do much other than nag you, and we still treated it as unsafe. The fact that a company engages in data siphoning while also holding a thick stack of documents written in the peculiar Legalese dialect of English shouldn’t really matter.
It’s not impossible. There’s an entire generation of Internet users, including 100% non-technical people like my parents (literally), who understands that those “chat to hot singles in your area” banners are bad. We should teach the next generation of Internet users that websites with weird cookie consent windows are bad. That is the kind of thing that makes media owners improve them, not UX advice.
It’s not like this is cold fusion technology: we’ve known how to make efficient “yes/no” dialogs for like thirty years now, so there is no lack of knowledge on how to make clear cookie consent dialogs. The ones currently in use don’t have “bad UX” because the designers can’t figure out a better one – they have exactly the UX that they need to have. By the internal metrics which are used for promotion and bonus payment, they’re excellent, and the designers who came up with all this crap earned them. If you’re one of them, all this article is literally bad advice.
I realize this would hurt the Internet advertising industry. That would break my heart only slightly more than something that would hurt the tobacco industry.
“Teaching users” is the wrong approach. Users cannot be taught. Make it a crime to collect unnecessary data.
Teaching all users is probably impossible, as evidenced by the fact that e.g. phishing is still a thing. However, teaching enough of them that tracking becomes unpalatable for mainstream companies is, I think, not just easier to achieve than modifying legislation in a useful manner, but also much more efficient.
Media and advertising companies have a lot of political influence. Even the GDPR, which is neither very strict nor anywhere near as harsh as it should be IMHO, was adopted with some difficulty and against media campaigns that had eye-watering budgets (think prime-time talk shows where “analysts” took turns explaining how the GDPR is going to make it impossible for European companies to sell goods and services).
Even if this could be achieved, I think one of the things that we’ve learned after 30+ years of cyberspace regulation is that regulation and legislation is inherently reactive. By the time an abusive practice, like online tracking, is becoming so problematic as to not only require regulation, but overcome the resistance of the industry lobby, it’s already so tightly ingrained that it’s hard to root it out, and it has produced so much industry expertise that the industry will quickly wiggle its way around any legal definition and any legal measure. In this case, I bet any attempt to criminalise the collection of unnecessary data will quickly be rendered useless in court by small tweaks which will allow it to circumvent the definition of “necessary”. User education, on the other hand, is proactive: once a critical mass of users thinks data collection is fishy, attempts to circumvent that wisdom will seem even fishier.
Laws are good, and it’s bad to be fatalistic about the possibility of making good laws.
The CAN SPAM Act didn’t can all spam, but it did drive spam underground and make it so that every legitimate newsletter has a not that hard to find unsubscribe link. It was very effective and makes my life better every day. We can and should make more laws like that.
I also speak Japanese and I will never be able to unsubscribe from a Japanese newsletter I signed up for ten years ago because they don’t have an equivalent of the CAN SPAM Act.
I’m not fatalistic about the possibility of making any good laws, just laws that make it hard for politicians to stay in power :). The CAN-SPAM Act is, in fact, my gauge for what kind of legislation can be passed and under what circumstances.
Back when it was passed, spam was recognized as a problem but CAN-SPAM regulated just enough of it to be palatable – enough to make a difference but still so little of it that we jokingly referred it as the You-Can-Spam act. Even companies that were literally in the spamming business could generally adapt to it unless they were already doing something illegal and the spam was just the front.
Criminalising unnecessary data collection, on the other hand, would make the entire business model of even some large companies illegal, and it’s worth noting that, at least in parts of the EU, some of these companies are either the owners of, or are tied to the owners of, various important media outlets. For an MP, supporting this kind of legislation would be the equivalent of committing media suicide. Unnecessary data collection is also one of the big facilitators of modern election campaigns – it’s unlikely that political parties would be willing to support a move against it.
I’m not at all opposed to the passing of laws, and I agree that it would be better to have legislation that makes unnecessary data collection illegal. But I don’t think this is possible in a 5-10-year timeframe.
The cookie consent window is one of the worst things to happen to the internet. I get the idea but the execution just annoys everyone on all sides of the aisle, and I doubt people feel any safer (if they cared at all in the first place).
Why would you consent to being tracked, other than to make the pop-up go away?
GDPR should protect people who don’t understand or don’t care, not the 1% of users who actually read the pop-up.
That’s part of the goal. Most people wouldn’t consent to being tracked, the law requires you to disclose the unethical behaviour. This article is important in light of the recent lawsuit that affirmed that most of the ‘dark patterns’ in this article are not just unethical, they are also illegal. The article says:
This is true from a usability perspective but it’s closely related to a critical legal concept: informed consent. Under the GDPR, you may track people if and only if they have explicitly consented and understand what they are consenting to.
More importantly:
This is also really critical. You don’t need consent to use cookies. You need consent to track visitors. The mechanism that you use is irrelevant. If you use cookies, HTML5 local data, local cache timing from JavaScript, whatever, it’s the effect that requires consent, not the mechanism.
Companies that rely on tracking people have had a big marketing push to try to frame these things as ‘cookie banners’. They are not. They are consent to be tracked banners. If, like GitHub, you don’t record any linked or linkable data for visitors that have not logged into the site, then you don’t need a pop-up banner at all.
A minor nit;
You actually don’t even need consent to do this.
What is going on is that you need consent for is to transfer customer data to a third-party, when it’s not specifically required to service the users’ intentions:
This is exactly right.
The directive cited by ad networks that require their publishers hoist this machinery into the view of the user in order to get those higher-paying “targeted” impressions, is the eprivacy directive and it’s not even law yet. Maybe 2022 is the year, but the reality is that this is about the GDPR requires users actually consent to stuff that isn’t to their benefit, and the bid to the regulators is that these forms (a) actually obtain consent to do nasty things and (b) that they provide the best technological measure (another cookie!) towards communicating and distributing that consent. So far, I don’t think the regulators agree, but I also think Europe is a bit distracted at the moment, so I don’t expect it to be resolved very soon either.
We need to keep reminding people these forms are about their intention to do something that isn’t to their benefit, and not about complying with some stupid euro-law, and so when they see one, consider leaving, or if not that consider at least using private browsing/container browsing to keep data-sharing impact to a minimum.
You also require consent to store the data. In some cases the consent is a side effect of some other action (if I create an account on your web site, then there is implied consent for you to retain my username and profile). The missing part for a lot of sites is that you must also provide a mechanism to withdraw consent. If all personal information is tied to an account then a delete-account action is sufficient here as long as it actually deletes the data tied to the account but if you’re collecting information about visitors to your site that don’t create an account and you don’t have a mechanism to for them to request deletion, then you may also be in violation.
It is legitimate to store traffic data to deal with security issues, app crashes and so on. You need consent to process them for other purposes not immediately related to providing the service. E.g. to target content, unless you are in fact a recommendation service.
Also, you cannot ask people to delete account to opt out of profiling that can be enabled separately. You must let them use the service without it, if they wish so.
The law got watered down, because there’s a multi-trillion-dollar adtech surveilance lobby fighting to it death.
Keep in mind that most ad cookie popups are intentionally annoying. The surveilance lobby wants you to associate privacy with annoyance, so that you will blame their malicious (non)compliance on the law and demand less privacy protections from your legislators.
I will go further than that: having consent as a legal base for processing in GDPR is a mistake.
People often misunderstand GDPR as “you need consent to process user data” but that’s completely wrong.
Consent is one of the 6 legal bases for processing, the other 5 (and two in particular: execution of contracts and legitimate interest) are enough in most legitimate cases. So let’s just get rid of consent.
I am not sure. It might steer people into signing risky contracts to “get better prices” at stores. Some shops might pull shady crap like periodic membership fees once it gets normalized to sign contracts online. Then after couple of years sue with interest.
But conceptually I agree with you.
Instead of pop-up dialogs the browser should be able to automatically select the desired behaviour for you. Which means, that it should always deny any tracking. If one really wants to get tracked then they can override that decision manually for each domain.
Update Someone else already suggested the same in a comment down below.
Instead of diddling with browsers, we should just make it illegal to track users for purposes of advertising or to sell data about users to third parties.
I think diddling with browsers has shown itself to be much more productive at blocking trackers than the law.
My favorites are the ones Google News sends me to often. They work on desktop, but on mobile you can only see an option to accept all..
I don’t understand why this hasn’t been solved by browser manufacturers. Couldn’t there be a browser API for requesting cookie permissions? Then:
I know this by itself wouldn’t solve the problem, since at first web sites would still (I assume) be required to use banners by law. But if the technology is widespread in browsers, the law can eventually be amended. “If you build it, they will come”.
You’ve fallen into the trap of the ad companies. You do not need consent to store cookies on a client device. That is the lie that they are pushing by calling these things ‘cookie banners’. If you require explicit user action to store cookies then:
That’s precisely the response that the ad vendors want: it harms everyone doing non-evil things and causes backlash against privacy regulations, and it doesn’t stop them from tracking you using any of the other forms of fingerprinting available to them.
I understand these criticisms and they’re valid, but I don’t currently see a realistic alternative that would be better - do you know of one?
The current state seems pretty good to me: sites that spy on their users have to pop up something that annoys their users. Sites that don’t spy on their users don’t. This provides an incentive for people to move. Half the time I see a consent pop-up, my reaction is to just close the tab and go somewhere else for the information. If anything, I’d like to see browser manufacturers (at least, the ones that aren’t Google, since it’s counter to their interests) make these pop ups more annoying.
If Google did not have a massive conflict of interest here, the obvious thing to do would be to penalise sites with consent pop-ups in their search rankings. Other search engines could do that and drive traffic to sites that didn’t spy on their visitors.
The fallout from the recent court case that ruled that some of these ‘dark pattern’ consent pop-ups did not constitute informed consent is likely to be interesting because it means that just sticking an obnoxious banner that drives people to hit ‘I consent to whatever you want because I don’t understand what’s going on’ is not, in fact, the CYA move, it’s a way of increasing your liability.
Make tracking completely illegal.
I like this solution from a technical standpoint but I’m not at all convinced that the interest of browser manufacturers align with those of the users, not the sites. That’s true of Mozilla, to some degree, and maybe of Apple if we’re willing to stretch it a little. But the incentives of Google, who makes the dominating browser today, are pretty much aligned with those of the websites, not those of the users. Real web privacy would make Google pretty much bankrupt.
Related to this subject: https://www.i-dont-care-about-cookies.eu/ is essential to browse the web nowadays.
I am always sad when people recommend this because it is explicitly granting companies permission to spy on you. Consent-O-Matic exists if you want to automate clicking on the tracking banners, it will go through and click all of the deny buttons for you.
It tries to hide the banner first but yeah, Consent-O-Matic looks better, thanks!