I can’t afford what is likely tens of thousand to go through all the legal and technical hoops over a prolonged period of time
Where does the author get that “tens of thousands” figure from?
I can totally understand not wanting to bother with extra bureaucracy for a volunteer-run forum… but it’s just a risk assessment, it’s not that much work. The forum is already moderated so I doubt there’s anything extra to do besides fill in one risk assessment and perhaps check to see if it needs updating once a year or so.
Again, I can understand not wanting to go through the hassle of writing down a formal risk assessment, especially for a forum that you run as an unpaid volunteer, but it’s a bit disingenuous to suggest that this is an enormous legal or financial burden.
It isn’t “just” a risk assessment. The forums allow image uploads, and the law requires scanning for CSAM. It isn’t clear such scanning is possible for a site like this.
Also, they are fairly clear where they reckon the burden will come from:
disproportionately high personal liability for me, and one that could easily be weaponised by disgruntled people (trolls) who are banned for their egregious behaviour
Where did you get that from? I know the author claimed that but I don’t think it is true. I have read Ofcom’s published guidance and it does not say this.
Hash-based scanning will only be required for services deemed to be at high risk of facilitating the sharing of CSAM. Ofcom have published detailed guidance for performing a risk assessment, which I think makes clear that a forum such as LFGSS would not be in this category.
Just to be clear, I’m not saying that the author is wrong to shut down the forum. I completely understand not wanting to risk liability. I just think it is worth trying to be accurate when discussing the law.
While the categories (pages 1-2) call Lobsters and LFGSS a “smaller site”, they’re also both “Multi-Risk services” which may require ICU C9 and ICU C10 depending on whether they have 700K UK users or a “file-storage” or “file-sharing” service, which I can’t find a definition for. Even just following the cross-references in these summaries to start understanding what rules apply is a significant project and contains enough legal jargon or implicit legal knowledge that I wouldn’t trust a non-lawyer’s summary.
One of my go-to examples is that occasionally you’ll see a law or legal document that mentions it’ll evaluate whether a reasonable person would do such-and-such. As a non-lawyer that’s easy to read right over and go “sure, of course, no reasonable person would drive on the sidewalk” or whatever the topic is and feel like you’ve understood the law. But reasonable person is actually a 500-year old bit of legal jargon that depends heavily on case law particular to one’s jurisdiction and intersects with dozens of other terms. It requires significant study to understand and probably research specific to the particular legal scenario. A tiny, seemingly-obvious phrase can be a project in itself, and they exist in a set of principles and exceptions that are not at all the default that a programmer’s intuition would lead me to.
So, when Ofcom “recommends” standards based on categories, I’m not confident I understand how much legal liability that’s imposing, let alone that I understand the categories. I am not trying to say we have to assume the most draconian definitions and expected enforcement. I’m saying that I’ve been caught out by these easy misunderstandings so many times that I think even understanding these laws is a large project requiring professional expertise, and I am particularly skeptical of the interpretations of laws and contracts written by programmers, including myself.
That’s exactly the document that I think shows LFGSS wouldn’t have to do this. The third column of that table, “Who should implement this”, does not include any category containing LFGSS. Does it?
I hit “Post” too quick and greatly expanded my post to make a larger point about understanding the laws. To explain a confusing situation, this response by c– was posted when my comment was only the opening sentence about the one table; everything else appeared a few minutes after their reply.
EDIT: Also, yes, LFGSS is likely file a storage/sharing sites because they have image uploading. That feature that is also in-development for Lobsters to replace a third-party dependency for avatars.
I don’t disagree with your position regarding legal categories at all.
You said:
I am not trying to say we have to assume the most draconian definitions and expected enforcement.
I’m glad. My comments here were a reaction to the author doing just that. I don’t claim to know precisely how this new legislation will play out in the courts, or how Ofcom will actually enforce it, but I don’t think it is helpful to just assume that it is an awful, draconian measure that will place ridiculous burdens on people.
You’re right that we don’t yet know exactly how this will work. But a reasonable reading of Ofcom’s guidance does not say that, for example, every site with image uploads must implement hash-based CSAM scanning. I mean, the paragraph after the table you cite explicitly explains why Ofcom are not recommending that.
Similarly, Ofcom themselves say:
Providers’ safety duties are proportionate to factors including the risk of harm to individuals, and the size and capacity of each provider. This makes sure that while safety measures will need to be put in place across the board, we aren’t requiring small services with limited functionality to take the same actions as the largest corporations. Ofcom is required to take users’ rights into account when setting out steps to take.
My optimistic gullibility might be completely wrong but I just don’t think it is helpful to assume the worst (and especially not to post that as fact).
So, when Ofcom “recommends” standards based on categories, I’m not confident I understand how much legal liability that’s imposing, let alone that I understand the categories.
I think my skepticism of programmer lawyering is similar to yours. But I’ll add what either amounts to a little bit of it anyway, or what (my intention) amounts to a question for a real lawyer.
They unambiguously intend their document to be readable by and usable for non-lawyers and even non-UK-persons. Does attempting to comply based on a lay understanding of the categories outlined provide any shield? I would think geoblocking UK IPs would be a decent shield, but I sure hope it doesn’t come to that.
It seems like a site like lobste.rs is so far outside the reasonable intent of this law that it shouldn’t come into play. But I’m not the one running it and I don’t go to the UK, so my arse landing in gaol doesn’t register as much of a risk. I really hope this gets clearer before it becomes a considerable risk for you.
To elaborate on fanf’s point, access to APIs to scan for CSAM based on photo hashes has pretty steep requirements because of the risk of infiltration. CSAM groups have demonstrated considerable technical sophistication to evade law enforcement. Giving one the ability to test against photo hash databases + algorithms would be real bad.
Not to minimize the effects of this act on site operators based in the UK, but I don’t really see how Ofcom can act against sites outside the UK - other than demand that access from the UK is restricted.
Also, while I am not imtimately knowledgeable about regulation in the UK, if it’s anything like the common-law system in the US, the law is written quite broadly but can be pared down via lawsuits. Of course, no-one wants to be the other party in such a suit.
The international reach of the UK OSA is similar to the Australian OSA I talked about on a recent office hours stream. I’ve worked closely with product legal counsel and have friends + family who are lawyers so I have some idea how lawyers work and what they’re concerned by, but I’m still not a lawyer, let alone in other countries. I’ve read this linked post and started reading the UK law + Ofcom pages (for fellow Americans, Ofcom seems roughly equivalent to the FCC, with a similar pattern of the law delegating specifics to them as a regulating authority), and much less is determined about the AU law. There have been similar laws in poor despotic countries, but those have been so unlikely to attempt to be enforced against the site that I’ve ignored them; wealthy anglosphere countries that might score local political points enforcing laws against American Big Tech/hegemony/politics/culture are enormously different.
With those big caveats, both laws are clear that they claim authority over this forum if it is accessed by individuals in their countries, even though it is a noncommercial hobbyist site run by an American on American servers. The time, resources, and money required to comply with these laws’ registrations and formalizations vastly overwhelm any hobbyist forum’s resources. While we mostly discuss software development, software now mediates intimate relationships so we’ve had infrequent mentions of sex and pornography; it’s quite possible both laws would apply their additional burdens related to pornography, CSAM, and sex trafficking. The costs for non-compliance with the laws are measured in the tens of millions USD, and the UK law includes criminal penalties. It’s not hyperbole to say that these two jurisdictions are outlawing the small mailing lists, forums, and software forges that have quietly been the bulk of online communities for the last 50 years. Only large commercial services can afford to comply with these regulations, especially given the ruinous risks to imperfect compliance.
So, what to do?
Well, for completeness/prebuttal I’ll mention some really shitty options that would result in the death of the community:
Shut down like LFGSS.
Migrate to a commercial forum host like Reddit or Facebook that’s large enough to afford to comply, torching the site’s history, culture, community, and functionality. This would be shutting down with extra steps.
Ignore the laws and continue normally. Both laws would allow any troll to post something that violates the law and report the site to a regulator for destruction.
Burn a few thousand hours of volunteer time attempting to understand and get set up with these laws, plus hundreds more on an ongoing basis. There is no chance of meeting these laws’ requirements without legal advice, which is especially expensive and uncertain around novel regulations with no case law. I’d expect that to cost at least $100k USD to set up and $10k/y thereafter. There’s also the option to make the project 10x larger and more expensive to provide a general guide to other hobbyist forums. While I happily foot the bill for Lobsters, these costs are quite a bit beyond my hobby budget and would force a lot of overhead and culture change to take donations, sell merch, or otherwise attempt to raise funds.
And then there’s also some bad options:
Buy a subscription to an IP geolocation database and ban the UK and AU. Those aren’t perfectly accurate and people in those jurisdictions might use a VPN (whether commercial or homebrew like wireguarding to a US friend’s home connection), so the laws would still apply. The attempt to ban those jurisdictions might be enough of a good faith effort that a regulator wouldn’t want to act against the forum, but there’s still significant risk here. I haven’t attempted to geolocate our users but I’d guess we lose 20% of active users.
Lobby the US government to pass some sort of home turf law that US entities can ignore laws like these, or create an international treaty to similar effect. This seems unlikely to succeed in the 5 months before the UK OSA takes effect. (This is not an invitation to start a political fight in the comments here, be cool.)
Lobby the AU/UK lawmakers to fix their laws. I know very little about their politics but don’t much expect them to welcome foreigners explaining they should rewrite their brand new marquee laws, so I’m going to guess this is less likely than the prior option.
Get a commitment from an American rights group like the EFF or ACLU that they’d defend Lobsters as a test case to establish case law protecting American/noncommercial communities from these laws. Being a test case would be a huge time commitment that’d probably kill my current entrepreneurial project but might be worth it. Well, assuming a win; I don’t think I’d be too satisfied if I give up on that and then also am bankrupted and go to prison in the UK.
And then I don’t see any good options with a low cost, low effort, and good chances of normalcy. These laws were written to regulate large commercial platforms and come with burdens that only extant large commercial platforms can afford.
I would especially appreciate it folks could share links to writeups by lawyers in the relevant jurisdictions, groups of online communities that are faced with these laws, or statements by the regulators relevant to our kind of community (American, hobbyist, noncommercial). I don’t need programmer-playing-lawyer interpretations of the laws as I can generate those myself.
Of course. Or holding someone up when they cross border to one of the mentioned countries. And I would love @pushcx to be able to travel to Australia and the UK unhindered.
I imagine an alternate reality, where @pushcx seeks refuge in an embassy to prevent being extradited to the UK, and put on an international most wanted list for operating a highly-illegal technical discussion and link-sharing forum.
We are all suspects. Why don’t we use the big services like the other people that have nothing to hide?
Enforcement is expensive and subject to public scrutiny. Ofcom have been very clear about the purpose behind these regulations and it is obviously not to shut down community forums of this sort. The idea that they’d undertake an enforcement action against LFGSS is pretty silly.
However, that’s easy for me to say: I’m not financially liable if it turns out that I’m wrong. I can see why a volunteer forum administrator might not want to take the risk, however small it is.
I don’t think it’s silly at all. It’s an attack vector. Somebody who doesn’t like your site can upload something illegal and then report the site to the “authorities”. Repeat as necessary until the site is shut down.
This law places certain duties on online service providers. Essentially, you have to show that you have considered how your site could be used to share certain categories of illegal or “legal but harmful” material. You must then put into place mitigations appropriate to your level of risk, such as content moderation, or a way for users to report illegal material.
The law doesn’t change the existing penalties for deliberate possession or dissemination of illegal material.
I don’t disbelieve you, but I also doubt that you are a lawyer in the UK. This is a new law, apparently pretty broad and ambiguous in certain aspects of its scope. Its enforcement regime has not yet begun, so we can only speculate about that based on existing laws and cases. I can understand why small operators are worried about being made examples of, and the mere perception of that risk is enough to create a chilling effect.
The thing that surprised me the most was that the admin was running full business just as themselves, without any business entity such as a LLC. While it’s possible to do this, at least in the US, it seems like the least amount of protection to reduce risk. I’m surprised they didn’t set something up.
It doesn’t have to be a large for profit business to take advantage of a legal business entity.
The first reaction of the site admin was to shutter the site and close it completely. Maybe if they had a little legal protection they would feel more comfortable with another solution first.
Maybe “do more paperwork and pay more money to the government just in case, to reduce your risk of getting screwed over by the government someday” isn’t the model we want to encourage, especially absent evidence that it actually helps. It normalizes asking permission for things that need no permission.
The discourse meta has discussions about the effects of this as well.
Where does the author get that “tens of thousands” figure from?
I can totally understand not wanting to bother with extra bureaucracy for a volunteer-run forum… but it’s just a risk assessment, it’s not that much work. The forum is already moderated so I doubt there’s anything extra to do besides fill in one risk assessment and perhaps check to see if it needs updating once a year or so.
Again, I can understand not wanting to go through the hassle of writing down a formal risk assessment, especially for a forum that you run as an unpaid volunteer, but it’s a bit disingenuous to suggest that this is an enormous legal or financial burden.
It isn’t “just” a risk assessment. The forums allow image uploads, and the law requires scanning for CSAM. It isn’t clear such scanning is possible for a site like this.
Also, they are fairly clear where they reckon the burden will come from:
Where did you get that from? I know the author claimed that but I don’t think it is true. I have read Ofcom’s published guidance and it does not say this.
Hash-based scanning will only be required for services deemed to be at high risk of facilitating the sharing of CSAM. Ofcom have published detailed guidance for performing a risk assessment, which I think makes clear that a forum such as LFGSS would not be in this category.
Just to be clear, I’m not saying that the author is wrong to shut down the forum. I completely understand not wanting to risk liability. I just think it is worth trying to be accurate when discussing the law.
This Ofcom doc, the table spanning pages 15-16.
While the categories (pages 1-2) call Lobsters and LFGSS a “smaller site”, they’re also both “Multi-Risk services” which may require ICU C9 and ICU C10 depending on whether they have 700K UK users or a “file-storage” or “file-sharing” service, which I can’t find a definition for. Even just following the cross-references in these summaries to start understanding what rules apply is a significant project and contains enough legal jargon or implicit legal knowledge that I wouldn’t trust a non-lawyer’s summary.
One of my go-to examples is that occasionally you’ll see a law or legal document that mentions it’ll evaluate whether a reasonable person would do such-and-such. As a non-lawyer that’s easy to read right over and go “sure, of course, no reasonable person would drive on the sidewalk” or whatever the topic is and feel like you’ve understood the law. But reasonable person is actually a 500-year old bit of legal jargon that depends heavily on case law particular to one’s jurisdiction and intersects with dozens of other terms. It requires significant study to understand and probably research specific to the particular legal scenario. A tiny, seemingly-obvious phrase can be a project in itself, and they exist in a set of principles and exceptions that are not at all the default that a programmer’s intuition would lead me to.
So, when Ofcom “recommends” standards based on categories, I’m not confident I understand how much legal liability that’s imposing, let alone that I understand the categories. I am not trying to say we have to assume the most draconian definitions and expected enforcement. I’m saying that I’ve been caught out by these easy misunderstandings so many times that I think even understanding these laws is a large project requiring professional expertise, and I am particularly skeptical of the interpretations of laws and contracts written by programmers, including myself.
That’s exactly the document that I think shows LFGSS wouldn’t have to do this. The third column of that table, “Who should implement this”, does not include any category containing LFGSS. Does it?
I hit “Post” too quick and greatly expanded my post to make a larger point about understanding the laws. To explain a confusing situation, this response by c– was posted when my comment was only the opening sentence about the one table; everything else appeared a few minutes after their reply.
EDIT: Also, yes, LFGSS is likely file a storage/sharing sites because they have image uploading. That feature that is also in-development for Lobsters to replace a third-party dependency for avatars.
I don’t disagree with your position regarding legal categories at all.
You said:
I’m glad. My comments here were a reaction to the author doing just that. I don’t claim to know precisely how this new legislation will play out in the courts, or how Ofcom will actually enforce it, but I don’t think it is helpful to just assume that it is an awful, draconian measure that will place ridiculous burdens on people.
You’re right that we don’t yet know exactly how this will work. But a reasonable reading of Ofcom’s guidance does not say that, for example, every site with image uploads must implement hash-based CSAM scanning. I mean, the paragraph after the table you cite explicitly explains why Ofcom are not recommending that.
Similarly, Ofcom themselves say:
My optimistic gullibility might be completely wrong but I just don’t think it is helpful to assume the worst (and especially not to post that as fact).
I think you should’ve hatted this comment.
I think my skepticism of programmer lawyering is similar to yours. But I’ll add what either amounts to a little bit of it anyway, or what (my intention) amounts to a question for a real lawyer.
They unambiguously intend their document to be readable by and usable for non-lawyers and even non-UK-persons. Does attempting to comply based on a lay understanding of the categories outlined provide any shield? I would think geoblocking UK IPs would be a decent shield, but I sure hope it doesn’t come to that.
It seems like a site like lobste.rs is so far outside the reasonable intent of this law that it shouldn’t come into play. But I’m not the one running it and I don’t go to the UK, so my arse landing in gaol doesn’t register as much of a risk. I really hope this gets clearer before it becomes a considerable risk for you.
To elaborate on fanf’s point, access to APIs to scan for CSAM based on photo hashes has pretty steep requirements because of the risk of infiltration. CSAM groups have demonstrated considerable technical sophistication to evade law enforcement. Giving one the ability to test against photo hash databases + algorithms would be real bad.
This is a really hard problem, however, IFTAS has offerings around that now. https://about.iftas.org/activities/moderation-as-a-service/content-classification-service/
@pushcx in running Lobsters do you experience similar legal hoops and user threats as Velocio the LFGSs admin?
Not to minimize the effects of this act on site operators based in the UK, but I don’t really see how Ofcom can act against sites outside the UK - other than demand that access from the UK is restricted.
Also, while I am not imtimately knowledgeable about regulation in the UK, if it’s anything like the common-law system in the US, the law is written quite broadly but can be pared down via lawsuits. Of course, no-one wants to be the other party in such a suit.
The international reach of the UK OSA is similar to the Australian OSA I talked about on a recent office hours stream. I’ve worked closely with product legal counsel and have friends + family who are lawyers so I have some idea how lawyers work and what they’re concerned by, but I’m still not a lawyer, let alone in other countries. I’ve read this linked post and started reading the UK law + Ofcom pages (for fellow Americans, Ofcom seems roughly equivalent to the FCC, with a similar pattern of the law delegating specifics to them as a regulating authority), and much less is determined about the AU law. There have been similar laws in poor despotic countries, but those have been so unlikely to attempt to be enforced against the site that I’ve ignored them; wealthy anglosphere countries that might score local political points enforcing laws against American Big Tech/hegemony/politics/culture are enormously different.
With those big caveats, both laws are clear that they claim authority over this forum if it is accessed by individuals in their countries, even though it is a noncommercial hobbyist site run by an American on American servers. The time, resources, and money required to comply with these laws’ registrations and formalizations vastly overwhelm any hobbyist forum’s resources. While we mostly discuss software development, software now mediates intimate relationships so we’ve had infrequent mentions of sex and pornography; it’s quite possible both laws would apply their additional burdens related to pornography, CSAM, and sex trafficking. The costs for non-compliance with the laws are measured in the tens of millions USD, and the UK law includes criminal penalties. It’s not hyperbole to say that these two jurisdictions are outlawing the small mailing lists, forums, and software forges that have quietly been the bulk of online communities for the last 50 years. Only large commercial services can afford to comply with these regulations, especially given the ruinous risks to imperfect compliance.
So, what to do?
Well, for completeness/prebuttal I’ll mention some really shitty options that would result in the death of the community:
And then there’s also some bad options:
And then I don’t see any good options with a low cost, low effort, and good chances of normalcy. These laws were written to regulate large commercial platforms and come with burdens that only extant large commercial platforms can afford.
I would especially appreciate it folks could share links to writeups by lawyers in the relevant jurisdictions, groups of online communities that are faced with these laws, or statements by the regulators relevant to our kind of community (American, hobbyist, noncommercial). I don’t need programmer-playing-lawyer interpretations of the laws as I can generate those myself.
Thanks a lot for this reply. The situation is much more serious than I thought. Thanks for taking the time to research this.
Doesn’t the international reach require extradition or similar to enforce?
Of course. Or holding someone up when they cross border to one of the mentioned countries. And I would love @pushcx to be able to travel to Australia and the UK unhindered.
I imagine an alternate reality, where @pushcx seeks refuge in an embassy to prevent being extradited to the UK, and put on an international most wanted list for operating a highly-illegal technical discussion and link-sharing forum.
We are all suspects. Why don’t we use the big services like the other people that have nothing to hide?
Enforcement is expensive and subject to public scrutiny. Ofcom have been very clear about the purpose behind these regulations and it is obviously not to shut down community forums of this sort. The idea that they’d undertake an enforcement action against LFGSS is pretty silly.
However, that’s easy for me to say: I’m not financially liable if it turns out that I’m wrong. I can see why a volunteer forum administrator might not want to take the risk, however small it is.
I don’t think it’s silly at all. It’s an attack vector. Somebody who doesn’t like your site can upload something illegal and then report the site to the “authorities”. Repeat as necessary until the site is shut down.
I’m not sure how your proposed scenario is relevant to this new law. You can already report illegal material, without this law.
My understanding is that this law increases the severity of the consequences.
This law places certain duties on online service providers. Essentially, you have to show that you have considered how your site could be used to share certain categories of illegal or “legal but harmful” material. You must then put into place mitigations appropriate to your level of risk, such as content moderation, or a way for users to report illegal material.
The law doesn’t change the existing penalties for deliberate possession or dissemination of illegal material.
I don’t disbelieve you, but I also doubt that you are a lawyer in the UK. This is a new law, apparently pretty broad and ambiguous in certain aspects of its scope. Its enforcement regime has not yet begun, so we can only speculate about that based on existing laws and cases. I can understand why small operators are worried about being made examples of, and the mere perception of that risk is enough to create a chilling effect.
The thing that surprised me the most was that the admin was running full business just as themselves, without any business entity such as a LLC. While it’s possible to do this, at least in the US, it seems like the least amount of protection to reduce risk. I’m surprised they didn’t set something up.
I think “full business” is a strong term, they run donation-supported web forums. Hardly a business.
I think even with donations (maybe especially so) incorporating as a business is a good idea to avoid comingling funds.
The fundamental issue would remain though, that the act enforces compliance demands on volunteer web services.
It doesn’t have to be a large for profit business to take advantage of a legal business entity.
The first reaction of the site admin was to shutter the site and close it completely. Maybe if they had a little legal protection they would feel more comfortable with another solution first.
Maybe “do more paperwork and pay more money to the government just in case, to reduce your risk of getting screwed over by the government someday” isn’t the model we want to encourage, especially absent evidence that it actually helps. It normalizes asking permission for things that need no permission.