Jump to content

Featured Posts

Posted

https://www.theregister.com/2025/01/14/online_safety_act/?td=rt-3a

 

Analysis A little more than two months out from its first legal deadline, the UK’s Online Safety Act is causing concern among smaller online forums caught within its reach. The legislation, which came into law in the autumn of 2023, applies to search services and services that allow users to post content online or to interact with each other.

The government says it is designed to protect children and adults online and creates new legal duties for the online platforms and apps affected.

When one thinks of online harms – death threats, revenge porn, suicide encouragement etc. – one thinks of the largest global platforms and services. But over the decades small hobbyist forums have sprung up on even the most niche topics, while retailers often allow customers to chat to each other to share ideas, challenges, and solutions.

Estimates suggest 100,000 such services will have to comply with the act, and some feel they don't have resources to do the compliance work, or that their content is not relevant to the kind of harm the law is designed to prevent.

 

Who has to publish a summary risk assessment on their website? Sites with 34 million or more monthly active UK users. The threshold is lower for services that allow users to forward or reshare user-generated content and use a content recommender system: 7 million or more UK users. Organizations with a smaller number of users will simply have to have the document ready should Ofcom request it.

Posted

There will be "boilerplate" forum risk assessment documents available, no doubt. It would be a C&P of that (and maybe even actually publish it, even though there's no need). I took a look at the link, and here is a snippet:

 

What activity is the Online Safety Act trying to prevent?

  • offences related to information likely to be of use to a terrorist and offences relating to training for terrorism
  • hate offences such as stirring up of racial hatred offence and stirring up of hatred on the basis of religion or sexual orientation
  • sexual exploitation of adults such as causing or inciting prostitution for gain offence
  • human trafficking
  • assisting or encouraging suicide offence
  • the unlawful supply, offer to supply, of controlled drugs, and the unlawful supply, or offer to supply, of articles for administering or preparing controlled drugs
  • weapons offences such as offences relating to firearms, their parts, ammunition, including air guns and shotguns

 

Since the forum is more/less in the "family friendly" end of the spectrum on the internet, its pretty easy to align the forum's rules to comply with the above without altering the utility of the forum as-is. Put simply, there will be an end to unmoderated forums (for example, Thunderboat wouldn't have survived were it still going) but this one, no real issue. 

  • Greenie 1
Posted
57 minutes ago, IanD said:

https://www.theregister.com/2025/01/14/online_safety_act/?td=rt-3a

 

Analysis A little more than two months out from its first legal deadline, the UK’s Online Safety Act is causing concern among smaller online forums caught within its reach. The legislation, which came into law in the autumn of 2023, applies to search services and services that allow users to post content online or to interact with each other.

The government says it is designed to protect children and adults online and creates new legal duties for the online platforms and apps affected.

When one thinks of online harms – death threats, revenge porn, suicide encouragement etc. – one thinks of the largest global platforms and services. But over the decades small hobbyist forums have sprung up on even the most niche topics, while retailers often allow customers to chat to each other to share ideas, challenges, and solutions.

Estimates suggest 100,000 such services will have to comply with the act, and some feel they don't have resources to do the compliance work, or that their content is not relevant to the kind of harm the law is designed to prevent.

 

Who has to publish a summary risk assessment on their website? Sites with 34 million or more monthly active UK users. The threshold is lower for services that allow users to forward or reshare user-generated content and use a content recommender system: 7 million or more UK users. Organizations with a smaller number of users will simply have to have the document ready should Ofcom request it.

 

I can't see this as being a problem for CWDF since there is a merry band of Mods who spring into action when the wrong thing is said about toilets. No doubt there will be a template "risk assessment" available on line to be slightly amended to fit.

  • Greenie 1
Posted (edited)

Maybe it will be the end of the Political Section that does degenerate to personal threats and insults, but I notice it is far more civilised when a very few posters are not contributing. I suppose the question is, is any extra modding effort worth it, or should that area be shut down.

 

I still can't understand why so-called platforms that facilitate publishing of user generated posts can not be subject to the same legal sanctions as the press and individuals, so they can be sued in UK courts or otherwise legally sanctioned. As long as offending content was taken down within X hours of being reported then no further action, but if not they are as liable as the originator.

Edited by Tony Brooks
Posted (edited)
49 minutes ago, Paul C said:

There will be "boilerplate" forum risk assessment documents available, no doubt. It would be a C&P of that (and maybe even actually publish it, even though there's no need). I took a look at the link, and here is a snippet:

 

What activity is the Online Safety Act trying to prevent?

  • offences related to information likely to be of use to a terrorist and offences relating to training for terrorism
  • hate offences such as stirring up of racial hatred offence and stirring up of hatred on the basis of religion or sexual orientation
  • sexual exploitation of adults such as causing or inciting prostitution for gain offence
  • human trafficking
  • assisting or encouraging suicide offence
  • the unlawful supply, offer to supply, of controlled drugs, and the unlawful supply, or offer to supply, of articles for administering or preparing controlled drugs
  • weapons offences such as offences relating to firearms, their parts, ammunition, including air guns and shotguns

 

Since the forum is more/less in the "family friendly" end of the spectrum on the internet, its pretty easy to align the forum's rules to comply with the above without altering the utility of the forum as-is. Put simply, there will be an end to unmoderated forums (for example, Thunderboat wouldn't have survived were it still going) but this one, no real issue. 

I don't think the rules themselves will pose any big problem for CWDF, the only issue is how many hoops the site owner has to jump through to meet them -- which seems to be just preparing a risk assessment document. Hopefully not too difficult, just a PITA... 😉 

 

43 minutes ago, Tony Brooks said:

Maybe it will be the end of the Political Section that does degenerate to personal threats and insults, but I notice it is far more civilised when a very few posters are not contributing. I suppose the question is, is any extra modding effort worth it, or should that area be shut down.

 

I still can't understand why so-called platforms that facilitate publishing of user generated posts can not be subject to the same legal sanctions as the press and individuals, so they can be sued in UK courts or otherwise legally sanctioned. As long as offending content was taken down within X hours of being reported then no further action, but if not they are as liable as the originator.

I don't think the political forum is any riskier than the others, there's nothing in the Online Safety Act saying that people have to be nice to each other, and all the things referred to above are already grounds for posts/posters being deleted/banned. They may just have to be made so they explicitly cover all the things covered by the Act.

 

I still think it's better to have the political forum where such discussions can be shunted off to and ignored by those who don't want to read them, to try and keep at least some of the nastiness/political sniping out of the other forums. Doesn't *always* work, but at least keeps a lot of the cr*p in one place so people don't step in it accidentally... 😉 

Edited by IanD

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.