![Online Safety Act: progress update](https://www.strawberrysocial.com/wp-content/uploads/2025/02/The-Online-Safety-Act-progress-update-and-what-it-means-for-charities.jpg)
The Online Safety Act: progress update and what it means for charities
![Liz Malone-Johnstone](https://www.strawberrysocial.com/wp-content/uploads/2023/11/LMJpic2-150x150.png)
![](https://www.strawberrysocial.com/wp-content/themes/dsk-bt/images/date.png)
Online Safety Act – latest update
In 2025, Ofcom is shifting its focus to service providers and the actions they must take to comply with new legal duties.
On 16 January 2025, Ofcom published new industry guidance on effective age checks and Children’s Access Assessments, requiring all user-to-user and search services to complete a Children’s Access Assessment by mid April 2025.
Online Safety Act Timeline
- 26 October 2023: Online Safety Bill received Royal Assent and was passed into law
- 31 January 2024: Encouraging or assisting self-harm, cyberflashing, epilepsy trolling, intimate image abuse, threatening communications and sending false information intended to cause non-trivial harm introduced as new criminal offences under the Act
- 8 May 2024: Ofcom propose measures to improve children’s online safety
- 16 December 2024: Ofcom policy Statement for the Online Safety regime dictate that providers have a duty to assess the risk of illegal harms on their services, with a deadline of 16 March 2025.
- 16 January 2025: Ofcom publish new industry guidance on effective age checks and Children’s Access Assessments
See all important dates for Online Safety compliance here >
What is the Online Safety Act – a recap
The Online Safety Act 2023 is a set of laws that protects both children and adults online, holding social media companies and search services accountable for user safety. It requires platforms and service providers to prevent access to harmful content, quickly remove illegal material, and provide clear reporting tools for children and parents. Ofcom will have powers to take action against companies which do not follow their new duties.
“The Online Safety Act makes businesses, and anyone else who operates a wide range of online services, legally responsible for keeping people (especially children) in the UK safe online.” – Ofcom
Who does the Online Safety Act apply to?
The Online Safety Act applies to all user-to-user and search services, meaning any online platform that facilitates user-generated content or communication. This includes big tech companies, social media platforms, cloud storage sites and search engines as well as gaming platforms, review sites, forums and online communities where users can interact. There are some exemptions such as public bodies
What about charities?
If a charity or brand owns or operates an online service that enables user generated content or interactions, they may fall under the scope of the Online Safety Act. This includes:
- Websites with comment sections or forums
- Brands with dedicated online communities such as private member forums
- Charities running interactive platforms or peer support spaces
To be clear, simply having a presence on social media does not bring a charity under the Act.The charity must own or operate the service itself to be affected. However, it’s important to remember that all organisations have a duty to comply with platform policies and follow moderation best practices to keep their communities safe.
Still not sure? Check if the Online Safety Act applies to you with Ofcom’s handy tool >
StrawberrySocial specialises in moderation and online safety for charities and we are proud to support many incredible charities including NSPCC, Samaritans and Sue Ryder. Our experienced team has worked relentlessly to protect high-risk and vulnerable audiences online for over 15 years. Get in touch to learn how we can help safeguard your online community today >