No, the UK’s Online Safety Act Doesn’t Make Children Safer Online

Deeplinks 2025-08-01

Summary:

Young people should be able to access information, speak to each other and to the world, play games, and express themselves online without the government making decisions about what speech is permissible. But in one of the latest misguided attempts to protect children online, internet users of all ages in the UK are being forced to prove their age before they can access millions of websites under the country’s Online Safety Act (OSA). 

The legislation attempts to make the UK the “the safest place” in the world to be online by placing a duty of care on online platforms to protect their users from harmful content. It mandates that any site accessible in the UK—including social media, search engines, music sites, and adult content providers—enforce age checks to prevent children from seeing harmful content. This is defined in three categories, and failure to comply could result in fines of up to 10% of global revenue or courts blocking services:

  1. Primary priority content that is harmful to children: 
    1. Pornographic content.
    2. Content which encourages, promotes or provides instructions for:
      1. suicide;
      2. self-harm; or 
      3. an eating disorder or behaviours associated with an eating disorder.
  2. Priority content that is harmful to children: 
    1. Content that is abusive on the basis of race, religion, sex, sexual orientation, disability or gender reassignment;
    2. Content that incites hatred against people on the basis of race, religion, sex, sexual orientation, disability or gender reassignment; 
    3. Content that encourages, promotes or provides instructions for serious violence against a person; 
    4. Bullying content;
    5. Content which depicts serious violence against or graphicly depicts serious injury to a person or animal (whether real or fictional); 
    6. Content that encourages, promotes or provides instructions for stunts and challenges that are highly likely to result in serious injury; and 
    7. Content that encourages the self-administration of harmful substances.
  3. Non-designated content that is harmful to children (NDC): 
    1. Content is NDC if it presents a material risk of significant harm to an appreciable number of children in the UK, provided that the risk of harm does not flow from any of the following:
      1. the content’s potential financial impact;
      2. the safety or quality of goods featured in the content; or
      3. the way in which a service featured in the content may be performed.

    Online service providers must make a judgement about whether the content they host is harmful to children, and if so, address the risk by implementing a number of measures, which includes, but is not limited to:

    1. Robust age checks: Services must use “highly effective age assurance to protect children from this content. If services have minimum age requirements and are not using highly effective age assurance to prevent children under that age using the service, they should assume tha

    Link:

    https://www.eff.org/deeplinks/2025/08/no-uks-online-safety-act-doesnt-make-children-safer-online

    From feeds:

    Fair Use Tracker » Deeplinks
    CLS / ROC » Deeplinks

    Tags:

    privacy

    Authors:

    Paige Collings

    Date tagged:

    08/01/2025, 17:31

    Date published:

    08/01/2025, 12:32