It’s simple and straightforward for the ASA to answer the two most common questions we receive about online advertising:
- “Yes, we have rules for online advertising”. Our rulebook covers almost every online ad in paid-for space and in media under an advertiser’s control, including on their social media channels. It’s an amazing asset which, uniquely, draws together all the major restrictions on the creative content and targeting of online ads in the UK, whether the restrictions come directly from law or through separate regulatory process.
- “Yes, we have the means to successfully enforce the rules”. Our interventions result in thousands of ads being amended or withdrawn (over 36,000 last year) because we found they didn’t stick to the rules. The vast majority of these ads relate to companies’ claims on their own websites or on their own social media posts, and not ads in paid-for space online.
There’s a third question that takes a little longer to answer, ‘What about the rule breaking you don’t know about?”
That’s a question that could be asked of the Police, Trading Standards or any enforcement body for that matter, and the motivation behind the question is generally the same in each case. It’s a request for reassurance that we’re doing everything we reasonably can to tackle emerging areas of detriment arising from advertising.
At the ASA, we’re certainly not complacent about the fact that the overwhelming majority of ads are legal, decent, honest and truthful. And, we’re by no means reliant on complaints from the public (also over 36,000 last year) to act as the eyes and ears of our online regulation, as valuable as those complaints are.
Rather, we take a range of proactive steps to prevent, detect and clamp down on the minority of bad ads. For example, last year we delivered advice and training on over 720,000 occassions to help ensure ads comply with the rules before they go to market. We worked with platforms and networks to disrupt the scourge of scam ads operating in paid-for space. And, we routinely draw on data from a wide variety of sources1 to help determine where and how we intervene, sharing what we learn with organisations in the Consumer Protection Partnership to help deliver collective regulation.
At the heart of our proactivity is a determination to use technology, market insights and closer working with platforms to deliver protection for consumers and responsible businesses online. Our annual reports provide plenty of case examples of how we’re doing that, but we’ve got plenty more examples in the pipeline, which demonstrate how we innovate to regulate online.
Just as the police use cameras to catch the minority of drivers that break the speed limits, we use CCTV-style monitoring to identify the small number of ads for alcohol, gambling and high fat, and salt or sugar foods that break the rules online.
We train our cameras on 49 websites and 12 YouTube channels with a disproportionately high child/under 18 audience and publish our findings in quarterly reports (Q2, Q3 and Q4, 2020), with our latest report, covering this first three months of 2021, provided here.
Later this summer, we’ll highlight learning from these reports to ensure our CCTV-style monitoring continues to evolve and improve and the handful of repeat offenders we’ve identified are drawn to the public’s attention.
Insights from social media platforms
There are ethical and practical reasons why monitoring ads delivered to children’s social media accounts is more complicated than tracking ads across the open internet. But, it’s vital that we do this monitoring.
Following a landmark information request to the seven social media platforms most popular with children in the UK, we’ll soon publish a report on Alcohol Ads in Social Media. The report provides a fascinating insight into whether the targeting choices of alcohol brands are likely to result in their ads being delivered to the accounts of children who are falsely registered or incorrectly inferred to be 18 or older.
Later this year, we’ll complete the circle, working with 100 children (and their guardians) to identify if alcohol and other age-restricted ads are, as a matter of fact, being delivered to their social media accounts. This promises to provide a crucial insight into the real world experiences of children, and it serves as an important update to a related project we undertook in 2013.
Using child avatars
Alcohol, gambling and other age-restricted ads are permitted in online media attracting a heavily weighted (75%+) adult audience, but even in these environments we call on advertisers to use media and audience targeting tools to direct their ads away from the minority child audience.
To gain an insight into whether this is happening in practice, we use online avatars as proxies for child and adult audiences and monitor the ads they receive.
We’ll publish imminently the results of our latest avatar monitoring, which updates and expands on the report we published in 2019.
Closer working with platforms
For over 60 years, broadcast and non-broadcast media have helped to uphold the Advertising Codes and provide regulatory teeth on those rare occasions when advertisers are not willing or able to work with the ASA to amend or remove ads that break the rules.
Today, some of the world’s largest online platforms and networks do the same, with the result that the ASA almost never has to refer to its legal backstop bad ads, from legitimate companies, appearing in paid-for space online.
We are currently exploring with these platforms and networks whether and, if so, how we might standardise the cooperation we receive under a bold, new regulatory framework that aims to bring greater transparency and accountability to the regulatory system online.
The sheer scale and changing nature of online advertising mean that, to be effective, we must innovate to regulate. It’s something we do with genuine passion and purpose and helps to provide reassurance to those seeking it that we’re doing everything we reasonably can to police ads online.
1 -Our Formal Intelligence Gathering report