Summary of Council decision:
Five issues were investigated of which one was Not upheld; one was Upheld; two were Upheld in part; and one was Upheld in relation to one ad only.
A TV ad, the advertiser’s website, two Instagram posts, three outdoor posters, and a tweet for EE:
a. The TV ad, seen in August 2020, featured Kevin Bacon showing someone his new phone. He said, “Just got it on EE. They just won number one network for 5G. And number one network seven years in a row”.
At the bottom of the screen text appeared which stated, “Limited UK 5G coverage. Rankings based on awards and analysis from UK RootScore® reports from H2 ’12 to H1 ’20. Not an endorsement. See RootMetrics.co.uk”, followed by text stating, “5G compatible phone & plan required. Check coverage ee.co.uk/networkenvy”.
At the end of the ad a medal appeared with the word “5G” on it and then the EE logo, accompanied by large on-screen text stating, “THE UK’S NO.1 NETWORK”. In the bottom left-hand corner of the screen smaller text stated, “No.1 for network performance” followed by the RootMetrics name and logo.
b. On EE’s website, www.ee.co.uk, seen in August 2020, a section of the home page featured a medal with the word “5G” on it and large text stating, “WE’RE THE NO.1 NETWORK FOR 5G”. The RootMetrics name and logo were accompanied by the text “UK RootMetrics® Report H1 2020” and “No. 1 for Network Performance”. Clicking on linked text which stated, “Find out more” took website visitors to a webpage headed “WE’RE NO.1 FOR 5G”.
c. The first Instagram post, posted 10 August 2020, stated “Get on the winning [medal emoji] network or get network envy. Find out more at ee.co.uk”. The accompanying video stated, “WE’RE NO.1 FOR 5G” with the word “5G” displayed on a medal, followed by the text “THE NO.1 NETWORK 7 YEARS RUNNING”, the RootMetrics name and logo, and the text “UK RootMetrics® Report H1 2020” and “No. 1 for Network Performance”.
d. The second Instagram post, posted 14 August 2020, stated “[…] EE – the UK’s No.1 5G network”. The accompanying video was the same as ad (a).
e. The first poster ad, seen in August 2020, featured a medal with the word “5G” on it, and large text which stated, “WE’RE NO.1 FOR 5G”. The RootMetrics name and logo were displayed at the bottom with the text “UK RootMetrics® Report H1 2020”. Further text at the bottom stated, “No.1 5G network based on analysis of 5G speed, reliability, and availability, data collected by RootMetrics® in Jan-Jun 2020. Tested at locations across the UK with the best commercially available smartphones on 4 national mobile networks across all available network types. Your experiences may vary. Visit ROOTMETRICS.CO.UK for more details”.
f. The second poster, seen in August 2020 in Birmingham, stated, “ENVY ALERT” and featured several stylised location pins with the text “5G” on them. Further text stated “EE HAS 3X MORE 5G COVERAGE IN BIRMINGHAM THAN VODAFONE, THREE AND O2”. The bottom of the poster featured the RootMetrics name and logo, and under the text “SEARCH 5GEE” small print stated, “Coverage claim based on analysis of 5G availability during testing by RootMetrics by IHS Markit in June 2020. Experiences may vary. Verify at www.RootMetrics.com”.
g. The third poster, seen in August 2020 in Glasgow, was the same as ad (f) except that under the location pins it featured the claim “EE HAS MORE 5G COVERAGE IN GLASGOW THAN VODAFONE, THREE AND O2”.
h. The tweet, posted to EE’s own Twitter feed on 23 June 2021, stated “London! We’ve got you covered with 4x more 5G coverage than Three. Find out more ee.co.uk/why-ee/network”. Further text at the bottom of the ad stated, “RootMetrics” and “EE has 4x more 5G coverage in London than Three”. The post featured an animation with two outlined images of Tower Bridge. The top image, labelled “EE”, was outlined in yellow, with the animation showing it filling up from the bottom with detail in yellow and blue. The lower image, labelled “Three”, was outlined in grey, with the animation showing only a small section at the bottom filling up with detail in shades of grey. Text stated, “EE HAS 4X MORE 5G COVERAGE THAN THREE”. The RootMetrics name and logo appeared in the bottom left-hand corner, accompanied by small text that stated, “UK RootMetrics Report H2 2020”.
The ASA received three complaints, from Vodafone Ltd, Three UK Ltd and a member of the public:
1. Vodafone Ltd challenged whether the stylised images of medals, the claim “They just won number one network for 5G” and the references to “awards” in ads (a) to (e) were misleading, because they misrepresented the type and findings of the evidence on which the ‘No.1’ claims were based.
2. Vodafone and a member of the public challenged whether the claims that EE was the “No.1 network for 5G” in ads (a) and (b), “NO.1 FOR 5G” in ads (b), (c) and (e), and “the UK’s No.1 5G network” in ad (d) were misleading and could be substantiated, including because they did not make clear on which relevant measure(s) the claims were based and small print was either absent or insufficiently prominent.
3. The member of the public challenged whether the claim “EE HAS 3X MORE 5G COVERAGE IN BIRMINGHAM THAN VODAFONE, THREE AND O2” in ad (f) and the claim “EE HAS MORE 5G COVERAGE IN GLASGOW THAN VODAFONE, THREE AND O2” in ad (g) were misleading and could be substantiated.
4. Three UK challenged whether the claims “London! We’ve got you covered with 4x more 5G coverage than Three” and “EE has 4x more 5G coverage in London than Three” in ad (h), were misleading and could be substantiated.
5. All the complainants challenged whether the claims challenged at Points 2, 3 and 4 were verifiable, because the ads either did not include information sufficient for consumers to verify the claims, or adequately signpost consumers to where they could find information sufficient to verify the claims.
1. EE Ltd said that the claims and imagery were based on awards and analysis by RootMetrics, which had confirmed they were happy that the ads reflected their finding that EE had the number one network performance for 5G based on relevant, objective measures. EE used the term “awarded” as a shorthand to mean that they were ranked number one.
EE believed consumers would understand the use of the imagery and the references to winning an award to mean that EE was the first placed amongst the mobile network operators (MNOs) tested. They said that because an independent third party had found that to be the case, consumers would not be misled by the claims and imagery used in the ads. Whether EE had been awarded number one network for 5G was not relevant so long as EE had been found to have the top-performing network.
Clearcast responded in relation to the TV ad (a), which included the claim “They just won number one network for 5G” and featured imagery of a medal with the words “5G” and the EE logo on it. They explained that RootMetrics was an independent mobile analytics firm which published regular reports ranking the UK’s major MNOs (EE, O2, Three and Vodafone) in six testing categories, including ‘Network Reliability’, ‘Network Speed’, and ‘Data Performance’, plus an ‘Overall Performance’ category. Based on those reports, RootMetrics announced the rankings of the RootScore Awards in each category. Clearcast noted that RootMetrics therefore used the term “award” themselves. They felt that was summarised clearly in the ad’s on-screen text “Rankings based on awards and analysis from UK RootScore® reports from H2’12 to H1’20”, and justified the use of the terms “won” and “award” and the medal imagery. They also felt that phrasing helped to make clear to viewers that the ranking was based on a title awarded to EE by a third party, rather than it being based on EE’s own assessment.
2. EE said that because the claims all referenced 5G, it would be clear that they referred to a technical aspect of the network – the 5G part – and that therefore it would be clear to consumers that EE’s 5G performed better when measured objectively against other networks. Consumers would not understand the claim to be a subjective claim.
EE said that, given the technical complexities involved in RootMetrics’ testing process, they did not think the consumer would need additional information about the objective measures on which the claim was based. They said the key point was that RootMetrics’ testing of mobile network performance used a methodology intended to represent consumers’ mobile experience, by including the metrics most relevant to customer experience of the network. All networks were tested in the same way at the same time, in the same locations.
EE said that RootMetrics carried out routine twice-yearly testing of mobile network performance across the UK. The results of RootMetrics’ routine testing were reported on RootMetrics’ website in ‘RootScore’ reports, but those results related to all data collected by RootMetrics and was not 5G-specific. EE had commissioned RootMetrics to produce a separate report relating to 5G network performance only. They provided that report, dated 29 October 2020, which provided information about the methodologies used for testing and ranking the MNO’s 5G networks.In RootMetrics’ routine testing, each test was carried out using whichever generation of network technology the device connected to (e.g., 3G, 4G or 5G) and information was recorded as to which ‘G’ the test was carried out on. Call and text connections did not yet utilise 5G, and therefore the tests recorded as having used 5G were only data tests. When a data test was recorded as having used 5G for the duration of the test period, the test was classified as 5G for metric calculations.
To assess 5G performance RootMetrics had therefore selected from their usual six testing categories only those categories, and metrics within those categories, which related to data performance and which consequently determined consumer experience of the 5G network. Three metrics related to download speed and four related to reliability of connection. Each of those metrics was given equal weighting in RootMetrics’ scoring system. A further metric, availability of 5G, was given more weight than the other metrics, because fast speeds and reliable connections were meaningless if consumers were unable to access them. The results for each metric were ranked by MNO, with a point being allocated to the top or joint-top ranked MNO (with two points to the top or joint-top ranked MNO on the 5G availability metric). EE scored 5 points, Three scored four points and Vodafone scored two points. O2 did not have sufficient 5G availability to be included in the comparison.
EE explained that 5G coverage was only available in limited areas as of the first half of 2020, and therefore 5G was often not available where RootMetrics routinely tested. RootMetrics’ calculations for 5G performance therefore only included testing in the UK’s 16 most populated metro areas, which between them accounted for over 50% of the UK population (Newcastle, Bristol, London, Manchester, Liverpool, Birmingham, Nottingham, Coventry, Sheffield, Leicester, Leeds and Bradford, and Hull in England; Glasgow and Edinburgh in Scotland; Cardiff in Wales; and Belfast in Northern Ireland). Those areas corresponded with Eurostat’s (the EU’s statistical office) definition of a ‘Large Urban Zone’ or ‘Functional Urban Area’ which included a city and the surrounding commuting area.
A total of 338,417 tests were carried out along driving routes across those 16 metro areas. The selection of testing locations and testing times were randomised to reduce the risk of bias, but proprietary algorithms were used to help define driving routes in order to conduct tests in as many locations as possible in each area, to cover mobile phone masts for all MNOs, and to cover any spatial gaps and account for population density. Larger, more densely populated areas were tested over longer periods of time compared to smaller, more sparsely populated areas, to achieve the same level of accuracy and statistical confidence. Driving routes were also optimised to approximate common travel patterns, and testing times were weighted to ensure that peak times and travel periods were well represented.
They said that RootMetrics’ methodology usually included tests conducted indoors, but in the first half of 2020 it had not been possible to do so due to restrictions resulting from the Covid-19 pandemic. However, RootMetrics had increased the number of outdoor locations at which testing was conducted, including adding testing on walking routes, to ensure the same amount of data was collected as in previous periods. They said that the outdoor-only testing was therefore still representative and robust.
EE said that to reflect typical consumer mobile experience, RootMetrics used off-the-shelf Android smartphones purchased directly from the relevant MNO. A primary goal was for their testing to show each network in the best possible light. They therefore chose the single best-performing device for each network that was capable of connecting and performing tests on both 4G and 5G networks, using a standard evaluation process. The device used was the one that for each network was most capable of connecting to 5G networks. The process usually led them to select the same device for all networks and ensured that their testing showed what the latest technology built into the network could do, thus capturing the best possible user experience of each network. It also enabled them to control for transient inferior performance from specific devices on specific networks, such as might be caused by a temporary software bug.
Devices were used for two consecutive six-month testing periods. They said that using one device per operator helped to assure data quality because this helped them to isolate better, and attribute issues or failures to a device, network or firmware update. They evaluated two 5G-capable devices, both from Samsung, to be used during their two testing periods in 2020; the Samsung Galaxy Note 10+ 5G was the best-performing device for all four networks.
EE explained that RootMetrics only used Android devices because it was not possible to install their testing software on Apple devices without hacking the operating system, which would result in the device not performing in the same way as an Apple device used by a consumer and the introduction of an uncontrolled variable. This inability to install such software on Apple devices impacted all test and measurement companies. They also said that, because mobile phones were designed to access and make the most of the technology available in the mobile network, there would not be discernible differences between the performance of Apple and Android devices. There was no evidence that Apple device users would experience a different level of performance.Additionally, using older Android devices, or a wide range of Android devices, would not provide a better or clearer picture of network performance. Using older devices introduced the possibility of artificially limiting a network and therefore painting an inaccurate picture of it. Using a wide range of devices would reflect differences in consumer purchasing habits rather than relative network performance. They also believed that device manufacturers would make varying performance on networks an advertising focus if there was differentiation in performance on that basis.
Clearcast, responding in relation to the TV ad, said they felt the on-screen text made clear what the “No.1 network for 5G” claim was based on, and highlighted that it appeared on-screen at the same time the claim was being made. They considered the claim was substantiated by RootMetrics’ findings, which were based on extensive analysis and robust data. They provided a document which explained RootMetrics’ methodology and stated the results of testing carried out in six of the 16 metro areas, submitted to them by EE during the ad clearance process. Clearcast said EE had used RootMetrics reports to substantiate a range of advertising claims in recent years and Clearcast had found them to be thorough and convincing. They said that RootMetrics had not previously analysed 5G as a category and so the claim in the ad was new.
3. The claims about network coverage in Birmingham and Glasgow were also based on RootMetrics’ 5G testing data from the first half of 2020.
EE referred to an article on RootMetrics’ website, published on 7 August 2020, titled “5G coverage across the UK’s 16 largest cities: which operator offered the most 5G coverage?”. It included a table that listed the UK’s 16 most populated metro areas and the percentage of those areas covered by each MNO’s 5G network. The table stated that, based on testing carried out across 78.3% of Birmingham’s geographical area (i.e., the land mass sampled), EE covered 58.6% of that area, O2 covered 0%, Three covered 18.9% and Vodafone covered 9.9%. It stated that, based on testing carried out across 67.7% of Glasgow’s geographical area, EE covered 9.8% of that area, compared to 0% for O2, 0% for Three and 8.8% for Vodafone.
EE highlighted that RootMetrics’ approach to geographic testing took account of both land mass and population density. Each geographic area was subdivided into hexagons of approximately 1.7 square kilometres (‘hexes’) and tests were weighted to hexes where the most people lived. That was why, for example, the total land mass sampled for Glasgow was lower than that of Birmingham. They explained that if the majority of tests in a hex connected via 5G, it was characterised as a 5G hex. Coverage claims were based on the number of sampled hexes across the metro area which were characterised as 5G. They highlighted that because coverage was based on whether the majority of tests in a hex were conducted via 5G, and because no indoor location was bigger than a hex, using only outdoor testing during the pandemic had little to no impact on the results for coverage.
They acknowledged that their 5G network covered only 1% more of Glasgow than Vodafone’s. However, they said that in an area the size of Glasgow that would still be a noticeable difference to consumers focused on their access to 5G, as it actually equated to over 10% more coverage.
EE also highlighted that they, and all other MNOs, had coverage checkers on their websites. They provided screenshots of their and Vodafone’s checkers showing 5G coverage in Birmingham; their checker showed wider coverage than Vodafone’s.
4. The claim in ad (h) comparing EE’s 5G coverage in London with Three’s was based on data gathered in RootMetrics’ H1 2021 testing period. The London testing was carried out between 14 and 23 March 2021.
EE said the animation of Tower Bridge exaggerated the difference between the performance of the two networks, particularly through the use of colour. However, the shading in each image was a fair representation of the difference in coverage from the networks, with four times the area of the “EE” Tower Bridge being coloured in compared to the “Three” image.
EE provided a document which they said was published on their website. It provided information about RootMetrics’ methodology and data that supported a range of claims made across EE’s advertising. The methodology used was the same as that used in H1 2020 except that different devices were used. EE said that two devices were evaluated for testing in 2021, both from Samsung. The Samsung Galaxy S20 5G was the best-performing device on Three and the Samsung Galaxy Note 20 Ultra 5G was the best-performing for EE, O2 and Vodafone. The reason they used a different device to test Three’s network was because a software bug impacted the performance of the Samsung Galaxy Note 20 Ultra 5G on that network, causing a disproportionate number of dropped calls.
EE said that testing was carried out across 79.35% of London’s geographical area. The substantiation document stated that EE covered 46.3% of that area, and Three covered 10.3%.
5. EE acknowledged that at the time ads (a) to (e) were published there was no information available on either the RootMetrics website or EE’s website that provided details of the basis for the “No.1 for 5G” claims. However, RootMetrics had created a report with the relevant information to verify the claims on 29 October 2020 which was then made available on EE’s website. EE said they had added a qualification to their ads which stated “to verify visit ee.co.uk/claims”.
Responding in relation to the TV ad, ad (a), Clearcast felt that the on-screen text, which stated, “See RootMetrics.co.uk” clearly gave viewers a route to verify the “Number one network for 5G” claim. The website gave viewers full access to the details about the RootMetrics analysis, their awards, and the comparisons involved. Neither payment nor registration was required. Clearcast were also satisfied that the website as a whole was a reasonable place to direct viewers to as they could then choose which report would be most suited to the information they wanted (e.g., best for 5G, best network, or local information) and the depth of information they wanted.
With regard to the coverage claims in the poster ads (f) and (g), EE said the data supporting the claims was in the article on RootMetrics’ website published on 7 August 2020, referenced above. The coverage checkers on MNO’s websites also provided consumers with the ability to easily verify the claims for themselves, as well as check whether they would have 5G coverage in areas where they lived or to which they travelled on a regular basis.
With regard to the coverage claim in the tweet, ad (h), EE acknowledged that the small text shown during the video inaccurately ascribed the basis of the claim to testing carried out in the second half of 2020, when in fact the testing had been carried out during the first half of 2021. They said that the text “Find out more ee.co.uk/why-ee/network” referred consumers to a webpage which contained further information about RootMetrics awards and a prominent link to a page with full details and test data.
Commenting in relation to the type of information that would need to be provided for consumers to verify both the “No.1 for 5G” and coverage claims, EE said that consumers would not be able to replicate RootMetrics’ testing without a large time and monetary investment. Therefore, an explanation of RootMetrics’ methodology, including the principle of testing in large urban zones and with a representative sample that treated all networks in the same way, would provide sufficient information for consumers to understand the comparison; it would not be necessary to make available the test data.
1. & 2. Upheld in part
The TV ad, website and video in the second Instagram ad (ad (d)) featured the claim “No.1 network for 5G”; ad (d) also featured the claim “the UK’s No.1 5G network” in the post. The other Instagram ad (ad (c)) featured the claim “Get on the winning network […]” in the post, and the video began “WE’RE NO.1 FOR 5G”. All four ads also included the claim “No.1 for Network Performance” next to the RootMetrics logo. In the context of the references to “Network Performance”, the ASA considered consumers were likely to interpret the variations of the “No.1 for 5G” claims in those ads as references to the performance of EE’s 5G network, relating to technical aspects such as coverage and reliability. Because the claims referred to EE being “No.1” and the ads included the RootMetrics logo and stylised images of medals, and ads (a) and (c) additionally prominently referred to EE having “won” or being the “winning” network, we considered consumers would see the claims as comparative claims that, based on an objective comparison of the technical aspects of 5G network performance, EE’s 5G network had been ranked highest in the UK market, by RootMetrics.
The poster, ad (e), included small text at the bottom which provided clarification as to the measures on which the headline claim “WE’RE NO.1 FOR 5G […] GET EE OR GET NETWORK ENVY” was based (“5G speed, reliability, and availability”). However, we considered that because the ad was a large outdoor poster, the volume and size of the small print meant that it was unlikely that consumers could read it. We considered the ad therefore did not make sufficiently clear that the claim was intended to be an objective comparison relating to the technical aspects of 5G networks rather than, for example, that EE’s 5G plans offered the ‘best’ 5G plans in terms of factors such as cost or data allowance.
Based on how consumers would interpret ads (a) to (d), and that EE intended ad (e) to be interpreted in the same way, we expected to see evidence that a range of networks had been rated by RootMetrics based on a robustly-conducted comparison of a range of appropriate, relevant and objective performance measures and that EE had received the highest score of all the rated networks. We considered those measures should be as reflective of likely ‘real-world’ consumer experience as possible.
The small print in each ad referenced RootMetrics and referred to either “UK RootScore® reports from H2 ’12 to H1 ’20” or “UK RootMetrics® Report H1 2020”. We considered consumers would therefore understand that those reports provided the evidence to support the “No.1 network for 5G” claims and associated medal imagery and references to “awards” and “winning”. However, the testing on which those scores were based measured network performance on whichever generation of mobile technology the test phone connected to (e.g., 3G, 4G or 5G). The RootScore testing, results and awards therefore did not relate specifically to 5G network performance and were not relevant to the claims in ads (a) to (e) that EE had won or been awarded “No.1” for 5G network performance. We considered the references to those reports in the ads were therefore misleading.
EE had separately commissioned RootMetrics to analyse the subset of their data collected during the first half of 2020 that specifically related to 5G performance. We reviewed the methodology and findings.
With regard to the specific performance metrics tested, we compared the range of metrics used to evaluate overall network performance to those used to evaluate 5G network performance specifically. We accepted that the more limited range of metrics used to evaluate 5G network performance, which related to data only, reflected the type of activities that were conducted over 5G and the factors that would impact on real-world consumer experience. The evidence showed that the performances of EE, Vodafone and Three were generally similar across the seven equally-weighted metrics relating to download speed and reliability of connection. In all but one of those metrics the difference between the first and second-ranked MNO was quite small. For the final metric, availability of 5G, the results showed that 25% of tests carried out on the EE network were conducted on 5G, compared to 5% for Vodafone and 3.5% for Three. We noted that, were it not for the fact that EE had the largest geographical coverage and that factor had been weighted more heavily than the other factors, Three would have scored the same as EE overall. However, we noted the substantial difference between EE’s 5G coverage compared to that of the other MNOs, and considered that it was reasonable to weight 5G availability more heavily than the other factors given that coverage was likely to be one of the most significant factors for consumers for whom being able to use 5G was a factor when selecting a mobile network.
We considered it was important that the comparison was based on a distribution of tests that was geographically representative and up-to-date. We considered consumers who saw the ads would have had a general understanding that, at that time, the 5G rollout was in its relatively early stages, with limitations to its availability, and would understand the “No.1 for 5G” claims in that context. We understood from Ofcom’s Connected Nations 2020 Report (published in December 2020) that in general, the rollout of 5G across the UK by all MNOs had focused primarily in areas with the highest amounts of mobile traffic and which broadly reflected the national distribution of all mobile traffic in the UK. There were some areas of the UK in which 5G was available where RootMetrics had not conducted tests, but we noted that the 16 metro areas largely corresponded to the areas in which 5G was available at that time. Although the 5G rollout had continued throughout 2020, we considered the areas in which testing had been conducted were a sufficiently representative sample of the areas in which 5G was available in the first half of 2020, and that it was reasonable to base claims in ads published in August 2020 on the results of testing carried out in the first half of that year as part of a twice-yearly testing program.
We considered it was also important that the distribution of tests carried out within the 16 metro areas was representative of how consumers would experience 5G availability and performance across those areas, both in terms of location and time of day. We understood that RootMetrics’ testing locations and times were initially randomly selected, and then testing was weighted towards peak network usage times, and choice of location was informed by population density and travel routes. We considered that it was reasonable for that testing to be carried out only outdoors while Covid-19 restrictions applied. A total of 338,417 tests were conducted, which equated to around 84,600 sets of tests across the 16 metro areas over the six-month period (with each set measuring the performance of all four MNOs in the same place at the same time). We considered that was likely to be a sufficiently representative sample of locations and times across the metro areas to be reflective of real-world consumer experience.
We also considered it was important that the comparison was based on a representative assessment of mobile data users across different networks across the UK, and that it took into account that consumers would be using a range of mobile devices, if that was likely to impact real-world consumer experience of network performance. For example, if certain devices had greater technological compatibility with certain networks (e.g. due to manufacturer, operating system or model), resulting in noticeably better 5G performance for consumers, we would expect the testing methodology to take that into account. However, we understood that at the time the ad appeared different devices performed similarly on 5G and therefore did not impact on real-world consumer experience. While RootMetrics’ methodology was designed to select the Android device that would facilitate the best network performance test results for each network, we understood that in most testing periods the same device was used across all networks. We further understood that while manufacturers were likely to aim to optimise better each new device model to utilise 5G network technology there was, so far, minimal differentiation between the performance of older and newer 5G devices. We also accepted that it was reasonable not to test using Apple devices given that to do so would require compromising the device’s operating system. In any case, we understood that in general the choice of device, including operating system, had minimal impact on testing metrics related to 5G network performance and that any differentiation would be so minimal that consumers would be unlikely to experience noticeably differing levels of network performance.
We further understood that one of the benefits to the type of methodology used by RootMetrics, compared to methodologies which crowd-sourced data from consumers’ mobile devices, was that tests could be conducted across four MNOs on as close to a like-for-like basis as possible. We accepted that RootMetrics’ approach, using the single best-performing device on each network in an otherwise like-for-like comparison, was designed to compare each network’s optimal performance while controlling for confounding factors. On balance, we considered that this methodology was adequate to support comparative claims regarding 5G network performance.
On that basis, we accepted that EE had the highest score relating to 5G network performance in the first half of 2020, and therefore that the various “No.1 for 5G” claims in ads (a) to (e) had been substantiated.
However, as noted above, the ads referred to RootMetrics’ RootScore reports as evidence, which were not relevant in supporting claims relating only to 5G network performance, and we therefore concluded the ads were misleading on that basis.
Additionally, as referenced above, because the volume and size of the small print in the outdoor poster was unlikely to be able to be read by consumers, we concluded ad (e) did not make the basis of the comparison sufficiently clear and was also misleading on that basis.
On points 1 and 2, ad (a) breached BCAP Code rule 3.1 (Misleading advertising).
On points 1 and 2, we also investigated ad (a) under BCAP Code rules 3.9 (Substantiation) and 3.33 (Comparisons with identifiable competitors), but did not find it in breach.
On points 1 and 2, ads (b), (c), (d) and (e) breached CAP Code (Edition 12) rule 3.1 (Misleading advertising).
On points 1 and 2, we also investigated ads (b), (c), (d) and (e) under CAP Code (Edition 12) rules 3.7 (Substantiation) and 3.33 (Comparisons with identifiable competitors), but did not find them in breach.
3. Upheld in relation to ad (g) only
We considered consumers would understand ad (f) to mean that EE’s 5G network covered over three times the area of Birmingham than each of their competitors’ 5G networks. We considered consumers would understand ad (g) to mean that EE’s 5G network covered a greater proportion of the area of Glasgow than each of their competitors’ networks. We further considered that consumers would understand the underlying message of both the ads to be that choosing EE over other networks would result in them having a significantly better chance of connecting to 5G when in the areas referenced.
RootMetrics had defined the extent of Birmingham and Glasgow based on Eurostat’s definition of the Large/Functional Urban Areas which covered those cities and the surrounding commuting areas. We considered those areas were likely to correlate with consumers’ understanding of the extent of those areas. The testing was carried out using the methodology described in points 1 and 2, and the advertising claims were based specifically on the results of the metric relating to the availability of 5G. For the reasons discussed above we considered the testing methodology was adequate to support comparative claims relating to coverage. However, the test results must support the advertising claims as consumers would understand them.
We therefore also considered whether the results of the coverage tests substantiated that choosing EE over other networks would result in consumers having a significantly better chance of connecting to 5G when in the areas referenced. The results showed that in Birmingham, EE’s 5G network covered over three times more of the sampled area than their closest competitor. We concluded that was a significant enough difference to have a real effect on consumer experience when attempting to connect to 5G in Birmingham.
In Glasgow, EE’s 5G network covered only 1% more of the sampled area than its closest competitor’s network (Vodafone). We noted that EE’s 9% coverage was a 10% increase from Vodafone’s 8% coverage. However, we considered that on balance it was unlikely to be a significant enough difference to have a real effect on consumer experience when attempting to connect to 5G in Glasgow. We considered the evidence therefore did not substantiate the claim as consumers would understand it.
We considered that the evidence demonstrated that the claim in ad (f) was substantiated and was therefore not misleading. However, we concluded that the claim in ad (g) was not substantiated and was therefore misleading.
On that point, we investigated ad (f) under CAP Code (Edition 12) rules 3.1 (Misleading advertising), 3.7 (Substantiation) and 3.33 (Comparisons with identifiable competitors), but did not find it in breach.
On that point, (g) breached CAP Code (Edition 12) rules 3.1 (Misleading advertising), 3.7 (Substantiation) and 3.33 (Comparisons with identifiable competitors).
4. Not upheld
We considered consumers would understand the claims in the tweet and accompanying video to mean that EE’s 5G network covered four times the area of London than Three’s 5G network, and that choosing EE over Three would result in them having a significantly better chance of connecting to 5G in London. While the animation showed the whole of the EE-labelled Tower Bridge filling up with detail, we considered consumers would interpret that in the context of the Three-labelled bridge and the “4x more 5G coverage” claims, rather than as implying that EE’s 5G network covered 100% of London.
The claim was based on testing carried out in March 2021, using the same methodology used during the first half of 2020 (on which the claims in ads (a) to (g) were based). The only exception was that in 2021, Three’s network was tested with a different device to that used for the other three networks. This was because a software bug caused a temporary negative effect on Three’s performance using the device used for the other three networks. As discussed in the above points, we considered that the methodology of using the best-performing device for each network was adequate to support comparative claims, including those relating to coverage, if the test results supported the advertising claims as consumers would understand them.
The evidence showed that EE’s 5G network covered 46.3% of the sampled area of London, compared to 10.3% for Three. We considered that the claim in the ad had therefore been substantiated and was not misleading.
On that point, we investigated ad (h) under CAP Code (Edition 12) rules 3.1 (Misleading advertising), 3.7 (Substantiation) and 3.33 (Comparisons with identifiable competitors), but did not find it in breach
The CAP and BCAP Codes required that comparisons with identifiable competitors must be verifiable. That meant that an ad which featured a comparison with an identifiable competitor or competitors needed to include, or direct its audience to, sufficient information to allow them to understand the comparison, and be able to check the claims were accurate, or ask someone suitably qualified to do so.
We considered what would constitute sufficient information to enable consumers and competitors to verify the ads’ comparative “No.1 for 5G” and geographical coverage claims. Because the test results were historic (i.e., specific to the six-month period of testing given the continuing roll-out of 5G after the testing period), we considered consumers and competitors would not be in a position to replicate the tests or have them replicated by someone else, whether or not the full particularities of the testing methodology were made available. We therefore considered that for the claims to be verifiable, the result for each metric at each test location and time would need to be made available in an accessible format. Additionally, full particularities about the methodology by which the data was obtained, categorised, assessed, scored and ranked would need to be provided, so that consumers and competitors could fully understand and interpret the data.
Ads (a) to (e) included variations of the comparative claim that EE was the “UK’s No.1 5G network”. While the ads did not specifically name any of EE’s competitors we considered consumers would be able to readily identify one or more of them. All five ads included reference to RootMetrics data or reports, or RootScore reports. Ads (a), (d) and (e) referred consumers to the home page of RootMetrics’ website and ad (c) referred consumers to ad (b) - the home page of EE’s own website.
As referenced above, RootMetrics did not publish any information on its website specifically relating to 5G network performance. Neither was that information made available to consumers on EE’s website at the time ads (a) to (e) were published. We concluded none of those ads therefore included sufficient information to allow consumers to understand or check the accuracy of the “No.1 5G network” claims, nor did they adequately signpost consumers to where they could find that information. We welcomed EE’s confirmation that they had taken steps to make the information available in a document on their website from 29 October 2020 and had added a signpost in relevant ads to direct consumers to that webpage, but we were concerned that the information was not available when the ads were published.
Notwithstanding that, we reviewed whether the information provided in that document would have been sufficient for consumers to verify the advertising claims. The document began with a series of “Testing Methodology Facts” which provided summarised information about RootMetrics' methodology. For example, it referenced that testing was conducted in “the 16 most populous urban areas” along “freeways and motorways, major arterials, and residential streets where the population within a market generally lives and travels” and that testing had not been conducted indoors. However, it did not provide further detail about the test locations or timings. It stated which device had been used for testing, but did not provide information about the process by which it had been selected. Additionally, some of the information related to RootMetrics’ general methodology (i.e, tests conducted on 3G and 4G as well as 5G) rather than being specific to the methodology used to support claims solely about 5G. It therefore provided incorrect information about the number of tests conducted, the areas in which some tests were conducted, and which aspects of mobile performance were tested.
A section titled “Ranking Methodology” stated the eight metrics used to make the comparison, referencing that they were “5G-specific”, and explained how many points were assigned to the MNO ranked first, including why 5G availability was given two points instead of one. Underneath, a table listed the “Value” for each metric for each MNO (e.g., for “5th percentile 5G download speed” EE’s value was 26.5 Mbps, and for “Percentage of tests conducted on 5G (availability)” EE’s value was 25.0%). It then stated the “Metric Rank” for each MNO based on the Value in that metric. Another table then ranked the MNOs based on their total scores from the metric rankings.
While the document did include the overall value for each MNO in each metric, we understood each was calculated from the results of around 21,150 individual tests. No other data points were provided or made available elsewhere for consumers to verify the calculations and the advertising claims on which they were based. We additionally considered that the amount of information provided about the testing methodology was not sufficient (or sufficiently accurate) for consumers to fully understand and interpret the data such that they would be able to verify it.
We concluded the comparative claims the “No.1 network for 5G” in ads (a) and (b), “NO.1 FOR 5G” in ads (b), (c) and (e), and “the UK’s No.1 5G network” in ad (d) were not verifiable and breached the Codes.
Ads (f) and (g) featured claims about the level of 5G coverage EE had in Birmingham and Glasgow respectively, compared to Vodafone, O2 and Three. Both ads included the text “SEARCH 5GEE” and small text next to the RootMetrics name and logo, which stated, “Coverage claim based on analysis of 5G availability during testing by RootMetrics by IHS Markit in June 2020. Experiences may vary. Verify at www.RootMetrics.com”. We considered consumers and competitors would need additional information in order to understand the basis of the comparison, but that it would be acceptable to provide that information elsewhere so long as its location was adequately signposted in the ads.
We considered that the small text was in such a small font that it was unlikely consumers could read it. The ads featured the RootMetrics name and logo next to the small text, but they were not prominent and we considered they were not, in any case, sufficient to direct consumers to where they could verify the claims. Notwithstanding that, we also noted that the URL www.RootMetrics.com, referenced in the small text in the ads, took internet users to the US-facing version of the website, which did not include the article that EE had stated featured the relevant information. We considered that the ads therefore did not adequately signpost consumers to where they could find the information to verify the coverage claims.
We also reviewed the information made available to consumers. We understood that the four MNOs included 5G coverage checkers on their websites, but noted those were not signposted in the ads. Additionally, while those tools were useful for consumers to check the 5G coverage from the specific MNO whose website they were visiting, we considered they would not be sufficiently comparable for consumers to verify comparative coverage claims such as those in the ads.
The article EE had referenced was included in a list of articles linked from the home page of the UK-facing version of RootMetrics website. It provided very brief information about the methodology used for the testing, namely that coverage was tested in 16 urban areas (which were listed) in the first half of 2020 and what RootMetrics meant by “5G coverage”, and linked to more detailed information about RootMetrics’ general methodology (i.e., that which encompassed tests on all generations of mobile technology). It also included the table, referenced at point 3, which showed the percentage of each of 16 metro areas covered by each of the four MNOs’ 5G networks, according to RootMetrics’ testing.
The table provided statistics relating to the overall percentages of coverage in Birmingham and Glasgow (and the other 14 urban areas) but did not provide the data on which those calculations were based, and that data was not made available elsewhere. It also provided very limited information about the testing methodology and the further information it was linked to was not sufficiently relevant to the methodology used to calculate the 5G-specific coverage claims in the ad.
We concluded the article did not provide sufficient information to allow consumers to understand the comparison and be able to check the claims were accurate. Additionally, the article was not adequately signposted. We concluded that the comparative 5G coverage claims in ads (f) and (g) were not verifiable and breached the Code.
Ad (h) featured the claim “London! We’ve got you covered with 4x more 5G coverage than Three. Find out more ee.co.uk/why-ee/network”. Very small text in the video ascribed the source for that claim to “UK RootMetrics Report H2 2020”, which was inaccurate as the claim was based on data from the first half of 2021. As with ads (f) and (g), we considered the further information that consumers would need to understand the claim should have been adequately signposted in the ad. However, we understood that, from the webpage referenced in the ad, consumers had to follow a link in a section which related to EE’s “No1 network” claims rather than its coverage claims, before being taken to a further page from where consumers could access the relevant document. We considered the route for consumers to find the information was therefore not adequately signposted. We also noted the document was dated a few days after the ad appeared and it was therefore not clear that the verification information was available at the time the ad was published. We considered the ad therefore did not adequately signpost consumers to where they could find the information to verify the coverage claim in ad (h).
Notwithstanding that, we reviewed whether the information provided in that document would have been sufficient for consumers to verify the advertising claims. As with the document which was intended to provide verification for the “No.1 for 5G” claims, it began with “Testing Methodology Facts”. It included a little more detail but again related to RootMetrics’ general methodology without explaining the differentiations between that and the methodology on which 5G-specific coverage claims were based. A section underneath focused on coverage claims for cities, providing principles-based information about RootMetrics’ methodology for 5G coverage claims. Under the heading “Claim: EE has 2x more 5G coverage in London than O2 & Vodafone & Three”, a table listed the overall percentage of 5G coverage for each MNO in London in the first half of 2021. The table therefore was not clearly indicated as relevant to the claim in ad (h), which was that EE had four times more coverage in London than Three.
The document also included what appeared to be side-by-side screenshots of the coverage checkers from each MNO’s website, each labelled with the relevant coverage percentage stated in the table. However, the coverage information shown in the screenshots was therefore not based on the same data as the percentage coverage claims in the table. Additionally, the screenshots were of central London only and therefore did not encompass the full extent of the Large/Functional Urban Area of London as defined by Eurostat, and so also did not correspond to the full areas to which the coverage percentages related.
The document provided statistics relating to the overall percentage of 5G coverage in London of each MNO, but did not provide the data on which those calculations were based, and that data was not made available elsewhere. Additionally, the coverage checker screenshots conflated RootMetrics’ coverage test results with information from other sources. The document also provided only limited information about the testing methodology relevant to 5G coverage claims. We concluded the document did not provide sufficient information to allow consumers to understand the basis of the comparison and be able to check the claims were accurate. Additionally, the document was not adequately signposted. We concluded that the comparative 5G coverage claims in ad (h) were not verifiable and breached the Code.
We concluded that ads (a) to (h) did not include, or adequately direct consumers to, sufficient information to allow them to understand the comparisons made, and therefore breached the Code.
On that point, ad (a) breached BCAP Code rule 3.35 (Comparisons with identifiable competitors).
On that point ads (b), (c), (d), (e), (f), (g) and (h) breached CAP Code (Edition 12) rule 3.35 (Comparisons with identifiable competitors).
The ads must not appear again in the form complained of. We told EE Ltd to ensure that the basis of any comparative claims was presented clearly. We also told them to ensure that ads provided sufficient information to enable consumers to verify comparisons with identifiable competitors or adequately signposted consumers to such information.