Social media activity and usage
Estimated population active on social media in 2023* *Source
Platforms ranked by usage in 2022/2023
1. WhatsApp
2. Facebook
3. Instagram
4.Facebook Messenger
5. Twitter/X
Current laws and regulations relevant to social media
Advertising Standards Authority (ASA)
The Advertising Standards Authority (ASA) is the UK’s independent regulator of advertising across all media and is responsible for monitoring advertising and marketing campaigns, as well as carrying out investigations in relation to the same. With social media forming an increasingly key part of businesses’ marketing and advertising strategies, advertising the decisions and codes of conduct prepared by the ASA are becoming more relevant by the day.
UK Code of Non-broadcast Advertising and Direct & Promotional Marketing (CAP Code)
The CAP Code imposes rules on the advertising industry in relation to the collection and use of personal information, and it restricts marketing communications delivered by electronic means. It also includes how promotions are run on social media specifically obligating that promotions must be run fairly. Promotion and competition posts should clearly include conditions covering: participation, a closing date, the nature and number of prizes or gifts, and any eligibility or availability restrictions. Some of the ASA’s rulings on this topic include the following:
- Adverts on social media must be “obviously identifiable” as advertising. The ASA has published various rulings where social media influencers have been fined for not clearly displaying whether they have been paid to endorse a particular brand or product. At a minimum, it is advised that such posts include a prominent “ad” label up front to highlight that a post has been publicized for marketing purposes.
- Adverts must not create a false impression or be misleading. For example, the ASA has investigated various influencers for trivializing investment in crypto assets on Instagram by implying it was “straightforward and simple” without highlighting the risk that crypto values fluctuate and are unregulated in the UK.
- Social media promotions must be run fairly. Promotion and competition posts should clearly include conditions covering participation, a closing date, the nature and number of prizes or gifts, and any eligibility or availability restrictions.
Intellectual Property, Copyright and trademarks.
While social media content provides opportunities for businesses to promote their companies and products to a wide range of customers, one of the key dangers is the risk of intellectual property (IP) infringement. When posting videos and images online, it is important to seek permission from the owner or creator. It is also important that businesses and content creators manage their IP portfolios to enable enforcement and protection against unauthorized users and, where possible, register content to maximize such protection.
IP infringement on social media platforms can occur in many ways. The most common examples are as follows:
- Copyright. Tweets, posts, photographs, videos and artwork on social media platforms and social network profiles may be protected as literary, musical and artistic works under the Copyright Designs and Patents Act 1988. Copyright is an automatic right in the UK and does not require registration. Therefore, when posting content online, it is important that permission is sought from the creator. However, “resharing” photos on Instagram and “retweeting” posts on X is permitted as long as the creator is credited.
- Trade marks. Social media presents specific challenges for brand use and protection. Due to the volume of material posted, it can be difficult to monitor. Most social media sites have terms and conditions of use that prohibit the unauthorized use of trade marks by third parties. While unregistered marks offer some level of protection, registered trademarks are significantly easier to enforce in the event of a dispute.
Consumer Rights
The UK has wide consumer and other protections in place, all of which could apply to social media platforms, including the following:
- Consumer Rights Act 2015 (Consumer Act). Under the Consumer Act, consumers are afforded various statutory rights and remedies in relation to goods, digital content and services. This piece of legislation also includes an unfair terms regime impacting terms used (whether contractual or noncontractual notices) between businesses and consumers.
- Consumer Protection from Unfair Trading Regulations 2008 (Consumer Regulations 2008). Consumer Regulations 2008 prohibits unfair commercial practices such as misleading a consumer and aggressive behavior.
- Electronic Commerce (EC Directive) Regulations 2002, SI 2002/2013 (as amended by the Electronic Commerce (Amendments, etc.), EU Exit and Regulations 2019, SI 2019/87) and the Defamation (Operators of Websites) Regulations 2013, SI 2013/3028 (EC and Def Regulations). The EC and Def Regulations provide for certain “intermediary defense” for internet intermediaries, including those relating to hosting, caching or mere conduit.
- Communications Act 2003 (2003 Act). The 2003 Act requires video-sharing services and social media sites that allow video sharing and live-streaming audio-visual services to protect users from harmful or criminal content through measures such as terms and conditions, reporting systems, age verification and parental controls.
- Protection from Harassment Act 1997 (1997 Act). The 1997 Act prohibits harassment, stalking or physical acts and covers online issues such as cyberbullying.
- Business Protection from Misleading Marketing Regulations 2008 (2008 Mar Regs). The 2008 Mar Regs prohibit advertising that misleads traders and governs comparative advertising (advertising that identifies a competitor or competitive product). They make engaging in misleading advertising a criminal offense and impose regulatory sanctions for certain comparative advertising.
- Code of Practice (COP) for providers of online social media platforms. This COP offers guidance to social media platform providers on appropriate actions that they should take to prevent bullying, insulting, intimidating and humiliating behaviors on their platforms. It is also particularly relevant to any sites hosting user-generated content.
Defamation Act 2013 (DA 2013) and the Defamation (Operators of Websites) Regulations 2013 (DOW Regulations 2013), SI 2013/3028
The UK does have defamation laws, as well as options for social media providers to defend themselves from defamation claims made on their platforms subject to certain limitations. In conjunction with the eCommerce Regulations, section 5 of the DA 2013 (c 26) provides a defense for the operator of a website where a defamation action is brought in respect of a statement posted on that website if it was not the operator who posted the statement. The defense can be defeated if the claimant can show that it was not possible for them to identify the person posting the statement, the claimant gave the operator a notice of complaint in relation to that statement and the operator did not respond to the notice of complaint in accordance with these regulations.
Criminal Justice and Courts Act 2015
In the UK disclosing a private sexual photograph or film, without the consent of the person depicted and with the intention of causing that individual distress, an explicit criminal offence.
Data protection laws applicable to social media
Data Protection Act 2018 (UK DPA) and UK GDPR
The UK GDPR is EU GDPR (as transposed into UK law by Section 3 of the European Union (Withdrawal) Act 2018). The UK DPA and UK GDPR regulate privacy in the UK and affect social network operators by regulating the bases and purposes they can use for processing users’ personal data, their transparency obligations and other rights that must be granted to individuals with regard to their personal data. At the time of publication of this guide, the UK GDPR and EU GDPR are still significantly similar, with the DPA providing additional input where permitted under the UK GDPR, and minimal changes to the text from that in the EU GDPR save for replacing references to European bodies and institutions with their UK equivalents (e.g. referring to the UK Information Commissioner (IC) rather than a supervisory authority).
Privacy and Electronic Communications Directive (PECR)
The PECR sits alongside the UK GDPR and UK DPA and gives individuals specific rights in relation to direct marketing communications.
International Transfers out of the UK
Following both Brexit and the Schrems II case (see the EU data protection section for more information), the IC has prepared the UK International Data Transfer Agreement (IDTA) which is the UK equivalent to the EU SCCs. The IDTA is used for transfers of personal data out of the UK. There is also a UK Addendum which the IC has prepared which can be used alongside the EU SCCs to convert them to work for a UK international transfer enabling companies to avoid needing to use both the EU SCCs and the IDTA (where appropriate). As noted in the EU data protection section, transfer risk assessments are also required when consideration transfers of personal data out of the UK. For transfers to the USA from the UK, the UK Secretary of State for Department for Science, Innovation and Technology has, on September 21, 2023, issued regulations essentially creating the UK-US Data Bridge. The UK-US Data Bridge essentially is an extension of the DPF (see the EU section for further information), enabling transfers to between the UK and US for those US entities that have self-certified under the DPF and this new DPF extension. The UK-US Data Bridge will become effective from October 12, 2023. It is important to remember that the DPF, and this UK DPF extension do not grant adequacy to all US entities, rather it is only an extension to those US entities who are eligible to and who have then self-certified under the DPF regimes.
Data Protection law and Competition law overlap
Recently, the intersection between data protection law and competition or antitrust law is also becoming a relevant consideration, particularly in light of the EU’s DMA and the pending DSA. While both of these are EU laws, they may have indirect (if not direct) effect on companies within the UK, and if they do not then the UK Competition Act 1998 will, particularly if it is considered that actions taken by social media providers in relation to the handling of personal data is considered an abuse of a dominant position and as such considered anti-competitive. This is something that will become more relevant particularly in the metaverse, where there may need to be a sharing of information between competitors in order for their users to move freely about in the metaverse, but it is also an issue that may be live now in terms of access to information held or services provided by social media companies, which may be different or on different terms when provided within their own group versus to non-group companies
Children’s data processing
For data protection purposes in the UK a child is considered someone under the age of 13 years old, however, generally the age an individual needs to be to be able to legally enter a contract in the UK is 18. The impact of these points will differ for the use of social media, as there could be questions as to whether someone under the age of 18 can legitimately agree to the terms of service of a social media provider, whilst the processing of the personal data of users of those platforms when the individuals are under the age of 18, but older than the age of 13 may still be lawful under UK GPDR.
Age-Appropriate Design Code (the “Children’s Code”)
To assist organizations processing personal data of children, the IC issued the Children’s Code in September 2021. The Children’s Code requires that designers of apps or social media platforms make children’s privacy a primary consideration. The Children’s Code sets out 15 standards of age-appropriate design reflecting a risk-based approach. The focus is on providing default settings which ensures that children have the best possible access to online services while minimizing data collection and use by default. Many social media platforms have implemented changes to their child privacy and safety measures as a result of the Children’s Code.
Legislative developments on the horizon
Online Safety Act 2023
The Online Safety Bill that was has now received Royal Assent and became the Online Safety Act 2023 (OSA) on October 26, 2023. The OSA is subject to a transition period where different parts of the OSA will come in to force on a phased basis. Limited provisions came into force on October 26, 2023 most of which relate to interpretation of the OSA, as well as those provisions empowering OFCOM (regulator of the OSA) to prepare codes of practice and guidance mandated by the OSA. Further elements of the OSA became effective on January 1, 2024 and further provisions will come into force on April 1, 2024. However, OFCOM’s enforcement of the law will still only happen in a phased approach as outlined below. Passing the OSA shifts the focus from the theoretical to the practical, with organizations now having to interpret the OSA (and guidance as it is produced) and create practical solutions and assessments of risk and harm. Specifically the obligations on providers to identify and remove illegal content, particularly material relating to terrorism and child sexual abuse, as well as impose additional duties of care on platforms that are likely to attract child users. OFCOM’s phased approach to implementation is being kept up to date on the OFCOM website. The current information contemplates a 3-phased approach outlined below. The key take-away is that enforcement of the OSA likely to only begin at the end of 2024 at the earliest (bearing in mind that 2024 is an election year in the UK).
Phase 1 – Focus on illegal harms
- November 9, 2023 – OFCOM launches its consultation on various Codes of Practice and Guidance relating to illegal harms.
- February 23, 2024 – OFCOM consultation closes
- Autumn 2024 – OFCOM announces outcome of consultation and its final decisions.
- End of 2024 – OFCOM potentially begins investigations and enforcement actions relating to illegal harms.
Phase 2 – Child safety, pornography and protection women and girls
- December 2023 – OFCOM consultation on age assurance for pornographic content .
- Spring 2024 – OFCOM to publish and consult on codes of practice relevant to the protection of children online, as well as children’s access to online services and content.
- Spring 2025 – OFCOM to publish its outcomes regarding its consultations allowing time for providers of regulated services to undertake the relevant action and remedial measures to comply with the law.
- Summer 2025 – OFCOM potentially begins investigations and enforcement relating to child safety, pornography, and the protection of women and children.
Phase 3 – Additional duties for categorized services
- Spring 2024 – OFCOM to provide the UK Government with its advice regarding categorization of providers of regulated services, as well as draft guidance on transparency obligations required by the OSA.
- Summer 2024 – Government to set the thresholds for categorization in secondary legislation
- End of 2024 – OFCOM to publish a list of categorized services.
- During 2025 – OFCOM to publish further proposals and draft codes of practice on additional duties for regulated providers, including information about fraudulent advertising and guidance on transparency notices.
- End of 2025 – OFCOM potentially begins investigations and enforcement actions relating to the additional obligations and codes of practice it had published during the back end of 2024 and during 2025.
In terms of sanctions and enforcement powers, organizations not complying with their obligations under the OSA could face fine of the greater of £18million or 10% of worldwide revenue for the previous full accounting period.
Digital Markets, Competition and Consumer Bill (Bill)
The headline-grabbing change in the Bill is a wide-ranging new competition regime for digital markets that will put the Competition and Markets Authority (CMA) at the heart of the UK Government’s policy to regulate large digital platforms. This is a step-change in the UK regulatory landscape that provides for a proactive and interventionist regulatory regime—similar to the regulation of financial institutions and utilities—for some of the major Big Tech companies. The Bill proposes substantial reforms to UK competition law more generally that impact all sectors, not just digital markets. The CMA’s enforcement powers to intervene in anticompetitive conduct are expanded, and reforms proposed to the UK’s merger control regime introduce new ways in which the CMA can intervene in M&A deals. Importantly, the Bill provides the CMA with an explicit ability to intervene in conduct that takes place outside the UK and to review M&A deals that have a very limited impact on the UK economy. Finally, the CMA has a whole new suite of tools to enforce consumer law, including the ability to impose fines of up to 10% of global turnover on companies that break the law. This means that the Bill substantially strengthens the role and powers of the CMA for the future. The timing of the adoption of the Bill will largely depend on Parliament. However, according to an official from the Digital Markets Unit, the Bill may not receive royal assent until spring 2024.
Advertising
Some of the key findings of the House of Commons’ Digital, Culture, Media & Sport Committee’s report on influencer culture are that the rapid growth in social media influencing has expanded and outstripped advertising regulations, meaning that updates to the ASA, further reform and enforcement powers are urgently required. The committee is currently considering giving the ASA statutory powers to enforce the CAP Code in order to improve compliance.
Changes to UK data protection laws: the draft Data Protection and Digital Information (No.2) Bill (Data Reform Bill)
The Data Reform Bill has been tabled by the UK Government and is currently going through House of Commons review (having had two readings in Parliament already). The Data Reform Bill aims to clarify, amend and in some cases relax current UK data protection obligations for businesses. Of particular importance to social media platforms is that the Data Reform Bill proposes to raise potential fines for infringements of the PECR (currently set at a maximum of £500,000) to be in line with the UK GDPR (£17.5 million or 4% of global turnover, whichever is higher), which will impact how social media companies deploy the use of cookies to track users (which impacts both on- and off-platform tracking). The changes seek to reduce barriers for responsible innovation in a digital economy and so be more pro-business in appearance, but only apply in relation to the UK. Some of these pro-business proposals are also being met with some concern outside the UK, in particular in the EU where some privacy activists are calling for a reexamination of the UK adequacy determination if the Data Reform Bill is passed as is. If this happens, it would have consequences for, among other issues, data transfers between the UK and EU. With general elections due in the UK during 2024, a “carry-over” motion has been passed by the current UK government extending the review of the Data Reform Bill by 80 days, meaning the review and consideration of the Reform Bill has until December 12, 2024 to complete.
Further regulation of AI technology
The UK Government has published a National AI Strategy with a long-term view of establishing a UK framework for the regulation of AI, and changes to legislation affecting the rollout of AI technology and solutions are anticipated. In a further update, on February 6, 2024, the UK government responded to an AI Regulation White Paper consultation and concluded that it “will legislate when we are confident that it is the right thing to do”. Essentially saying that, at least for now, the current regulatory framework and regulators are sufficient to address AI issues. This will remain an area to watch.
Increased use and development of the metaverse
We have already seen changes being implemented to prepare for the advent of the metaverse in the IP sphere to ensure that trademarks are appropriately protected in the metaverse; for example, it is now possible to register a trade mark in the following classes: online non-downloadable virtual goods and NFTs (class 42) and financial services, including digital tokens (class 36).
Contributors
Matthew Gough
Partner and Head of the UK Consumer Law Team, Cardiff
E: matthewgough@eversheds-sutherland.com T: +44 292 047 7943
Paula Barrett
Partner and Co-Lead, Global Cybersecurity and Data Privacy, London
E: paulabarrett@eversheds-sutherland.com T: +44 207 919 4634
David Wilkinson
Partner and Head of UK IP, London
E: davidwilkinson@eversheds-sutherland.com T: +44 20 7919 0775
Penelope Jarvis
Legal Director (South African qualified), London
E: penelopejarvis@eversheds-sutherland.com T: +44 207 919 4684
© Eversheds Sutherland. All rights reserved. Eversheds Sutherland is a global provider of legal and other services operating through various separate and distinct legal entities. Eversheds Sutherland is the name and brand under which the members of Eversheds Sutherland Limited (Eversheds Sutherland (International) LLP and Eversheds Sutherland (US) LLP) and their respective controlled, managed and affiliated firms and the members of Eversheds Sutherland (Europe) Limited (each an "Eversheds Sutherland Entity" and together the "Eversheds Sutherland Entities") provide legal or other services to clients around the world. Eversheds Sutherland Entities are constituted and regulated in accordance with relevant local regulatory and legal requirements and operate in accordance with their locally registered names. The use of the name Eversheds Sutherland, is for description purposes only and does not imply that the Eversheds Sutherland Entities are in a partnership or are part of a global LLP. The responsibility for the provision of services to the client is defined in the terms of engagement between the instructed firm and the client.
Share this