Current laws and regulations relevant to social media
This section specifically lists various EU laws that may be relevant to social media companies. EU laws are applicable to each EU member state and may be supplemented further by local national law in each member state. As such, all of the laws set out below will be applicable in each EU member state.
EU e-Commerce Directive
The e-Commerce Directive is a set of rules on commercial communications for online contracting. According to the e-Commerce Directive, online intermediaries are exempt from liability for hosting or transmitting information that may infringe legal rights under three exemptions. Mere conduit service providers, which are only passively involved in the transmission of data, are not liable provided they do not initiate the transmission, do not select the receiver of the transmission and do not select or modify the information contained in the transmission. Caching providers are not liable for the automatic and temporary storage of information that is performed for the sole purpose of making it more efficient for that information’s onward transmission. Hosting providers are exempt from liability provided they have no actual knowledge of the unlawful activity or information, or facts or circumstances from which that unlawful activity is apparent. It is important to note that each of these exemptions preserves the power of the courts to grant injunctive relief. In addition, caching and hosting providers can only avoid liability if, upon notice of infringement, they act expeditiously to remove or disable access to that information. Once again, this highlights the importance of an effective notice-and-takedown procedure.
EU Unfair Commercial Practices Directive (Directive)
The Unfair Commercial Practices Directive regulates unfair business practices and applies to online intermediaries, including social media companies. The Directive provides a general prohibition on unfair commercial practices. Some commercial practices that may be problematic on social media platforms are hidden advertising, including misleading influencer marketing; unfair standard contract terms; practices related to in-platform purchases, such as virtual items; and commercial practices put in place by third party traders through social media platforms, including fake or misleading user reviews or endorsement. The Directive requires that all forms of commercial communications on social media platforms must be clearly disclosed.
EU Platform to Business Regulation (P2B Regulation)
The P2B Regulation applies to certain social media where they fall under the definition of an online platform provider by providing online intermediation services. Some of the obligations introduced by the P2B Regulation include that a platform’s terms and conditions (T&Cs) must be easily available to business users and be written in plain and intelligible language; if the platform decides to restrict, suspend or terminate its service to a business user, it must give the business user a statement of reasons; the platform’s T&Cs must describe the main parameters which affect the ranking of goods and services on the platform; and platforms must establish an internal complaint-handling system for business users.
EU Directive on Copyright in the Digital Single Market (Copyright Directive)
The Copyright Directive introduced obligations for online content-sharing platforms to prevent infringing content. This includes, for example, the use of technology to spot and prevent the uploading of infringing content. It grants publishers and news organizations the right to negotiate licenses and receive compensation when online platforms use snippets of their content. The Copyright Directive also allows for certain exceptions and limitations to copyright such as for education, text and data mining and research.
EU Terrorist Content Online Regulation (TCO Regulation)
The TCO Regulation was passed in April 2021 to tackle the spread of unlawful content promoting or facilitating terrorist activity. The TCO Regulation provides a legal framework to ensure that hosting service providers that make content available to the public address the potential misuse of their services for the dissemination of terrorist content online
EU Digital Single Market Strategy
The aim of the EU Digital Single Market Strategy is to maintain the EU’s position in digital markets globally under the “three pillars” approach:
- Access to online products and services for consumers and businesses
- Shaping the environment for digital networks and services to grow and thrive
- Maximizing the growth potential of the European digital economy
In this respect, the European Commission has developed the DSA package, which includes the Digital Services Act (Regulation (EU) 2022/2065) (DSA) and the Digital Markets Act (Regulation (EU) 2022/1925) (DMA).
DSA
The DSA will create additional obligations for digital service providers, including specific obligations for “very large platforms,” which are defined as platforms with over 45 million users. The key dates for the DSA under the current timeline include:
- February 17, 2023. Platforms and search engines must publish their user numbers.
- August 25, 2023. Very large search engines and very large platforms, as designated by the European Commission, must comply with the DSA from this date.
- February 17, 2024. The DSA rules shall be in force for all applicable companies, and EU member states must have established their digital service coordinators (as defined under the DSA) by this date.
The DSA regulates the following areas:
- Content liability. With respect to content liability, the DSA will essentially reproduce the liability safe haven that is currently provided for in the e-Commerce Directive. Therefore, a hosting service, if it has actual knowledge of illegal activity or content, must act expeditiously to remove it or it will be held liable.
- Reporting obligations. Very large platforms will also have specific reporting obligations to enforcement authorities in certain cases—for example, when people’s safety is at stake.
- Accountability. Like the EU GDPR, the DSA will introduce elements of accountability and financial fines. Such fines can reach 6% of the provider’s annual turnover, which is even higher than those that can be issued under the EU GDPR. This shows that the DSA is not just an invitation for platforms to include mechanisms against hate speech; it actually creates binding obligations, with potential fines at stake.
- Notification systems. Online platforms must put mechanisms in place to enable any individual or entity to notify them of the presence of illegal content. Such mechanisms must be easy to access and user friendly. The DSA also introduces the notion of “trusted flaggers” for very large platforms. The platforms will have to treat notifications from trusted flaggers as a priority based on an internal complaints-handling system that they must put in place.
- Information and transparency obligations. The DSA also increases information and transparency obligations, including, for example, the obligation for very large platforms to provide transparency over the main parameters of the decision-making algorithm that is used to offer content. This is in response to the aforementioned concern over freedom of speech.
DMA
The DMA targets large technology companies and implements new obligations on “gatekeepers” to ensure smaller businesses are treated fairly. The purpose of the new legislation is to encourage competition in the digital markets by prohibiting certain commercial practices and requiring bigger players to adhere to positive obligations in order to promote competitiveness, such as providing fair access to application developers. Key dates for the DMA under the include:
- May 1, 2023. Potential Gatekeepers had 2 months to identify themselves to the European Commission if they met the DMA thresholds.
- July 3, 2023. European Commission has 45 working days to assess whether the companies identifying as potential Gatekeepers actually met the thresholds and if they did to formally designate them as Gatekeepers for the purposes of the DMA.
- September 6, 2023. European Commission will notify Designated Gatekeepers following their assessment. Companies notified as Designated Gatekeepers have 6 months from the date of notification to comply with the DMA.
- March, 2024. Gatekeepers required to publish their DMA compliance reports.
On September 5, 2023, the European Commissioner published the following list of entities as being Designated Gatekeepers for the purposes of the DMA:
Alphabet Inc.
- AmApp Stores: Google Play
- Google Maps
- Google Shopping
- Google Search
- YouTube
- Android Mobile
- Alphabet’s online advertising service
- Google Chrome
Amazon.com Inc.
- Marketplace
- Amazon Advertising
Apple Inc.
- AppStore
- iOS
- Safari
ByteDance Ltd.
- TikTok
Meta Platforms, Inc.
- Facebook Marketplace
- WhatsApp Messenger
Microsoft Corporation
- Windows PC OS
Data protection laws applicable to social media
The proliferation of data protection laws globally has been happening at an increasing pace, particularly since the introduction of the EU’s General Data Protection Regulation on May 25, 2018. Given the nature of a social media platform, data protection laws will always be a critical piece of legislation to be considered in a social media environment. This section looks specifically at current data protection laws.
EU General Data Protection Regulation (EU GDPR)
The EU GDPR regulates privacy in the EU and European Economic Area (EEA) (the EEA is all the EU member states as well as Iceland, Liechtenstein and Norway) and affects social network operators by regulating the bases and purposes they can use for processing users’ personal data, their transparency obligations and other rights that must be granted to individuals with regard to their personal data.
EU ePrivacy Directive (ePD)
The ePD, which was amended several times since being enacted, regulates the confidentiality of communications and the use of personal data for marketing communications. The ePD regulates providers of electronic communication services, which must provide secure services and inform subscribers whenever there is risk to security.
European Data Protection Board (EDPB)
While not a law, the EDPB (constituted under the EU GDPR ) plays an increasingly important role in EU data protection law through the issuing of guidance, and perhaps most importantly its input into member state decisions on data protection matters which are referred to it when there is disagreement among EU data protection supervisory authorities (SA) on actions of controllers within the EU. Specifically the EDPB can issue binding decisions in response to disputes between SAs which binding decisions then must be implemented by the relevant leading SA in the particular member jurisdiction where the issue arose.
International Data Transfers outside of the EU
Transfers of personal data outside the EU are only allowed where certain conditions are met. One of those is where a country has been “deemed adequate” by the European Commission. This means that the laws in that jurisdiction are deemed by the European Commission to provide the same level of protection for personal data to that given by EU member states. If a country does not have an adequacy decision, then there are certain mechanisms that can be used to enable data transfers, the most commonly used of which are the EU standard contractual clauses (SCCs). Further detail on some of these elements are set out below:
- Countries deemed adequate by the EU - Andorra, Argentina, Canada (commercial organizations only), Faroe Islands, Guernsey, Israel, Isle of Man, Japan, New Zealand, Republic of Korea, Switzerland, the UK and Uruguay.
- US and EU Adequacy - The Trans-Atlantic Data Privacy Framework (DPF) was agreed between the US and the European Commission in March 2022 and adopted into law by the EU in July 2023. As a result, personal data transfers from the EU to the US to organizations which participate in the DPF obligations, will be considered “adequate”.
- EU SCCs – The European Commission published updated SCCs following the introduction of the GDPR as well as considering the outcome of the Schrems II decision.* Using the EU SCCs, along with having conducted the appropriate assessments and taking any appropriate mitigation measures identified in the assessments will enable lawful transfers out of the EU to non-adequate jurisdictions.
Post Schrems II, companies are also required – where there is no adequacy decision – to undertake transfer risk assessments of both the need for the transfer, as well as the country that the personal data is being transferred to, This is to assess if that country’s laws are “essentially equivalent” to those provided in the EU for the protection of personal data. If a company finds in their assessment that the third country does not provide essentially equivalent protection, they must consider supplemental measures to improve the protection of the transferred personal data. This can be a particularly burdensome requirement in the best of times, but may become more onerous when using social media that usually operates from a global footprint, with servers around the world, and where the interplay between that personal data that is controlled independently by the social media platform may be very hard to determine. This will then have knock-on consequences in relation to international transfer considerations, such as implementation of supplemental measures. * Schrems II was a decision by the Court of Justice of the European Union that essentially struck down the EU-US privacy shield mechanism for international transfers (a precursor to the DPF). The decision also cast doubt on the suitability of the then-standard contractual clauses that were the default mechanism for international personal data transfers where a jurisdiction was not considered “adequate” for data protection purposes by the EU.
Legislative developments on the horizon
Regulation Proposal on Child Sexual Abuse Material (CSAM)
This regulation will impose a range of obligations to remove CSAM, including conducting risk assessments and mitigation, issuing detection and removal orders, reporting, and using detection technology. The regulation targets hosting services, interpersonal communication services (i.e. messaging services), app stores and internet access services.
AI
The EU AI regulation is a landmark piece of proposed legislation that is expected to be adopted in 2023 and introduces a tiered regulation of AI systems where certain use cases are prohibited and others are subject to substantial compliance obligations. Prohibited practices include “subliminal techniques” beyond a person’s consciousness in order to materially distort a person’s behavior or where an AI system manipulates the vulnerabilities of a specific group of persons due to personal characteristics. “High risk” AI includes a wide number of sectors such as law enforcement and biometrics, and in such instances, AI providers must adhere to burdensome technical and transparency requirements and conformance assessments. Furthermore, the AI Liability Directive will introduce a new standard of tortious liability and disclosure obligations for AI providers and will harmonize rules across the EU.
The metaverse
The EU has taken an active stance in the digital spaces—the AR and VR sectors—and has launched initiatives to prepare for this nascent industry’s maturity. European Commissioner Thierry Breton has stated, “This new virtual environment must embed European values from the outset... Private metaverses should develop based on interoperable standards and no single private player should hold the key to the public square or set its terms and conditions.” While the DSA and DMA will provide regulators with new tools to police digital spaces, the European Commission is expected to publish a Metaverse Regulation this year covering issues such as network infrastructure taxes, digital rules following the DSA and DMA, and safety and interoperability measures.
Data protection
The European Commission’s Work Programme 2023 includes initiatives concerning digital enforcement and enhanced data use. Proposals to amend consumer protection cooperation rules, make updates to harmonize the approach by data protection authorities enforcing the EU GDPR and set up a “common European mobility data space” are to be examined. In addition, it has been acknowledged for some time now that the ePD is outdated. In 2017, the European Commission proposed text for a new e-Privacy Regulation that, when in force, will repeal the ePD. The e-Privacy Regulation is still in draft, having gone through numerous rounds of negotiation. Even after six years it is still unclear what the e-Privacy Regulation will provide or when it will come into law, but areas of focus over the past six years have been on cookies and other tracking technologies, including new technologies such as instant messaging apps, Voice over Internet Protocol platforms and the Internet of Things. These will inevitably be relevant to social media, so a watching brief on this will be important. Another key consideration within the e-Privacy Regulation is the potential for the enforcement to include the one-stop-shop mechanism as set out in the EU GDPR for regulatory purposes. It is also highly likely that the level of fines for non-compliance with the e-Privacy Regulation will mirror those set out in the EU GDPR.
Contributors
Marie McGinley
Partner, International Head of Technology, Dublin
E: mariemcginley@eversheds-sutherland.ie T: +353 1 6441 457
Olaf van Haperen
European TMT Sector Head and Partner, Technology & Data Protection, Rotterdam
E: olafvanhaperen@eversheds-sutherland.com T: +31 1 02 48 80 58
Vincent Denoyelle
Partner, International Head of Media, Paris
E: vincentdenoyelle@eversheds-sutherland.com T: +33 155 7 34 21 2
© Eversheds Sutherland. All rights reserved. Eversheds Sutherland is a global provider of legal and other services operating through various separate and distinct legal entities. Eversheds Sutherland is the name and brand under which the members of Eversheds Sutherland Limited (Eversheds Sutherland (International) LLP and Eversheds Sutherland (US) LLP) and their respective controlled, managed and affiliated firms and the members of Eversheds Sutherland (Europe) Limited (each an "Eversheds Sutherland Entity" and together the "Eversheds Sutherland Entities") provide legal or other services to clients around the world. Eversheds Sutherland Entities are constituted and regulated in accordance with relevant local regulatory and legal requirements and operate in accordance with their locally registered names. The use of the name Eversheds Sutherland, is for description purposes only and does not imply that the Eversheds Sutherland Entities are in a partnership or are part of a global LLP. The responsibility for the provision of services to the client is defined in the terms of engagement between the instructed firm and the client.
Share this