What’s Happening in the World?

While the field of Data Protection is developing at an accelerating pace in our country, worldwide innovations continue to remain on the radar of the Personal Data Protection Authority (“Authority”).

From the examples we have repeatedly encountered before, we witness that the Authority keeps pace with the world agenda, especially the European General Data Protection Regulation (“GDPR”) regulations, and tries to catch up with the requirements of the fast-moving data privacy world.

As GRC LEGAL, we closely follow the world agenda and present a selection of the current news for your information with this content.

The news below belongs to October-November 2023.

UK’s Online Security Bill

The bill, which aims to make the UK “the safest place in the world to be online”, received royal assent and became law. However, its content remains controversial, particularly its potential impact on encrypted messaging.

The bill, which had been in the works for years, sought to impose new obligations on how technology firms should design, operate and manage/regulate their platforms. Specific items of harm the bill seeks to address include underage access to online pornography, “anonymous trolls”, scam adverts, non-consensual sharing of intimate deepfakes, dissemination of child sexual abuse material and terrorism-related content.

Although it has become law, online platforms are not expected to immediately comply with all their obligations under the Online Safety Act. Ofcom, the UK telecommunications regulator responsible for implementing the law, plans to publish implementing rules in three stages:

The first covers how platforms should respond to illegal content, such as terrorism and child sexual abuse material. The supplementary opinion published on 9 November, which includes recommendations on how these obligations should be met, again prioritises the online safety of children.

The second and third phases cover platforms’ obligations on child safety and preventing minors from accessing pornography, as well as producing transparency reports, preventing fraudulent advertising and providing “empowerment tools” to give users more control over the content they are shown. Further consultations on other obligations relating to child safety will take place next spring. Ofcom said it aims to publish a list of “categorised services” – large or high-risk platforms that will be subject to obligations such as producing transparency reports – by the end of next year.

In case of breach of the law, the companies concerned can expect a fine of up to £ 18 million (approximately $ 22 million) or 10 per cent of their global annual turnover (whichever is higher), while the owners of the companies may also face imprisonment.

Making a statement for the Online Safety Act, UK Home Secretary Suella Braverman said, “The strongest protections in the Online Safety Act are for children. Social media companies will be held accountable for the appalling level of child sexual abuse that occurs on their platforms and our children will be safer.”, “We are determined to tackle child sexual abuse wherever it occurs. This law is a big step forward.”

Online Safety Law Creates Controversy!

Although the law was welcomed by child safety advocates, it was also a controversial legislation with various opponents, from encrypted messaging apps to the Wikimedia Foundation. Messaging apps such as WhatsApp and Signal have objected to a clause that allows Ofcom to require tech companies to identify “publicly or privately transmitted” child sexual abuse content, which they say fatally undermines the companies’ end-to-end encryption capabilities. Service providers have suggested that they would rather leave the UK than comply with these rules.

Meanwhile, the Wikimedia Foundation said that the bill’s strict obligations to protect children from inappropriate content could create problems for a service like Wikipedia, which chooses to collect minimal data about its users.

GRC LEGAL Commentary

With this law, which has been in the making for a long time, the United Kingdom has taken a critical step for this technological age in terms of the safety of children in online applications. While the implementation of such laws aims to protect children from online risks and limit malicious content, on the other hand, the process needs to be placed on a delicate balance in applications and communication tools that have privacy expectations within the scope of privacy.

Although the importance of users having a safe environment on online platforms is extremely important, especially for detecting and combating online crimes committed online and directed against children, it is essential to observe their fundamental rights such as the protection of personal data while fulfilling these protection factors. However, it can be said that the law is in full expectation of action to address security concerns with the serious sanctions it brings, and it is curious how the implementation will be shaped in the light of the new guidance expected to be shared.

Meta x Ad-free Subscription

Meta will offer an ad-free subscription version of Instagram and Facebook in Europe, the European Economic Area and Switzerland from November.

Meta’s move comes as a result of years of privacy litigation, sanctions and court rulings in Europe. These cases resulted in a situation where Meta could no longer claim a legitimate interest in tracking and profiling users for targeted advertising. Although Meta is currently operating without a legal basis, contrary to the outcome of the lawsuit, this summer it announced that it will switch to a system based on the consent of its users.

According to the Regional Data Protection Act, Meta’s only remaining basis for tracking and profiling for targeted advertising purposes was to obtain explicit consent from users. The ad giant’s “pay us or be tracked” subscription offer is expected to anger privacy advocates. Because this new model offered by Meta offers the option of “pay or pay with your privacy”.

According to Meta’s blog post, the ad-free subscription fee for Facebook and Instagram accounts is €9.99/month for the web and €12.99/month for iOS or Android. It is also stated that as of March 2024, users will be charged an additional €6/month for web and €8/month for iOS or Android for each account listed in the Account Centre. Therefore, the cost of using Meta’s ad-free subscription may rise rapidly for those with multiple accounts on Meta’s social networks. Even for a user with just a single Facebook or Instagram account, the cost of an ad-free subscription is over €120 for web and €155 for mobile.

Meta’s move is based on a ruling by the European Court of Justice allowing an “appropriate fee” to be charged if “necessary” for an equivalent service (i.e. one that does not involve tracking and profiling). Meta’s new move is therefore expected to be evaluated on the points of “necessity” and “appropriate fee”.

The Irish Data Protection Commission (DPC), Meta’s lead regulator under the European General Data Protection Regulation (“GDPR”) in the European Union, said: “Meta notified the DPC on 27 July of its intention to implement an alternative, consent-based model that offers users a choice between platform versions funded by advertising and subscription versions where they will not be exposed to targeted advertising in exchange for payment.” Meta’s consent model was originally set for February 2024, but at the DPC’s direction, this date was pushed back to November 2023. The DPC requested that the changes be implemented on the platforms as soon as possible in light of previous findings, as it did not trust the legal bases Meta used to process the data.

The DPC, acting in consultation with other European supervisory authorities, is conducting a detailed regulatory assessment of the consent-based model proposed by Meta in its role as Lead Supervisory Authority for Facebook and Instagram. The assessment is expected to be finalised shortly. The Norwegian Data Protection Authority has also expressed concerns about this subscription and doubts about the validity of consent obtained in this way.

In addition to the obligation to comply with the GDPR, Meta also faces the European Digital Services Act (EU Digital Services Act, “DSA”), which sets the conditions for targeted advertising by major platforms, and the Digital Markets Act (DMA), which imposes restrictions on the use of personal data for advertising purposes.

It is therefore expected that not only data protection authorities will be involved in deciding on the validity of Meta’s subscription or tracking offer, but also the European Commission, which oversees the relevant processes.

Meta is in the Commission’s crosshairs over its approach to the DSA; the European Union’s executive recently asked Meta for more information on its approach to content threats arising from the Israel-Hamas war and election security. It remains to be seen whether the EU will apply the same close scrutiny to Meta’s ad tracking proposal.

In its blog post, Meta argues that giving people the option to pay for their privacy or agree to be tracked “balances the requirements of European regulators by giving users a choice and allows it to continue to serve all people in Europe, the European Economic Area and Switzerland”.

The fact that the subscription is offered to people aged 18 and over raises questions about how it will comply with the requirements in the DSA and DMA that children’s data should not be processed for targeted advertising. “In this evolving regulatory environment, we continue to explore ways in which we can provide a useful and responsible advertising experience for young people,” Meta said.

GRC LEGAL Review

Meta, which gathers popular social media platforms around the world under its roof and frequently comes to the fore with its profiling of its users, was prevented from targeted advertising without consent by the European Data Protection Board (“EDPB”) decision. Although Meta aims to overcome this obstacle by offering a paid subscription option, Meta’s move completely destroys the privacy expectations of users who will not pay the subscription fee.

Especially in light of the worldwide economic crises, it does not seem fair to say that users voluntarily accept targeted advertising and profiling, as it is not only unreasonable to require users to pay for all their accounts within Meta, including additional accounts, but it will also lead users to use the free version. Even if Meta’s practice is found to be in compliance with the GDPR by the Commission, it is clear that it will face obstacles in terms of DSA and DMA. Although these two laws put Meta under obligations in terms of targeted advertising, the introduction of a paid subscription application will also create a violation of privacy that seems to be legalised and may evolve into a much more problematic point in the future.

The Data Act

The Data Act (“The Data Act”) was formally adopted by a large majority in a plenary vote by the European Parliament in November, marking the final step in the legislative process.

Once the text is published in the Official Journal, a 20-month transition period will begin. The regulation, proposed by the European Commission in February 2022, is part of the European Union Data Strategy package and is the last regulation to be adopted.

The regulation aims to create a single market for data, focusing on facilitating the voluntary sharing of data by individuals and businesses and harmonising the conditions for the use of certain public sector data, alongside the Data Governance Act already in place. According to European statistics, 80% of industrial data is never used, while the Data Act is expected to bring more data back into use and generate an additional €270 billion of GDP by 2028.

According to the proposal, the Data Act “ensures that users of a product or related service in the European Union have timely access to the data generated by the use of that product or related service and that those users can use the data, including sharing it with third parties of their choice.” In short, it creates new requirements to clarify who can use and access data of connected products and related services.

GRC LEGAL Commentary

The Data Act is likely to lead to significant changes in data management in the European Union and aims to standardise this area. If the creation of a single market for data in Europe is achieved, it could reduce the inconsistency between the data management rules and standards that currently apply in different countries and provide a more consistent environment for businesses.

Furthermore, giving users control over access to the data generated by the use of a product or service and the ability to share this data with third parties of their choice is a promising development that will make data use more transparent and user-friendly. However, it should not be overlooked that one of the most important points to be considered in the innovative legislation that supports and facilitates voluntary data sharing is the need to protect personal privacy. It will be right to wait for the transition period to fully evaluate the effects of the Data Law.

YouTube x Ad Blocking

Privacy advocates argue that YouTube’s ad-blocking restrictions violate the European Union’s online privacy laws.

As YouTube tightens its ad-blocking restrictions, privacy advocates in the European Union argue that government regulation could put a stop to these restrictions. Privacy expert Alexander Hanff filed a complaint with the Irish Data Protection Commission (DPC) in October. In his complaint, Hanff argued that YouTube’s ad blocker detection system violates privacy and contravenes European Union law.

The fight against ad blocker detection is nothing new, but YouTube’s global effort to stop ad blockers has brought the issue back into the spotlight. According to The New York Times, sites like YouTube can detect ad blockers by downloading JavaScript code that checks to see if anything has changed on the page, or by detecting if elements needed to load an advert are blocked.

YouTube began small-scale restrictions to test ad blockers in June, but later confirmed to The Verge that the company was ramping up its efforts. This means that more users using ad blockers will not be able to watch videos on the platform. Instead of showing the video, YouTube displays a prompt encouraging users to allow adverts on YouTube or subscribe to YouTube Premium. In a report published by Wired, it is stated that users are looking for an ad blocker that is not affected by YouTube restrictions, and ad blockers are being downloaded and uninstalled in record numbers. YouTube claims that ad blockers violate the platform’s terms of service and prevent creators from monetising ads.

Hanff first approached the European Commission (“Commission”) in 2016 regarding the use of ad blocker detection tools. In response, the Commission confirmed that the scripts used to detect ad blockers fell within the scope of Article 5.3 of the ePrivacy Directive, a rule that requires websites to obtain consent from the user before storing or accessing information on a user’s device, such as cookies.

The Commission said at the time: “Article 5.3 does not limit itself to a specific type of information or technology, such as cookies. Article 5.3 will also apply to the storage of scripts by websites on users’ terminal equipment for the purpose of detecting whether users have installed or used ad blockers.” However, this does not appear to have had a significant impact on how websites detect ad blockers. The European Commission appeared to have reversed its position in the proposed reform of privacy law in 2017, stating that website providers should be able to check whether a user has used an ad blocker without their consent.

Hanff is not the only advocate against YouTube’s ad blocker restriction. Patrick Breyer, a German digital rights advocate and member of the European Parliament, writes in Mastodon: “YouTube wants to force us into surveillance advertising and tracking with an anti-ad blocking wall.” Breyer also writes that he has asked the European Commission whether ad blocker detection systems under the ePrivacy Directive are legal.

YouTube spokesperson Christopher Lawton responded to Hanff and Breyer’s challenge by repeating the same statement given to The Verge last month, stating that YouTube has launched a global effort to crack down on ad blockers. Lawton added that the company “will co-operate fully with any questions or queries from the DPC”.

If the European Commission decides that YouTube’s ad blocker detection system breaches the EU’s ePrivacy Directive, the Commission could fine the platform and force it to change the feature. It is unclear at this stage how the Commission will respond to Hanff’s challenge, but it will likely not result in any changes to the current system for users in the United States.

For the moment, Hanff is not backing down. “I’ve been fighting for almost two decades for stronger protection of privacy and data protection rights,” he said, “If YouTube continues to think they can succeed in continuing to install spyware on our devices, I’m going to try to take them down.”

GRC LEGAL Comment

While adverts expose many internet users to surveillance advertising and profiling, they account for a significant portion of the revenues of many social media platforms such as YouTube. This conflict of interest has attracted the attention of privacy experts.

Following the Commission’s confirmation that scripts used to detect ad blockers will be subject to the prior consent of users (Article 5.3 of the ePrivacy Directive), YouTube’s decision on this practice has become a matter of curiosity.

It is clear that the possible fine that YouTube could be fined for breaching the ePrivacy Directive if the scripts continue to be used in the same way should act as a deterrent. In any case, even if some measures are taken within the borders of the European Union, there is also the danger that no changes will be made to the system for users in out-of-scope countries such as the United States.