On 17 October 2024[i] the European Commission adopted the first implementing rules of cybersecurity of critical entities and networks, in consonance with the NIS2 Directive, in the form of an Implementing Regulation.[ii] The Regulation is set to come into force in late November, to be precise 20 days after its publication in the Official Journal – which took place on 7 November 2024.
The adoption of the Regulation also coincides with the last day of the deadline set for the EU Member States to transpose the NIS2 Directive into their national laws.
The implementing rules essentially detail measures pertaining to cybersecurity risk management, and reporting obligations to national authorities across the bloc which are imposed on companies providing digital infrastructures and services in the event “significant” incidents may occur. Specifically, those companies with provision of digital services for instance cloud computing service providers, data centre service providers, online marketplaces, online search engines and social networking platforms would fall under the scope.
NIS2 Directive[iii] re-categories and noticeably expands the previous scope, which initially covered two categories of i) operators of essential services (OESs) and ii) relevant digital service providers (RDSPs), by classifying covered entities under either Essential Entities (EE) or Important Entities (IE).
EE includes sectors of energy, transport, finance, public administration, health, space, water supply and digital infrastructure such as cloud computing service providers and ICT management.
IE includes sectors of postal services, waste management, chemicals, research organisations, food processing, manufacturing and digital providers such as social networks, search engines and online marketplaces.
With micro and small entities in principle excluded from the scope, the Directive puts in place a size threshold. In other words, a threshold of 250 employees, annual turnover of €50 million or balance sheet of €43 million concerning the EE entities, respectively a threshold of 50 employees, annual turnover of €10 million or balance sheet of €10 million concerning those under the IE list.
Nevertheless, an entity may still be considered as ‘essential’ or ‘important’ irrespective of its size, if it is the sole provider of a critical service for societal or economic activity in a given Member State, respectively a trust service provider or any central or regional government entity.
Similar to GDPR, the Directive requires Member States to impose penalties for non-compliance, the ratio of which would differ per classification. €10 million or at least 2% of global annual turnover for the previous fiscal year, whichever is higher, for the EE entities, respectively €7 million or at least 1.4% of global annual turnover for the previous fiscal year, whichever is higher, for the IE entities.
Notably, the covered entities’ management bodies, such as board of directors, would also be held liable for non-compliance.
On the other hand, the Swiss Information Security Act (Informationssicherheitsgesetz, ISG) applies primarily to the federal administration, cantonal authorities and their partner companies in the country, and its revised version is set to come into force by 1 January 2025. In this context, partner companies could be active in similar sectors as those within the scope of the Directive in the EU, such as financial and information and communication sectors as well as those service providers and manufacturers of hardware and software products that are used by critical infrastructures.
Therefore, supplier companies would indirectly fall under the scope of ISG, similar to that of the Directive in the EU. The Swiss entities forming part of a supply chain which ultimately target those EU based entities covered by the Directive, would as a result be affected by the requirements and obligations under both instruments.
Specifically, the subsidiaries and branches of Swiss entities registered within the EU, which fall under either of the EE or IE classifications, will have to comply with the Directive in the EU and comply with the requirement to register with the national authority of an affiliated Member State, among other things. In this scenario, the parent or affiliated entity in Switzerland may also be indirectly caught under the radar of the Directive through the supply chain connection.
[i] See here https://ec.europa.eu/commission/presscorner/detail/en/ip_24_5342.
[ii] See here https://eur-lex.europa.eu/eli/reg_impl/2024/2690/oj.
[iii] See here https://eur-lex.europa.eu/eli/dir/2022/2555.
Following the judgement of 4 October 2024 of the Court of Justice of the European Union (CJEU)[i], case C‑446/21 between Maximilian Schrems and Meta Platforms Ireland Ltd (“Meta”), the scope of collection of personal data on social media network platforms and the applicable restrictions thereof in particular in the context of targeted advertising were put under strict scrutiny.
Here, the EU General Data Protection Regulation (GDPR) principles of data minimisation and purpose limitation were specifically delved into.
Meta generally manages the provision of services of the online social network Facebook in the EU and is considered as the controller within the meaning of the GDPR. The present case concerns data collected from Facebook users’ activities by Meta not only on Facebook but also outside, including those data related to online platform visits and navigation patterns as well as third party websites and applications. For this, Meta is seen to utilise cookies, social plug-ins and pixels, which are embedded on the relevant websites, for the purpose of targeted advertising.
The CJEU decision brings further clarity to the following:
. the scope of the principle of data minimisation under Art. 5(1)(c) GDPR covers all personal data which is collected from data subjects or third parties by a controller, collected on or outside the platform, for the purpose of aggregation, analysis and processing in the context of targeted advertising, whereby the retention time would at all times need to be restricted and the type of personal data would need to be distinguished. Furthermore, the principle is applicable irrespective of the legal basis used for the processing, and even if a data subject may have consented to targeted advertising, their personal data cannot be used indefinitely.
. Article 9(2)(e) GDPR, on processing of special categories of personal data, would need to be interpreted in a restrictive manner, whereby the mere mentioning of a fact by a data subject in a public setting should not easily give rise to any other information related to that particular fact being labelled as “manifestly made public” and hence legally permitted to be processed.
As a consequence of the CJEU ruling, any operator of a social media network platform or online advertisement company would need to restrict their data pool and put in place an effective data deletion policy.
[i] See here https://curia.europa.eu/juris/document/document.jsf;jsessionid=5CE53D5E3FCC1ABA77F2ACD5AAC2F038?text=&docid=290674&pageIndex=0&doclang=en&mode=req&dir=&occ=first&part=1&cid=1306139.
As a company, it is worth recognizing which tasks should be handled in-house and where the company is better off concentrating on its core business in order to increase efficiency and reduce costs. Particularly in IT, it makes sense to consider bringing in an external service provider, if not only then also for information security reasons. And while companies and service providers like to focus on performance, both would do well to be able to answer the fundamental questions of data protection.
Outsourcing from a data protection perspective
Cloud providers, web hosts, agencies and call centres or IT support companies take on tasks in companies that also entail access to or the processing of personal data held by the company. According to the Federal Act on Data Protection (FADP), the processing of personal data can be transferred to a so-called processor. In this case, the company, as the so-called controller, remains responsible for ensuring that data protection is complied with. The company must ensure careful selection, appropriate instruction and necessary monitoring. The service provider has a reciprocal interest in delineating the duties transferred and clarifying which services are to be remunerated and how.
DPA – Data Privacy Agreement
In practice, the agreement on data processing (or order processing according to GDPR) has become established, often abbreviated as DPA. Instead of a separate DPA, data protection can also be regulated as an annex to the contract, as is common in the Anglo-American region with the so-called Data Privacy Addendum, or (also) DPA for short. More important than where, is that the responsibilities and obligations are regulated in accordance with the DPA or GDPR. Similar pitfalls present themselves time and again.
Important to regulate
The basic principle of both the FADP and the GDPR is to ensure that the processor only processes the transferred data in accordance with the instructions of the client. Suitable technical and organisational measures must be taken to ensure that the rights of the data subject are protected. Accordingly, in addition to the basic scope of the order, and thus the data processing, data security in particular must be determined. On the one hand, this should be appropriate to the risk and effective, and on the other hand, it must correspond to the state of the art. Depending on the sector, audits, pen tests and certifications may also be used for this purpose.
In principle, the processor processes personal data for controller’s purposes, i.e. the company’s. As a service provider, processors therefore are not permitted to process this personal data for their own purposes – otherwise they would become controllers themselves (sometimes also referred to as a “joint controller”). For such a change of purpose, a processor must be able to assert its own justification, primarily the express consent of the data subject.
Pitfalls in DPA
The issue of subcontracting, i.e. the use of so-called subcontractors, is often forgotten or overlooked. These must be disclosed before the contract is concluded and must only be used after the contract has been concluded with the prior authorization of the company, i.e. the controller. In addition, when outsourcing abroad, there are also considerations regarding the disclosure of personal data abroad and data security in general (see also below on outsourcing with a US context).
As mentioned above, the company remains responsible as the controller. To this end, it may make sense to regulate support obligations with contractual agreements (e.g. obligations to cooperate) and/or to take specific organizational measures. This primarily concerns the rights of data subjects (information, rectification, erasure), but also instruments and obligations under the law (data protection violations, data protection impact assessments, etc.).
It is not uncommon for regulations on responsibility and liability as well as the resulting costs to be missing – or they are unilaterally transferred. It seems sensible to adapt liability to the dynamics of the service relationship, in particular how independently the service provider acts for the company. The costs in particular should be made transparent and are probably best orientated towards the polluter-pays principle.
Outsourcing in a US context
If data is disclosed abroad during outsourcing, it must also be checked whether the countries in which the data is processed have an adequate level of data protection. This is primarily based on the decision of the Federal Council, i.e. the list of countries with an adequate level of data protection in accordance with the Swiss Data Protection Ordinance (DPO). If the service provider is located in a country that does not offer a level of data protection comparable to that in Switzerland, or if the data is processed in countries that do not offer an adequate level of protection compared to Switzerland, additional measures must be taken. Standard data protection clauses (also known as “Standard Contractual Clauses” SCC) are the first thing to consider.
The USA is therefore a special case. With the invalidation of the EU-US Privacy Shield – and subsequently the Swiss-US Privacy Shield – as a result of the Schrems II ruling by the European Court of Justice in July 2020, the processing of confidential data by a US provider in compliance with data protection regulations has become more complex. In the meantime, SCC was used as a workaround – without legal certainty as to whether this would be sufficient. Now the new Swiss-U.S. Data Privacy Framework for certified U.S. companies offers adequate protection for personal data. To this extent, the Federal Council has now put the USA back on the list of countries with an adequate level of data protection and will in future allow the transfer of personal data from Switzerland to certified companies in the USA without additional guarantees. The Federal Council has brought the corresponding amendment to the DPO into force on September 15, 2024.
On 22 May 2024 the Swiss Federal Council[i] decided on submitting further reforms in the realm of the anti-money laundering (AML) framework to the Parliament, with an aim to reinforce the competitiveness of the country both as a financial centre and a commercial hub.
These reforms, which are expected to come into force by early 2026, include the introduction of a non-public federal (transparency) register of beneficial owners. A simplified registration will be provided for not only associations and foundations but also sole proprietorships and limited liability companies. The register will be managed by the Federal Department of Justice and Police (FDJP).
Other proposals refer to the AML due diligence obligations applicable to certain advisory activities, in particular legal advice. While maintaining professional secrecy, these obligations will kick-in in certain activities with a potentially increased money laundering risk, such as the founding and structuring of companies as well as real estate transactions.
Precisely speaking, the following will be pivotal:
. The client’s identity must be verified and the beneficial owner and the object and purpose of the transaction or service must be identified;
. If the client, or the transaction or service, has a particularly high risk profile, it may be necessary to clarify the origin of the funds or to request additional explanations about the purpose of the requested transaction or service;
. The measures undertaken in connection with due diligence must be appropriately recorded.
In this respect, the responsibility for supervising the exercise of due diligence obligations by the affected lawyers and legal advisors will be vested upon the self-regulatory organisations (SROs).
Furthermore, additional organisational measures will be set in place against i) circumvention of sanctions under the Embargo Act, ii) cash payments exceeding CHF 15,000 in precious metals trading and iii) any amount in real estate business.
On a different note, on 22 May 2024 the Federal Council[ii] launched a consultation, set to run until mid September 2024, on the Cybersecurity Ordinance which essentially outlines the implementation of the obligation to report cyberattacks on critical infrastructures and the national cybersecurity strategy as well as the duties of the Swiss National Cyber Security Centre (NCSC).
The Ordinance also specifies exempted entities from the reporting obligation, namely those suffering a cyberattack which have no direct impact on the functioning of the economy or the well-being of the population. In addition, a general exemption would apply to companies with fewer than 50 employees, an annual turnover or annual balance sheet total of less than CHF 10 million and authorities that are responsible for fewer than 1,000 inhabitants.
Lastly, on 15 May 2024 the Federal Council[iii] decided to initiate a consultation, applicable until early September 2024, on extending the international automatic exchange of information in tax matters (AEOI). Set to apply from 1 January 2026, the extension would concern the new AEOI regarding cryptoassets and the amendment of the standards for the automatic exchange of financial account information.
Notably, the OECD update to the common reporting and due diligence standards for financial account information (CRS) and the new cryptoasset reporting framework (CARF) was published in October 2022. While the amendments to the CRS clarify interpretation issues and take practical experience into account, the CARF regulates the handling of cryptoassets and their providers.
Subject to parliamentary approval, Switzerland thus intends to also implement the CARF with an intention to effectively address existing gaps in the tax transparency mechanism and to ensure equal treatment with respect to traditional assets and financial institutions.
[i] See here https://www.admin.ch/gov/en/start/documentation/media-releases.msg-id-101100.html.
[ii] See here https://www.ncsc.admin.ch/ncsc/en/home/dokumentation/medienmitteilungen/newslist.msg-id-101088.html.
[iii] See here https://www.admin.ch/gov/en/start/documentation/media-releases/media-releases-federal-council.msg-id-101030.html.
The EU Data Act[1] aiming to regulate fair access to and use of data has entered into force in January 2024 following its publication in the Official Journal of the European Union. The Act is set to become applicable across the bloc as of September 2025.
Setting out rules for the use, access, availability and sharing of generated personal and non-personal data, the Act targets manufacturers of connected products and providers of related services irrespective of their place of establishment – all grouped under the umbrella term of ‘data holders’ either in the form of a natural or legal person.
In this context, ‘connected product’ is defined as “an item that obtains, generates or collects data concerning its use or environment and that is able to communicate product data via an electronic communications service, physical connection or on-device access, and whose primary function is not the storing, processing or transmission of data on behalf of any party other than the user”. In addition, the term ‘related services’ refers to “a digital service, other than an electronic communications service, including software, which is connected with the product at the time of the purchase, rent or lease in such a way that its absence would prevent the connected product from performing one or more of its functions, or which is subsequently connected to the product by the manufacturer or a third party to add to, update or adapt the functions of the connected product”.
Distinguished from the term ‘user’, ‘data recipient’ means “a natural or legal person, acting for purposes which are related to that person’s trade, business, craft or profession, other than the user of a connected product or related service, to whom the data holder makes data available, including a third party following a request by the user to the data holder.”
The following elements set out in the Act carry significance:
- Introduction of data access by design and by default;
- Where direct access may not be possible, data holders shall grant access to products and related services data including metadata, upon users’ requests, both in B2B and B2C settings – save for stricter conditions for security reasons and when sharing data constituting trade secrets;
- Data holders shall make data available to data recipients, upon users’ requests, on fair, reasonable and non-discriminatory terms while maintaining transparency;
- Data holders shall make data available to EU public sector bodies, upon request, on the grounds of public interest;
- Data processing service providers such as cloud computing services shall adopt necessary measures to enable effective interoperability for data access, transfer and use among different providers, and shall put in place contractual terms concerning switching services;
- Data processing service providers shall implement technical, organisational and legal measures to prevent unlawful cross border transfers of and access to non-personal data, retained in the bloc, from third countries;
- Introduction of data licence agreements between data holders, i.e. manufacturers and users;
- Introduction of a set of requirements for smart contract applications in the context of executing data sharing agreements;
- The Commission’s introduction of non-binding model contractual terms on data access and use, reasonable compensation and protection of trade secrets, alongside standard contractual clauses for cloud computing services based on fair, reasonable and non-discriminatory contractual rights and obligations.
The Data Act shall be read without prejudice to the EU GDPR, whereby those data access rights granted under the former shall be treated separately from the access rights granted to individuals under the latter.
Lastly, the Act’s inherent extraterritoriality shall carry direct consequences for manufacturers and providers outside the bloc, including Swiss based entities. In other words, any commercial activity falling under the scope of the Act with products and services being offered to the EU market, respectively any engagement in data sharing with stakeholders within the EU would need to undergo necessary due diligence within the set timeframe in order to ensure timely compliance.
[1] See here https://eur-lex.europa.eu/legal-content/en/TXT/?uri=CELEX:32023R2854&qid=1707924358044.
After over a year from the European Commission’s proposal[i] for a new Cyber Resilience Act for protection of consumers and businesses from (digital) products, which contain inadequate security features, through the introduction of mandatory requirements, a political agreement[ii] has now been reached effective as of 1 December 2023 between the other two legs of the Trilogue, namely the European Parliament and the Council.
The rather comprehensive proposal is set to cover both hardware and software products which may entail varying levels of risk and therefore requiring different security measures. As a result, the type of conformity assessment per product is set to be adapted to respective risk level.
Consequentially, manufacturers of hardware and software, developers and distributors aiming to import and offer their products on the EU market, will essentially have to implement cybersecurity measures across the entire lifecycle of their products, from design and development stages to after placement on the market. Specifically, not only those that are sold to end users and consumers, but also those used in companies for production, sourced as precursors and further processed, or those forming part of supply chains.
Notably, those products that are already covered by other existing EU legislation, such as the scope of the NIS2 Directive, will be excluded.
In this context, compliance with the proposed legislation will essentially be rendered in the form of a CE marking which is an indication confirming that the products sold on the market of the European Economic Area (EEA) have been duly assessed to meet safety, health and environmental protection requirements.
Furthermore, manufacturers will be obliged to provide consumers with a precise length by which a given product would be expected to be utilised.
Applicable to all products that are connected directly or indirectly to another device or network, the proposed legislation will now have to be formally approved and expected to enter into force following its publication on the Official Journal.
Given that the EU serves as the most important sales market for many of the industries and sectors in Switzerland, the direct impact of the proposed legislation on Swiss actors and stakeholders is undeniable. Importantly, the Swiss exporters of those products that could be classified as “critical” within the meaning of the proposed text will have to firstly prove that the related digital components do meet the set security standards and to secondly submit conformity assessments as deemed necessary.
[i] See here https://ec.europa.eu/commission/presscorner/detail/en/IP_22_5374.
[ii] See here https://ec.europa.eu/commission/presscorner/detail/en/ip_23_6168.
The Swiss Federal Council has recently announced[1] the launch of a consultation process effective until 29 November 2023 in order to tighten the existing anti-money laundering rules.
The proposed framework particularly focuses on the identification of legal entities, whereby a mandatory federal (transparency) register is set to be introduced containing information on beneficial owners. The non-public register will be coordinated by the Federal Department of Justice and Police (FDJP) and accessible by competent authorities including financial intermediaries. Notwithstanding, a rather simplified procedure will also be put in place for certain legal forms such as sole proprietorships, foundations, associations as well as limited liability companies.
Furthermore, the monetary threshold for due diligence obligations in the context of trade in precious metals and stones will be significantly lowered from CHF 100,000 to CHF 15,000.
An all inclusive obligation for due diligence will also be introduced for cash payments in real estate business irrespective of the monetary amount involved.
By the end of the consultation period the proposal is expected to be presented at the parliament in early 2024.
[1] See here https://www.admin.ch/gov/en/start/documentation/media-releases.msg-id-97561.html.
In a recent decision of the Court of Justice of European Union (CJEU)[1], namely Breyer v REA, the ongoing question of how and when to strike a balance between commercial interests vs public interest, in particular rights to privacy, equality and expression, was once again brought forward.
The case concerns an EU funded research project, namely iBorderCtrl, on development of an AI enabled emotion recognition technology which would be deployed at borders as part of the EU border control management scheme. The issue arose upon refusal of the EU institutions to disclose information and give full access to documentation relating both to the authorisation of the project and to its progress. The main reasoning given was the protection of commercial interests of stakeholders.
As a result, an action was brought against the European Research Executive Agency (REA) by a member of the European Parliament (EP) mainly on the grounds that an overriding public interest clearly existed which would justify the need for full disclosure of documentation in particular in the context of a technology which could in principle be utilised for mass surveillance and crowd control purposes.
The verdict of the CJEU however falls short of effectively outweighing public interest over commercial interests by essentially stipulating that “general considerations” of overriding public interest may not be sufficient to establish a “particularly pressing” need for transparency.
Such a stance could certainly undermine the importance of democratic oversight and public debate and the need for transparency in software development in projects with undeniable impact on individuals at large.
Furthermore, it was confirmed that tools and technologies developed within the framework of a given project are considered trade secrets, only setting aside the results of the project.
[1] See here for details https://curia.europa.eu/juris/document/document.jsf?mode=DOC&pageIndex=0&docid=277067&part=1&doclang=EN&text=&dir=&occ=first&cid=1901751.
With more than 750 member firms and 36,000 lawyers across 200+ countries, Nextlaw Referral Network[1] is considered the largest legal referral network in the world. Created by Dentons the network employs a detailed screening system to guarantee the quality of its member firms and has developed proprietary technology to allow members to identify lawyers, legal counsels and advisers at other member firms with jurisdiction-specific appropriate experience where clients need personalised consultancy.
[1] See here for more information: https://www.nextlawnetwork.com/.
The European Parliament (EP) has recently[i] voted to adopt its negotiating position in a plenary session on the Artificial Intelligence (AI) Act.
Essentially following a risk based approach, the discussions over rules span around ensuring that the developments and use of AI applications and systems in Europe would in theory comply with EU rights and values including “human oversight, safety, privacy, transparency, non-discrimination and social and environmental well-being”.
In a nutshell, next to a revised definition of an AI system in line with the OECD version, the proposed to-do list, targeting providers and deployers among other actors, contains the following:
- ban on emotion-recognition AI;
- ban on “real-time” and “post” remote biometric identification and predictive policing in public spaces;
- ban on biometric categorisation systems using sensitive characteristics;
- ban on social scoring;
- ban on untargeted scraping of facial images, from the internet or CCTV footage, for facial recognition purposes;
- new set of restrictions for general purpose AI and foundation models;
- new set of restrictions on recommendation algorithms on social media;
- assignment of recommender systems to the “high risk” category, whereby placing higher scrutiny on recommender systems on social media platforms as to how they work. Consequentially, tech companies could be held more liable for the impact of user generated content.
Notably, the ban on “post” remote biometrics identification would be subject to the exception of law enforcement upon prior judicial authorisation in the context of serious crimes.
Furthermore, those generative AI systems based on foundation models, such as ChatGPT, would have to comply with transparency requirements and put in place effective safeguarding mechanisms against illegal content. In the case of use of copyrighted data for training models, detailed summaries would need to be made publicly available. Registration in the EU database will also be obligatory for foundation models.
Importantly, alongside defining responsibilities across AI value chain of various actors involved, the EP proposes the development of non-binding standard contractual clauses to regulate rights and obligations in line with each actor’s level of control in a given value chain.
Taking into account that the AI Act is set to be also applicable to providers and users of AI systems located outside of the EU – provided that the output produced is intended to be used in the EU, these developments are pivotal for the Swiss market.
[i] See here https://www.europarl.europa.eu/pdfs/news/expert/2023/6/press_release/20230609IPR96212/20230609IPR96212_en.pdf; https://www.europarl.europa.eu/doceo/document/TA-9-2023-0236_EN.html.