After over a year from the European Commission’s proposal[i] for a new Cyber Resilience Act for protection of consumers and businesses from (digital) products, which contain inadequate security features, through the introduction of mandatory requirements, a political agreement[ii] has now been reached effective as of 1 December 2023 between the other two legs of the Trilogue, namely the European Parliament and the Council.

The rather comprehensive proposal is set to cover both hardware and software products which may entail varying levels of risk and therefore requiring different security measures. As a result, the type of conformity assessment per product is set to be adapted to respective risk level.

Consequentially, manufacturers of hardware and software, developers and distributors aiming to import and offer their products on the EU market, will essentially have to implement cybersecurity measures across the entire lifecycle of their products, from design and development stages to after placement on the market. Specifically, not only those that are sold to end users and consumers, but also those used in companies for production, sourced as precursors and further processed, or those forming part of supply chains.

Notably, those products that are already covered by other existing EU legislation, such as the scope of the NIS2 Directive, will be excluded.

In this context, compliance with the proposed legislation will essentially be rendered in the form of a CE marking which is an indication confirming that the products sold on the market of the European Economic Area (EEA) have been duly assessed to meet safety, health and environmental protection requirements.

Furthermore, manufacturers will be obliged to provide consumers with a precise length by which a given product would be expected to be utilised.

Applicable to all products that are connected directly or indirectly to another device or network, the proposed legislation will now have to be formally approved and expected to enter into force following its publication on the Official Journal.

Given that the EU serves as the most important sales market for many of the industries and sectors in Switzerland, the direct impact of the proposed legislation on Swiss actors and stakeholders is undeniable. Importantly, the Swiss exporters of those products that could be classified as “critical” within the meaning of the proposed text will have to firstly prove that the related digital components do meet the set security standards and to secondly submit conformity assessments as deemed necessary.

[i] See here

[ii] See here

The Swiss Federal Council has recently announced[1] the launch of a consultation process effective until 29 November 2023 in order to tighten the existing anti-money laundering rules.

The proposed framework particularly focuses on the identification of legal entities, whereby a mandatory federal (transparency) register is set to be introduced containing information on beneficial owners. The non-public register will be coordinated by the Federal Department of Justice and Police (FDJP) and accessible by competent authorities including financial intermediaries. Notwithstanding, a rather simplified procedure will also be put in place for certain legal forms such as sole proprietorships, foundations, associations as well as limited liability companies.

Furthermore, the monetary threshold for due diligence obligations in the context of trade in precious metals and stones will be significantly lowered from CHF 100,000 to CHF 15,000.

An all inclusive obligation for due diligence will also be introduced for cash payments in real estate business irrespective of the monetary amount involved.

By the end of the consultation period the proposal is expected to be presented at the parliament in early 2024.

[1] See here

In a recent decision of the Court of Justice of European Union (CJEU)[1], namely Breyer v REA, the ongoing question of how and when to strike a balance between commercial interests vs public interest, in particular rights to privacy, equality and expression, was once again brought forward.

The case concerns an EU funded research project, namely iBorderCtrl, on development of an AI enabled emotion recognition technology which would be deployed at borders as part of the EU border control management scheme. The issue arose upon refusal of the EU institutions to disclose information and give full access to documentation relating both to the authorisation of the project and to its progress. The main reasoning given was the protection of commercial interests of stakeholders.

As a result, an action was brought against the European Research Executive Agency (REA) by a member of the European Parliament (EP) mainly on the grounds that an overriding public interest clearly existed which would justify the need for full disclosure of documentation in particular in the context of a technology which could in principle be utilised for mass surveillance and crowd control purposes.

The verdict of the CJEU however falls short of effectively outweighing public interest over commercial interests by essentially stipulating that “general considerations” of overriding public interest may not be sufficient to establish a “particularly pressing” need for transparency.

Such a stance could certainly undermine the importance of democratic oversight and public debate and the need for transparency in software development in projects with undeniable impact on individuals at large.

Furthermore, it was confirmed that tools and technologies developed within the framework of a given project are considered trade secrets, only setting aside the results of the project.

[1] See here for details

With more than 750 member firms and 36,000 lawyers across 200+ countries, Nextlaw Referral Network[1] is considered the largest legal referral network in the world. Created by Dentons the network employs a detailed screening system to guarantee the quality of its member firms and has developed proprietary technology to allow members to identify lawyers, legal counsels and advisers at other member firms with jurisdiction-specific appropriate experience where clients need personalised consultancy.

[1] See here for more information:

The European Parliament (EP) has recently[i] voted to adopt its negotiating position in a plenary session on the Artificial Intelligence (AI) Act.

Essentially following a risk based approach, the discussions over rules span around ensuring that the developments and use of AI applications and systems in Europe would in theory comply with EU rights and values including “human oversight, safety, privacy, transparency, non-discrimination and social and environmental well-being”.

In a nutshell, next to a revised definition of an AI system in line with the OECD version, the proposed to-do list, targeting providers and deployers among other actors, contains the following:

Notably, the ban on “post” remote biometrics identification would be subject to the exception of law enforcement upon prior judicial authorisation in the context of serious crimes.

Furthermore, those generative AI systems based on foundation models, such as ChatGPT, would have to comply with transparency requirements and put in place effective safeguarding mechanisms against illegal content. In the case of use of copyrighted data for training models, detailed summaries would need to be made publicly available. Registration in the EU database will also be obligatory for foundation models.

Importantly, alongside defining responsibilities across AI value chain of various actors involved, the EP proposes the development of non-binding standard contractual clauses to regulate rights and obligations in line with each actor’s level of control in a given value chain.  

Taking into account that the AI Act is set to be also applicable to providers and users of AI systems located outside of the EU – provided that the output produced is intended to be used in the EU, these developments are pivotal for the Swiss market.

[i] See here;

Following the binding decision of the European Data Protection Board (EDPB) in April 2023[i], the Irish Data Protection Commission (DPC) has announced on 22 May[ii] a fine of EUR 1.2 bn against Meta Platforms Ireland Limited on the grounds of the company’s unlawful transfer of personal data from the EU/EEA to the USA, effectively from July 2020 to date, within the context of provision of its Facebook related services. A 6-month deadline is imposed for suspension of any future transfers, and for either deletion or moving of the already transferred data back to the EU.

The subject matter transfers have been carried out on a systematic, repetitive and continuous manner on the basis of standard contractual clauses (SCC).

The infringement essentially relates to Article 46(1) of the GDPR, whereby “…a controller or processor may transfer personal data to a third country or an international organisation only if the controller or processor has provided appropriate safeguards, and on condition that enforceable data subject rights and effective legal remedies for data subjects are available.”

Following the European Commission’s draft adequacy decision in December 2022, the political agenda is already put in place for a new EU – US data privacy framework which is set to enter into force by the end of 2023. In this context, the question remains as to the actual effectiveness of the DPC decision in favour of data privacy given that the set 6-month deadline could in principle be seen as a leeway for the company to delay compliance until the new framework becomes operative.

[i] See here

[ii] See here

The German Financial Supervisory Authority (BaFin) has on 8 March 2023[i] announced its general stance with respect to the classification of non-fungible token models (NFTs).

Suggesting a strictly case-by-case analysis, BaFin takes a rather conservative approach towards defining NFTs as securities primarily due to lack of immediate exchangeability. In other words, an NFT could potentially be considered a security only in cases where, for instance, a significant number of these tokens would embody identical repayment and interest claims.

Also, if an NFT embodies types of ownership rights such as a promise of distribution, the token could in principle be considered as an investment under the Asset Investments Act (VermAnIG). The mere act of speculation by token holders would, on other hand, not essentially suffice for the NFT in question to assume an investment purpose.

Notably, NFTs can in principle have a potential use in the financial sector, especially in cases where they can be transferable and tradable on the financial market, hence embedding certain security like rights i.e. membership rights or contractual claims similar to stocks and debt instruments. As stipulated by BaFin, “the transferability can be assumed as a given with the current standards […] whereas tradability requires a minimum of standardisation.”

To recap, here the nexus would be the definition of types of rights associated with a given token model alongside the potential utility of those rights after the token issuance.

Taking a stance similar to the draft EU proposal for a Regulation in Markets for Crypto Assets (MiCA), BaFin adopts the position that NFT fragmentation, which would result in fungible tokens each representing an equal share of an NFT, would in theory satisfy the interchangeability feature.

On a different note, France, with the parliamentary voting of 28 February 2023[ii], is set to introduce tighter licensing rules for new entrants to its crypto ecosystem in an attempt to harmonise its national laws in accordance with the upcoming EU laws. Under the existing rules, entities have the option to opt for simplified registration procedures with the l’Autorité des Marchés Financiers (AMF) under less disclosure requirements. Once passed, the new players will be facing stricter anti money laundering (AML) measures, namely clear segregation of customer funds, a new set of reporting guidelines and more detailed risk and conflict of interest related disclosures.

Lastly, the sudden failures and recent regulatory issues in the US financial sector surrounding the three active financial institutions in the cryptocurrency industry, namely Silicon Valley Bank (SVB), Signature Bank and Silvergate Capital, have raised confidence questions and have inevitably spawned ever more volatility in the industry. A simple bank run, where large numbers of depositors withdraw funds simultaneously in fear of potential insolvency, is so far seen as the root cause.

In the context of potential contagion risk, however, questions have also arisen as to whether the banking system in Europe has in general more effective risk management infrastructure and stronger liquidity requirements in place.

[i] See here

[ii] See here

With the deferral of final votes on the EU proposals for Regulations on Markets in Crypto Assets (MiCA) and on Transfer of Funds (TFR) to April this year, a set of new banking rules were recently approved by the Economic and Monetary Affairs Committee of the European Parliament[i].

In alignment with Basel III Accord, the amendments would, among other things, oblige banks to apply a risk weight of 1,250% of capital to crypto-asset exposures, which is deemed as maximum possible level by international standards.  In simple terms, banks must in practice issue one euro of capital for every euro of crypto-asset held. In addition, a requirement for banks to disclose their exposure to crypto-assets and crypto-asset services as well as a specific description of their risk management policies related to crypto-assets is put forward.

Next to the introduction of the concept of “shadow banking” covering for instance investment funds and insurers, the updates also make reference to environmental, social and governance (ESG) risks with strengthened reporting and disclosure requirements.

Relevantly, the Basel Committee on Banking Supervision (BIS) also set forth a set of rules in December 2022[ii] to be implemented by January 2025, whereby two groups of crypto-assets are designed, based on a number of classification conditions, in order to determine minimum risk-based capital requirements concerning credit and market.

The classification conditions essentially relate to the nature of crypto-assets, issues of legal certainty, the reliability of the design of a given crypto-asset and its underlying network, alongside regulation and supervision of entities performing significant functions.

Here, distinction is made between tokenised traditional assets and crypto-assets with an effective stabilisation mechanism in place, as well as those crypto-assets that are unbacked.
In the case of crypto-assets that fail to meet any of the classification conditions, namely group 2, which are taken to pose additional and higher risks, “bank’s total exposure to these must not exceed 2% of the bank’s Tier 1 capital and should generally be lower than 1%.”

Lastly, with the test for assessing stablecoins with low risk profiles now taking the form of evaluating both the scope of redemption risks and the level of regulatory supervision, the BIS rules in general are seen to be of dynamic nature and would change in line with emerging developments.  

[i] See here

[ii] See here

Following a consultation process[i] on partial revision of the Swiss Financial Market Supervisory Authority’s (FINMA) Anti-Money Laundering Ordinance (AMLO – FINMA), which ran for a period between March and May 2022 and in line with the recent revisions of the federal Anti-Money Laundering Act (AMLA) and its accompanying Ordinance (AMLO) in conformity with the recommendations of the Financial Action Task Force (FATF), FINMA recently announced the revision[ii] which came into force on 01 January 2023 simultaneous with the revised AMLA and AMLO[iii].

Requiring financial intermediaries to comply with stricter due diligence obligations as of January, namely a duty to verify the identity of beneficial owners, including control holders, and to update client data, the AMLA and AMLO take a risk-based approach but nevertheless seem to remain unclear as to the precise form of such identity verification obligation. Similarly, neither the AMLO – FINMA nor the agreement on the Swiss banks’ code of conduct with regards to the exercise of due diligence, namely CDB 20, seem to provide for such a specification.

As a result, it could be argued that financial intermediaries, essentially depending on the risk profile of each individual case, would initially require to carry out a plausibility test on own knowledge of their clients and – if deemed necessary – ensure that various sources of information are also exhausted. In other words, a mere verification by means of retaining identity documents of beneficial owners may not be generally sufficient for fulfilling the set obligation.

In case of natural persons with a normal risk profile, information provided by contracting parties on beneficial owners would need to be scrutinised by financial intermediaries in order to ensure consistency. In case of legal persons, however, with increased risk would come stricter scrutiny.

Under the revised AMLA, the periodic verification and updating of client data as well as documents covering all business relationships is now required, irrespective of events and risk profile of entities. The term ‘documents’, when interpreted broadly, would include every information that is collected as part of the due diligence process at the time of the creation of each and every client profile.

In the context of regulatory reporting and amendments thereof, the test of ‘reasonable suspicion’ which would otherwise result in an immediate reporting obligation to the Money Laundering Reporting Office Switzerland (MROS), has been redefined. A reasonable suspicion therefore exists in case the financial intermediary has a single or several concrete indications that the assets involved in a given business relationship:

  1. are connected with a criminal offence under Art. 260ter or 305bis Swiss Criminal Code;
  2. originate from a crime or a qualified tax offence; or
  3. are subject to the power of disposition of a criminal or terrorist organisation or serve the financing of terrorism, and if this suspicion cannot be dispelled on the basis of additional clarifications.

Moreover, a new right is introduced allowing financial intermediaries to terminate a reported business relationship in cases where the MROS would not notify the former within 40 working days after a report has been made that the reported information will be forwarded to a prosecution authority.

Notably, the stipulated new obligations will apply to business relationships initiated in January and onwards. For existing business relationships, on the other hand, the revised requirements would only apply in connection with the periodic review and updating of client data.

With reference back to the partial revision of the AMLO – FINMA, the amendments mainly concern further clarification on the set threshold for transactions involving cryptocurrencies. In view of risk management, FINMA has now confirmed that “technical measures are needed to prevent the threshold of CHF 1’000 from being exceeded for linked transactions within thirty days (and not just per day).” Nevertheless, this obligation would only apply to exchange transactions of cryptocurrencies for cash or “other anonymous means of payment”. The scope of application of AMLO – FINMA is also set to be expanded to include the distributed ledger technology (DLT) based trading systems and facilities.  

Relevantly, FINMA has also recognised the updated regulations of the Self-Regulatory Organisation of the Swiss Insurance Association (SRO-SIA), revision of which also took place for the same underlying reasons.

[i] See here

[ii] See here;

[iii] See here  

Following the 6th of December 2022 binding decision of the European Data Protection Board (EDPB) regarding the question of whether Meta’s reliance on its contractual terms in order to justify its personal data – based targeted advertisement would be deemed unlawful, further clarity has now been gained by the 4th of January 2023 final decision[i] of the Irish Data Protection Commission (DPC), Meta Platforms Ireland’s overseeing authority.

Meta is now given three months to establish an alternative legal basis for its personal data – based targeted advertisement model. In other words, “by making the accessibility of its services conditional on users accepting the updated terms of service, Meta was in fact “forcing” … [users] to consent to the processing of their personal data for behavioural advertising and other personalised services”. A stance found to be in clear breach of the EU GDPR, in particular a contravention of its Article 6.

Simply put, the decision renders that Meta must within three months allow users to have a version of all applications that does not utilise personal data for advertisement. Users must also be able to withdraw consent at any time, in which case Meta may not limit the services.

This clearly excludes Meta’s use of non-personal data for targeted advertisement.

Furthermore, the imposed administrative fines have now seen an almost ten-fold increase to reach €210 mm in the case of Facebook services, and €180 mm in the case of Instagram.

Meta Platforms Ireland still retains the option to appeal against the decision before the Irish courts, albeit with very limited chance to win given the circumstances and the involvement of the EDPB.

[i] See here;