markus-spiske-p-l8OjDH9eE-unsplash

Strict Liability for Software Developers?

Strict Liability for Software Developers?

With yet another recent proposal package, the European Commission has stirred discussions this time around the product liability regime in the European Union (EU).

On 28 September 2022[i], a proposal was put forward to (i) revise the existing Product Liability Directive (PLD), and (ii) introduce a brand new directive specifically for designing non-contractual fault-based civil liability rules for artificial intelligence (AI), namely the AI Directive.

The text of the proposed revised PLD is currently open for public feedback until 11 December 2022[ii], with the original PLD laying down harmonised liability rules for damages incurred by consumers caused by defective products. The existing framework thus incorporates a strict liability regime for producers regardless of fault that cannot be excluded or limited by contract. In other words, a consumer simply needs to establish a damage claim, and a causal link between damage and defect.

Among the proposed changes is the expansion of the scope of the definition of a ‘product’ to explicitly cover software, with the European Commission noting that software “is capable of being placed on the market as a standalone product and may subsequently be integrated into other products as a component, and is capable of causing damage through its execution”. Software would as a result be considered a product “irrespective of the mode of its supply or usage, and therefore irrespective of whether the software is stored on a device or accessed through cloud technologies.”

Nevertheless, source code of software as well as free and open-source software which is “developed or supplied outside the course of a commercial activity” – namely one which is not exchanged for profit, respectively for personal data – would not be included within the said scope.

The concept of ‘producer’ seems to no longer be part of the proposed terminology, whereby instead the term ‘manufacturer’ is stretched in scope. In this context, developers and producers of software including AI system providers, within the meaning of the forthcoming AI Act, would be treated as manufacturers. Under the proposed revision, the definition of an ‘economic operator’ would cover additional actors in supply chain, namely fulfilment service providers, re-manufactured product makers, and distributors. In cases where a manufacturer is established outside of the EU, respectively where a manufacturer is not identifiable, an importer of a product or an authorised representative of a manufacturer in the EU as well as fulfilment service providers alongside distributors could be held liable. Besides, an online platform provider within the meaning of the Digital Services Act (DSA) “would be liable only when they do so present the product or otherwise enable the specific transaction, and only where the online platform fails to promptly identify a relevant economic operator based in the Union.”

Article 10 of the proposed revised PLD lists exemption from liability scenarios for an economic operator, and stipulates that derogation from exemption may arise in cases “where the defectiveness of the product is due to any of the following, provided that it is within the manufacturer’s control: (a) a related service; (b) software, including software updates or upgrades; or (c) the lack of software updates or upgrades necessary to maintain safety.”

Furthermore, the range of compensable damages is set to be expanded beyond personal injury and damage to property to include “material losses resulting from loss or corruption of data”. Such a significant shift would mean that product liability risks may cross over with cyber security risks.

Here, the proposed revised PLD makes reference to the definition of data set out in the Data Governance Act which reads as “any digital representation of acts, facts or information […] including in the form of sound, visual or audio visual recording”. Distinctively, the definition of a token or a cryptographically generated asset under the proposed EU Markets in Crypto-Assets Regulation (MiCA) reads as “a digital representation of value or rights which may be transferred and stored electronically”. Although the definition of data under the Data Governance Act may seem rather broad and may potentially be interpreted to encompass that of the MiCA, such an overlap would certainly be detrimental, and go against proper functioning of projects with disintermediation and levels of decentralisation in governance in place, decentralised finance (DeFi) applications as well as the emerging web3 protocols and developments. With the immutable and tamper resistant record of tokenised transactions appended on distributed ledger technology (DLT) based network infrastructures, it seems implausible to consider scenarios with potential data loss or corruption – and associated evidence thereof – in this regard on an equal footing as those scenarios envisioned by the European Commission. In other words, it would only be feasible if the proposed revised PLD is strictly read without prejudice to the scope of the forthcoming MiCA.

Moreover, software inherently continues to develop on an ongoing basis whereby an all-inclusive qualification of software as product, next to a generous application of data definition under the compensable damages for loss or corruption of data, would wrongly risk expanding strict liability regime to the detriment of technological advancements.

On the other hand, the proposed AI Directive is set to implement a non-contractual fault-based civil liability regime for damages caused by an AI system. Complementary to the proposed revised PLD, the proposed AI Directive introduces a ‘presumption of causality’ between the fault and the harm suffered by a given claimant. The provisions apply to AI systems providers, operators and users. The Directive is set to have an extraterritorial effect, given that it would apply to those providers and/or users of AI systems that are available on or operating within the EU market. Given that the Directive is closely aligned with the forthcoming AI Act, particularly as to the classification of high-risk AI systems, any amendments to latter would need to be duly reflected in the former.

Lastly, in parallel with the developments at the EU regulatory level, a designated committee at the Council of Europe, namely the Committee on Artificial Intelligence (CAI) has recently been set up to draft a convention on artificial intelligence, human rights, democracy and the rule of law, whereby Switzerland will be actively participating in the course of the negotiations, as per the decision of the federal council in September 2022[iii].  


[i] See here https://ec.europa.eu/commission/presscorner/detail/en/ip_22_5807.

[ii] See here https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12979-Product-Liability-Directive-Adapting-liability-rules-to-the-digital-age-circular-economy-and-global-value-chains_en.

[iii] See here https://www.admin.ch/gov/en/start/documentation/media-releases.msg-id-90367.html.

Obergrundstrasse 70
CH-6003 Luzern

Bahnhofplatz
CH-6300 Zug

Contact us directly

Obergrundstrasse 70
CH-6003 Luzern

Bahnhofplatz
CH-6300 Zug