|

The music industry and AI: What legal protection(s) for artists?

Deepfake » or « hypertrucage » (a multimedia synthesis technique that uses Artificial Intelligence and mainly consists of superimposing existing images, videos or sounds onto other images, videos or sounds in order to create new content), seems to threaten, among other things, the music industry and the future of its artists.

While the use of this process in recent years has animated video streaming platforms involving the personality attributes of a duo of artists from across the Atlantic (notably the singer « The Weekend » and the Canadian rapper « Drake » with the music entitled « Heart on my Sleeve », released in the first half of 2023), it has also raised numerous concerns, forcing certain state representatives to take up the issue[1].

As the risks associated with the use of such a digital process are particularly extensive, the victim of a deepfake has, under French and European legislation, a lever for legal protection in both civil (I) and criminal (II) law, as well as in the field of personal data protection (III).

I. Civil protection of rights

A. Use of article 9 of the French Civil Code

As the voice is considered by jurisprudence[2] to be an element of personality, the provisions of article 9 of the French Civil Code[3] may apply to the creation of a deepfake without the consent of the person whose voice has been processed.

Under this provision, any person can authorize or prohibit the use of his or her voice, or even image, by a third party[4].

The victim, whose personal attributes have been used without his or her consent, will nevertheless need to be identifiable – albeit by a limited number of people – in order to effectively assert the rights he or she feels have been violated.

In the event of an infringement of these rights, the victim of the deepfake will then be able to sue for civil liability on this basis, in order to obtain compensation for the damage suffered.

B. Using the provisions of the French Intellectual Property Code

While recourse to copyright or related rights provisions seems ineffective in the case of a deepfake of a person’s voice made without their consent – since the voice is not considered a work within the meaning of the French Intellectual Property Code – the situation is different when the designer uses musical works which are themselves protected by copyright or related rights.

Indeed, superimposing a sound onto a musical work for which the designer has no rights will most certainly be analyzed as an act of infringement, obliging the designer to compensate the effective owner of the copyright or related rights.

However, a word of caution is in order.

Even if they remain strictly defined by the legislator, the exception of parody or short quotation could eventually be used by the deepfake’s creator to escape conviction.

II. Protection under criminal law

Article 226-8 of the French Penal Code, relating to attacks on the representation of the person, punishes with one year’s imprisonment and a 45,000 euro fine the fact of « publishing, by any means whatsoever, a montage made with the words or image of a person without his or her consent, if it is not obvious that it is a montage or if it is not expressly mentioned.

The second text punishes identity theft with one year’s imprisonment and a fine of 15,000 euros, defined in article 226-4-1 as « the act of usurping the identity of a third party or of using one or more data of any kind enabling him or her to be identified, with a view to disturbing his or her peace of mind or that of others, or prejudicing his or her honor or consideration ».

It should also be remembered that on October 17, 2023, the French National Assembly passed the first reading of the bill to secure and regulate the digital space (SREN), which included two government amendments on the use of deepfakes, adding two paragraphs to article 226-8 of the French penal code.

The first paragraph punishes the dissemination of content (audio or visual) generated by an artificial intelligence without its consent and without mentioning that it is fake content (new article 226-8 of the Penal Code).

The second paragraph, meanwhile, creates an aggravating circumstance and punishes the designer with two years’ imprisonment and a 45,000 euro fine when the deepfake has been published on an online public communication service.

III. Protection under the RGPD

If voice and voice data are indeed assimilated to biometric personal data, their use must therefore meet the requirements of the General Data Protection Regulation (GDPR)[5] and in particular, on the one hand, have a legal basis, and on the other, pursue one or more defined purposes.

Consequently, in the absence of consent from the data subject, and hence of a legal basis, a violation of the provisions stemming from the RGPD could be noted by the Commission Nationale Informatique et Libertés (CNIL).

However, the data controller (in other words, the person using, modifying and broadcasting the voice) could invoke a « legitimate interest » to justify the processing of said data, particularly insofar as it is based on a humorous or satirical approach…

It is to be hoped that the balance of interests will favor the protection of the rights of the person concerned.

In any case, and despite the absence of a specific legal framework applicable to deepfakes produced without the consent of the person who is the object of their conception, the victim has a legal arsenal at his or her disposal, enabling him or her to effectively assert his or her rights.

In the event of an infringement of these rights, Maëva BAKIR will advise, assist and represent you to ensure the legitimate defense of your interests.

[1] A bill called the « No Fakes Act » drafted and proposed on October 12, 2023 by U.S. Senators Amy Klobuchar, Chris Coons, Marsha Blackburn and Thom Thillis, which would prohibit the AI-generated creation of a clone of another person in any audiovisual format without that person’s consent.

[2] PAU Court of Appeal ruling January 22 2001: BICC April 15 2002 n°396

[3] Article 9 of the Civil Code: « Everyone has the right to respect for his private life.Judges may, without prejudice to compensation for damage suffered, prescribe all measures, such as sequestration, seizure and others, suitable for preventing or stopping an invasion of privacy: these measures may, if urgent, be ordered in summary proceedings. »

[4] Judgements TGI de PARIS 3 décembre 1975 : JCP 1978 ; TGI de PARIS 19 décembre 1984 Gaz. Pal. 1985

[5] EU Regulation 2016/679 of the European Parliament and of the Council of April 27, 2016 on the protection of individuals with regard to the processing of personal data and on the free movement of such data.