The practice of lawyers cannot remain unaffected by recent advances in generative artificial intelligence (AI), whose gradual democratisation and rapid development could eventually reshape certain professional practices. For lawyers experienced in contract negotiation and drafting, these developments simultaneously spark genuine interest and legitimate concern.
In response to the advent of AI, the Council of the Swiss Bar Association (SBA) adopted in June 2024 the SBA Guidelines on the Use of AI[1]. These recommendations provide a reference framework for the responsible use of AI by lawyers, including in contractual practice.
According to empirical data from the Liquid Legal Institute (2023), 38.2% of tasks carried out within law firms are repetitive and potentially automatable. 55.8% of law firms have identified contract drafting as one of the main tasks likely to benefit from the use of AI, following legal research (79.7%) and document analysis (72.1%). Contract analysis (44.1%) and contract review (40.1%) also appear among the top ten tasks that lawyers would consider optimising through AI.[2]
Lawyers thus stand at a crossroads: they must comply with professional and ethical rules while improving their efficiency through the use of AI tools. In this context, it is pertinent to assess in a nuanced manner the main opportunities offered by AI in the context of contract drafting, while remaining mindful of the specific risks its use may entail.
Opportunities of IA
Increased efficiency. AI tools enable lawyers to quickly generate draft contracts, automatically insert standard clauses, or adapt a contract’s format and content to the requirements of a particular legal system. This optimisation frees practitioners from the more tedious aspects of the task, allowing them to focus on strategic matters: negotiation, personalisation, and anticipating potential points of friction.
Enhanced drafting quality. AI contributes to terminological consistency and helps prevent internal contradictions or inconsistencies with other related contractual documents. Such harmonisation is a key factor in ensuring legal clarity and, by extension, security for the contracting parties. This qualitative improvement, in our view, falls within the duty of diligence prescribed in Article 12(a) of the Federal Act on the Free Movement of Lawyers (FMLA) and Article 6 of the Swiss Code of Professional Conduct (SCPC), both of which require lawyers to exercise their profession with care and diligence. It may thus be expected that lawyers integrate technological tools enabling contract automation into their practice, particularly where this results in reduced fees for their clients[3].
Proactive risk anticipation. By analysing large volumes of legal data and similar cases, AI systems can detect potentially problematic clauses or ambiguous wording. Early identification of such issues helps prevent future disputes and enhances the robustness of agreements. AI may prove especially effective in situations of fatigue or work overload, acting as a valuable ally in error prevention.
Contextual personalisation. Certain tools offer contractual variations tailored to specific industries, jurisdictions, or even client profiles. In such cases, the lawyer need only formulate a properly contextualised query for the AI tool in order to obtain the necessary adaptations. For instance, a software licence agreement generated for a client active in the fintech sector in Geneva might by default include clauses aligned with the Swiss Financial Market Supervisory Authority (FINMA) requirements, while a similar agreement for a biotechnology start-up in Basel would place greater emphasis on intellectual property protection.
Enhanced argumentative support. Both during the preparatory phase and in the course of negotiations, AI can be a strategic ally for lawyers. Upstream, it allows for the identification of key negotiation objectives, the structuring of alternative scenarios, and the anticipation of opposing arguments—thereby supporting the development of a clear strategy. During negotiations themselves, AI can instantly provide structured suggestions or summaries to substantiate a position or enhance a clause—especially when the other party raises technical arguments in fields where the lawyer may not have in-depth expertise. For example, in the course of negotiating a partnership agreement, one party might refer to a price adjustment mechanism based on a complex mathematical formula; a brief pause would allow the lawyer to consult an AI tool to verify the calculation and immediately propose a revised formulation in line with their client’s interests—an appreciable form of support, given that lawyers are not always renowned for their affinity with mathematics.
Risks of AI
Dependence. AI systems are neither omniscient nor infallible, and the results they produce may be inaccurate or incomplete. Human oversight remains indispensable. Excessive reliance on such systems risks leading to the atrophy of certain professional reflexes or the stagnation of legal knowledge, as critical thinking is progressively delegated to the machine. It is therefore essential to critically examine the outputs generated, correcting or supplementing them where necessary. In the context of contract drafting, such vigilance is imperative in order to avoid mechanically reproducing templates that are ill-suited to the specific needs of the parties.
Bias and hallucinations. AI systems may replicate or even amplify the biases present in their training data. More concerning still, these tools may at times generate entirely fictitious content—a phenomenon referred to as hallucination. Such errors may arise from false information, particularly where the underlying dataset is insufficient. They may also occur when an AI model adjusts its output to align with the perceived expectations of the user. In the absence of rigorous review, these distortions can result in the production of erroneous or misleading documents.
Confidentiality and Data Protection. For practising lawyers, this issue is of critical importance in light of professional secrecy obligations (Article 13 FMLA and Article 4 SCPC). When selecting and using any software, including AI applications, it is essential to clarify in advance what happens to the data entered—specifically, who has access to it, where it is stored, and how it is processed.
The SBA Guidelines on the Use of AI identify three options available to lawyers for ensuring that the use of such tools complies with legal and ethical requirements:
- Internal solution. The AI application is installed and operated locally within the law firm’s network, ensuring that processed data never leaves its infrastructure, whether for storage or processing purposes. This solution—by far the most appropriate—requires advanced technical expertise for its implementation and may require the involvement of an external IT specialist for setup purposes.
- Compliant outsourcing. When the software is provided by an external supplier or relies on a dematerialised (cloud-based) infrastructure hosted on third-party servers, it is imperative to strictly comply with the recommendations concerning IT outsourcing, particularly those issued by the SBA regarding the use of cloud services in law firms[4]. This obligation is all the more stringent given that the Federal Supreme Court of Switzerland has recognised that an external provider may be considered an auxiliary within the meaning of Article 13(2) FMLA, thereby rendering the lawyer liable for ensuring that the third party respects professional secrecy[5].
- Informed consent. It is also possible to obtain the client’s express and informed consent by means of a formal declaration partially waiving professional secrecy and the Federal Act on Data Protection (FADP), provided that the client has been duly informed of the risks involved.
In the absence of such safeguards, confidential information should under no circumstances be entered into AI systems. In the specific context of contract drafting, which frequently involves such information, imprudent use of these tools may expose the lawyer to disciplinary consequences.
Complexity and Implementation Costs. In the absence of safeguards ensuring data protection, lawyers will need to anonymise documents before processing them through AI systems—an effort that may prove time-consuming and potentially counterproductive. Moreover, beyond pricing considerations, the effective integration of AI within a law firm requires specialised training, technical adaptations, and the development of internal policies. In a professional environment where speed is often expected, such technical constraints may hinder rather than streamline internal processes.
Liability. When a lawyer relies on an AI tool to assist in the drafting or negotiation of a contract, and this use leads to the conclusion of an agreement containing a poorly worded clause—to the detriment of one of the parties—the issue of liability arises. In such cases, the lawyer remains responsible for the proper execution of their mandate and cannot exonerate themselves by invoking an error committed by the AI system. This position is consistent with the duty of diligence set out in Article 12(a) FMLA and Article 6 SCPC, which may entail an extended technological competence[6]. Contract negotiation and drafting thus remain acts that fully engage the lawyer’s professional liability, regardless of the degree of technological assistance involved. In summary, the use of AI tools gives rise to new scenarios in which a lawyer’s liability may be incurred, without altering the underlying legal principles[7]. It is only natural that those who benefit from increased efficiency by delegating certain tasks to AI should also bear the associated risks.
[1] Lignes directrices de la FSA portant sur l’utilisation de l’intelligence artificielle (IA), https://digital.sav-fsa.ch/fr/ki-guidelines (accessed 3 July 2025).
[2] LLI Whitepaper, First Global Report on the State of Artificial Intelligence in Legal Practice, n° 3 2023, p. 22-23.
[3] Chappuis Benoît/Gurtner Jérôme, La profession d’avocat, Geneva/Zurich/Basel 2021, N 177.
[4] See : Fédération Suisse des Avocats, Utilisation du cloud dans les études d’avocats, disponible sur : https://digital.sav-fsa.ch/fr/transition-numerique-de-l-etude-utilisation-du-cloud (accessed 3 July 2025).
[5] DSFC 145 II 229, reas. 7 ; See also Nussbaumer-Laghzaoui Arnaud, L’utilisation d’un espace de co-working par un avocat, in : https://lawinside.ch/777.
[6] Chappuis Benoît/Gurtner Jérôme, La profession d’avocat, Geneva/Zurich/Basel 2021, N 177.
[7] Gurtner Jérôme, Les nouvelles technologies et la responsabilité des avocats : La cybersécurité et l’intelligence artificielle, in : Chappuis Christine/Winiger Bénédict (eds.), Responsabilité civile et nouvelles technologies, Journée de la responsabilité civile 2018, Geneva/Zurich/Basel 2019, p. 72.
Suggested citation: Greinig Scott, Swiss Lawyers and Artificial Intelligence: A Focus on Contractual Practice, Blog of the LexTech Institute, 9 July 2025