Two projects, one question. Within a single week I found myself in the same situation twice: someone wants to use AI to analyse real business data and only afterwards wonders whether that is even legally sound.
Two Real-World Examples
Both applications are useful, efficient, and technically straightforward to implement. But both work with personal data, and that is exactly where things get legally interesting.
What Does the Swiss Federal Act on Data Protection (FADP) Say?
The revised Swiss Federal Act on Data Protection (FADP, SR 235.1) has been in force since 1 September 2023. The most important takeaway first:
This sounds reassuring, but it is a double-edged sword: anyone using AI is subject to the same strict requirements as any other form of data processing.
The 5 Most Important FADP Obligations When Using AI
What a Typical Business Regulates, and What the FADP Actually Requires
Many companies now have an internal AI policy. That is a good start. But anyone who looks closely at such documents will notice a common gap: they govern how employees should behave, but not the legal basis underneath.
A typical policy contains the following:
- β Data categories: personal, sensitive, non-sensitive
- β List of approved tools (e.g. Microsoft Copilot, DeepL via company login)
- β Rule of thumb: use unapproved tools only for "non-sensitive" queries
- β Note on transparency towards customers and partners
- β General values: trust, integrity, respect
This looks solid. The problem: the policy tells employees what to do, but not why it applies legally, and it leaves the critical questions unanswered.
The point is not that such policies are worthless. They help raise awareness and prevent obvious mistakes. But they do not replace a legal basis, and they do not close the gaps that the FADP specifically addresses.
The Problem with Claude Pro
Here comes the point that surprises many people, myself included.
What Works, and What Does It Cost?
The solution is simpler than expected: the Anthropic API (Developer Platform / Commercial Plan).
For small businesses and sole traders, the API with pay-as-you-go is the most cost-effective solution: no monthly base fee, no seat minimum, and all Commercial Terms including the DPA are included.
What This Means in Practice: The Two Case Studies
Case Study 1: Construction Schedule Dashboard
Data involved: Construction plans with employee names, subcontractors, schedules, machine allocation. Where natural persons are identifiable, the FADP applies.
What needs to be done:
- Inform employees and subcontractors (duty to inform)
- Check whether the AI analysis makes "automated individual decisions" (e.g. automatic resource allocation)
- Use Anthropic's API plan, not Claude Pro
- Address AI use in employment contracts and subcontractor agreements
Tip: Construction schedules using only aggregated data (no names, only roles and functions) significantly reduce the data protection workload.
Case Study 2: Online Shop Analysis
Data involved: Order history, purchasing behaviour, customer segments, classic personal data often with a profiling character (Art. 5 let. f FADP).
What needs to be done:
- Expand the shop's privacy policy: name the AI use, purpose, and recipient (Anthropic as processor)
- Use the Anthropic API (DPA as data processing agreement under Art. 9)
- For high-risk profiling: obtain explicit consent and conduct a data protection impact assessment (Art. 22)
- Anonymise or pseudonymise data before sending it to the AI, where possible
Tip: For pure analytics, it is often worth aggregating customer data before the AI analysis β at that point it may no longer fall under the FADP at all.
FADP Compliance Checklist
My Conclusion
AI can create real value in business, whether for a construction schedule dashboard or shop analytics. The Swiss FADP does not prohibit this. It does, however, require transparency, an appropriate contract with the AI provider, and care when handling personal data.
The most common mistake: people start with the AI tool they already know from personal use, Claude Pro/Max or ChatGPT Plus, and forget that these consumer plans are neither contractually nor legally adequate for processing business customer data.
The good news: the Anthropic API is pay-as-you-go, includes all the necessary commercial guarantees including a DPA, and contractually prohibits training on customer data. For small businesses it is often cheaper than a flat-rate plan, because you only pay for what you actually use.
Sources
- [1] FDPIC, Update: The existing data protection law is directly applicable to AI, 8 May 2025 (updated 22 August 2025). Federal Data Protection and Information Commissioner, Bern. edoeb.admin.ch
- [2] Federal Act on Data Protection (FADP), SR 235.1, in force since 1 September 2023. fedlex.admin.ch
- [3] Anthropic, Consumer Terms of Service and Commercial Terms of Service, as of October 2025. anthropic.com/legal
- [4] Anthropic, Data Processing Addendum (DPA), available for API and commercial plans. anthropic.com/legal/data-processing-addendum
- [5] Anthropic, Privacy Policy, section on data use for model training, as of 2025. anthropic.com/legal/privacy
- [6] Fredric Paul, Anthropic: You can still use your Claude accounts to run OpenClaw, NanoClaw and Co., The New Stack, 2025. thenewstack.io
Fun Fact: The OpenClaw Dispute and What It Reveals About Anthropic's ToS
OpenClaw, NanoClaw and similar personal AI agents work exactly as this article describes: they use the OAuth token of a Claude Pro or Max account, without an API key. This makes them affordable, but it sits in a legal grey area.
In early 2025, Anthropic updated its documentation to clarify that using Pro/Max credentials in third-party tools violates the terms of service. The community reaction was fierce, and Anthropic backtracked: "Nothing is changing about how you can use the Agent SDK and MAX subscriptions." The official line since then: personal use is fine; anyone building a business on top of it or processing customer data should use an API key.
That is exactly the point. Using OpenClaw for personal experiments sits within the tolerated zone. Using it to analyse business customer data lands you back at the starting question of this article: no DPA, no training exclusion, no SCC documentation, Consumer ToS. The technology is the same; the legal context is entirely different.
Note: This post provides a practice-oriented overview and does not replace legal advice. For specific data protection questions, consulting a specialist or the FDPIC is recommended.

