Every AI voice platform records calls. It's fundamental to how the technology works. Recordings are used for quality assurance, compliance monitoring, AI training, and dispute resolution. But under GDPR, every recording creates significant legal obligations that most businesses ignore until they receive a subject access request or deletion demand they can't fulfill.
Voice recordings are personal data under GDPR. They identify individuals through voice characteristics and content. That means Articles 6 (lawful basis), 13 and 14 (transparency), 15 (access rights), 17 (deletion rights), and 30 (processing records) all apply. Whether you're the data controller (using AI for your own calls) or the data processor (providing AI services to clients), GDPR compliance for call recordings is non-negotiable.
This guide covers exactly what GDPR requires when recording AI phone calls, what most businesses get wrong, and practical steps to stay compliant in 2026 and beyond. It's not legal advice (consult a lawyer for that), but it's informed by three years of building GDPR-compliant voice AI systems and working with European businesses navigating these requirements.
EUR 20M
GDPR fine for inadequate data protection (Vodafone Spain, 2021)
Does GDPR Apply to AI Call Recordings?
Yes, always. GDPR applies when you process personal data of individuals in the EU, regardless of where your business is located. Voice recordings are personal data because they allow identification of individuals (voice characteristics are biometric data under some interpretations, and call content often includes names, phone numbers, addresses, and other identifiers).
GDPR applies whether you're the data controller or data processor. If you're using AI to make calls on behalf of your business (sales calls, appointment reminders, customer support), you're the controller. You decide why and how recordings are made. If you're an AI platform provider (like Ringvox, Vapi, or Retell) processing calls on behalf of clients, you're the processor. Both roles have legal obligations, though they differ in scope.
Controllers must ensure there's a lawful basis for recording (usually consent or legitimate interest), provide transparency (privacy policies, disclosure at call start), and honour data subject rights (access, deletion, portability). Processors must follow controller instructions, implement security measures, and sign Data Processing Agreements. If you're unsure which role you occupy, the default assumption is controller. That's the safer position from a compliance standpoint.
Consent Requirements: What You Actually Need
Recording phone calls requires a lawful basis under GDPR Article 6. The most common bases are consent (Article 6(1)(a)) and legitimate interest (Article 6(1)(f)). Which one applies depends on the purpose of the recording and who you're calling.
Consent is required when the recording isn't necessary for the service or when the caller is a consumer (B2C). Consent must be freely given, specific, informed, and unambiguous. That means you need to tell the person the call is being recorded, explain why, and give them a genuine choice to opt out without penalty. In practice, this looks like: "This call is being recorded for quality assurance. If you'd prefer not to be recorded, please let me know now." If they object, you stop recording or end the call.
Legitimate interest can apply when recording is necessary for your business (quality monitoring, compliance with sectoral regulations, dispute resolution) and doesn't override the individual's rights. This is more commonly used in B2B contexts where recording is standard practice. But even with legitimate interest, you must inform the person the call is being recorded and give them the right to object. The key difference from consent is that you don't need their explicit agreement in advance. Disclosure alone is sufficient, unless they object.
AI training data is a grey area. Using call recordings to train your AI models is technically a separate purpose from the original recording. Under GDPR, that means you need a separate lawful basis or explicit consent for training use. The safest approach is to anonymise recordings before using them for training (remove names, numbers, identifiers) or obtain explicit consent that covers both quality assurance and AI training. Most businesses skip this step and assume their original consent or legitimate interest covers training. It probably doesn't.
How to obtain consent at the start of a call without killing conversion: The mistake most businesses make is treating consent as a legal script. "By continuing this call, you consent to the processing of your personal data in accordance with our privacy policy available at..." Nobody listens past the first five words. They hang up. Instead, make it conversational and brief. "Hi, this is Alex from Ringvox. Quick heads-up, this call is recorded so we can make sure everything goes smoothly. Is now a good time?" You've disclosed the recording, given them an easy opt-out (they can say no), and kept the call natural. That's compliant consent.
Data Storage: Where and How Long
GDPR doesn't require call recordings to be stored in the EU. That's a common misconception. What GDPR requires is adequate safeguards when transferring data to third countries (outside the EU/EEA). If your AI platform stores recordings in US-based cloud infrastructure (AWS us-east-1, Google Cloud us-central1), you need an approved transfer mechanism under GDPR Chapter V.
The current approved mechanisms are Standard Contractual Clauses (SCCs), Binding Corporate Rules (BCRs), or an adequacy decision (the EU Commission has determined the third country provides adequate data protection). The US doesn't have an adequacy decision since Schrems II invalidated Privacy Shield in 2020. That means US transfers require SCCs. Most reputable cloud providers and AI platforms offer SCCs as standard. If your provider doesn't, find a new provider.
EU data residency is safer and simpler. Storing recordings in EU-based cloud regions (Ireland, Frankfurt, Paris, Amsterdam) avoids the complexity of international transfers entirely. It also insulates you from US legal processes like CLOUD Act requests. For businesses serious about GDPR compliance, EU residency is the default choice. Non-EU storage should require active justification (cost, specific technical requirements, etc.), not the other way around.
Retention periods must be defined and enforced. GDPR Article 5(1)(e) requires personal data to be kept only as long as necessary for the purpose it was collected. That means you can't store call recordings indefinitely "just in case." You need to define a retention period based on your business needs, document it in your privacy policy, and configure automatic deletion when the period expires.
Common retention periods by use case: Quality assurance and training (30-90 days is typical - long enough to review calls and coach agents, short enough to minimize data retention). Dispute resolution (12 months is common for commercial disputes; some sectors like financial services have longer statutory requirements). Compliance monitoring (sectoral regulations may require retention for 3-5 years; know what applies to your industry). AI training data (anonymise before long-term storage; raw recordings with identifiable data should be deleted after initial training is complete).
The key is documentation. Your privacy policy should state your retention periods. Your data processing agreements with AI providers should specify how long they retain recordings on your behalf. And your systems should enforce automatic deletion. Manual deletion processes fail. Someone forgets, or the backlog grows, and suddenly you're storing five years of call recordings with no documented retention policy. That's a GDPR violation waiting to happen.
The Right to Deletion (and Why It's Harder Than You Think)
GDPR Article 17 gives individuals the right to request deletion of their personal data (the "right to be forgotten"). For call recordings, this means if someone requests deletion, you must delete the recording from all systems, including backups, within 30 days (or explain why a legal exception applies).
Subject access requests (SARs) under Article 15 are closely related. When someone requests access to their data, you must provide a copy of their call recordings (or transcripts) within 30 days, free of charge. That means you need to be able to search your recordings by customer identifier (phone number, email, name) and retrieve them quickly. If your recordings are stored in a blob storage bucket with no metadata or indexing, you'll struggle to comply.
Deletion across all systems is the hard part. Call recordings often live in multiple places: the primary database where they're initially stored, backup systems (daily, weekly, monthly snapshots), log files (error logs, audit trails that reference recording IDs), and AI training datasets (if recordings were used to train or fine-tune models). True deletion requires removing the data from all of these. Most businesses delete the primary record and forget the backups. That's not compliant.
What happens to AI models trained on deleted data? This is an unsolved legal question. If you delete a call recording that was used to train your AI model, does the model itself violate GDPR because it "contains" traces of the deleted data? Legal consensus is emerging that anonymised training data doesn't require deletion (because it's no longer personal data), but models trained on identifiable data may be problematic. The practical approach most businesses take is to anonymise all training data before model training, so deletion requests only affect raw recordings, not models.
Practical deletion procedure: When you receive a deletion request, you need to: (1) Verify the requester's identity (to prevent malicious deletion requests). (2) Search all systems for recordings associated with the requester (primary storage, backups, logs). (3) Delete or anonymise the data in each system. (4) Document the deletion (date, systems affected, who carried it out) for audit purposes. (5) Confirm deletion to the requester within 30 days. Most businesses fail at step 2 (searching backups) and step 4 (documentation). Build these processes before you receive your first deletion request, not after.
Data Processing Agreements: The Document Nobody Reads
If you use a third-party AI platform to make or receive calls, you need a Data Processing Agreement (DPA) with that provider. GDPR Article 28 requires controllers to only use processors who provide sufficient guarantees of compliance and to document the relationship in a written contract.
What a DPA must contain: The subject matter and duration of processing (e.g., "recording and transcribing AI phone calls for 12 months"). The nature and purpose of processing (e.g., "customer support, quality assurance, AI training"). The type of personal data processed (e.g., "voice recordings, phone numbers, names, addresses"). The categories of data subjects (e.g., "customers, prospects, business contacts"). The processor's obligations (security measures, sub-processor management, data deletion, breach notification). The controller's rights (audit rights, ability to terminate, data return on termination).
Sub-processors are where most DPAs get complicated. Your AI platform doesn't operate in isolation. It uses third-party services: ElevenLabs or Deepgram for voice synthesis, OpenAI or Anthropic for language models, AWS or Google Cloud for infrastructure. Each of these is a sub-processor. Under GDPR, you (the controller) must authorise each sub-processor. Your DPA with the AI platform should list all sub-processors or give the platform general authorisation with a notification requirement ("We can add new sub-processors, but we'll tell you 30 days in advance so you can object").
Red flags in DPAs: No mention of EU data residency or transfer mechanisms (means your data is probably going to the US without SCCs). Processor claims ownership of call data or reserves right to use it for their own purposes (violates Article 28 - processor can only act on controller instructions). No sub-processor list or authorisation process (you have no idea who has access to your data). No audit rights (you can't verify the processor is complying with GDPR). If your AI provider's DPA has any of these red flags, renegotiate or switch providers.
Practical Implementation Guide
Here's a step-by-step process for implementing GDPR-compliant call recording with AI.
- β’Step 1: Choose your lawful basis. For B2C calls, use consent. For B2B calls where recording is standard practice, legitimate interest works. Document your choice in your privacy policy and internal processing records (Article 30).
- β’Step 2: Draft a consent script. If using consent, create a brief, natural disclosure that happens at call start. Test it with real calls to ensure it doesn't kill conversion. Example: "Hi, this is [name] from [company]. This call is recorded for quality assurance. Is now a good time?" If the person objects, end the call or stop recording immediately.
- β’Step 3: Update your privacy policy. Add a section on call recording that covers: what data is recorded (voice, transcripts, metadata), why it's recorded (quality, compliance, training), legal basis (consent or legitimate interest), retention period (30 days, 12 months, etc.), where it's stored (EU or non-EU with SCCs), and how to request access or deletion.
- β’Step 4: Sign a DPA with your AI provider. Request a Data Processing Agreement from your AI platform. Review it for the red flags mentioned above. Ensure it covers sub-processors, EU data residency or SCCs, and your audit rights. Don't start processing data without a signed DPA.
- β’Step 5: Configure retention policies. Set up automatic deletion of call recordings after your defined retention period. Most AI platforms support this. If yours doesn't, you'll need to implement manual deletion processes (set calendar reminders, export and delete old recordings monthly). Manual processes fail over time. Automate where possible.
- β’Step 6: Set up SAR and deletion request handling. Create a process for handling subject access requests and deletion requests. This includes verifying requester identity, searching all systems for their data, fulfilling the request within 30 days, and documenting the action. Test the process before you receive real requests.
- β’Step 7: Train your team. Anyone who handles call recordings (sales managers, support leads, compliance officers) should understand GDPR obligations. Cover the basics: why consent matters, how long recordings are kept, how to handle deletion requests, who to escalate questions to. A 30-minute training session is sufficient.
- β’Step 8: Document everything. GDPR requires you to maintain records of processing activities (Article 30). Create a simple document that covers what data you process (call recordings), why (quality assurance, training), legal basis (consent, legitimate interest), retention periods (30-90 days), and who has access (internal teams, AI provider, sub-processors). Update it whenever processes change.
How Ringvox Handles Call Recording Compliance
Ringvox is built for EU businesses, which means GDPR compliance is part of the platform architecture, not an afterthought.
Built-in consent capture: Every Ringvox call flow includes configurable consent scripts. You write the disclosure (or use our templates), and the system ensures it's delivered at call start. If the person opts out, the call ends or continues without recording (your choice).
EU data storage: All call recordings and transcripts are stored in EU-based infrastructure (AWS eu-west-1 in Ireland, with replication to eu-central-1 in Frankfurt). No data leaves the EU unless you explicitly configure non-EU storage. This eliminates the need for SCCs and simplifies compliance.
Configurable retention policies: Ringvox allows you to set retention periods per call type (quality assurance calls: 30 days; dispute resolution calls: 12 months; compliance calls: 5 years). When the period expires, recordings are automatically deleted from primary storage and backups. No manual processes required.
Automated deletion and SAR handling: When a customer requests deletion, you can search by phone number, email, or name, and delete all associated recordings with one click. The system handles deletion across primary storage, backups, and logs. For subject access requests, you can export all recordings and transcripts for a specific customer as a ZIP file, ready to send within the 30-day deadline.
DPA provided to all customers: Every Ringvox customer receives a GDPR-compliant Data Processing Agreement that covers call recording, data residency, sub-processors (ElevenLabs, Anthropic, AWS), and audit rights. Sub-processors are documented and covered by SCCs where applicable. If we add a new sub-processor, we notify customers 30 days in advance.
Anonymised training data: Call recordings used to improve Ringvox AI models are anonymised before training. Names, phone numbers, addresses, and other identifiers are redacted or replaced with synthetic data. This ensures deletion requests don't affect model performance and training data doesn't violate GDPR.
Need a GDPR-compliant AI voice platform? See how Ringvox handles consent, storage, and deletion automatically: https://ringvox.co/compliance