Why GDPR Compliance on AWS Matters
The General Data Protection Regulation (GDPR) applies to any organization processing personal data of EU residents, regardless of where your company is located. European regulators have demonstrated aggressive enforcement, with cumulative fines reaching €5.88 billion since 2018 and €1.2 billion issued during 2024 alone.
- Maximum penalty: €20 million or 4% of global annual revenue (whichever is higher)
- 72-hour mandatory breach notification to supervisory authorities
- 30-day deadline to fulfill data subject access and erasure requests
- Requirement to demonstrate compliance on demand to regulators
- Data residency requirements for storing EU personal data
The Four Most Common GDPR Violations on AWS
Data Residency Violations
Storing EU personal data in non-EU regions (especially US-East-1, the AWS default) without adequate safeguards or explicit consent from data subjects.
Missing Consent Management
Lacking proper systems to capture, store, and honor user consent preferences. GDPR requires consent to be freely given, specific, informed, and unambiguous—with easy withdrawal.
Right to Erasure Failures
Inability to completely delete personal data within 30 days when requested. This includes forgetting data in backups, logs, caches, and replicated storage.
Inadequate Audit Trails
Lacking documentation of data processing activities, consent records, and compliance measures that regulators require on demand.
Configure Data Residency Controls
~15 minutes
Ensure EU personal data stays within approved EU regions using Service Control Policies and resource configuration. AWS offers 8 EU regions for GDPR-compliant data storage.
Prerequisites
- AWS account with administrative access
- AWS Organizations enabled (for Service Control Policies)
- AWS CLI configured (optional, for automation)
Console Steps
1.1 Audit Current Data Locations
- Navigate to AWS Config in the console
- Go to "Advanced queries"
- Run a query to identify all resources storing data and their regions
SELECT
resourceType,
resourceId,
awsRegion,
configuration.encrypted
WHERE
resourceType IN (
'AWS::S3::Bucket',
'AWS::RDS::DBInstance',
'AWS::DynamoDB::Table',
'AWS::EBS::Volume'
)
1.2 Create Region Restriction SCP
- Navigate to AWS Organizations
- Click "Policies" → "Service control policies"
- Click "Create policy"
- Name it "GDPR-EU-Region-Restriction"
- Paste the policy below
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "DenyNonEURegions",
"Effect": "Deny",
"Action": [
"ec2:RunInstances",
"rds:CreateDBInstance",
"s3:CreateBucket",
"dynamodb:CreateTable",
"lambda:CreateFunction"
],
"Resource": "*",
"Condition": {
"StringNotEquals": {
"aws:RequestedRegion": [
"eu-west-1",
"eu-west-2",
"eu-west-3",
"eu-central-1",
"eu-central-2",
"eu-north-1",
"eu-south-1",
"eu-south-2"
]
}
}
}
]
}
1.3 Audit S3 Bucket Regions
- Run the script below to identify non-compliant buckets
- Plan migration for any buckets outside EU regions
- Create new buckets in EU regions with encryption enabled
#!/bin/bash
# Audit S3 bucket regions for GDPR compliance
EU_REGIONS="eu-west-1 eu-west-2 eu-west-3 eu-central-1 eu-central-2 eu-north-1 eu-south-1 eu-south-2"
for bucket in $(aws s3api list-buckets --query 'Buckets[].Name' --output text); do
region=$(aws s3api get-bucket-location --bucket "$bucket" --query 'LocationConstraint' --output text)
if [ "$region" = "None" ]; then region="us-east-1"; fi
if echo "$EU_REGIONS" | grep -qw "$region"; then
echo "✓ $bucket: $region (EU-compliant)"
else
echo "✗ $bucket: $region (NON-COMPLIANT)"
fi
done
1.4 Create EU-Compliant S3 Bucket
- Navigate to S3 → "Create bucket"
- Choose an EU region (e.g., eu-west-1 Ireland)
- Enable "Server-side encryption" with SSE-S3 or SSE-KMS
- Enable "Block all public access"
- Enable "Bucket Versioning" for audit purposes
# Create GDPR-compliant S3 bucket in EU
aws s3api create-bucket \
--bucket your-gdpr-compliant-bucket \
--region eu-west-1 \
--create-bucket-configuration LocationConstraint=eu-west-1
# Enable encryption
aws s3api put-bucket-encryption \
--bucket your-gdpr-compliant-bucket \
--server-side-encryption-configuration \
'{"Rules":[{"ApplyServerSideEncryptionByDefault":{"SSEAlgorithm":"AES256"}}]}'
# Block public access
aws s3api put-public-access-block \
--bucket your-gdpr-compliant-bucket \
--public-access-block-configuration \
"BlockPublicAcls=true,IgnorePublicAcls=true,BlockPublicPolicy=true,RestrictPublicBuckets=true"
Implement Consent Management System
~10 minutes
Create systems to capture, store, and honor user consent preferences with complete audit trails. GDPR requires consent to be freely given, specific, informed, and unambiguous.
Console Steps
2.1 Create Consent Database
- Navigate to DynamoDB in the console
- Click "Create table"
- Table name:
UserConsentRecords - Partition key:
userId(String) - Sort key:
timestamp(Number) - Select "On-demand" capacity mode
- Enable encryption at rest
# Create consent records table
aws dynamodb create-table \
--table-name UserConsentRecords \
--attribute-definitions \
AttributeName=userId,AttributeType=S \
AttributeName=timestamp,AttributeType=N \
--key-schema \
AttributeName=userId,KeyType=HASH \
AttributeName=timestamp,KeyType=RANGE \
--billing-mode PAY_PER_REQUEST \
--region eu-west-1 \
--sse-specification Enabled=true
# Enable point-in-time recovery for audit
aws dynamodb update-continuous-backups \
--table-name UserConsentRecords \
--point-in-time-recovery-specification PointInTimeRecoveryEnabled=true \
--region eu-west-1
2.2 Define Consent Record Structure
- Each consent record must include timestamp and consent type
- Store the policy version the user agreed to
- Record the method of consent (web form, API, etc.)
- Include processing purposes and data categories
{
"userId": "user-12345",
"timestamp": 1706745600,
"consentId": "consent-uuid-here",
"consentType": "marketing",
"consentGiven": true,
"legalBasis": "consent",
"policyVersion": "privacy-policy-v3",
"processingPurposes": ["marketing_emails", "personalization"],
"dataCategories": ["email", "preferences"],
"consentMethod": "web_form",
"ipAddress": "192.168.1.1",
"userAgent": "Mozilla/5.0..."
}
2.3 Create Consent API Lambda
- Navigate to Lambda → "Create function"
- Name:
gdpr-consent-handler - Runtime: Python 3.12
- Region: eu-west-1
- Create execution role with DynamoDB access
import json
import boto3
import time
import uuid
dynamodb = boto3.resource('dynamodb', region_name='eu-west-1')
table = dynamodb.Table('UserConsentRecords')
def lambda_handler(event, context):
action = event.get('action', '')
if action == 'record_consent':
return record_consent(event)
elif action == 'withdraw_consent':
return withdraw_consent(event)
elif action == 'get_consent_history':
return get_consent_history(event)
return {'statusCode': 400, 'body': json.dumps({'error': 'Invalid action'})}
def record_consent(event):
data = event.get('data', {})
required = ['userId', 'consentType', 'consentGiven', 'policyVersion']
if not all(f in data for f in required):
return {'statusCode': 400, 'body': json.dumps({'error': 'Missing required fields'})}
record = {
'userId': data['userId'],
'timestamp': int(time.time()),
'consentId': str(uuid.uuid4()),
'consentType': data['consentType'],
'consentGiven': data['consentGiven'],
'policyVersion': data['policyVersion'],
'legalBasis': data.get('legalBasis', 'consent'),
'processingPurposes': data.get('processingPurposes', [])
}
table.put_item(Item=record)
return {'statusCode': 200, 'body': json.dumps({'consentId': record['consentId']})}
def withdraw_consent(event):
user_id = event.get('userId')
if not user_id:
return {'statusCode': 400, 'body': json.dumps({'error': 'userId required'})}
# Record withdrawal (maintains audit trail per GDPR Article 7)
record = {
'userId': user_id,
'timestamp': int(time.time()),
'consentId': str(uuid.uuid4()),
'consentGiven': False,
'withdrawalReason': 'user_requested'
}
table.put_item(Item=record)
return {'statusCode': 200, 'body': json.dumps({'message': 'Consent withdrawn'})}
def get_consent_history(event):
user_id = event.get('userId')
if not user_id:
return {'statusCode': 400, 'body': json.dumps({'error': 'userId required'})}
result = table.query(
KeyConditionExpression='userId = :uid',
ExpressionAttributeValues={':uid': user_id}
)
return {'statusCode': 200, 'body': json.dumps({'history': result.get('Items', [])})}
Enable Right to Erasure Automation
~10 minutes
Implement automated systems to fulfill "right to be forgotten" requests within the required 30-day timeframe. This requires knowing where user data exists across your AWS infrastructure.
Console Steps
3.1 Create Data Mapping Table
- Navigate to DynamoDB → "Create table"
- Table name:
UserDataMapping - Partition key:
userId(String) - Sort key:
dataLocation(String) - Enable encryption at rest
# Create data mapping table
aws dynamodb create-table \
--table-name UserDataMapping \
--attribute-definitions \
AttributeName=userId,AttributeType=S \
AttributeName=dataLocation,AttributeType=S \
--key-schema \
AttributeName=userId,KeyType=HASH \
AttributeName=dataLocation,KeyType=RANGE \
--billing-mode PAY_PER_REQUEST \
--region eu-west-1 \
--sse-specification Enabled=true
3.2 Create Erasure Lambda Function
- Navigate to Lambda → "Create function"
- Name:
gdpr-data-erasure - Runtime: Python 3.12
- Attach IAM role with S3 and DynamoDB permissions
- Set timeout to 5 minutes for large deletions
import boto3
import json
from datetime import datetime
def lambda_handler(event, context):
user_id = event.get('userId')
request_id = event.get('requestId', 'manual')
if not user_id:
return {'statusCode': 400, 'error': 'userId is required'}
s3 = boto3.client('s3')
dynamodb = boto3.resource('dynamodb', region_name='eu-west-1')
deletion_log = []
# Query data mapping table
mapping_table = dynamodb.Table('UserDataMapping')
response = mapping_table.query(
KeyConditionExpression='userId = :uid',
ExpressionAttributeValues={':uid': user_id}
)
# Delete from each location
for item in response.get('Items', []):
try:
if item.get('serviceType') == 'S3':
location = item['dataLocation'].replace('s3://', '')
bucket, key = location.split('/', 1)
s3.delete_object(Bucket=bucket, Key=key)
deletion_log.append({'location': item['dataLocation'], 'status': 'deleted'})
elif item.get('serviceType') == 'DynamoDB':
target_table = dynamodb.Table(item['tableName'])
target_table.delete_item(Key={'userId': user_id})
deletion_log.append({'location': item['dataLocation'], 'status': 'deleted'})
except Exception as e:
deletion_log.append({'location': item['dataLocation'], 'error': str(e)})
# Record deletion in consent table for audit trail
consent_table = dynamodb.Table('UserConsentRecords')
consent_table.put_item(Item={
'userId': user_id,
'timestamp': int(datetime.utcnow().timestamp()),
'consentType': 'erasure_completed',
'deletedLocations': len(deletion_log),
'requestId': request_id
})
return {
'statusCode': 200,
'body': json.dumps({
'message': 'Data erasure completed',
'deletedLocations': len(deletion_log),
'details': deletion_log
})
}
Configure Monitoring and Breach Detection
~10 minutes
Set up systems to detect potential data breaches and ensure compliance with GDPR's 72-hour notification requirement.
Console Steps
4.1 Enable CloudTrail Logging
- Navigate to CloudTrail → "Create trail"
- Trail name:
gdpr-compliance-trail - Apply to all regions: Yes
- Create new S3 bucket in eu-west-1
- Enable log file encryption with KMS
- Enable log file validation
# Create S3 bucket for audit logs in EU
aws s3api create-bucket \
--bucket your-company-gdpr-audit-logs \
--region eu-west-1 \
--create-bucket-configuration LocationConstraint=eu-west-1
# Enable encryption on audit bucket
aws s3api put-bucket-encryption \
--bucket your-company-gdpr-audit-logs \
--server-side-encryption-configuration \
'{"Rules":[{"ApplyServerSideEncryptionByDefault":{"SSEAlgorithm":"aws:kms"}}]}'
# Create CloudTrail
aws cloudtrail create-trail \
--name gdpr-compliance-trail \
--s3-bucket-name your-company-gdpr-audit-logs \
--is-multi-region-trail \
--enable-log-file-validation
# Start logging
aws cloudtrail start-logging --name gdpr-compliance-trail
4.2 Create Breach Alert SNS Topic
- Navigate to SNS → "Create topic"
- Type: Standard
- Name:
gdpr-breach-alerts - Create subscription with security team email
4.3 Enable AWS Config Rules
- Navigate to AWS Config → "Rules"
- Add rule:
s3-bucket-server-side-encryption-enabled - Add rule:
s3-bucket-public-read-prohibited - Add rule:
rds-storage-encrypted - Add rule:
cloud-trail-enabled
4.4 Enable Amazon Macie for PII Discovery
- Navigate to Amazon Macie
- Click "Get started" → "Enable Macie"
- Create classification job for S3 buckets with customer data
- Schedule weekly scans to discover unprotected PII
Validate Your Configuration
Complete these checks to ensure your AWS infrastructure meets GDPR requirements:
Security Validation Script
Run this script to verify your GDPR compliance configuration:
#!/bin/bash
# GDPR Compliance Validation Script
echo "Validating GDPR compliance configuration..."
# Check consent table exists
echo "Checking consent management system..."
aws dynamodb describe-table --table-name UserConsentRecords --region eu-west-1 > /dev/null 2>&1
if [ $? -eq 0 ]; then
echo "✓ UserConsentRecords table exists"
else
echo "✗ UserConsentRecords table not found"
fi
# Check data mapping table
echo "Checking data mapping system..."
aws dynamodb describe-table --table-name UserDataMapping --region eu-west-1 > /dev/null 2>&1
if [ $? -eq 0 ]; then
echo "✓ UserDataMapping table exists"
else
echo "✗ UserDataMapping table not found"
fi
# Check CloudTrail
echo "Checking CloudTrail configuration..."
aws cloudtrail get-trail-status --name gdpr-compliance-trail > /dev/null 2>&1
if [ $? -eq 0 ]; then
echo "✓ GDPR CloudTrail is configured"
else
echo "✗ GDPR CloudTrail not found"
fi
# Check for non-EU S3 buckets
echo "Checking S3 bucket regions..."
EU_REGIONS="eu-west-1 eu-west-2 eu-west-3 eu-central-1 eu-central-2 eu-north-1 eu-south-1 eu-south-2"
NON_EU=0
for bucket in $(aws s3api list-buckets --query 'Buckets[].Name' --output text 2>/dev/null); do
region=$(aws s3api get-bucket-location --bucket "$bucket" --query 'LocationConstraint' --output text 2>/dev/null)
if [ "$region" = "None" ]; then region="us-east-1"; fi
if ! echo "$EU_REGIONS" | grep -qw "$region"; then
echo " ✗ Non-EU bucket: $bucket ($region)"
((NON_EU++))
fi
done
if [ $NON_EU -eq 0 ]; then
echo "✓ All S3 buckets are in EU regions"
fi
echo ""
echo "GDPR compliance validation complete!"
Common Mistakes to Avoid
Storing EU data in US regions without adequate safeguards. Always use EU regions for EU personal data or implement Standard Contractual Clauses.
Making consent withdrawal harder than giving consent. GDPR Article 7(3) requires withdrawal to be equally easy.
Forgetting data in backups, logs, and caches during erasure requests. All copies must be deleted.
Lacking audit trails for compliance activities. All GDPR activities must be documented and traceable.
Not conducting Data Protection Impact Assessments for high-risk processing activities involving profiling or large-scale data.
Want Continuous GDPR Compliance Monitoring?
Manual compliance checks are time-consuming and error-prone. AWSight automatically monitors your data residency, encryption, and access patterns—alerting you before violations become costly fines.