

Voto Consulting LLC
AWS Engineer - Tokenization & Data Security
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Engineer specializing in Tokenization & Data Security, based in McLean, VA. Contract length is unspecified, with a pay rate of "unknown." Requires 6-10 years of IT experience, AWS proficiency, and strong tokenization expertise.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 15, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
McLean, VA
-
🧠 - Skills detailed
#IAM (Identity and Access Management) #EC2 #GDPR (General Data Protection Regulation) #Cloud #Scripting #Security #Informatica #IICS (Informatica Intelligent Cloud Services) #Deployment #AWS (Amazon Web Services) #Automation #Data Pipeline #Infrastructure as Code (IaC) #PCI (Payment Card Industry) #DevSecOps #Kubernetes #"ETL (Extract #Transform #Load)" #Python #Compliance #DevOps #Scala #Data Engineering #S3 (Amazon Simple Storage Service) #Lambda (AWS Lambda) #Data Security #Documentation #Ansible
Role description
Title :- AWS Engineer – Tokenization & Data Security
Location :- Mclean, VA
Must Have :- Tokenization & Data Security Experience.
Responsibilities :-
• We are seeking a skilled AWS Engineer with strong experience in tokenization, data security, and containerized application management.
• The ideal candidate will have deep technical expertise in AWS cloud services, automation, scripting, and data protection frameworks.
• This role requires hands-on experience with EKS clusters, CFT updates, and secure data handling (PII/confidential data masking and tokenization).
Key Responsibilities :-
• Design, build, and maintain containerized applications on AWS (EKS/ECS) using automation and custom scripts.
• Implement and manage tokenization and detokenization processes to protect PII and confidential data.
• Work with data security tools to ensure compliance with internal and external security standards (e.g., PCI DSS, GDPR).
• Build and maintain infrastructure as code (IaC) using CloudFormation Templates (CFTs) and automation pipelines.
• Develop and manage scripts (Python, Shell, Ansible, etc.) to automate application builds and deployments.
• Collaborate with security and data engineering teams to implement data masking, token mapping, and encryption solutions.
• Monitor, optimize, and troubleshoot EKS clusters, ensuring high performance and scalability.
• Maintain documentation on infrastructure design, tokenization workflows, and data protection measures.
• Participate in audits, reviews, and assessments of data security systems.
Required Skills & Experience :-
• 06 - 10 years of total IT experience with a strong focus on AWS Cloud Engineering and Data Security.
• Hands-on experience with AWS services – EC2, EKS, Lambda, S3, IAM, CloudFormation, and KMS.
• Proven experience in containerization and Kubernetes (EKS) management, including upgrades and patching.
• Proficiency in Python scripting and automation for build/deployment processes.
• Strong understanding of tokenization concepts, token mapping, and data masking techniques.
• Experience with data security tools used for tokenization/detokenization and encryption key management (e.g., Protegrity, Thales CipherTrust, Voltage SecureData, or similar).
• Deep knowledge of PII and confidential data protection standards.
• Experience updating and maintaining CloudFormation Templates (CFTs) and other IaC frameworks.
• Solid understanding of security compliance frameworks (PCI DSS, GDPR, HIPAA).
Nice-to-Have Skills :-
• Exposure to ETL tools and data pipelines (Informatica, IICS).
• Familiarity with DevSecOps and integrating security within CI/CD pipelines.
• Knowledge of AWS KMS, encryption mechanisms, and key rotation policies.
Soft Skills :-
• Strong analytical and problem-solving abilities.
• Excellent communication and documentation skills.
• Ability to collaborate with cross-functional teams (DevOps, Data, Security).
• Self-driven with a proactive approach to automation and process improvement.
Title :- AWS Engineer – Tokenization & Data Security
Location :- Mclean, VA
Must Have :- Tokenization & Data Security Experience.
Responsibilities :-
• We are seeking a skilled AWS Engineer with strong experience in tokenization, data security, and containerized application management.
• The ideal candidate will have deep technical expertise in AWS cloud services, automation, scripting, and data protection frameworks.
• This role requires hands-on experience with EKS clusters, CFT updates, and secure data handling (PII/confidential data masking and tokenization).
Key Responsibilities :-
• Design, build, and maintain containerized applications on AWS (EKS/ECS) using automation and custom scripts.
• Implement and manage tokenization and detokenization processes to protect PII and confidential data.
• Work with data security tools to ensure compliance with internal and external security standards (e.g., PCI DSS, GDPR).
• Build and maintain infrastructure as code (IaC) using CloudFormation Templates (CFTs) and automation pipelines.
• Develop and manage scripts (Python, Shell, Ansible, etc.) to automate application builds and deployments.
• Collaborate with security and data engineering teams to implement data masking, token mapping, and encryption solutions.
• Monitor, optimize, and troubleshoot EKS clusters, ensuring high performance and scalability.
• Maintain documentation on infrastructure design, tokenization workflows, and data protection measures.
• Participate in audits, reviews, and assessments of data security systems.
Required Skills & Experience :-
• 06 - 10 years of total IT experience with a strong focus on AWS Cloud Engineering and Data Security.
• Hands-on experience with AWS services – EC2, EKS, Lambda, S3, IAM, CloudFormation, and KMS.
• Proven experience in containerization and Kubernetes (EKS) management, including upgrades and patching.
• Proficiency in Python scripting and automation for build/deployment processes.
• Strong understanding of tokenization concepts, token mapping, and data masking techniques.
• Experience with data security tools used for tokenization/detokenization and encryption key management (e.g., Protegrity, Thales CipherTrust, Voltage SecureData, or similar).
• Deep knowledge of PII and confidential data protection standards.
• Experience updating and maintaining CloudFormation Templates (CFTs) and other IaC frameworks.
• Solid understanding of security compliance frameworks (PCI DSS, GDPR, HIPAA).
Nice-to-Have Skills :-
• Exposure to ETL tools and data pipelines (Informatica, IICS).
• Familiarity with DevSecOps and integrating security within CI/CD pipelines.
• Knowledge of AWS KMS, encryption mechanisms, and key rotation policies.
Soft Skills :-
• Strong analytical and problem-solving abilities.
• Excellent communication and documentation skills.
• Ability to collaborate with cross-functional teams (DevOps, Data, Security).
• Self-driven with a proactive approach to automation and process improvement.






