Voto Consulting LLC

AWS Engineer – Tokenization & Data Security

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Engineer – Tokenization & Data Security, onsite in McLean, VA, with a contract length of unspecified duration. Pay is $55-$60/hr on C2C. Requires 6–10 years in AWS, data security, and expertise in tokenization and containerization.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
480
-
🗓️ - Date
October 28, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
McLean, VA
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Cloud #Automation #Kubernetes #Lambda (AWS Lambda) #IAM (Identity and Access Management) #DevOps #Data Security #EC2 #Compliance #Scripting #Deployment #Ansible #Security #Scala #GDPR (General Data Protection Regulation) #S3 (Amazon Simple Storage Service) #Infrastructure as Code (IaC) #PCI (Payment Card Industry) #Data Pipeline #DevSecOps #Informatica #Documentation #Data Engineering #IP (Internet Protocol) #Python #IICS (Informatica Intelligent Cloud Services) #AWS (Amazon Web Services)
Role description
Job Title: AWS Engineer – Tokenization & Data Security Local to Mclean, VA- VA location or Nearby profile. Onsite-5 days $55-$60/hr on c2c 4 New Opening IP-Stratacent We are seeking a skilled AWS Engineer with strong experience in tokenization, data security, and containerized application management. The ideal candidate will have deep technical expertise in AWS cloud services, automation, scripting, and data protection frameworks. This role requires hands-on experience with EKS clusters, CFT updates, and secure data handling (PII/confidential data masking and tokenization). Key Responsibilities: • Design, build, and maintain containerized applications on AWS (EKS/ECS) using automation and custom scripts. • Implement and manage tokenization and detokenization processes to protect PII and confidential data. • Work with data security tools to ensure compliance with internal and external security standards (e.g., PCI DSS, GDPR). • Build and maintain infrastructure as code (IaC) using CloudFormation Templates (CFTs) and automation pipelines. • Develop and manage scripts (Python, Shell, Ansible, etc.) to automate application builds and deployments. • Collaborate with security and data engineering teams to implement data masking, token mapping, and encryption solutions. • Monitor, optimize, and troubleshoot EKS clusters, ensuring high performance and scalability. • Maintain documentation on infrastructure design, tokenization workflows, and data protection measures. • Participate in audits, reviews, and assessments of data security systems. Required Skills & Experience: • 6–10 years of total IT experience with a strong focus on AWS Cloud Engineering and Data Security. • Hands-on experience with AWS services – EC2, EKS, Lambda, S3, IAM, CloudFormation, and KMS. • Proven experience in containerization and Kubernetes (EKS) management, including upgrades and patching. • Proficiency in Python scripting and automation for build/deployment processes. • Strong understanding of tokenization concepts, token mapping, and data masking techniques. • Experience with data security tools used for tokenization/detokenization and encryption key management (e.g., Protegrity, Thales CipherTrust, Voltage SecureData, or similar). • Deep knowledge of PII and confidential data protection standards. • Experience updating and maintaining CloudFormation Templates (CFTs) and other IaC frameworks. • Solid understanding of security compliance frameworks (PCI DSS, GDPR, HIPAA). Nice-to-Have Skills: • Exposure to ETL tools and data pipelines (Informatica, IICS). • Familiarity with DevSecOps and integrating security within CI/CD pipelines. • Knowledge of AWS KMS, encryption mechanisms, and key rotation policies. Soft Skills: • Strong analytical and problem-solving abilities. • Excellent communication and documentation skills. • Ability to collaborate with cross-functional teams (DevOps, Data, Security). • Self-driven with a proactive approach to automation and process improvement.