
AWS Data Loss Protection Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an "AWS Data Loss Protection Engineer" with a 6-month contract-to-hire in Reston, VA or Plano, TX. Requires 3+ years in data engineering, AWS services, Python, SQL, and AWS certification. In-person interview mandatory.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 6, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Reston, VA
-
π§ - Skills detailed
#SQL Queries #AWS (Amazon Web Services) #ML (Machine Learning) #Terraform #Scala #Scrum #SQL (Structured Query Language) #Agile #Data Loss Prevention #Databases #Shell Scripting #Scripting #Security #Monitoring #Cloud #"ETL (Extract #Transform #Load)" #Data Ingestion #Computer Science #S3 (Amazon Simple Storage Service) #Data Pipeline #Redshift #Data Engineering #Cybersecurity #GitHub #Python #Logging #Version Control #Data Science #Lambda (AWS Lambda) #Strategy #Compliance #Leadership #DevOps #Data Governance #Automation #Unix #Linux #Documentation
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Reston VA or Plano- onsite interview required. (Local to any of the location, preference in Reston )
6 months CTH
USC or GC only- contract-to-hire position.
LinkedIn should be created at least 3-4 years back and have profile picture.
DETAILS NEEDED ::
Availability To Interview
Is candidate willing to do an in person interview in Reston VA or Plano TX?
Availability To Start
LinkedIn Profile:
Last 4 Of SSN
MM/DD of birth:
Skill Highlights- please indicate the # of years on each of the following skills:
β’ Data engineering and data pipelines
β’ AWS services:
β’ Glue
β’ Lambda
β’ Step Functions
β’ S3
β’ Redshift
β’ EMR
β’ Data Pipeline
β’ Python scripts for data ingestion, transformation, and automation.
β’ SQL queries for ETL and analytics workflows.
β’ Unix/Linux scripting, automation, and systemlevel data operations.
β’ Agile
β’ Terraform
β’ Github
β’ AWS Cloud Formation
β’ AWS Certification required:
Job Description
β’ Senior AWS Engineer
β’ Drive strategy and execution of Fannie Maes Data Loss Prevention (DLP) Program and/or other relevant FM InfoSec programs in collaboration with a wide range of stakeholders.
β’ Provide DLP input on design and configuration of security controls across multiple capabilities including firewall, proxy, endpoint, and messaging. Assess and influence riskbased prioritizations for DLP and other security controls.
β’ Advise on and assist with security, data, and technology initiatives that impact the entire organization.
Skills
β’ Experience with developing and implementing a comprehensive Information Security Data Loss Prevention (DLP) program to include defining standards and influencing DLP product and strategy roadmaps. Experience with security control design and configuration in cloud environments.
β’ Experience with assessing and identifying gaps in security controls, processes, systems and providing recommendations.
β’ Experience developing technology and control roadmaps, including research, planning, and stakeholder engagement. Experience identifying and selecting strategic options, and identifying resources to meet the defined objectives.
β’ Experience identifying and determining levels of risk to an organization's networks and systems using cybersecurity techniques and tools such as penetration testing, application security, and intelligence. Experience in the process of analyzing data to identify trends or relationships to inform conclusions about the data.
β’ Experience in thought leadership, training, workforce assessment, and workforce development.
β’ Skilled in cloud technologies and cloud computing.
β’ The group of skills related to Security including designing and evaluating security systems, identifying security threats, securing computers, assessing vulnerability, etc.
β’ Skills related to Governance and Compliance including creating policies, evaluating compliance, conducting internal investigations, developing data governance, etc.
β’ Skills related to Influencing including negotiating, persuading others, facilitating meetings, and resolving conflict.
β’ Skills related to Relationship Management including managing and engaging stakeholders, customers, and vendors, building relationship networks, contracting, etc.
Education/Work Experience
β’ Bachelor degree in Computer Science, Information Systems or related field
β’ Professional certification(s) desired 10+ years experience across the appropriate platform specific to cyber, compliance, and DLP.
β’ Solid IT background and experience.
β’ Experience as a application developer for projects similar in scope and responsibility
Key Responsibilities
β’ Design, build, and maintain scalable, secure, and efficient data pipelines using AWS services such as Glue, Lambda, Step Functions, S3, Redshift, EMR, and Data Pipeline.
β’ Develop robust Python scripts for data ingestion, transformation, and automation.
β’ Write and optimize complex SQL queries for ETL and analytics workflows.
β’ Operate in Unix/Linux environments for scripting, automation, and systemlevel data operations.
β’ Participate in Agile ceremonies (daily standups, sprint planning, retrospectives) and contribute to iterative delivery of data solutions.
β’ Collaborate with crossfunctional teams to gather requirements and translate them into high-level architecture and design documents.
β’ Communicate technical concepts clearly through documentation, presentations, and stakeholder meetings.
β’ Implement monitoring, logging, and alerting for data pipelines to ensure reliability and performance.
β’ Apply DevOps best practices using GitHub, Terraform, and CloudFormation for infrastructure automation and CI/CD.
β’ Additional Job Description
Required Qualifications
β’ Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
β’ 3+ years of experience in data engineering or a similar role.
β’ Strong handson experience with AWS data services (e.g., EMR, Glue, Lambda, Step Functions, S3, Redshift).
β’ Advanced proficiency in Python for scripting and automation.
β’ Solid experience with Unix/Linux shell scripting.
β’ Strong command of SQL and experience with relational databases.
β’ Proficiency with GitHub for version control and collaboration.
β’ Experience with Terraform and/or AWS CloudFormation for infrastructureas-code.
β’ Experience working in Agile/Scrum environments.
β’ Excellent verbal and written communication skills.
β’ Proven ability to contribute to highlevel solution design and architecture discussions.
β’ AWS Certification (e.g., AWS Certified Data Analytics Specialty, AWS Certified Solutions Architect, or equivalent).
Preferred Qualifications
β’ Exposure to machine learning pipelines or data science workflows.
β’ Experience with data governance, security, and compliance best practices.