

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 6-month contract, remote work, and a pay rate of $60.00-$70.00/hr. Requires US citizenship, a Bachelor's in Computer Science, 6+ years of experience, proficiency in Python and Terraform, and knowledge of data architecture and compliance.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
560
-
ποΈ - Date discovered
September 4, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Visualization #AI (Artificial Intelligence) #Data Quality #Version Control #Data Architecture #Python #AWS (Amazon Web Services) #Storage #Automation #Compliance #Data Science #Computer Science #GIT #ML (Machine Learning) #Data Warehouse #Apache Spark #Spark (Apache Spark) #Agile #Tableau #Observability #Security #Azure #Data Ingestion #Programming #Data Engineering #Data Processing #Data Lake #Automated Testing #Anomaly Detection #Deployment #Data Security #Cloud #Apache Kafka #Documentation #"ETL (Extract #Transform #Load)" #AWS Kinesis #Data Pipeline #Terraform #Kafka (Apache Kafka)
Role description
Title : Data Engineer
Location : REMOTE
Contract : 6 Month Contract with potential extension
Citizenship Requirement: US Citizenship
Shift : 4/10 1st Shift
Pay : $60.00/hr - $70.00/hr on W2!
Job Description
This Lockheed Martin Enterprise Business & Digital Transformation (EBDT) Data Engineer position will be in the EBDT Innovation and Technology Solutions (ITS) organization. As a Data Engineer, you will be responsible for the development, optimization, and management of data ingestion, transformation, and storage processes using modern frameworks. The successful candidate will implement infrastructure-as-code and automation solutions to streamline and build resiliency into data processing workflows. You will work closely with our data science and analytics teams to create and optimize data and machine learning models that support stakeholder needs.
The responsibilities of this position include, but are not limited to:
β’ Design, build, and maintain large-scale data systems, including data warehouses, data lakes, and data pipelines
β’ Develop and implement data architectures that meet the needs of our business stakeholders
β’ Collaborate with data scientists and analysts to develop and deploy machine learning models and data products
β’ Ensure data quality, security, and compliance with standards
β’ Develop and maintain technical documentation of data systems and architectures
β’ Troubleshoot and resolve data-related issues and optimize system performance
β’ Develop and implement automated testing and deployment scripts to ensure smooth and efficient delivery of data solutions
β’ Collaborate with cross-functional teams to identify and prioritize data requirements and develop solutions to meet business needs
Basic Qualifications
β’ Bachelor's degree in Computer Science, Information Technology, or related field
β’ 6+ years of experience in data engineering, software development, or a related field
β’ Experience programming in languages such as Python and Terraform
β’ Experience with data architecture, data modelling, and data warehousing concepts
β’ Experience with data pipeline tools such as Apache Spark, Apache Kafka, AWS Kinesis
β’ Understanding of data security and compliance principles and practices
β’ Excellent problem-solving skills, with the ability to analyse complex data systems and identify areas for improvement
β’ Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams
Desired Skills
β’ Experience with Artificial Intelligence and Machine Learning technologies
β’ Experience with cloud-based data platforms such as AWS and Azure
β’ Experience with agile development methodologies and version control systems such as Git
β’ Experience with data visualization tools such as Tableau and Cognos Analytics
β’ Familiarity with complex data regulatory requirements such as NIST 800-171 and SOX
β’ Familiarity with OpenTelemetry standards
β’ Optimization for observability data and real time anomaly detections
Title : Data Engineer
Location : REMOTE
Contract : 6 Month Contract with potential extension
Citizenship Requirement: US Citizenship
Shift : 4/10 1st Shift
Pay : $60.00/hr - $70.00/hr on W2!
Job Description
This Lockheed Martin Enterprise Business & Digital Transformation (EBDT) Data Engineer position will be in the EBDT Innovation and Technology Solutions (ITS) organization. As a Data Engineer, you will be responsible for the development, optimization, and management of data ingestion, transformation, and storage processes using modern frameworks. The successful candidate will implement infrastructure-as-code and automation solutions to streamline and build resiliency into data processing workflows. You will work closely with our data science and analytics teams to create and optimize data and machine learning models that support stakeholder needs.
The responsibilities of this position include, but are not limited to:
β’ Design, build, and maintain large-scale data systems, including data warehouses, data lakes, and data pipelines
β’ Develop and implement data architectures that meet the needs of our business stakeholders
β’ Collaborate with data scientists and analysts to develop and deploy machine learning models and data products
β’ Ensure data quality, security, and compliance with standards
β’ Develop and maintain technical documentation of data systems and architectures
β’ Troubleshoot and resolve data-related issues and optimize system performance
β’ Develop and implement automated testing and deployment scripts to ensure smooth and efficient delivery of data solutions
β’ Collaborate with cross-functional teams to identify and prioritize data requirements and develop solutions to meet business needs
Basic Qualifications
β’ Bachelor's degree in Computer Science, Information Technology, or related field
β’ 6+ years of experience in data engineering, software development, or a related field
β’ Experience programming in languages such as Python and Terraform
β’ Experience with data architecture, data modelling, and data warehousing concepts
β’ Experience with data pipeline tools such as Apache Spark, Apache Kafka, AWS Kinesis
β’ Understanding of data security and compliance principles and practices
β’ Excellent problem-solving skills, with the ability to analyse complex data systems and identify areas for improvement
β’ Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams
Desired Skills
β’ Experience with Artificial Intelligence and Machine Learning technologies
β’ Experience with cloud-based data platforms such as AWS and Azure
β’ Experience with agile development methodologies and version control systems such as Git
β’ Experience with data visualization tools such as Tableau and Cognos Analytics
β’ Familiarity with complex data regulatory requirements such as NIST 800-171 and SOX
β’ Familiarity with OpenTelemetry standards
β’ Optimization for observability data and real time anomaly detections