Santcore Technologies

Senior Data Engineer

โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Remote, Northern Virginia) with a contract length of "unknown." Pay rate is "unknown." Requires USC or GC, a Bachelor's degree with eight years or a Master's with six years of experience, proficiency in Python, PySpark, and AWS services.
๐ŸŒŽ - Country
United States
๐Ÿ’ฑ - Currency
$ USD
-
๐Ÿ’ฐ - Day rate
Unknown
-
๐Ÿ—“๏ธ - Date
October 23, 2025
๐Ÿ•’ - Duration
Unknown
-
๐Ÿ๏ธ - Location
Remote
-
๐Ÿ“„ - Contract
Unknown
-
๐Ÿ”’ - Security
Unknown
-
๐Ÿ“ - Location detailed
Northern Virginia, VA
-
๐Ÿง  - Skills detailed
#Deployment #"ETL (Extract #Transform #Load)" #Agile #AWS Glue #Big Data #Apache Spark #PySpark #Monitoring #DynamoDB #Computer Science #Kafka (Apache Kafka) #Data Modeling #Data Engineering #Scrum #Cloud #Data Processing #Python #Lambda (AWS Lambda) #Oracle #Redshift #Scala #Continuous Deployment #Kanban #Hadoop #Spark (Apache Spark) #Data Manipulation #Data Science #SQL (Structured Query Language) #Datasets #Data Integrity #Database Performance #Schema Design #AWS (Amazon Web Services)
Role description
Senior Data Engineer Location: Northern Virginia (Remote) Status: USC or GC required Responsibilities include: โ€ข Design, develop, and implement robust ETL solutions using Python and PySpark to extract, transform, and load data from various sources into AWS data services. โ€ข Optimize ETL processes for performance and scalability utilizing AWS Glue, EMR, Step Functions, and Lambda to ensure efficient data processing and timely delivery. โ€ข Ensure data integrity and quality throughout the ETL process by implementing thorough data validation checks and error handling mechanisms. โ€ข Manage AWS services such as Glue, EMR, Step Functions, and Lambda, including configuration, monitoring, and troubleshooting to maintain operational excellence. โ€ข Collaborate with cross-functional teams including data engineers, data scientists, and business stakeholders to understand data requirements and deliver tailored ETL solutions. โ€ข Troubleshoot complex technical issues and provide advanced operational support to internal MITRE customers in AWS Required Qualifications: โ€ข Bachelorโ€™s degree with eight yearsโ€™ related experience, or a masterโ€™s degree with six yearsโ€™ related experience, preferably with a technical major such as engineering, computer science, etc. โ€ข Self-motivated, curious, and collaborative, with a passion to learn new technologies and develop new skills โ€ข Demonstrated experience in developing ETL pipelines using Python and PySpark, with a strong understanding of data processing techniques. โ€ข Expertise in SQL for data manipulation, querying, and optimization to work with various database platforms including Postgres, DynamoDB, Oracle, and Redshift. โ€ข Hands-on experience with AWS Glue, EMR, Step Functions, and Lambda for building and orchestrating ETL workflows in a cloud environment. โ€ข Experience implementing Continuous Integration/Continuous Deployment (CI/CD) pipelines using AWS CDK or similar tools for automating deployment and testing of ETL solutions. Preferred Qualifications: โ€ข Previous experience leading or mentoring a team of developers/engineers in a collaborative environment. โ€ข AWS certifications such as AWS Certified Developer or AWS Certified Solutions Architect, demonstrating proficiency in AWS services and best practices. โ€ข Familiarity with big data technologies such as Apache Spark, Hadoop, or Kafka for processing large-scale datasets. โ€ข Experience in data modeling and schema design for optimizing database performance and scalability. โ€ข Experience working in Agile development methodologies, such as Scrum or Kanban, for iterative and collaborative project delivery.