Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in McLean, VA, on a contract basis. Requires 8+ years in data engineering, expertise in Python, SQL, AWS, and financial services experience. Only US Citizens, Green Card holders, or EAD candidates eligible.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
520
-
πŸ—“οΈ - Date discovered
September 17, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
McLean, VA
-
🧠 - Skills detailed
#Lambda (AWS Lambda) #MongoDB #Data Governance #Redshift #Datasets #Data Engineering #Scala #"ETL (Extract #Transform #Load)" #Azure #Data Science #Data Warehouse #Data Lake #Security #AWS S3 (Amazon Simple Storage Service) #Kafka (Apache Kafka) #Snowflake #Agile #Distributed Computing #SQL (Structured Query Language) #Spark (Apache Spark) #S3 (Amazon Simple Storage Service) #Computer Science #GCP (Google Cloud Platform) #Compliance #NoSQL #Hadoop #Databases #Java #Python #AWS (Amazon Web Services)
Role description
Senior Data Engineer Location : Mclean ,VA Employment Type: Contract Note : This role is only for US Citizens , Green Card , EAD Key Responsibilities β€’ Architect and implement robust ETL/ELT pipelines using Spark, Python, and SQL β€’ Design and manage data lakes and data warehouses (e.g., Snowflake, Redshift) β€’ Collaborate with data scientists, analysts, and product teams to deliver clean, reliable datasets β€’ Optimize data workflows for performance, scalability, and cost-efficiency β€’ Ensure data governance, security, and compliance with financial regulations β€’ Develop real-time data streaming solutions using Kafka or similar tools β€’ Mentor junior engineers and contribute to best practices in data engineering Required Skills & Qualifications β€’ Bachelor’s or Master’s in Computer Science, Engineering, or related field β€’ 8+ years of experience in data engineering, preferably in financial services β€’ Expertise in Python, SQL, and one or more JVM languages (Java/Scala) β€’ Hands-on experience with AWS (S3, EMR, Lambda, Glue, Redshift) or Azure/GCP β€’ Strong knowledge of distributed computing frameworks (Spark, Hadoop) β€’ Experience with NoSQL databases (MongoDB, Cassandra) β€’ Familiarity with CI/CD pipelines and Agile methodologies β€’ Excellent problem-solving and communication skills