Data Lake Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Lake Engineer in Dallas, TX (Hybrid) for 24 months+, requiring a minimum of 12 years of experience. Key skills include AWS (Lake Formation, S3, Glue), Python, PySpark, and DevOps (GitLab, Terraform).
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 26, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Cloud #GitLab #Data Lake #AWS (Amazon Web Services) #Terraform #Lambda (AWS Lambda) #Python #Spark (Apache Spark) #DynamoDB #IAM (Identity and Access Management) #PySpark #Agile #S3 (Amazon Simple Storage Service) #DevOps
Role description
Job Title: Data Lake Engineer Job Location: Dallas, TX (Hybrid/Locals) Duration: 24 Months+ With possibility of extension. W2 only and minimum 12 years of experience is needed. Overview: β€’ Client is refactoring how we collect, store, and expose data across the Enterprise using the latest cloud architectures, tools, Devops mindset for quicker feedback loop while leveraging Agile principles. β€’ Our team is looking to solve these pain points with a Data Lake House solution where we store, govern and provide the right data to the right customer at the right time that will integrate and accel both business and technology analytical users. β€’ This solution is and will be the foundation of Southwest Airlines for the future using a Cloud based platform that can scale and continuous integrate with tools and concepts to provide exponential value. Technical Requirements AWS β€’ Lake Formation β€’ S3 β€’ Glue - Crawler, Catalog, Glue Registry, Glue Jobs β€’ Step Function β€’ DynamoDB β€’ IAM β€’ Lambda DevOps β€’ GitLab Languages β€’ Python (aws) β€’ Pyspark Frameworks β€’ Serverless β€’ Stacker β€’ CloudFormation β€’ Terraform