Changing Technologies, Inc.

AWS Data Lake Developer (Senior Level)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior AWS Data Lake Developer on a contract basis for over 6 months, offering $91.00 per hour. Required skills include 7+ years in AWS services, ETL processes, and programming (Python, Java). Work location is hybrid in Raleigh, NC.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
728
-
🗓️ - Date
November 13, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Raleigh, NC 27603
-
🧠 - Skills detailed
#MS SQL (Microsoft SQL Server) #Cloud #Microsoft SQL Server #Agile #SQL Server #Compliance #Informatica #ML (Machine Learning) #Data Access #Microsoft SQL #Data Warehouse #S3 (Amazon Simple Storage Service) #Scripting #Data Engineering #Unix #Looker #Databases #Data Modeling #Scala #Shell Scripting #DMS (Data Migration Service) #Data Processing #.Net #Apache Hive #Storage #Java #Data Lake #Scrum #Hadoop #Data Ingestion #Microsoft Power BI #SQL (Structured Query Language) #Talend #Big Data #AWS (Amazon Web Services) #Azure #DevOps #Code Reviews #Oracle #Data Analysis #"ETL (Extract #Transform #Load)" #VBA (Visual Basic for Applications) #Python #Redshift #Datasets #Data Pipeline #Security #Database Design #Spark (Apache Spark) #Migration #BI (Business Intelligence) #API (Application Programming Interface) #Automation #AWS Glue #Programming #Bash #Data Architecture
Role description
OverviewWe are seeking a highly skilled Senior AWS Data Lake Developer to join our dynamic data engineering team. This role involves designing, developing, and maintaining scalable data lake solutions on AWS, leveraging big data technologies and cloud services to enable advanced analytics and data-driven decision-making. The ideal candidate will possess extensive experience with cloud-based data architectures, ETL processes, and a broad set of programming and database skills to support complex data ecosystems. This position offers an exciting opportunity to work on cutting-edge data solutions in a collaborative environment focused on innovation and excellence. Responsibilities Design, develop, and optimize scalable AWS Data Lake architectures utilizing services such as Azure Data Lake, Apache Hive, Spark, and related big data tools. Implement robust ETL workflows using tools like Informatica, Talend, and custom scripting in Python, Bash (Unix shell), or Shell Scripting to extract, transform, and load large datasets efficiently. Develop and maintain data pipelines integrating diverse sources such as Microsoft SQL Server, Oracle, and other relational databases, ensuring high performance and reliability. Collaborate with cross-functional teams to gather requirements, design data models, and implement solutions supporting analytics platforms like Looker. Build and maintain data warehouses supporting business intelligence initiatives with tools like SQL, VBA, and advanced analysis techniques. Develop APIs using RESTful API standards for seamless data access across applications. Perform data analysis, model training, and validation to support machine learning initiatives. Ensure adherence to best practices in database design, security, compliance, and performance tuning within cloud environments. Participate in Agile development cycles, contributing to sprint planning, code reviews, and continuous improvement efforts. Qualifications Proven expertise in designing and implementing large-scale data lakes on AWS with extensive experience in AWS services such as S3, Glue, EMR, Redshift, etc., along with familiarity with Azure Data Lake platforms. Strong programming skills in Java, Python, and scripting languages including Bash (Unix shell) or similar. Deep understanding of big data ecosystems including Hadoop, Apache Hive, Spark, and related technologies. Experience with ETL tools such as Informatica or Talend; proficiency in SQL for complex query development across multiple databases (Microsoft SQL Server, Oracle). Knowledge of modern analytics tools like Looker; ability to translate business needs into technical solutions. Familiarity with database design principles, data modeling, and warehousing concepts. Experience working within an Agile environment; strong problem-solving skills combined with excellent analysis capabilities. Ability to perform model training and validation for machine learning applications is a plus. Excellent communication skills with the ability to collaborate effectively across teams. This position is ideal for a seasoned professional passionate about leveraging cloud technologies for innovative data solutions while working in a fast-paced environment committed to excellence in analytics-driven decision-making. Job Types: Full-time, Contract Pay: $91.00 per hour Expected hours: 40 per week Benefits: Health insurance Experience: Proficient in using various AWS services : 7 years (Required) Modernizing applications (refactoring, migration): 7 years (Required) Data Ingestion & storage (S3, Kinesis, DMS): 7 years (Required) Data Processing & Transformation (AWS Glue, EMR): 7 years (Required) DevOps practices and automation tools: 7 years (Required) Software development (Python, Java/.net coding): 10 years (Required) Strong communication & collaboration skills: 10 years (Required) Agile/Scrum Framework environments: 7 years (Required) Data analytics and BI solutions (Power BI): 3 years (Required) AWS Certification: 1 year (Required) State Government: 1 year (Required) License/Certification: U.S. Citizenship or a Green Card (Required) Ability to Commute: Raleigh, NC 27603 (Required) Work Location: Hybrid remote in Raleigh, NC 27603