Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 8+ years of experience, focusing on migrating data pipelines to AWS and Snowflake. Key skills include ETL/ELT, Python programming, and MongoDB knowledge. Contract length and pay rate are unspecified.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 30, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Programming #Python #Data Accuracy #Data Architecture #Scripting #"ETL (Extract #Transform #Load)" #Data Modeling #Databases #Agile #Cloud #Snowflake #Informatica #MongoDB #Data Pipeline #AWS (Amazon Web Services) #Scala #Security #Data Engineering #Teradata
Role description
Capstone IT is helping our client to find a Senior Data Engineer to lead modernization efforts of their data pipeline and architecture. This focuses on migrating existing pipelines from Informatica and Teradata to modern cloud-based technologies like AWS, Snowflake, and MongoDB. The engineer will be hands-on with data modeling, ETL/ELT, and will play a key role in driving cloud data solutions using Python and document databases. This is a role for someone with strong technical acumen, an innovative mindset, and a passion for modern data technologies. Top 4 Skills 1. AWS & Snowflake Data Architecture 1. ETL/ELT and Data Warehousing (Teradata, Informatica) 1. Python Programming for Data Engineering 1. MongoDB and Document Database Knowledge Responsibilities β€’ Lead cloud data modernization from Teradata/Informatica to AWS and Snowflake β€’ Design and implement scalable data models and warehouses in Snowflake β€’ Develop clean, testable, and documented code following defined standards β€’ Collaborate in Agile ceremonies and proactively contribute to team goals β€’ Maintain data accuracy, availability, and meet NFRs β€’ Participate in on-call rotation post-permanent hiring β€’ Adhere to security and communication protocols Qualifications β€’ Strong experience in AWS, Snowflake, and cloud data pipelines β€’ 8+ years of experience in Data Engineering β€’ Background in data warehousing and ETL tools (Teradata, Informatica) β€’ Proficient in Python and scripting for data workflows β€’ Experience with MongoDB or other document-based databases preferred β€’ Familiar with Agile development methodologies β€’ Strong communicator and collaborator