

Forbes Technical Consulting
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a 6-month remote contract. Key skills include Snowflake, SQL, Python, and AWS technologies. Experience in data migration, ETL processes, and agile environments is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
624
-
🗓️ - Date
April 30, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Batch #Python #Data Pipeline #Streamlit #Scrum #Databases #RDS (Amazon Relational Database Service) #Snowflake #Agile #Data Cleansing #SnowPipe #IICS (Informatica Intelligent Cloud Services) #Unix #AWS (Amazon Web Services) #API (Application Programming Interface) #Data Engineering #Redshift #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Scala #Migration #Informatica #S3 (Amazon Simple Storage Service) #Alteryx #Lambda (AWS Lambda) #Data Warehouse #Cloud #Kafka (Apache Kafka) #Scripting
Role description
Data Engineer, Snowflake
Duration: 6 months
Location: Remote; US-based
Responsibilities
• Design, build, and maintain robust data pipelines and warehouse solutions in a cloud-first environment
• Implement advanced Snowflake capabilities (Streams, Tasks, Snowpipe, data sharing) for real-time and batch processing
• Lead migration initiatives from legacy data warehouses to Snowflake with minimal disruption
• Build data applications including Streamlit apps and Snowflake Native Apps
• Hands-on work with AWS cloud architecture using S3, Lambda, API Gateway, RDS, and more
• Collaborate with Product teams in an agile/scrum environment to translate requirements into solutions
Requirements
Core skills
• Proficient in SQL, PL/SQL, relational databases, and dimensional modeling
• Experience with Python or Unix scripting
• Good experience with Snowflake; familiarity with Streams, Tasks, Snowpipe
• Experience with Kafka or other streaming data platforms
• Strong understanding of data warehousing concepts and the full SDLC
• Experience building scalable ETL/data pipelines using Informatica IICS, Alteryx, or similar tools
• Hands-on experience with AWS native technologies: Glue, Lambda, Kinesis, Lake Formation, S3, Redshift
• Experience with data cleansing, validation, and wrangling
• Strong verbal and written communication skills
• Comfortable working in ambiguous, fast-changing environments
Data Engineer, Snowflake
Duration: 6 months
Location: Remote; US-based
Responsibilities
• Design, build, and maintain robust data pipelines and warehouse solutions in a cloud-first environment
• Implement advanced Snowflake capabilities (Streams, Tasks, Snowpipe, data sharing) for real-time and batch processing
• Lead migration initiatives from legacy data warehouses to Snowflake with minimal disruption
• Build data applications including Streamlit apps and Snowflake Native Apps
• Hands-on work with AWS cloud architecture using S3, Lambda, API Gateway, RDS, and more
• Collaborate with Product teams in an agile/scrum environment to translate requirements into solutions
Requirements
Core skills
• Proficient in SQL, PL/SQL, relational databases, and dimensional modeling
• Experience with Python or Unix scripting
• Good experience with Snowflake; familiarity with Streams, Tasks, Snowpipe
• Experience with Kafka or other streaming data platforms
• Strong understanding of data warehousing concepts and the full SDLC
• Experience building scalable ETL/data pipelines using Informatica IICS, Alteryx, or similar tools
• Hands-on experience with AWS native technologies: Glue, Lambda, Kinesis, Lake Formation, S3, Redshift
• Experience with data cleansing, validation, and wrangling
• Strong verbal and written communication skills
• Comfortable working in ambiguous, fast-changing environments






