

The Brixton Group
Data Engineer (Snowflake, Informatica, Teradata, Oracle)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 5+ years of experience in Snowflake and Informatica, 4+ years in Teradata, Oracle, and Python. The contract lasts 6+ months and is 100% remote, focusing on data pipeline design and ETL workflows.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 20, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Teradata #Data Processing #Data Engineering #Data Integration #Data Pipeline #Scala #Databases #Automation #Informatica #Oracle #Snowflake #SQL (Structured Query Language) #Data Modeling #Data Extraction #Complex Queries #Python
Role description
Duration: 6+ months
Location: 100% REMOTE
Responsibilities:
• Design, build, and maintain robust, scalable, and efficient data pipelines
• Develop and manage data integrations using Informatica and other ETL tools
• Implement and optimize data models and transformations in Snowflake
• Work with enterprise databases such as Teradata and Oracle for data extraction, transformation, and loading
• Write efficient Python scripts for data processing, automation, and validation
Required Skills & Qualifications
• 5+ Years of experience with Snowflake data warehousing
• 5+ Years of experience in building ETL/ELT workflows using Informatica
• 4+ Years of experience in Teradata and Oracle databases
• 4+ Years of experience in in Python for data engineering and automation
• Strong SQL skills for complex queries and performance tuning
• Experience with data modeling concepts (dimensional and relational models)
• Understanding of data warehousing and analytics architectures
26-00217
Duration: 6+ months
Location: 100% REMOTE
Responsibilities:
• Design, build, and maintain robust, scalable, and efficient data pipelines
• Develop and manage data integrations using Informatica and other ETL tools
• Implement and optimize data models and transformations in Snowflake
• Work with enterprise databases such as Teradata and Oracle for data extraction, transformation, and loading
• Write efficient Python scripts for data processing, automation, and validation
Required Skills & Qualifications
• 5+ Years of experience with Snowflake data warehousing
• 5+ Years of experience in building ETL/ELT workflows using Informatica
• 4+ Years of experience in Teradata and Oracle databases
• 4+ Years of experience in in Python for data engineering and automation
• Strong SQL skills for complex queries and performance tuning
• Experience with data modeling concepts (dimensional and relational models)
• Understanding of data warehousing and analytics architectures
26-00217






