

GTN Technical Staffing
Bi-Lingual-Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Bi-Lingual Senior Data Engineer on a contract-to-hire basis, located in San Diego. Required skills include ETL/ELT, Snowflake, Informatica IICS, SQL, Python, and DevOps. Candidates should have 8+ years of relevant experience and be fluent in English and Spanish.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
600
-
🗓️ - Date
February 25, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
San Diego, CA
-
🧠 - Skills detailed
#Datasets #GIT #Data Governance #Documentation #IICS (Informatica Intelligent Cloud Services) #Deployment #dbt (data build tool) #Programming #Data Lineage #Forecasting #SQL (Structured Query Language) #BI (Business Intelligence) #Azure #Cloud #Microsoft Power BI #Snowflake #Data Pipeline #Security #Azure DevOps #DevOps #Data Warehouse #Data Security #Observability #Automated Testing #Data Modeling #"ETL (Extract #Transform #Load)" #Automation #Monitoring #Python #Data Engineering #Compliance #Informatica #Leadership #DataOps
Role description
Job Title
SR Data Engineer
Location
Hybrid, located in San Diego
Employment Type
Contract to Hire
Role Summary
As a Data Engineer, you will design, build, and operate reliable, governed, and cost-efficient data pipelines that connect core platforms and support analytics, operations, and digital decision-making at scale. This role blends hands-on engineering with ownership of how data moves across the enterprise, with a strong focus on automation, observability, and performance.
You will partner closely with analysts, product owners, automation engineers, and architects to turn complex business needs into well-designed, automated data products.
Key Responsibilities
Data Engineering & Pipelines
• Design, develop, and optimize ETL/ELT pipelines
• Build reusable and governed data models
• Translate business requirements into reliable data products
• Implement data pipeline monitoring and alerting
• Optimize performance and manage data platform costs
Data Platforms & Tools
• Develop pipelines using Snowflake, Informatica IICS, and dbt
• Build analytics-ready datasets for forecasting, logistics, and member analytics
• Support BI and reporting use cases
Automation & DevOps
• Implement CI/CD pipelines for data workflows
• Automate testing and deployments
• Manage source control using Git
Governance & Security
• Apply data governance, security, and compliance standards
• Document data lineage, transformations, and operational processes
Collaboration & Leadership
• Partner with analysts, product owners, automation engineers, and architects
• Mentor junior engineers
• Contribute to data engineering standards and best practices
Required Skills
Data Engineering
• ETL / ELT
• Data Pipelines
• Data Warehousing
• Data Modeling (Kimball, Inmon)
• Data Lineage
• Data Observability
Cloud & Platforms
• Snowflake
• Informatica IICS (CDI, CAI)
• Azure Data Services
• Power BI
Programming & Querying
• SQL
• Python
DevOps & Automation
• Git
• CI/CD Pipelines
• Azure DevOps
• DataOps
• Automated Testing
Governance & FinOps
• Data Governance
• Data Security
• Compliance
• Cost Optimization
• FinOps
Ways of Working
• Cross-functional collaboration
• Technical documentation
• Mentorship
Qualifications
• 8+ years of experience as a Data Engineer, Software Developer, or Automation Engineer
• Strong hands-on experience with SQL and Python
• Experience building cloud-based data pipelines and data warehouses
• Production experience with Snowflake and Informatica IICS
• Experience with Git and CI/CD pipelines
• Strong understanding of data modeling concepts (Kimball or Inmon)
• Ability to translate business and operational needs into automated data solutions
• Comfortable working with cross-functional, multicultural teams
• Fluent in English and Spanish
Nice-to-Have Skills
• Experience with dbt
• Experience implementing monitoring and alerting for data platforms
• Experience with FinOps or cost governance for cloud data platforms
• Experience supporting analytics and BI teams
Cultural Attributes
• Self-directed and comfortable working with minimal supervision
• Calm under pressure and adaptable when constraints change
• Comfortable moving forward with imperfect information
• Brings positive energy and curiosity to the team
• Enjoys building things and solving messy problems
Job Title
SR Data Engineer
Location
Hybrid, located in San Diego
Employment Type
Contract to Hire
Role Summary
As a Data Engineer, you will design, build, and operate reliable, governed, and cost-efficient data pipelines that connect core platforms and support analytics, operations, and digital decision-making at scale. This role blends hands-on engineering with ownership of how data moves across the enterprise, with a strong focus on automation, observability, and performance.
You will partner closely with analysts, product owners, automation engineers, and architects to turn complex business needs into well-designed, automated data products.
Key Responsibilities
Data Engineering & Pipelines
• Design, develop, and optimize ETL/ELT pipelines
• Build reusable and governed data models
• Translate business requirements into reliable data products
• Implement data pipeline monitoring and alerting
• Optimize performance and manage data platform costs
Data Platforms & Tools
• Develop pipelines using Snowflake, Informatica IICS, and dbt
• Build analytics-ready datasets for forecasting, logistics, and member analytics
• Support BI and reporting use cases
Automation & DevOps
• Implement CI/CD pipelines for data workflows
• Automate testing and deployments
• Manage source control using Git
Governance & Security
• Apply data governance, security, and compliance standards
• Document data lineage, transformations, and operational processes
Collaboration & Leadership
• Partner with analysts, product owners, automation engineers, and architects
• Mentor junior engineers
• Contribute to data engineering standards and best practices
Required Skills
Data Engineering
• ETL / ELT
• Data Pipelines
• Data Warehousing
• Data Modeling (Kimball, Inmon)
• Data Lineage
• Data Observability
Cloud & Platforms
• Snowflake
• Informatica IICS (CDI, CAI)
• Azure Data Services
• Power BI
Programming & Querying
• SQL
• Python
DevOps & Automation
• Git
• CI/CD Pipelines
• Azure DevOps
• DataOps
• Automated Testing
Governance & FinOps
• Data Governance
• Data Security
• Compliance
• Cost Optimization
• FinOps
Ways of Working
• Cross-functional collaboration
• Technical documentation
• Mentorship
Qualifications
• 8+ years of experience as a Data Engineer, Software Developer, or Automation Engineer
• Strong hands-on experience with SQL and Python
• Experience building cloud-based data pipelines and data warehouses
• Production experience with Snowflake and Informatica IICS
• Experience with Git and CI/CD pipelines
• Strong understanding of data modeling concepts (Kimball or Inmon)
• Ability to translate business and operational needs into automated data solutions
• Comfortable working with cross-functional, multicultural teams
• Fluent in English and Spanish
Nice-to-Have Skills
• Experience with dbt
• Experience implementing monitoring and alerting for data platforms
• Experience with FinOps or cost governance for cloud data platforms
• Experience supporting analytics and BI teams
Cultural Attributes
• Self-directed and comfortable working with minimal supervision
• Calm under pressure and adaptable when constraints change
• Comfortable moving forward with imperfect information
• Brings positive energy and curiosity to the team
• Enjoys building things and solving messy problems






