Data Engineer (Cloud)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Cloud) based in Mexico, offering a contract length of "unknown" at a pay rate of "unknown." Key skills include AWS, Python, SQL, and experience with Snowflake. Spanish language proficiency is required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 27, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Databases #Data Mart #Informatica #Cloud #Informatica PowerCenter #Unix #NoSQL #IICS (Informatica Intelligent Cloud Services) #Scripting #Airflow #Snowflake #AWS Lambda #SageMaker #AWS (Amazon Web Services) #S3 (Amazon Simple Storage Service) #SQL (Structured Query Language) #Data Engineering #Deployment #Athena #GitHub #Data Science #Python #"ETL (Extract #Transform #Load)" #Data Lake #Data Pipeline #Data Warehouse #Lambda (AWS Lambda) #Redshift
Role description
Job Summary Data Engineer - Local to Mexico β€’ Source, screen, and payroll candidates in Mexico or Latin America; Spanish language required. β€’ Work in Central Standard Time (CST) zone; roles open to candidates across Latin American countries. β€’ Collaborate with Data Science teams to design, develop, and implement data warehouse architecture and solutions. β€’ Develop, test, and maintain data pipelines, ETL processes, and feature engineering pipelines using AWS (Lambda, S3, Glue, Kinesis, Athena, etc.), Python, SQL, and Unix scripting. β€’ Work with databases such as Snowflake (preferred), Redshift, and DB2; demonstrate strong proficiency in SQL and dimensional modeling. β€’ Implement and support feature stores (e.g., SageMaker, Tekton, Tecton) and work with NoSQL databases. β€’ Utilize CI/CD tools (GitHub, GitHub Actions, CodePipeline, CloudFormation) for automated deployment and code management. β€’ Troubleshoot and resolve complex ETL and SQL issues in Data Lake, Data Warehouse, and Data Mart environments. β€’ Analyze requirements, conduct research, and propose technical solutions to address business needs. β€’ Demonstrate strong analytical, problem-solving, and communication skills; work independently and collaboratively within teams. β€’ Maintain flexibility and adaptability for learning new technologies and methodologies. β€’ Nice-to-have: Experience with Airflow, Informatica PowerCenter/IICS, and industry-standard ETL/ELT tools. β€’ Ensure deliverables are completed in a timely and accurate fashion; provide status reports as required. β€’ Uphold company values, demonstrate integrity, maturity, and a constructive approach to challenges.