CCS Global Tech

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer, remote for 6 to 12 months, paying "pay rate". Requires strong experience in Snowflake, Matillion, AWS, and Python. Must be a USC/GC with solid SQL expertise and understanding of ELT best practices.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
January 17, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Python #SQL (Structured Query Language) #Cloud #Matillion #Version Control #AWS (Amazon Web Services) #Lambda (AWS Lambda) #Data Modeling #Snowflake #Logging #Monitoring #Data Quality #Scala #Data Engineering #S3 (Amazon Simple Storage Service) #Security #"ETL (Extract #Transform #Load)" #EC2 #Storage #BI (Business Intelligence) #Automation #Data Pipeline #Data Processing #Databases
Role description
Position: Data Engineer Location: Remote Duration: 6 to 12 Months Only USC/GC Job Description We are seeking a skilled Data Engineer with strong experience in Snowflake and Matillion, along with hands-on exposure to AWS and Python. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and analytics solutions to support business intelligence and data-driven decision-making. Key Responsibilities β€’ Design, develop, and maintain ETL/ELT pipelines using Matillion and Snowflake β€’ Build and optimize data models in Snowflake for analytics and reporting β€’ Ingest data from multiple sources (databases, APIs, files, cloud services) β€’ Optimize Snowflake performance, cost, and storage using best practices β€’ Develop and maintain Python scripts for data processing, automation, and orchestration β€’ Work with AWS services such as S3, Lambda, Glue, EC2, and CloudWatch β€’ Ensure data quality, reliability, and security across pipelines β€’ Collaborate with analytics, BI, and business stakeholders to understand data requirements β€’ Implement monitoring, logging, and error-handling mechanisms β€’ Support CI/CD and version control for data pipelines Required Skills & Qualifications: β€’ Experience with Snowflake (data modeling, performance tuning, SQL) β€’ Hands-on experience with Matillion ETL β€’ Solid SQL expertise β€’ Working knowledge of AWS β€’ Proficiency in Python for data engineering use cases β€’ Understanding of ELT best practices and cloud-based architectures