

CCS Global Tech
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer, remote for 6 to 12 months, paying "pay rate". Requires strong experience in Snowflake, Matillion, AWS, and Python. Must be a USC/GC with solid SQL expertise and understanding of ELT best practices.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
January 17, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Python #SQL (Structured Query Language) #Cloud #Matillion #Version Control #AWS (Amazon Web Services) #Lambda (AWS Lambda) #Data Modeling #Snowflake #Logging #Monitoring #Data Quality #Scala #Data Engineering #S3 (Amazon Simple Storage Service) #Security #"ETL (Extract #Transform #Load)" #EC2 #Storage #BI (Business Intelligence) #Automation #Data Pipeline #Data Processing #Databases
Role description
Position: Data Engineer
Location: Remote
Duration: 6 to 12 Months
Only USC/GC
Job Description
We are seeking a skilled Data Engineer with strong experience in Snowflake and Matillion, along with hands-on exposure to AWS and Python. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and analytics solutions to support business intelligence and data-driven decision-making.
Key Responsibilities
β’ Design, develop, and maintain ETL/ELT pipelines using Matillion and Snowflake
β’ Build and optimize data models in Snowflake for analytics and reporting
β’ Ingest data from multiple sources (databases, APIs, files, cloud services)
β’ Optimize Snowflake performance, cost, and storage using best practices
β’ Develop and maintain Python scripts for data processing, automation, and orchestration
β’ Work with AWS services such as S3, Lambda, Glue, EC2, and CloudWatch
β’ Ensure data quality, reliability, and security across pipelines
β’ Collaborate with analytics, BI, and business stakeholders to understand data requirements
β’ Implement monitoring, logging, and error-handling mechanisms
β’ Support CI/CD and version control for data pipelines
Required Skills & Qualifications:
β’ Experience with Snowflake (data modeling, performance tuning, SQL)
β’ Hands-on experience with Matillion ETL
β’ Solid SQL expertise
β’ Working knowledge of AWS
β’ Proficiency in Python for data engineering use cases
β’ Understanding of ELT best practices and cloud-based architectures
Position: Data Engineer
Location: Remote
Duration: 6 to 12 Months
Only USC/GC
Job Description
We are seeking a skilled Data Engineer with strong experience in Snowflake and Matillion, along with hands-on exposure to AWS and Python. The ideal candidate will be responsible for designing, building, and maintaining scalable data pipelines and analytics solutions to support business intelligence and data-driven decision-making.
Key Responsibilities
β’ Design, develop, and maintain ETL/ELT pipelines using Matillion and Snowflake
β’ Build and optimize data models in Snowflake for analytics and reporting
β’ Ingest data from multiple sources (databases, APIs, files, cloud services)
β’ Optimize Snowflake performance, cost, and storage using best practices
β’ Develop and maintain Python scripts for data processing, automation, and orchestration
β’ Work with AWS services such as S3, Lambda, Glue, EC2, and CloudWatch
β’ Ensure data quality, reliability, and security across pipelines
β’ Collaborate with analytics, BI, and business stakeholders to understand data requirements
β’ Implement monitoring, logging, and error-handling mechanisms
β’ Support CI/CD and version control for data pipelines
Required Skills & Qualifications:
β’ Experience with Snowflake (data modeling, performance tuning, SQL)
β’ Hands-on experience with Matillion ETL
β’ Solid SQL expertise
β’ Working knowledge of AWS
β’ Proficiency in Python for data engineering use cases
β’ Understanding of ELT best practices and cloud-based architectures






