Charter Global

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer with a 5-month+ remote contract, offering a pay rate of "TBD." Requires 10+ years of experience, expertise in DPTM and DSCS, proficiency in AWS, and strong skills in Python/Java and SQL.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 18, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Fixed Term
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Lambda (AWS Lambda) #PySpark #Terraform #Data Catalog #Data Science #Logging #Java #SQL (Structured Query Language) #IAM (Identity and Access Management) #Dataiku #GIT #AWS (Amazon Web Services) #Data Lake #Python #"ETL (Extract #Transform #Load)" #Cloud #Databricks #SageMaker #Monitoring #Docker #Redshift #Data Access #Spark (Apache Spark) #Data Quality #Data Engineering #S3 (Amazon Simple Storage Service) #ML (Machine Learning) #Jenkins #Data Pipeline
Role description
Job Title: Sr. Data Engineer Location: Remote/WFH Duration: 5 months+ contract Number of Positions: 2 roles. Description • Design, develop, and maintain data pipelines to extract and load data into data lakes and warehouses. • Build and optimize data transformation rules and data models for analytical and operational use. • Collaborate with Product Analysts, Data Scientists, and ML Engineers to make data accessible and actionable. • Implement data quality checks, maintain data catalogs, and utilize orchestration, logging, and monitoring tools. • Apply test-driven development methodologies to ELT/ETL pipeline construction. Qualifications • 10+ years of relevant experience in data engineering. • Domain expertise in DPTM (Discovery Preclinical and Translational Medicine) and DSCS (Development Sciences and Clinical Supply). • Proficiency with AWS services including S3, IAM, Redshift, Sagemaker, Glue, Lambda, Step Functions, and CloudWatch. • Hands-on experience with platforms like Databricks and Dataiku. • Strong coding skills in Python/Java, SQL (Redshift preferred), and tools like Jenkins, CloudFormation, Terraform, Git, Docker, and Spark (2–3 years with PySpark). Best Regards, David Roy | Accounts Manager – US Staffing | Charter Global Inc. | https://www.charterglobal.com LinkedIn