

Creative Global Consulting
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of 12+ months, offering $40.00 - $45.00 per hour. Candidates must be US citizens residing on the East Coast, with 10+ years of ETL and data warehousing experience, and strong SQL proficiency.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
360
-
ποΈ - Date
March 18, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Remote
-
π§ - Skills detailed
#Monitoring #Agile #Data Governance #Computer Science #Data Integrity #AWS (Amazon Web Services) #Data Quality #Redshift #Scripting #Quality Assurance #Scala #Storage #S3 (Amazon Simple Storage Service) #Oracle #Cloud #Data Integration #NoSQL #Leadership #RDS (Amazon Relational Database Service) #AWS Glue #PySpark #GitHub #Compliance #Python #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Databases #Spark (Apache Spark) #Lambda (AWS Lambda) #GIT #DevOps #Data Manipulation #Data Engineering #Shell Scripting #Ansible #Documentation #PostgreSQL #Automation
Role description
Client: Securities and Exchange Commission
POP: 12+ months
Location: Remote, candidate must reside on the East Coast
US Citizen
SCOPE
The Data Engineer will be pivotal in designing, developing, and maintaining robust ETL (Extract, Transform, Load) processes to ensure seamless data flow between our diverse data sources and target data stores. You will be responsible for building and optimizing automated pipelines, ensuring data quality, and accommodating future data format changes. This position requires a strong technical foundation and a proactive approach to problem-solving.
REQUIRED SKILLS
Bachelorβs degree in Computer Science, Information Systems, or a related field.
10+ years of experience in data integration, ETL development, and data warehousing.
Strong proficiency in SQL and experience with relational databases (e.g. Oracle, PostgreSQL) and NoSQL databases.
Experienced with scripting languages such as Python or Shell scripting for automation and data manipulation.
Experienced with cloud technologies, including AWS Glue, Lambda, CloudFormation/Ansible, S3, Redshift, and EMR.
Experienced with Git, GitHub, CI/CD pipelines for DevOps and data engineering.
ETL development with Glue ETL, Python, Pyspark, RDS.
Solid understanding of data governance principles and data quality best practices.
Ability to work independently and as part of a collaborative team in an Agile environment.
Excellent problem-solving, analytical, and communication skills.
TASKS
Pipeline Design & Development: Design, develop, and implement scalable and efficient ETL pipelines using modern data integration tools and technologies.
Data Transformation: Transform and cleanse data from various sources (databases, APIs, cloud storage, etc.) to ensure accuracy, consistency, and compliance with data governance policies.
Data Store Management: Develop and maintain optimized data models and data warehousing solutions utilizing platforms like Oracle, PostgreSQL, Redshift, and EMR. Focus on performance tuning and query optimization.
Automation & Monitoring: Build and maintain automated ETL jobs, incorporating robust monitoring and alerting mechanisms for proactive issue detection and resolution.
Data Quality Assurance: Implement data quality checks and validation rules throughout the ETL process to guarantee data integrity.
Documentation: Create and maintain comprehensive documentation for ETL processes, data models, and system configurations.
Collaboration: Work closely with business stakeholders and other teams to understand data requirements and deliver effective solutions.
Future-Proofing: Proactively assess and implement changes to data integration processes to accommodate evolving data formats, sources, and business needs. Ensuring designs accommodate potential future data changes.
All other duties as assigned by leadership.
Pay: $40.00 - $45.00 per hour
Work Location: Remote
Client: Securities and Exchange Commission
POP: 12+ months
Location: Remote, candidate must reside on the East Coast
US Citizen
SCOPE
The Data Engineer will be pivotal in designing, developing, and maintaining robust ETL (Extract, Transform, Load) processes to ensure seamless data flow between our diverse data sources and target data stores. You will be responsible for building and optimizing automated pipelines, ensuring data quality, and accommodating future data format changes. This position requires a strong technical foundation and a proactive approach to problem-solving.
REQUIRED SKILLS
Bachelorβs degree in Computer Science, Information Systems, or a related field.
10+ years of experience in data integration, ETL development, and data warehousing.
Strong proficiency in SQL and experience with relational databases (e.g. Oracle, PostgreSQL) and NoSQL databases.
Experienced with scripting languages such as Python or Shell scripting for automation and data manipulation.
Experienced with cloud technologies, including AWS Glue, Lambda, CloudFormation/Ansible, S3, Redshift, and EMR.
Experienced with Git, GitHub, CI/CD pipelines for DevOps and data engineering.
ETL development with Glue ETL, Python, Pyspark, RDS.
Solid understanding of data governance principles and data quality best practices.
Ability to work independently and as part of a collaborative team in an Agile environment.
Excellent problem-solving, analytical, and communication skills.
TASKS
Pipeline Design & Development: Design, develop, and implement scalable and efficient ETL pipelines using modern data integration tools and technologies.
Data Transformation: Transform and cleanse data from various sources (databases, APIs, cloud storage, etc.) to ensure accuracy, consistency, and compliance with data governance policies.
Data Store Management: Develop and maintain optimized data models and data warehousing solutions utilizing platforms like Oracle, PostgreSQL, Redshift, and EMR. Focus on performance tuning and query optimization.
Automation & Monitoring: Build and maintain automated ETL jobs, incorporating robust monitoring and alerting mechanisms for proactive issue detection and resolution.
Data Quality Assurance: Implement data quality checks and validation rules throughout the ETL process to guarantee data integrity.
Documentation: Create and maintain comprehensive documentation for ETL processes, data models, and system configurations.
Collaboration: Work closely with business stakeholders and other teams to understand data requirements and deliver effective solutions.
Future-Proofing: Proactively assess and implement changes to data integration processes to accommodate evolving data formats, sources, and business needs. Ensuring designs accommodate potential future data changes.
All other duties as assigned by leadership.
Pay: $40.00 - $45.00 per hour
Work Location: Remote





