

Revel IT
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position for a fully remote contract lasting 3–5 years, offering competitive pay. Key skills include JavaScript, SQL, Python, and experience with ETL/ELT pipelines. Familiarity with cloud platforms like AWS, Azure, or GCP is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
600
-
🗓️ - Date
March 10, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Columbus, OH
-
🧠 - Skills detailed
#Data Quality #Datasets #Data Migration #AWS (Amazon Web Services) #Cloud #API (Application Programming Interface) #JavaScript #GCP (Google Cloud Platform) #Python #SQL (Structured Query Language) #Migration #Data Integration #Scala #Data Lake #Data Pipeline #Data Architecture #SaaS (Software as a Service) #Computer Science #Azure #Data Engineering #"ETL (Extract #Transform #Load)" #Documentation #Code Reviews
Role description
Location: Fully Remote – EST
Role Overview:
We are seeking Data Engineers to help build, maintain, and optimize cloud-based data solutions supporting SaaS platforms. This role focuses on developing data migration scripts, building ETL/ELT pipelines, and writing logic to manipulate and move data across systems.
The ideal candidate will be highly proficient in JavaScript, SQL, and Python, and comfortable working with large datasets, transforming data, and ensuring data is efficiently migrated and integrated across cloud environments.
Key Responsibilities:
• Develop and maintain data migration scripts to move and transform data across systems and platforms.
• Build and optimize ETL/ELT pipelines to support data integration, transformation, and analytics initiatives.
• Write efficient JavaScript, SQL, and Python logic to manipulate, transform, and move data between systems.
• Collaborate with data architects and engineering teams to implement scalable and reliable data solutions.
• Support data transformation and migration initiatives for cloud-based SaaS platforms.
• Monitor and troubleshoot data pipelines and integration processes to ensure reliability and performance.
• Ensure data quality, integrity, and consistency across systems.
• Contribute to documentation, code reviews, and data engineering best practices.
Qualifications:
• Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or related field (or equivalent experience).
• 3–5 years of experience in data engineering or data integration roles.
• Strong proficiency in JavaScript, SQL, and Python.
• Experience developing ETL/ELT pipelines and data migration scripts.
• Experience manipulating and transforming large datasets across systems.
• Familiarity with cloud-based data platforms (AWS, Azure, or GCP).
• Strong problem-solving and analytical skills.
Preferred:
• Experience working in SaaS environments or cloud-native applications.
• Familiarity with data warehousing, data lakes, or modern analytics platforms.
• Experience working with API-based data integrations or event-driven architectures.
Location: Fully Remote – EST
Role Overview:
We are seeking Data Engineers to help build, maintain, and optimize cloud-based data solutions supporting SaaS platforms. This role focuses on developing data migration scripts, building ETL/ELT pipelines, and writing logic to manipulate and move data across systems.
The ideal candidate will be highly proficient in JavaScript, SQL, and Python, and comfortable working with large datasets, transforming data, and ensuring data is efficiently migrated and integrated across cloud environments.
Key Responsibilities:
• Develop and maintain data migration scripts to move and transform data across systems and platforms.
• Build and optimize ETL/ELT pipelines to support data integration, transformation, and analytics initiatives.
• Write efficient JavaScript, SQL, and Python logic to manipulate, transform, and move data between systems.
• Collaborate with data architects and engineering teams to implement scalable and reliable data solutions.
• Support data transformation and migration initiatives for cloud-based SaaS platforms.
• Monitor and troubleshoot data pipelines and integration processes to ensure reliability and performance.
• Ensure data quality, integrity, and consistency across systems.
• Contribute to documentation, code reviews, and data engineering best practices.
Qualifications:
• Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or related field (or equivalent experience).
• 3–5 years of experience in data engineering or data integration roles.
• Strong proficiency in JavaScript, SQL, and Python.
• Experience developing ETL/ELT pipelines and data migration scripts.
• Experience manipulating and transforming large datasets across systems.
• Familiarity with cloud-based data platforms (AWS, Azure, or GCP).
• Strong problem-solving and analytical skills.
Preferred:
• Experience working in SaaS environments or cloud-native applications.
• Familiarity with data warehousing, data lakes, or modern analytics platforms.
• Experience working with API-based data integrations or event-driven architectures.





