

Idexcel
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 10+ years of experience, focused on GCP to AWS migration. It offers a long-term contract, competitive pay, and requires strong skills in cloud platforms, ETL tools, and programming languages like Python or Java.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 25, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, United States
-
🧠 - Skills detailed
#Data Storage #Scala #Security #Talend #Apache Spark #Oracle #Programming #Storage #Kafka (Apache Kafka) #Data Integration #Migration #Cloud #Java #Data Pipeline #AWS Migration #Apache Kafka #PostgreSQL #Data Quality #Spark (Apache Spark) #Data Engineering #Computer Science #MySQL #SQL (Structured Query Language) #Web Services #Database Design #"ETL (Extract #Transform #Load)" #Azure #GCP (Google Cloud Platform) #Data Modeling #AWS (Amazon Web Services) #Python #Data Governance
Role description
Job Title: Senior Data Engineer (10+ years)
Location: New York City, NY/Remote
Duration: Long term
Responsibilities:
Primary:
Skilled Cloud Engineers experienced in both Google Cloud Platform (GCP) and Amazon Web Services (AWS) for a GCP to AWS migration project.
The ideal candidate will have substantial hands-on cloud experience, a deep understanding of cloud architecture, and proficiency in complex cloud migrations.
Collaborate with stakeholders to understand data requirements and translate them into technical solutions
Design, develop, and maintain data pipelines and ETL processes
Implement data integration and transformation solutions to ensure data quality and consistency
Optimize data storage and retrieval for performance and scalability
Monitor and troubleshoot data pipelines and systems to ensure data availability and reliability
Implement and maintain data governance and security measures
Stay up-to-date with emerging data engineering technologies and best practices
Act with integrity, professionalism, and personal responsibility to uphold the firm’s respectful and courteous work environment.
Qualifications:
Bachelor's degree in Computer Science, Information Systems, or a related field
Proven experience as a Data Engineer or in a similar role
Strong programming skills in languages such as Python, Java, or Scala
Experience with data integration and ETL tools, such as Apache Spark, Apache Kafka, or Talend
Proficiency in SQL and database technologies (e.g., PostgreSQL, MySQL, or Oracle)
Familiarity with cloud platforms and services (e.g., AWS, Azure, or GCP)
Knowledge of data modeling and database design principles
Strong analytical and problem-solving skills
Effective communication and collaboration skills
Job Title: Senior Data Engineer (10+ years)
Location: New York City, NY/Remote
Duration: Long term
Responsibilities:
Primary:
Skilled Cloud Engineers experienced in both Google Cloud Platform (GCP) and Amazon Web Services (AWS) for a GCP to AWS migration project.
The ideal candidate will have substantial hands-on cloud experience, a deep understanding of cloud architecture, and proficiency in complex cloud migrations.
Collaborate with stakeholders to understand data requirements and translate them into technical solutions
Design, develop, and maintain data pipelines and ETL processes
Implement data integration and transformation solutions to ensure data quality and consistency
Optimize data storage and retrieval for performance and scalability
Monitor and troubleshoot data pipelines and systems to ensure data availability and reliability
Implement and maintain data governance and security measures
Stay up-to-date with emerging data engineering technologies and best practices
Act with integrity, professionalism, and personal responsibility to uphold the firm’s respectful and courteous work environment.
Qualifications:
Bachelor's degree in Computer Science, Information Systems, or a related field
Proven experience as a Data Engineer or in a similar role
Strong programming skills in languages such as Python, Java, or Scala
Experience with data integration and ETL tools, such as Apache Spark, Apache Kafka, or Talend
Proficiency in SQL and database technologies (e.g., PostgreSQL, MySQL, or Oracle)
Familiarity with cloud platforms and services (e.g., AWS, Azure, or GCP)
Knowledge of data modeling and database design principles
Strong analytical and problem-solving skills
Effective communication and collaboration skills






