

GxP Associates
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position requiring 7–9 years of experience, a Bachelor's degree in Computer Science or related field, strong SQL and Python/Scala skills, and expertise in ETL tools and cloud platforms. Contract length and pay rate are unspecified.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 25, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New Jersey, United States
-
🧠 - Skills detailed
#Kafka (Apache Kafka) #Data Quality #GCP (Google Cloud Platform) #Data Lake #Scala #Data Science #Big Data #BigQuery #Data Modeling #Python #Data Processing #AWS (Amazon Web Services) #Compliance #Data Pipeline #Cloud #Data Architecture #Data Security #Hadoop #Snowflake #"ETL (Extract #Transform #Load)" #Data Engineering #Computer Science #Spark (Apache Spark) #Redshift #SQL (Structured Query Language) #Azure #Data Warehouse #Data Governance #Airflow #Security #Programming #Storage
Role description
Job Summary:
We are seeking an experienced Data Engineer with 7–9 years of experience to design, build, and maintain scalable data pipelines and data infrastructure. The ideal candidate will work closely with data scientists, analysts, and business teams to ensure reliable data availability and support data-driven decision-making.
Key Responsibilities:
• Design, develop, and optimize scalable ETL/ELT data pipelines
• Build and maintain data architectures using cloud platforms and distributed systems
• Develop and manage data warehouses and data lakes
• Ensure data quality, integrity, and governance across systems
• Optimize performance of data processing and storage solutions
• Collaborate with cross-functional teams including Data Science, Analytics, and Engineering
• Implement data security and compliance best practices
• Troubleshoot and resolve data-related issues in production environments
Required Qualifications:
• Bachelor’s degree in Computer Science, Engineering, or related field
• 7–9 years of experience in data engineering or related roles
• Strong proficiency in SQL and programming languages such as Python or Scala
• Hands-on experience with ETL tools and frameworks (e.g., Spark, Airflow)
• Experience with cloud platforms such as AWS, Azure, or GCP
• Knowledge of data warehousing solutions (e.g., Snowflake, Redshift, BigQuery)
• Familiarity with big data technologies (Hadoop, Kafka)
• Understanding of data modeling, data governance, and data security
• Strong problem-solving and communication skills
Job Summary:
We are seeking an experienced Data Engineer with 7–9 years of experience to design, build, and maintain scalable data pipelines and data infrastructure. The ideal candidate will work closely with data scientists, analysts, and business teams to ensure reliable data availability and support data-driven decision-making.
Key Responsibilities:
• Design, develop, and optimize scalable ETL/ELT data pipelines
• Build and maintain data architectures using cloud platforms and distributed systems
• Develop and manage data warehouses and data lakes
• Ensure data quality, integrity, and governance across systems
• Optimize performance of data processing and storage solutions
• Collaborate with cross-functional teams including Data Science, Analytics, and Engineering
• Implement data security and compliance best practices
• Troubleshoot and resolve data-related issues in production environments
Required Qualifications:
• Bachelor’s degree in Computer Science, Engineering, or related field
• 7–9 years of experience in data engineering or related roles
• Strong proficiency in SQL and programming languages such as Python or Scala
• Hands-on experience with ETL tools and frameworks (e.g., Spark, Airflow)
• Experience with cloud platforms such as AWS, Azure, or GCP
• Knowledge of data warehousing solutions (e.g., Snowflake, Redshift, BigQuery)
• Familiarity with big data technologies (Hadoop, Kafka)
• Understanding of data modeling, data governance, and data security
• Strong problem-solving and communication skills





