Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 12+ year experience, including 8+ years on GCP. It offers a competitive pay rate, requires strong SQL and Python skills, and prefers GCP Professional Data Engineer certification. Remote work is available.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 23, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Houston, TX
🧠 - Skills detailed
#Data Engineering #Data Modeling #Airflow #Data Processing #"ETL (Extract #Transform #Load)" #GCP (Google Cloud Platform) #Java #BigQuery #Infrastructure as Code (IaC) #Python #Schema Design #Data Analysis #Terraform #Agile #Dataflow #Monitoring #Scala #Deployment #Apache Beam #Data Integrity #Version Control #Data Quality #Spark (Apache Spark) #SQL (Structured Query Language) #DevOps #Apache Spark #Data Pipeline #Computer Science #Storage #Compliance #GIT #Security #Data Security #Cloud #GDPR (General Data Protection Regulation)
Role description

Job Summary:

We are seeking a skilled and experienced GCP Data Engineer to join our data engineering team. The ideal candidate will be responsible for designing, building, and managing scalable data pipelines on Google Cloud Platform (GCP), ensuring efficient and secure data flow across the organization.

Key Responsibilities:

   • Design, build, and maintain robust and scalable ETL/ELT data pipelines on GCP.

   • Develop and optimize data processing workflows using services like BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Functions.

   • Work closely with data analysts, scientists, and stakeholders to understand data needs and deliver reliable data solutions.

   • Implement data quality checks, monitoring, and alerting mechanisms.

   • Automate data pipeline deployments using Infrastructure as Code (e.g., Terraform, Deployment Manager).

   • Ensure security, compliance, and governance of data pipelines and storage.

   • Troubleshoot and resolve performance, reliability, and data integrity issues.

Required Skills and Qualifications:

   • Bachelor's degree in Computer Science, Engineering, or a related field.

   • 12+ years of experience in data engineering, with at least 8+ years on GCP.

   • Strong expertise in SQL and Python (or Java/Scala).

   • Hands-on experience with BigQuery, Dataflow/Apache Beam, Cloud Storage, and Pub/Sub.

   • Familiarity with data modeling, schema design, and data warehousing best practices.

   • Knowledge of DevOps tools and CI/CD pipelines.

   • Experience with version control systems like Git.

   • Strong problem-solving skills and ability to work in an agile environment.

Preferred Qualifications:

   • GCP Professional Data Engineer certification.

   • Experience with real-time data processing and streaming pipelines.

   • Familiarity with Apache Spark (via Dataproc) and Airflow.

   • Exposure to data security and compliance standards (e.g., GDPR, HIPAA).