

Data Engineer
Job Summary:
We are seeking a skilled and experienced GCP Data Engineer to join our data engineering team. The ideal candidate will be responsible for designing, building, and managing scalable data pipelines on Google Cloud Platform (GCP), ensuring efficient and secure data flow across the organization.
Key Responsibilities:
• Design, build, and maintain robust and scalable ETL/ELT data pipelines on GCP.
• Develop and optimize data processing workflows using services like BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Functions.
• Work closely with data analysts, scientists, and stakeholders to understand data needs and deliver reliable data solutions.
• Implement data quality checks, monitoring, and alerting mechanisms.
• Automate data pipeline deployments using Infrastructure as Code (e.g., Terraform, Deployment Manager).
• Ensure security, compliance, and governance of data pipelines and storage.
• Troubleshoot and resolve performance, reliability, and data integrity issues.
Required Skills and Qualifications:
• Bachelor's degree in Computer Science, Engineering, or a related field.
• 12+ years of experience in data engineering, with at least 8+ years on GCP.
• Strong expertise in SQL and Python (or Java/Scala).
• Hands-on experience with BigQuery, Dataflow/Apache Beam, Cloud Storage, and Pub/Sub.
• Familiarity with data modeling, schema design, and data warehousing best practices.
• Knowledge of DevOps tools and CI/CD pipelines.
• Experience with version control systems like Git.
• Strong problem-solving skills and ability to work in an agile environment.
Preferred Qualifications:
• GCP Professional Data Engineer certification.
• Experience with real-time data processing and streaming pipelines.
• Familiarity with Apache Spark (via Dataproc) and Airflow.
• Exposure to data security and compliance standards (e.g., GDPR, HIPAA).
Job Summary:
We are seeking a skilled and experienced GCP Data Engineer to join our data engineering team. The ideal candidate will be responsible for designing, building, and managing scalable data pipelines on Google Cloud Platform (GCP), ensuring efficient and secure data flow across the organization.
Key Responsibilities:
• Design, build, and maintain robust and scalable ETL/ELT data pipelines on GCP.
• Develop and optimize data processing workflows using services like BigQuery, Dataflow, Dataproc, Pub/Sub, and Cloud Functions.
• Work closely with data analysts, scientists, and stakeholders to understand data needs and deliver reliable data solutions.
• Implement data quality checks, monitoring, and alerting mechanisms.
• Automate data pipeline deployments using Infrastructure as Code (e.g., Terraform, Deployment Manager).
• Ensure security, compliance, and governance of data pipelines and storage.
• Troubleshoot and resolve performance, reliability, and data integrity issues.
Required Skills and Qualifications:
• Bachelor's degree in Computer Science, Engineering, or a related field.
• 12+ years of experience in data engineering, with at least 8+ years on GCP.
• Strong expertise in SQL and Python (or Java/Scala).
• Hands-on experience with BigQuery, Dataflow/Apache Beam, Cloud Storage, and Pub/Sub.
• Familiarity with data modeling, schema design, and data warehousing best practices.
• Knowledge of DevOps tools and CI/CD pipelines.
• Experience with version control systems like Git.
• Strong problem-solving skills and ability to work in an agile environment.
Preferred Qualifications:
• GCP Professional Data Engineer certification.
• Experience with real-time data processing and streaming pipelines.
• Familiarity with Apache Spark (via Dataproc) and Airflow.
• Exposure to data security and compliance standards (e.g., GDPR, HIPAA).