

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 7+ years of experience, focusing on Python, GCP, and BigQuery. It offers a hybrid work location in Cincinnati, OH, with a competitive pay rate. Key skills include ETL/ELT development and data pipeline orchestration.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 13, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Cincinnati, OH
-
π§ - Skills detailed
#SQL Queries #Agile #Data Engineering #BigQuery #Airflow #Data Analysis #Data Processing #Scala #Tableau #BI (Business Intelligence) #Data Governance #Looker #Security #Version Control #Data Quality #GIT #Compliance #Python #Dataflow #Monitoring #Data Pipeline #Infrastructure as Code (IaC) #SQL (Structured Query Language) #Data Science #"ETL (Extract #Transform #Load)" #Datasets #Storage #Data Security #Terraform #Scripting #GCP (Google Cloud Platform) #Cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Greetings from IT Engagements
Role: Job Title: Data Engineer (Python, GCP, BigQuery) (NO H1B)
Location: Cincinnati, OH (Hybrid)
We are seeking a skilled Data Engineer with hands-on experience in Python, Google Cloud Platform (GCP), and BigQuery to join our team. The ideal candidate will be responsible for building and optimizing scalable data pipelines, supporting data warehousing initiatives, and collaborating with cross-functional teams to enable data-driven decision-making.
Responsibilities
Design, develop, and maintain robust and scalable ETL/ELT data pipelines using Python and GCP services.
Work extensively with BigQuery for large-scale data processing, transformation, and analytics.
Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and deliver timely solutions.
Optimize data flows and queries for performance and cost efficiency.
Implement data quality checks, validation rules, and monitoring processes.
Work with GCP services like Cloud Storage, Dataflow, Pub/Sub, Cloud Functions, and Composer as needed.
Ensure compliance with data governance and security policies.
Required Skills
7+ years of experience as a Data Engineer or similar role.
Strong proficiency in Python for data processing and scripting.
Hands-on experience with Google Cloud Platform (GCP) services.
Deep knowledge of BigQuery β writing complex SQL queries, managing datasets, optimizing performance.
Experience with data pipeline orchestration tools (e.g., Airflow, Cloud Composer).
Familiarity with version control (e.g., Git) and CI/CD processes.
Preferred Skills
Experience with Terraform or Infrastructure as Code (IaC) on GCP.
Knowledge of Data Governance and Data Security best practices.
Exposure to Looker, Tableau, or other BI tools.
Experience working in Agile environments.
Soft Skills
Strong problem-solving abilities and attention to detail.
Excellent communication and collaboration skills.
Ability to work independently in a fast-paced environment.
Thank you
vinaya@itengagements.com