

Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a 12-month contract, remote (EST hours). Requires expertise in Python, SQL, GCP, and experience with mobile app data engineering. Key skills include BigQuery, dbt, Airflow, and Looker.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 30, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#SQL (Structured Query Language) #Cloud #Data Science #Looker #dbt (data build tool) #Data Access #Fivetran #Automation #BigQuery #Data Integration #Data Engineering #GCP (Google Cloud Platform) #Computer Science #Statistics #Data Lake #Python #Mathematics #Airflow #Data Ingestion
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Senior Data Engineer β GCP
This role is with a DeWinter Digital Media Partner
Remote role - must work EST hours as client is based in NYC
12 Month + contract (or contract to hire, if desired)
We're seeking a highly skilled Data Engineer with a strong background in Python and SQL within GCP environments to join our Data Engineering. This role will leverage data from our widely used mobile applications so prior data engineering experience with mobile apps is required. In this role, you'll be instrumental in building robust data integration pipelines that feed into our data lakes and warehouses. Your expertise will be crucial in ensuring the quality and integrity of our data stores.
We're looking for a collaborative individual who can also work autonomously. You'll partner closely with key stakeholders to understand and implement business requirements, ensuring that data deliverables are consistently met.
What you'll do:
β’ Construct highly consumable and cost-efficient data products, synthesizing information from diverse source systems.
β’ Prepare raw data and enrich it with business value, providing consistent dimensions and metrics for downstream workflows.
β’ Ingest raw data using Fivetran and Python, making it accessible in BigQuery for company-wide use.
β’ Design and maintain reliable, consistent data creation workflows, repairing and enhancing them as needed to adapt to changing data or manage costs.
β’ Develop Looker Views and Models to democratize data access.
What you'll bring:
β’ Expert-level SQL skills.
β’ Proven experience building and managing data products in modern cloud environments (GCP preferred).
β’ Strong proficiency in Python for data ingestion and workflow automation.
β’ Hands-on experience with BigQuery, dbt, Airflow, and Looker.
β’ Excellent communication skills and a demonstrated ability to collaborate effectively across technical and non-technical teams.
β’ A Bachelor's Degree in a quantitative field (e.g., computer science, statistics, mathematics, engineering, data science) or equivalent practical experience.