

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position for a 12-month contract, offering competitive pay. Requires 10+ years of data engineering experience, strong SQL skills, and expertise in ETL/ELT pipelines. Familiarity with cloud environments and data governance is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 8, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Unknown
-
π - Contract type
Fixed Term
-
π - Security clearance
Unknown
-
π - Location detailed
New York City Metropolitan Area
-
π§ - Skills detailed
#Data Quality #BI (Business Intelligence) #Datasets #BigQuery #dbt (data build tool) #"ETL (Extract #Transform #Load)" #GIT #Security #Scala #Data Security #Data Modeling #Data Access #Cloud #SQL (Structured Query Language) #Data Pipeline #GCP (Google Cloud Platform) #Looker #Agile #Azure #Strategy #Data Analysis #Data Engineering #Data Integrity #Microsoft Power BI #Redshift #Visualization #Tableau #Version Control #Snowflake #AWS (Amazon Web Services) #Data Governance
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
12 months contract
What youβll do:
β’ Design and develop high-performance, scalable, and reliable data models for our attribution and measurement platforms.
β’ Build and maintain robust ETL/ELT pipelines to ingest, transform, and load large datasets from various sources.
β’ Collaborate with data engineers and analysts to define semantic layers and ensure consistency across data sources
β’ Manage the end-to-end pixel tracking solution, ensuring high availability and low-latency data capture for critical measurement needs
β’ Implement and promote best practices for data governance, data quality, and data security.
β’ Enable self-service data access and analysis for stakeholders through well-designed data platforms and tools.
β’ Monitor data pipeline performance, troubleshoot issues, and optimize for efficiency and cost.
β’ Contribute to the overall architecture and strategy of our data platform.
Who you are:
β’ 10+ years of hands-on data engineering experience, with a strong track record of designing, building, and optimizing data platforms in high-volume or ad tech environments.
β’ A strong collaborator who thrives in a cross-functional setting, effectively communicating technical concepts to diverse audiences
β’ Strong SQL skills, including complex joins, aggregations, and performance tuning
β’ Experience working with semantic layers and data modeling for analytics
β’ Solid understanding of data analysis and visualization best practices
β’ Passionate about data quality and governance, with an eye for detail and a commitment to maintaining data integrity
β’ Experience using version control systems, preferably Git
β’ Excellent communication skills and the ability to work cross-functionally
β’ Familiarity with modern data warehousing platforms (e.g., Snowflake, BigQuery, Redshift)
β’ Experience working in cloud environments such as AWS, GCP, or Azure (nice to have)
β’ Experience migrating from legacy BI tools (e.g., Tableau, Power BI, etc.) to Looker
β’ Experience working in agile data teams and managing BI projects
β’ Familiarity with dbt or other data transformation frameworks