

Data Engineer - Cloud & Data Warehouse
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer - Cloud & Data Warehouse on a contract basis, remote for U.S. residents. Requires 3–5 years of data engineering experience, strong SQL skills, and familiarity with cloud environments. Pay rate and contract length unspecified.
🌎 - Country
United States
💱 - Currency
Unknown
-
💰 - Day rate
-
🗓️ - Date discovered
September 3, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Remote
-
📄 - Contract type
1099 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Remote
-
🧠 - Skills detailed
#Strategy #GCP (Google Cloud Platform) #Looker #AWS (Amazon Web Services) #Data Modeling #Microsoft Power BI #Version Control #Data Analysis #"ETL (Extract #Transform #Load)" #Agile #Visualization #Scala #Azure #GIT #Data Integrity #Datasets #SQL (Structured Query Language) #Data Engineering #Data Pipeline #BI (Business Intelligence) #Security #Redshift #dbt (data build tool) #Data Warehouse #Data Access #Data Security #Data Quality #Snowflake #Tableau #BigQuery #Cloud #Data Governance
Role description
Workstate is looking for a Data Engineer for a contract position to enhance our dynamic team! This role is perfect for individuals who thrive in a fast-paced environment and have a passion for leveraging data to drive business insights. As a contract Data Engineer, you will be expected to deliver robust data solutions while collaborating with various stakeholders to meet project goals. At Workstate, we prioritize a culture of innovation, teamwork, and excellence. Our consultants engage with clients to not only solve immediate problems but also to add strategic value to their businesses. If you're ready to make a meaningful impact while working with a talented team, this opportunity is for you! This is a remote contract position and available to U.S. residents based in the continental U.S. who are authorized to engage in 1099 contracting without need for visa sponsorship or transfer. Job Duties:
Design and develop high-performance, scalable, and reliable data models for our attribution and measurement platforms.
Build and maintain robust ETL/ELT pipelines to ingest, transform, and load large datasets from various sources.
Collaborate with data engineers and analysts to define semantic layers and ensure consistency across data sources
Manage the end-to-end pixel tracking solution, ensuring high availability and low-latency data capture for critical measurement needs
Implement and promote best practices for data governance, data quality, and data security.
Enable self-service data access and analysis for stakeholders through well-designed data platforms and tools.
Monitor data pipeline performance, troubleshoot issues, and optimize for efficiency and cost.
Contribute to the overall architecture and strategy of our data platform.
Requirements: We are looking for candidates who possess the following skills and experience:
3–5 years of hands-on data engineering experience, with a strong track record of designing, building, and optimizing data platforms in high-volume or ad tech environments.
A strong collaborator who thrives in a cross-functional setting, effectively communicating technical concepts to diverse audiences
Strong SQL skills, including complex joins, aggregations, and performance tuning
Experience working with semantic layers and data modeling for analytics
Solid understanding of data analysis and visualization best practices
Passionate about data quality and governance, with an eye for detail and a commitment to maintaining data integrity
Experience using version control systems, preferably Git
Excellent communication skills and the ability to work cross-functionally
Familiarity with modern data warehousing platforms (e.g., Snowflake, BigQuery, Redshift)
Experience working in cloud environments such as AWS, GCP, or Azure (nice to have)
Experience migrating from legacy BI tools (e.g., Tableau, Power BI, etc.) to Looker
Experience working in agile data teams and managing BI projects
Familiarity with dbt or other data transformation frameworks
Workstate is looking for a Data Engineer for a contract position to enhance our dynamic team! This role is perfect for individuals who thrive in a fast-paced environment and have a passion for leveraging data to drive business insights. As a contract Data Engineer, you will be expected to deliver robust data solutions while collaborating with various stakeholders to meet project goals. At Workstate, we prioritize a culture of innovation, teamwork, and excellence. Our consultants engage with clients to not only solve immediate problems but also to add strategic value to their businesses. If you're ready to make a meaningful impact while working with a talented team, this opportunity is for you! This is a remote contract position and available to U.S. residents based in the continental U.S. who are authorized to engage in 1099 contracting without need for visa sponsorship or transfer. Job Duties:
Design and develop high-performance, scalable, and reliable data models for our attribution and measurement platforms.
Build and maintain robust ETL/ELT pipelines to ingest, transform, and load large datasets from various sources.
Collaborate with data engineers and analysts to define semantic layers and ensure consistency across data sources
Manage the end-to-end pixel tracking solution, ensuring high availability and low-latency data capture for critical measurement needs
Implement and promote best practices for data governance, data quality, and data security.
Enable self-service data access and analysis for stakeholders through well-designed data platforms and tools.
Monitor data pipeline performance, troubleshoot issues, and optimize for efficiency and cost.
Contribute to the overall architecture and strategy of our data platform.
Requirements: We are looking for candidates who possess the following skills and experience:
3–5 years of hands-on data engineering experience, with a strong track record of designing, building, and optimizing data platforms in high-volume or ad tech environments.
A strong collaborator who thrives in a cross-functional setting, effectively communicating technical concepts to diverse audiences
Strong SQL skills, including complex joins, aggregations, and performance tuning
Experience working with semantic layers and data modeling for analytics
Solid understanding of data analysis and visualization best practices
Passionate about data quality and governance, with an eye for detail and a commitment to maintaining data integrity
Experience using version control systems, preferably Git
Excellent communication skills and the ability to work cross-functionally
Familiarity with modern data warehousing platforms (e.g., Snowflake, BigQuery, Redshift)
Experience working in cloud environments such as AWS, GCP, or Azure (nice to have)
Experience migrating from legacy BI tools (e.g., Tableau, Power BI, etc.) to Looker
Experience working in agile data teams and managing BI projects
Familiarity with dbt or other data transformation frameworks