

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a remote Data Engineer position on a W2 contract, requiring 10+ years of experience in data engineering and business intelligence. Key skills include GCP BigQuery, Databricks, Power BI, and strong SQL proficiency.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 3, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#GCP (Google Cloud Platform) #Data Lake #DAX #Data Modeling #Spark (Apache Spark) #Microsoft Power BI #"ETL (Extract #Transform #Load)" #Agile #PySpark #Data Science #Visualization #Data Lakehouse #Scala #Delta Lake #SQL (Structured Query Language) #Data Engineering #Data Pipeline #BI (Business Intelligence) #Security #Data Warehouse #Databricks #BigQuery #Cloud #Clustering #Data Governance
Role description
Data Engineer
Remote
W2 Contract
Job Summary:
We are seeking a skilled and motivated Data Engineer / BI Developer to join our data team. The ideal candidate will have strong experience working with Google Cloud Platform (GCP) tools such as BigQuery and Databricks, with excellent proficiency in building and delivering interactive dashboards using Power BI. You will be responsible for developing scalable data pipelines, transforming raw data into actionable insights, and supporting business intelligence initiatives.
Key Responsibilities:
Design, build, and optimize ETL/ELT pipelines on GCP Databricks using PySpark/SQL.
Develop and maintain data models in BigQuery, ensuring high performance and reliability.
Integrate data from multiple sources and ensure data consistency, quality, and accuracy.
Create intuitive and interactive Power BI dashboards and reports for stakeholders.
Collaborate with data scientists, analysts, and business stakeholders to gather requirements and deliver solutions.
Monitor, troubleshoot, and optimize data workflows and reports for performance and scalability.
Implement data governance, security, and best practices in cloud data environments.
Required Skills and Qualifications:
10+ years of experience in data engineering, business intelligence, or related roles.
Proven hands-on experience with:
Google BigQuery β data modeling, query optimization, partitioning, clustering.
GCP Databricks β building scalable data pipelines using PySpark, notebooks.
Power BI β DAX, Power Query (M), dashboard design, and data visualization best practices.
Strong SQL skills and experience with cloud data warehouses.
Familiarity with data lakes, Delta Lake, and data lakehouse concepts.
Ability to work independently and in a team in a fast-paced agile environment.
Data Engineer
Remote
W2 Contract
Job Summary:
We are seeking a skilled and motivated Data Engineer / BI Developer to join our data team. The ideal candidate will have strong experience working with Google Cloud Platform (GCP) tools such as BigQuery and Databricks, with excellent proficiency in building and delivering interactive dashboards using Power BI. You will be responsible for developing scalable data pipelines, transforming raw data into actionable insights, and supporting business intelligence initiatives.
Key Responsibilities:
Design, build, and optimize ETL/ELT pipelines on GCP Databricks using PySpark/SQL.
Develop and maintain data models in BigQuery, ensuring high performance and reliability.
Integrate data from multiple sources and ensure data consistency, quality, and accuracy.
Create intuitive and interactive Power BI dashboards and reports for stakeholders.
Collaborate with data scientists, analysts, and business stakeholders to gather requirements and deliver solutions.
Monitor, troubleshoot, and optimize data workflows and reports for performance and scalability.
Implement data governance, security, and best practices in cloud data environments.
Required Skills and Qualifications:
10+ years of experience in data engineering, business intelligence, or related roles.
Proven hands-on experience with:
Google BigQuery β data modeling, query optimization, partitioning, clustering.
GCP Databricks β building scalable data pipelines using PySpark, notebooks.
Power BI β DAX, Power Query (M), dashboard design, and data visualization best practices.
Strong SQL skills and experience with cloud data warehouses.
Familiarity with data lakes, Delta Lake, and data lakehouse concepts.
Ability to work independently and in a team in a fast-paced agile environment.