

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer contract position for 12+ months, offering $80.00 - $90.00 per hour. Requires 10+ years in data engineering, strong SQL, Power BI, and ETL experience, preferably with Azure. In-person work location.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
720
-
ποΈ - Date discovered
August 12, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Fremont, CA 94538
-
π§ - Skills detailed
#Deployment #Data Lake #Azure SQL #IoT (Internet of Things) #Talend #GCP (Google Cloud Platform) #Programming #Snowflake #Data Warehouse #Azure Stream Analytics #OBIEE (Oracle Business Intelligence Enterprise Edition) #Azure #Data Bricks #Spark (Apache Spark) #Informatica #Migration #SQL (Structured Query Language) #Databricks #SAP #Tableau #AWS (Amazon Web Services) #BI (Business Intelligence) #Spark SQL #Data Engineering #Azure DevOps #Redshift #ADF (Azure Data Factory) #Amazon Redshift #Visualization #"ETL (Extract #Transform #Load)" #Microsoft Power BI #Schema Design #Synapse #Big Data #PySpark #DevOps #Cloud
Role description
Must-Have skills
10+ years experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms.
Minimum 12 + years of proven experience with SQL, schema design and dimensional data modelling
2+ yrs of experience with Power BI
5+ yrs of experience with any reporting/visualization tool like Tableau/OBIEE etc
Solid knowledge of data warehouse best practices, development standards and methodologies
Experience with ETL/ELT tools like ADF, Informatica , Talend etc., and data warehousing technologies like Azure Synapse, Azure SQL, Amazon redshift , Snowflake , Google Big Query etc.
Strong experience with big data tools (Databricks , Spark etc..) and programming skills in PySpark and Spark SQL. -
Preferred Skills:
Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge.
SAP ECC /S/4 and Hana knowledge. -
Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes
Job Type: Contract
Pay: $80.00 - $90.00 per hour
Expected hours: 40 per week
Benefits:
401(k)
Dental insurance
Health insurance
Vision insurance
Work Location: In person
Must-Have skills
10+ years experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms.
Minimum 12 + years of proven experience with SQL, schema design and dimensional data modelling
2+ yrs of experience with Power BI
5+ yrs of experience with any reporting/visualization tool like Tableau/OBIEE etc
Solid knowledge of data warehouse best practices, development standards and methodologies
Experience with ETL/ELT tools like ADF, Informatica , Talend etc., and data warehousing technologies like Azure Synapse, Azure SQL, Amazon redshift , Snowflake , Google Big Query etc.
Strong experience with big data tools (Databricks , Spark etc..) and programming skills in PySpark and Spark SQL. -
Preferred Skills:
Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge.
SAP ECC /S/4 and Hana knowledge. -
Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes
Job Type: Contract
Pay: $80.00 - $90.00 per hour
Expected hours: 40 per week
Benefits:
401(k)
Dental insurance
Health insurance
Vision insurance
Work Location: In person