

Senior Data Engineer – Azure, Databricks & BI
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer – Azure, Databricks & BI, offering $80 to $90/hr on W2 for a remote position. Requires 5+ years in cloud platforms, 12+ years in SQL/data modeling, and experience with ETL tools and big data technologies.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
720
-
🗓️ - Date discovered
August 12, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Remote
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Fremont, CA
-
🧠 - Skills detailed
#Deployment #Azure Data Factory #Data Lake #Azure SQL #IoT (Internet of Things) #Data Modeling #Talend #GCP (Google Cloud Platform) #Programming #Snowflake #Data Warehouse #Azure Stream Analytics #OBIEE (Oracle Business Intelligence Enterprise Edition) #Azure #Spark (Apache Spark) #Informatica #Migration #SQL (Structured Query Language) #Azure Databricks #Databricks #SAP #Tableau #AWS (Amazon Web Services) #BigQuery #BI (Business Intelligence) #Spark SQL #Data Engineering #Azure DevOps #Redshift #ADF (Azure Data Factory) #Amazon Redshift #Visualization #"ETL (Extract #Transform #Load)" #Microsoft Power BI #Schema Design #Synapse #Big Data #PySpark #DevOps #Cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Senior Data Engineer – Azure, Databricks & BI
Work Location: Fremont, CA - Remote
Pay Rate: $80 to $90/hr on W2
Positions Available: 1
Interview Process: One 45-minute interview via Microsoft Teams
Must-Have Skills
• Cloud Platforms: Minimum 5+ years (onsite) / 2+ years (offshore) experience in modern data engineering, data warehousing, and data lakes on platforms such as Azure, AWS, GCP, or Databricks (Azure preferred).
• SQL & Data Modeling: Minimum 12+ years (onsite) / 5+ years (offshore) proven experience with SQL, schema design, and dimensional data modeling.
• Data Warehousing: Strong knowledge of best practices, development standards, and methodologies.
• ETL/ELT Tools: Experience with tools such as Azure Data Factory (ADF), Informatica, Talend, and data warehouse technologies like Azure Synapse, Azure SQL, Amazon Redshift, Snowflake, or Google BigQuery.
• Big Data & Programming: Strong experience with Databricks, Spark, PySpark, and Spark SQL.
• Soft Skills: Independent self-learner with a proactive “let’s get this done” mindset; able to thrive in a fast-paced, dynamic environment. Strong communication and teamwork abilities.
Additional Requirements for Some Roles:
• 2+ years of experience with Power BI.
• 5+ years of experience with any reporting/visualization tool (e.g., Tableau, OBIEE).
Nice-to-Have Skills
• Experience with Event Hub, IoT Hub, Azure Stream Analytics, Azure Analysis Service, and Cosmos DB.
• SAP ECC / S/4HANA knowledge.
• Familiarity with Azure DevOps, CI/CD deployments, and cloud migration methodologies.
Please connect with me at (609) 257-2872 or drop me your contact at anil.chamoli@russelltobin.com .
Anil Chamoli
Lead - Recruitment
Contact : 609-257-2872
Email : anil.chamoli@russelltobin.com