

Lead Azure Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Azure Data Engineer on a contracting basis in Pleasanton, CA (Hybrid) with a focus on developing data pipelines using Azure Databricks. Key skills include Python, SQL, NoSQL, ETL, and experience in data engineering.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 15, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Pleasanton, CA
-
π§ - Skills detailed
#Azure Databricks #NoSQL #Debugging #Data Bricks #Security #Apache Spark #Data Manipulation #Snowflake #JavaScript #Synapse #Scala #"ETL (Extract #Transform #Load)" #SQL Queries #Data Integration #SQL (Structured Query Language) #Data Modeling #Data Extraction #Apache Kafka #Pandas #Programming #Azure #Databases #Data Engineering #Azure DevOps #Spark (Apache Spark) #GIT #Business Analysis #Data Science #Automation #Version Control #BigQuery #Data Quality #Batch #NumPy #Database Management #Compliance #DevOps #Schema Design #Data Conversion #Data Processing #Data Governance #Big Data #Java #JSON (JavaScript Object Notation) #AWS (Amazon Web Services) #Database Schema #Indexing #Python #Kafka (Apache Kafka) #Libraries #Data Pipeline #Databricks #GitHub #Data Lake #MongoDB #Cloud #Azure cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Lead Azure Data Engineer
Location: Pleasanton, CA (Hybrid)
Job Type: Contracting
About the Role:
We are looking for a highly skilled Senior Azure Data bricks (ADB) Lead Developer to join our Data Engineering team. This role involves developing large-scale batch and streaming data pipelines on Azure Cloud.
Key Responsibilities:
β’ Design and Develop: Create real-time and batch data pipelines using Azure Databricks, Apache Spark, and Structured Streaming.
β’ Data Processing: Write efficient ETL scripts and automate workflows using Python.
β’ Data Integration: Integrate with various data sources and destinations, including DB2, MongoDB, and other enterprise-grade data systems.
β’ Performance Optimization: Tune Spark jobs for optimal performance and cost-effective compute usage on Azure.
β’ Collaboration: Work with platform and architecture teams to ensure secure, scalable, and maintainable cloud data infrastructure.
β’ CI/CD Support: Implement CI/CD for Databricks pipelines and notebooks using tools like GitHub and Azure DevOps.
β’ Stakeholder Communication: Interface with product owners, data scientists, and business analysts to translate data requirements into production-ready pipelines.
Required Skills:
β’ experience in data engineering
β’ Python Proficiency:
β’ Data Manipulation: Using libraries like Pandas and NumPy for data manipulation and analysis.
β’ Data Processing: Writing efficient ETL scripts.
β’ Automation: Automating repetitive tasks and workflows.
β’ Debugging: Strong debugging skills to troubleshoot and optimize code.
β’ Database Management:
β’ SQL: Advanced SQL skills for querying and managing relational databases.
β’ NoSQL: Experience with NoSQL databases like MongoDB or Cassandra.
β’ Data Warehousing: Knowledge of data warehousing solutions like Google BigQuery or Snowflake.
β’ Big Data Technologies:
β’ Kafka: Knowledge of data streaming platforms like Apache Kafka.
β’ Version Control:
β’ Git: Using version control systems for collaborative development.
β’ Data Modeling:
β’ Schema Design: Designing efficient and scalable database schemas.
β’ Data Governance: Ensuring data quality, security, and compliance.
β’ Database Management:
β’ DB2: Understanding of DB2 architecture, SQL queries, and database management.
β’ MongoDB: Knowledge of MongoDB schema design, indexing, and query optimization.
β’ Programming Skills:
β’ Proficiency in languages such as Java, Python, or JavaScript to write scripts for data extraction and transformation.
β’ Experience with BSON (Binary JSON) for data conversion.
β’ Cloud Services:
β’ Experience with cloud platforms like AWS or Azure for deploying and managing databases.
Preferred Skills:
β’ Experience with Java or Scala in Spark streaming.
β’ Familiarity with Azure services like Data Lake, Data Factory, Synapse, and Event Hubs.
β’ Background in building data platforms in regulated or large-scale enterprise environments.