Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "unknown", offering a pay rate of "unknown". Required skills include 5+ years in SQL and Python, 3+ years in Azure services, and experience with GIT and CI/CD pipelines. Preferred certifications are "Microsoft Certified: Azure Data Engineer Associate" and "Databricks Certified Data Engineer Associate or Professional".
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
520
🗓️ - Date discovered
May 20, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Dallas, TX
🧠 - Skills detailed
#GIT #Data Storage #Databricks #Azure SQL #Data Pipeline #SQL (Structured Query Language) #Azure #Vault #Programming #SQL Server #.Net #Azure Databricks #Data Engineering #Cloud #Storage #Logic Apps #Azure cloud #Data Lake #Azure Data Factory #Python #ADLS (Azure Data Lake Storage) #Database Management #"ETL (Extract #Transform #Load)" #Database Schema #Azure ADLS (Azure Data Lake Storage) #Data Integration #Scala #ADF (Azure Data Factory)
Role description
Technical Skills: • Programming & Tools: • 5+ years of experience in SQL, Python. .Net is a plus. • 3+ years of experience in Azure cloud services, including: • Azure SQL Server • Azure Data Factory (ADF) • Azure Databricks (highlighted expertise) • Azure Data Lake Storage (ADLS) • Azure Key Vault • Azure Functions • Logic Apps • 3+ years of experience in GIT and deploying code using CI/CD pipelines. • Certifications (Preferred): • Microsoft Certified: Azure Data Engineer Associate • Databricks Certified Data Engineer Associate or Professional Responsibilities: 1. Data Pipeline Development: • Create and manage scalable data pipelines to collect, process, and store large volumes of data from various sources. 1. Data Integration: • Integrate data from multiple sources, ensuring consistency, quality, and reliability. 1. Database Management: • Design, implement, and optimize database schemas and structures to support data storage and retrieval. 1. ETL Processes: • Develop and maintain ETL (Extract, Transform, Load) processes to ensure accurate and efficient data movement between systems.