

Jobs via Dice
Azure Synapse & Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Synapse & Data Engineer on an 18-month contract, 100% remote (EST candidates only). Key skills include Azure Synapse Analytics, Python, Spark, SQL, and API integration. Experience with data governance and operational support is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 16, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
California, United States
-
🧠 - Skills detailed
#Synapse #Data Governance #Compliance #REST (Representational State Transfer) #Alation #CLI (Command-Line Interface) #Security #Pandas #ADLS (Azure Data Lake Storage) #Azure DevOps #Data Engineering #REST API #Data Access #"ETL (Extract #Transform #Load)" #Vault #Python #Scala #PySpark #Data Ingestion #SQL (Structured Query Language) #JSON (JavaScript Object Notation) #Azure Synapse Analytics #SQL Queries #Delta Lake #API (Application Programming Interface) #Spark (Apache Spark) #Azure #Azure ADLS (Azure Data Lake Storage) #DevOps #Libraries #Storage #Data Lake #Data Modeling #Azure CLI (Azure Command Line Interface)
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, CAYS Inc, is seeking the following. Apply via Dice today!
Role: Azure Synapse & Data Engineer
Term: 18 months contract
Location: 100% Remote EST Candidate Only
Job Description
Platform Familiarity
Deep understanding of Azure Synapse Analytics architecture (SQL pools, Spark pools, pipelines)
Proficient in navigating Synapse Studio and managing workspaces
Experience accessing and managing Azure Data Lake Storage Gen2
Python & Spark
Maintain and debug PySpark scripts for ETL/ELT processes
Use Python libraries (e.g., requests, json, pandas) for data ingestion
Configure and tune Spark jobs for performance
SQL & Data Modeling
Write and optimize T-SQL queries
Create and manage views, stored procedures, and tables
Familiarity with data formats such as Parquet and Delta Lake
API Integration
Consume REST APIs with authentication (OAuth2, API keys)
Handle pagination, rate limits, and error responses
Parse and transform JSON responses
Schedule API calls via Synapse pipelines
Azure & DevOps
Use Azure Key Vault for secrets and credentials
Monitor jobs using Azure Monitor and Log Analytics
Access blob storage via Azure CLI or Storage Explorer
Manage code via Azure DevOps (branching, pull requests, versioning)
Deploy updates using CI/CD pipelines
Security & Access
Configure RBAC roles for data access
Manage private endpoints and VNet integration
Understand data governance and compliance requirements
Operational Support
Monitor pipeline health and job status
Respond to alerts and perform root cause analysis
Document workflows, dependencies, and escalation paths
Communicate with stakeholders and manage SLAs
Dice is the leading career destination for tech experts at every stage of their careers. Our client, CAYS Inc, is seeking the following. Apply via Dice today!
Role: Azure Synapse & Data Engineer
Term: 18 months contract
Location: 100% Remote EST Candidate Only
Job Description
Platform Familiarity
Deep understanding of Azure Synapse Analytics architecture (SQL pools, Spark pools, pipelines)
Proficient in navigating Synapse Studio and managing workspaces
Experience accessing and managing Azure Data Lake Storage Gen2
Python & Spark
Maintain and debug PySpark scripts for ETL/ELT processes
Use Python libraries (e.g., requests, json, pandas) for data ingestion
Configure and tune Spark jobs for performance
SQL & Data Modeling
Write and optimize T-SQL queries
Create and manage views, stored procedures, and tables
Familiarity with data formats such as Parquet and Delta Lake
API Integration
Consume REST APIs with authentication (OAuth2, API keys)
Handle pagination, rate limits, and error responses
Parse and transform JSON responses
Schedule API calls via Synapse pipelines
Azure & DevOps
Use Azure Key Vault for secrets and credentials
Monitor jobs using Azure Monitor and Log Analytics
Access blob storage via Azure CLI or Storage Explorer
Manage code via Azure DevOps (branching, pull requests, versioning)
Deploy updates using CI/CD pipelines
Security & Access
Configure RBAC roles for data access
Manage private endpoints and VNet integration
Understand data governance and compliance requirements
Operational Support
Monitor pipeline health and job status
Respond to alerts and perform root cause analysis
Document workflows, dependencies, and escalation paths
Communicate with stakeholders and manage SLAs






