DATAEXL INFORMATION LLC

DevOps Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a DevOps Engineer with a 6-month contract, based in "San Francisco, CA", "Dallas, TX", "Hopkins, MN", "Charlotte, NC", or "Atlanta, GA". Key skills include Azure/AWS, CI/CD, Terraform, and data engineering fundamentals.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 1, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
San Francisco, CA
-
🧠 - Skills detailed
#ADF (Azure Data Factory) #Big Data #AWS (Amazon Web Services) #Kubernetes #Deployment #DevOps #GitHub #Azure Data Factory #Bash #Azure DevOps #IAM (Identity and Access Management) #Spark (Apache Spark) #Triggers #Data Lake #Data Engineering #Data Encryption #Cloud #Storage #Compliance #Data Manipulation #Microsoft Azure #GIT #Python #Docker #Data Governance #GitLab #Database Administration #Continuous Deployment #Data Modeling #Data Pipeline #ADLS (Azure Data Lake Storage) #Business Analysis #"ETL (Extract #Transform #Load)" #Terraform #Azure Synapse Analytics #Security #Data Science #Infrastructure as Code (IaC) #Databricks #Synapse #Scripting #Azure ADLS (Azure Data Lake Storage) #Programming #SQL (Structured Query Language) #Agile #GDPR (General Data Protection Regulation) #Automation #Monitoring #Version Control #Azure #Logging
Role description
Only W2. No C2C, No H1B, No Student Visa. Position title: Devops engineer Location: san Francisco, CA or Dallas, TX or Hopkins, MN or Charlotte, NC or Atlanta, GA Contract: 6 months to perm or 6 month contract In office: 3 days a week onsite Interview process: 2 video interviews to hire As an Analytics platform engineer supporting Azure & AWS Databricks from a DevOps perspective requires a hybrid skill set spanning primarily cloud infrastructure management and deployment, automation and CI/CD practices and it is helpful if they a background or exposure to ELT and data engineering and Database Administration Concepts etc. The resource will be responsible developing the foundation Databricks for use within U.S. Bank as an Analytics Platform Engineering team member where they will develop automation via terraform modules or other tooling to deploy Databricks Workspaces, along with other cloud resources, we are developing a platform ontop of a PaaS offering where you also need to implement a CICD solution in Gitlab. Core Responsibilities • Implement CI/CD Pipelines: Design and maintain continuous integration and continuous deployment pipelines using tools like Gitlab, Azure DevOps or GitHub Actions for deploying data pipelines and infrastructure changes. • Infrastructure as Code (IaC): Provision and manage Azure Analytics services (Databricks, etc.) through code using tools such as Terraform, ARM templates, or Bicep to ensure consistency and repeatability. • Platform Operations: Deploy, manage, and optimize the performance and resource utilization of Azure data services, including monitoring and troubleshooting data pipeline failures and performance issues. • Automation & Scripting: Automate routine operational tasks and application deployments using scripting languages like Python, PowerShell, or Bash. • Security & Compliance: Implement security best practices, including identity and access management (IAM), data encryption, and compliance with data governance policies (e.g., GDPR) within the platform. • Collaboration: Work closely with data engineers, data scientists, and business analysts to translate data requirements into robust technical solutions and foster a DevOps culture within the organization Key Skills and Qualifications • Cloud Platform Expertise: Deep knowledge of Microsoft Azure and/or AWS services, specifically, Azure and or AWS Databricks, Exposre to Azure Synapse Analytics (SQL pools, Spark pools, pipelines), Azure Data Factory (pipelines, triggers, data flows), would be beneficial as well. Azure Data Lake Storage (ADLS), and Azure Monitor • DevOps Tools & Methodologies: • CI/CD platforms: Expertise Gitlab or Azure Devops (Azure Pipelines, Boards, Repos) or GitHub Actions. • Version Control: Strong experience with Git and branching strategies. • IaC tools: Proficiency in Terraform, Bicep, or ARM templates. • Containerization: Experience with Docker and container orchestration (Azure Kubernetes Service - AKS) is highly beneficial. • Programming & Scripting Languages: Proficiency in Python, SQL (SQL, query optimization), and scripting languages like PowerShell or Bash for automation and data manipulation tasks. • Data Engineering Fundamentals: Basic understanding of Data engineering principals such as ETL/ELT processes, data modeling, data warehousing, and big data concepts. • Monitoring & Logging: Experience implementing monitoring, logging, and alerting solutions using tools like Azure Monitor, Log Analytics, and Application Insights. • Soft Skills: Excellent problem-solving, analytical, and communication skills to work effectively in fast-paced, agile environments. • Preferred Certifications: Microsoft Certified: Azure Data Engineer Associate (DP-203) or Microsoft Certified: Azure DevOps Engineer Expert (AZ-400) are highly valued.