DevOps Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a DevOps Engineer with a contract length of "unknown" and a pay rate of "unknown". Key skills include AWS, Terraform, Python, and CI/CD. Requires 5-7 years of experience in cloud platform engineering and data platform automation.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
440
-
πŸ—“οΈ - Date discovered
September 19, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Plano, TX
-
🧠 - Skills detailed
#Data Lake #Lambda (AWS Lambda) #Automation #EC2 #AWS (Amazon Web Services) #Python #API (Application Programming Interface) #Redshift #VPC (Virtual Private Cloud) #SQS (Simple Queue Service) #GIT #Cloud #Data Science #Data Pipeline #Databricks #Version Control #Infrastructure as Code (IaC) #Compliance #Terraform #SNS (Simple Notification Service) #GitHub #Monitoring #ELB (Elastic Load Balancing) #IAM (Identity and Access Management) #S3 (Amazon Simple Storage Service) #Data Engineering #CLI (Command-Line Interface) #Scripting #DevOps #Athena #Security #DynamoDB
Role description
β€’ Lead the design, development, and implementation of innovative platform automation solutions for the Databricks data lake on AWS. β€’ Automate the provisioning, configuration, and management of core AWS services (S3, IAM, EC2, VPC, Lambda, CloudWatch, SNS, SQS, ELB, Route53) using Infrastructure as Code (Terraform). β€’ Develop robust automation scripts and tools using Python for Databricks workspaces, clusters, jobs, and security configurations, leveraging Databricks APIs and CLI. β€’ Design, implement, and maintain CI/CD pipelines (e.g., GitHub Actions) for continuous delivery of infrastructure, platform, and automation code. β€’ Implement and enforce security best practices, compliance, and cost optimization strategies across the AWS and Databricks data platform. β€’ Integrate and automate the management of AWS data services (Redshift, DynamoDB, Glue, Athena) within the Databricks ecosystem. β€’ Drive continuous improvement and innovation in platform operations, monitoring, and performance. β€’ Collaborate with data engineers, data scientists, governance, and operations teams to deliver automated solutions that enhance productivity, platform stability and improve customer experience. Required Skills & Qualifications: β€’ 5-7 years of professional experience in cloud platform engineering, DevOps, or data platform automation. β€’ AWS (Mandatory): Extensive hands-on experience with S3, IAM, EC2, API Gateway, Step Functions, VPC, Lambda, CloudWatch, SNS, SQS, ELB, and Route53. β€’ DevOps and Automation (Mandatory) β€’ Proven expertise in Infrastructure as Code (IaC) with Terraform. β€’ Advanced Python scripting for platform automation, API integration, and data pipeline orchestration. β€’ A passion for identifying and implementing innovative automation solutions. β€’ Solid understanding of CI/CD principles and practical experience with tools like GitHub Actions. β€’ Proficiency with Git for version control.