

Sr. DevOps Engineer – BigData Platforms and Cloud Infrastructure (Remote)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Sr. DevOps Engineer for BigData Platforms and Cloud Infrastructure, offering a 5-month remote contract at $60-$69.28/hr. Requires 7+ years in DevOps, expertise in big data, cloud services, and tools like Kubernetes, Docker, and Apache Spark.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
552
-
🗓️ - Date discovered
June 17, 2025
🕒 - Project duration
3 to 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Management #Data Processing #Cloud #Storage #Deployment #Impala #dbt (data build tool) #LDAP (Lightweight Directory Access Protocol) #Scripting #JavaScript #Database Administration #Automation #HBase #HDFS (Hadoop Distributed File System) #Kafka (Apache Kafka) #Groovy #YARN (Yet Another Resource Negotiator) #Bash #Ansible #Snowflake #Linux #Data Governance #Kubernetes #GitLab #Apache Spark #Kudu #Scala #Python #Data Lakehouse #YAML (YAML Ain't Markup Language) #Trino #Docker #Spark (Apache Spark) #Hadoop #GIT #Big Data #Monitoring #Data Storage #Data Ingestion #Java #Data Lake #"ETL (Extract #Transform #Load)" #Maven #Compliance #Data Engineering #Jenkins #Kerberos #Computer Science #DevOps #Airflow
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
We are seeking a Sr. DevOps Engineer for a large, global B2B high-tech company. In this role, you will be responsible for maintaining and supporting a dedicated data environment within BigData infrastructure. You will be doing capacity planning and data pruning, database administration and performance tuning for various tables and queries.
This is a 5-month contract (Extension and conversion possible). 40-hour-per-week remote role in the US.
This is a W2 employee of Stage 4 Solutions. Health benefits and 401K are offered.
Responsibilities
• Performance tuning for various tables and queries.
• Responsible for access control, secrets management, and enforcing compliance with data governance policies.
• Support various CFI data management and data ingestion projects.
• Continuously monitor and optimize system performance and reliability.
• Diagnose and resolve issues in multi-source data ingestion pipelines, ETL processes, and data storage solutions.
• Develop CI/CD pipelines to automate code testing, integration, and deployment processes to ensure rapid delivery of new features.
• Collaborate with data engineering and development teams to integrate changes seamlessly into the deployment pipeline.
Requirements
• Over 7+ Years of experience in DevOps/SRE function, with at least 3 years emphasizing bigdata platforms, data engineering, and cloud infrastructure services.
• Experienced with designing, building, and maintaining scalable and robust cloud infrastructure using Kubernetes and Docker for containerization and orchestration.
• Knowledge of MLOps platforms and practices
• Hands-on experience with data processing tools like Apache Spark, dbt, Kafka, and Airflow.
• Strong knowledge of Hadoop components, including Spark Streaming, HDFS, HBase, YARN, Hive, Impala, Atlas, and Kudu.
• Familiarity with Snowflake and Clickhouse. Trino, starburst, and data lakehouse technologies
• Experience in securing the Hadoop stack using Sentry, Ranger, LDAP, and Kerberos KDC.
• Skilled in leveraging Software Configuration Management (SCM) and build tools like Git, GitLab, Nexus, Maven, Grunt, Jenkins, Docker, and Ansible for ongoing CI/CD operations.
• Good knowledge of CentOS 7.x and Linux system administration
• Strong experience with scripting languages for automation, such as Python, Bash, Go, Groovy, and YAML, for Big Data cluster deployment and monitoring automation.
• Ability to learn quickly in a fast-paced, dynamic team environment.
• Strong communication skills and the ability to work collaboratively
• Software development background with/ experience in OOP languages such as Java, JavaScript, Node.js, a plus.
• B.S. or Master's Degree in Computer Science, related field, or commensurate work experience.
Please submit your resume to our network at https://www.stage4solutions.com/careers/ (apply to the Sr. DevOps Engineer – BigData platforms and cloud infrastructure (Remote) role).
Please feel free to forward this job post to others you think may be interested.
Stage 4 Solutions is an equal-opportunity employer. We celebrate diversity and are committed to providing employees with an inclusive environment that is free of discrimination and harassment. All employment decisions are based on the job requirements and candidates’ qualifications, without regard to
race, color, religion/belief, national origin, gender identity, age, disability, marital status, genetic information, or other applicable legally protected characteristics.
Compensation: $60/hr. - $69.28/hr.