Akkodis

Cloudera Data Engineer - 100% Remote

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloudera Data Engineer on a contract basis (C2C & W2) for 100% remote work, offering $55 - $65 per hour. Requires 7+ years in data engineering, 4+ years with Cloudera, and strong Scala skills.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
520
-
πŸ—“οΈ - Date
November 6, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Python #Data Warehouse #Scala #Cloud #Big Data #EC2 #AWS EC2 (Amazon Elastic Compute Cloud) #Bash #Apache Spark #Data Engineering #"ETL (Extract #Transform #Load)" #Security #Data Pipeline #Computer Science #Metadata #HiveQL #VPC (Virtual Private Cloud) #Scripting #Spark (Apache Spark) #Data Governance #YARN (Yet Another Resource Negotiator) #S3 (Amazon Simple Storage Service) #Hadoop #DevOps #AWS (Amazon Web Services) #Data Integrity #Data Processing #IAM (Identity and Access Management) #Java #Cloudera #Migration #Programming #HDFS (Hadoop Distributed File System)
Role description
Akkodis is seeking a Cloudera Data Engineer for a Contract (C2C & W2) position, and this is a 100% Remote role. Ideally, looking for applicants with a solid background in Cloudera Cluster/Platform, Data Engineering, and AWS. Rate Range: $55 - $65 per hour. The salary may be negotiable based on experience, education, geographic location, and other factors. Job Summary β€’ We are seeking a Cloudera Data Engineer to support the migration of a Medicaid Data Warehouse Implementation in an AWS environment. The resource will support the migration and continued operations of an existing Cloudera/Hive/Scala-based data pipeline environment from one AWS account to another. β€’ This position is responsible for ensuring a seamless transition, validating data integrity and job performance, and maintaining reliable daily operations post-migration. β€’ The role will work closely with the existing project team for the underlying AWS infrastructure (VPC, IAM, S3, EC2, networking). The resource will focus on Cloudera cluster migration, data pipeline reconfiguration, and operational stability. Key Responsibilities β€’ Replicate and configure the existing Cloudera cluster (HDFS, YARN, Hive, Spark) in the new AWS account. β€’ Coordinate with the project team to ensure proper infrastructure provisioning (EC2, security groups, IAM roles, and networking). β€’ Reconfigure cluster connectivity and job dependencies for the new environment. β€’ Migrate and validate metadata stores (Hive Metastore, job configs, dependencies). β€’ Validate job execution and data outputs for parity with the existing environment. β€’ Deploy, test, and operate existing Hive, Spark (Scala) jobs post-migration. β€’ Maintain job schedules, dependencies, and runtime configurations. β€’ Monitor job performance, identify bottlenecks, and apply tuning or code-level optimizations. β€’ Troubleshoot failures and implement automated recovery or alerting where applicable. β€’ Monitor Cloudera Manager dashboards, cluster health, and resource utilization. β€’ Manage user roles and access within the Cloudera environment. β€’ Implement periodic data cleanup, archiving, and housekeeping processes. β€’ Document configurations, migration steps, and operational runbooks. Required Skills and Experience: β€’ Bachelor’s degree in computer science, Information Systems, or a related field. β€’ 7+ years of experience in data engineering or big data development β€’ 4+ years’ experience with Cloudera platform (HDFS, YARN, Hive, Spark, Oozie) β€’ Experience deploying and operating Cloudera workloads on AWS (EC2, S3, IAM, CloudWatch) β€’ Strong proficiency in Scala, Java, and HiveQL; Python or Bash scripting experience preferred β€’ Strong proficiency in Apache Spark & Scala programming for data processing and transformation. β€’ Hands-on experience with the Cloudera distribution of Hadoop. β€’ Hands-on experience implementing business rules processing using Drools. β€’ Able to work with infrastructure, DevOps, and data governance teams in a multi-disciplinary environment. Preferred Qualifications: β€’ Candidates with Cloudera certification (e.g., CDP Data Engineer or Cloudera Administrator) β€’ Experience with Cloudera version upgrades or AWS-to-AWS environment migrations. β€’ Experience in public-sector or large enterprise data environments. If you are interested in this position, then please click APPLY NOW. For other opportunities available at Akkodis, go to www.akkodis.com. If you have questions about the position, please contact Narendra Pratap at (213) 410-5211 or narendra.pratap@akkodis.com Equal Opportunity Employer/Veterans/Disabled Benefit offerings include medical, dental, vision, term life insurance, short-term disability insurance, additional voluntary benefits, commuter benefits, and a 401K plan. Our program provides employees the flexibility to choose the type of coverage that meets their individual needs. Available paid leave may include Paid Sick Leave, where required by law; any other paid leave required by Federal, State, or local law; and Holiday pay upon meeting eligibility criteria. Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs that are direct hires to a client. To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit https://www.akkodis.com/en/privacy-policy The Company will consider qualified applicants with arrest and conviction records.