

Akkodis
Hadoop Developer
โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Hadoop Developer with a contract length of "unknown" and a pay rate of $62-$65/hour. Location options include Charlotte, NC, Richmond, VA, or Kennesaw, GA. Key skills required are ETL, Hadoop, Spark, and Python, with 10+ years of relevant experience.
๐ - Country
United States
๐ฑ - Currency
$ USD
-
๐ฐ - Day rate
520
-
๐๏ธ - Date
October 7, 2025
๐ - Duration
Unknown
-
๐๏ธ - Location
Unknown
-
๐ - Contract
Unknown
-
๐ - Security
Yes
-
๐ - Location detailed
Charlotte, NC
-
๐ง - Skills detailed
#Hadoop #SQL (Structured Query Language) #Data Extraction #Business Analysis #Sqoop (Apache Sqoop) #Security #Cloud #Kafka (Apache Kafka) #Perl #"ETL (Extract #Transform #Load)" #YARN (Yet Another Resource Negotiator) #Cloudera #Strategy #Automation #Spark (Apache Spark) #Informatica #Impala #Python #Unix #Scripting
Role description
Hi,
Akkodis is seeking a Hadoop & ETL Developer for contract with a client in Charlotte, NC/Richmond, VA/Kennesaw, GA. Ideally, we are looking for candidates with a strong experience with ETL & Hadoop.
Pay Range: $62-$65/hour; The rate may be negotiable based on experience, education, geographic location, and other factors.
Full JD::
As a Hadoop & ETL Developer to develop required code in Hadoop platform to support Incentive Compensation Management (ICM) requirements, individual will be responsible for understanding design, propose high level and detailed design solutions, propose out of box technical solutions for business requirements and provide solutions to production related issues. Individual should be flexible to work with offshore team and provide the SME knowledge on Hadoop and related tools. As an individual contributor, person should have good analytical skills to take a quick decision during production down and process abend situations. Engage in discussions with information architecture team for coming out with design solutions, proposing new technology adoption ideas, attending project meetings, partnering with offshore teammates, coordinating with other support teams like APS, testing, upstream and downstream partners, etc.
Job Responsibilities:
Development activities like analysis, coding, propose improvement ideas, drive the development activities at offshore
Work on multiple projects concurrently, take ownership & pride in the work done by them, attending project meetings, understanding requirements, designing solutions, develop the code
Partnering with offshore teammates and work with other support teams like L2, testing, up/down stream partners, etc.
Work with the Business Analysts to understand the requirements work on high level & detailed design to address the real time issues in production
Partner with Information Architecture team for proposing technical solutions to business problems
Identify gaps in technology and propose viable solutions
Take accountability of the technical deliveries from offshore
Work with the development teams and QA during the post code development phase
Identify improvement areas within the application and work with the respective teams to implement the same
Ensuring adherence to defined process & quality standards, best practices, high quality levels in all deliverables
Required Qualifications:
Required Qualifications:
10+ years of experience developing ETL solutions using any ETL tool like Informatica.
10+ years of experience developing applications using Hadoop, Spark, Impala, Hive & Python
10+ years of experience in running, using and troubleshooting the ETL & Cloudera Hadoop Ecosystem i.e. Hadoop FS, Hive, Impala, Spark, Kafka, Hue, Oozie, Yarn, Sqoop, Flume
Experience on Autosys JIL scripting
Proficient scripting skills i.e. Unix shell & Perl Scripting
Knowledge of troubleshooting data-related issues
Experience processing large amounts of structured and unstructured data with MapReduce
Experience in SQL and Relation Database developing data extraction applications
Experience with data movement and transformation technologies
Skills:
Application Development
Automation
Solution Design
Technical Strategy Development
Architecture
Equal Opportunity Employer/Veterans/Disabled
Benefit offerings available for our associates include medical, dental, vision, life insurance, short-term disability, additional voluntary benefits, an EAP program, commuter benefits, and a 401K plan. Our benefit offerings provide employees the flexibility to choose the type of coverage that meets their individual needs. In addition, our associates may be eligible for paid leave including Paid Sick Leave or any other paid leave required by Federal, State, or local law, as well as Holiday pay where applicable. Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs that are direct hires to a client.
To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit https://www.akkodis.com/en/privacy-policy.
The Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable:
ยท The California Fair Chance Act
ยท Los Angeles City Fair Chance Ordinance
ยท Los Angeles County Fair Chance Ordinance for Employers
ยท San Francisco Fair Chance Ordinance
Hi,
Akkodis is seeking a Hadoop & ETL Developer for contract with a client in Charlotte, NC/Richmond, VA/Kennesaw, GA. Ideally, we are looking for candidates with a strong experience with ETL & Hadoop.
Pay Range: $62-$65/hour; The rate may be negotiable based on experience, education, geographic location, and other factors.
Full JD::
As a Hadoop & ETL Developer to develop required code in Hadoop platform to support Incentive Compensation Management (ICM) requirements, individual will be responsible for understanding design, propose high level and detailed design solutions, propose out of box technical solutions for business requirements and provide solutions to production related issues. Individual should be flexible to work with offshore team and provide the SME knowledge on Hadoop and related tools. As an individual contributor, person should have good analytical skills to take a quick decision during production down and process abend situations. Engage in discussions with information architecture team for coming out with design solutions, proposing new technology adoption ideas, attending project meetings, partnering with offshore teammates, coordinating with other support teams like APS, testing, upstream and downstream partners, etc.
Job Responsibilities:
Development activities like analysis, coding, propose improvement ideas, drive the development activities at offshore
Work on multiple projects concurrently, take ownership & pride in the work done by them, attending project meetings, understanding requirements, designing solutions, develop the code
Partnering with offshore teammates and work with other support teams like L2, testing, up/down stream partners, etc.
Work with the Business Analysts to understand the requirements work on high level & detailed design to address the real time issues in production
Partner with Information Architecture team for proposing technical solutions to business problems
Identify gaps in technology and propose viable solutions
Take accountability of the technical deliveries from offshore
Work with the development teams and QA during the post code development phase
Identify improvement areas within the application and work with the respective teams to implement the same
Ensuring adherence to defined process & quality standards, best practices, high quality levels in all deliverables
Required Qualifications:
Required Qualifications:
10+ years of experience developing ETL solutions using any ETL tool like Informatica.
10+ years of experience developing applications using Hadoop, Spark, Impala, Hive & Python
10+ years of experience in running, using and troubleshooting the ETL & Cloudera Hadoop Ecosystem i.e. Hadoop FS, Hive, Impala, Spark, Kafka, Hue, Oozie, Yarn, Sqoop, Flume
Experience on Autosys JIL scripting
Proficient scripting skills i.e. Unix shell & Perl Scripting
Knowledge of troubleshooting data-related issues
Experience processing large amounts of structured and unstructured data with MapReduce
Experience in SQL and Relation Database developing data extraction applications
Experience with data movement and transformation technologies
Skills:
Application Development
Automation
Solution Design
Technical Strategy Development
Architecture
Equal Opportunity Employer/Veterans/Disabled
Benefit offerings available for our associates include medical, dental, vision, life insurance, short-term disability, additional voluntary benefits, an EAP program, commuter benefits, and a 401K plan. Our benefit offerings provide employees the flexibility to choose the type of coverage that meets their individual needs. In addition, our associates may be eligible for paid leave including Paid Sick Leave or any other paid leave required by Federal, State, or local law, as well as Holiday pay where applicable. Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs that are direct hires to a client.
To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit https://www.akkodis.com/en/privacy-policy.
The Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable:
ยท The California Fair Chance Act
ยท Los Angeles City Fair Chance Ordinance
ยท Los Angeles County Fair Chance Ordinance for Employers
ยท San Francisco Fair Chance Ordinance