

Matlen Silver
Python/Big Data Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python/Big Data Developer on an 18-month W2 contract at $65/hour in Charlotte, NC. Requires 5+ years of Hadoop experience, proficiency in PL/SQL, Unix shell scripting, and familiarity with big data frameworks.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
520
-
🗓️ - Date
December 20, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#React #RDBMS (Relational Database Management System) #Big Data #Spark (Apache Spark) #Shell Scripting #Data Framework #"ETL (Extract #Transform #Load)" #Hadoop #Unix #Programming #Regression #Security #Agile #Linux #Apache Spark #SQL (Structured Query Language) #ECR (Elastic Container Registery) #Kafka (Apache Kafka) #Python #Angular #Data Security #Scripting
Role description
Python/Big Data Developer
18 Month W2 Contract
$65/hour
Charlotte, NC
Hybrid 3 days onsite
Mid-level to Senior software developer to design, develop, maintain, and test software for the ECR RCF team, collaborating with the development team and business partners to ensure successful delivery and implementation of application solutions in a dynamic Agile environment. Ability to achieve subject matter expertise quickly on new applications is needed. Programming experience on Python, Hadoop / Big Data, plsql with RDBMS, shell scripting, backend, Unix shell scripting. Angular 5, React, NodeJS, a plus.
Day-to-Day Responsibilities:
o Develop, support Enterprise Credit Risk ETL platform
o Develop unit / integration / regression / performance test scripts/ test suites for the framework
o Follow Agile development process - work refinement, estimation, retrospectives, etc. rituals
o Interface with users and other team members, understand requirements and engineer and analyze solution options
o Communicate solution effectively to team / teams and play a mentor role for junior resources
o Engage Architecture team as needed during development and support
o Participate in POCs - evaluate tools and technologies as needed
• 5+ years of experience working with Hadoop and its ecosystem
• Proficiency in Hadoop and PL/SQL with RDBMS, Unix shell scripting
• Familiarity with big data frameworks like Apache Spark and Kafka
• Strong understanding of Linux/Unix systems and shell scripting.
• Knowledge of data security practices in Hadoop environments
• Excellent problem-solving and communication skills.
• Must be able to handle multiple tasks, lead the team through the delivery and adapt to a constantly changing environment.
• Ability to learn quickly and work with minimal supervision
Python/Big Data Developer
18 Month W2 Contract
$65/hour
Charlotte, NC
Hybrid 3 days onsite
Mid-level to Senior software developer to design, develop, maintain, and test software for the ECR RCF team, collaborating with the development team and business partners to ensure successful delivery and implementation of application solutions in a dynamic Agile environment. Ability to achieve subject matter expertise quickly on new applications is needed. Programming experience on Python, Hadoop / Big Data, plsql with RDBMS, shell scripting, backend, Unix shell scripting. Angular 5, React, NodeJS, a plus.
Day-to-Day Responsibilities:
o Develop, support Enterprise Credit Risk ETL platform
o Develop unit / integration / regression / performance test scripts/ test suites for the framework
o Follow Agile development process - work refinement, estimation, retrospectives, etc. rituals
o Interface with users and other team members, understand requirements and engineer and analyze solution options
o Communicate solution effectively to team / teams and play a mentor role for junior resources
o Engage Architecture team as needed during development and support
o Participate in POCs - evaluate tools and technologies as needed
• 5+ years of experience working with Hadoop and its ecosystem
• Proficiency in Hadoop and PL/SQL with RDBMS, Unix shell scripting
• Familiarity with big data frameworks like Apache Spark and Kafka
• Strong understanding of Linux/Unix systems and shell scripting.
• Knowledge of data security practices in Hadoop environments
• Excellent problem-solving and communication skills.
• Must be able to handle multiple tasks, lead the team through the delivery and adapt to a constantly changing environment.
• Ability to learn quickly and work with minimal supervision






