

Big Data Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Big Data Developer with 12+ years of experience, focusing on Hadoop technologies. Contract length is unspecified, located in Columbus, OH / Jersey City, NJ, with a pay rate of "unknown." Key skills include Java, Python, and financial domain knowledge.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 25, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Columbus, OH
-
π§ - Skills detailed
#Spark (Apache Spark) #Azure #Data Extraction #Data Governance #Kafka (Apache Kafka) #Data Security #ML (Machine Learning) #NoSQL #Kubernetes #Tableau #Version Control #Data Accuracy #Data Ingestion #AWS (Amazon Web Services) #Apache Spark #GCP (Google Cloud Platform) #Datasets #Data Architecture #Data Quality #Agile #SQL Queries #Data Pipeline #Metadata #Scala #BI (Business Intelligence) #Docker #Hadoop #DevOps #Qlik #HDFS (Hadoop Distributed File System) #Compliance #Python #Java #Databases #SQL (Structured Query Language) #Big Data #Cloudera #Scrum #HBase #GIT #"ETL (Extract #Transform #Load)" #Data Management #Batch #Data Processing #Microsoft Power BI #Automation #PCI (Payment Card Industry) #Security #Cloud #Programming #Data Science #Visualization #MIFID (Markets in Financial Instruments Directive) #Computer Science #Project Management
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Big Data Developer - Hadoop
Location: Columbus, OH / Jersey City, NJ
Experience Level: 12+ Years
Business Line: Corporate & Investment Bank / Consumer & Community Banking
Please Share resume to Sganapathy@radiantze.com
Position Overview
One of the world's leading financial institutions, is seeking an experienced Big Data Developer with extensive Hadoop expertise to join our Technology team. As part of the client's commitment to innovation and digital transformation, you will play a critical role in building scalable data solutions that power our banking operations, risk management, and customer experience initiatives. The ideal candidate will have deep technical expertise in Java/Python programming and comprehensive understanding of large-scale financial data processing in enterprise environments.
Key Responsibilities
Data Architecture & Development
β’ Design and develop scalable big data solutions using Hadoop ecosystem components (HDFS, MapReduce, Hive, Spark, Kafka, HBase)
β’ Build and optimize ETL pipelines for processing large volumes of financial data
β’ Implement data ingestion frameworks from various sources including real-time streaming and batch processing
β’ Develop and maintain data models for complex financial datasets
Programming & Technical Implementation
β’ Write efficient, maintainable code in Java and Python for big data applications
β’ Develop MapReduce jobs, Spark applications, and streaming data processing solutions
β’ Create and optimize SQL queries and stored procedures for data extraction and transformation
β’ Implement data quality checks and validation frameworks
Client-Specific Data Management
β’ Process and analyze massive transaction volumes, trading data, credit card transactions, and regulatory reporting datasets
β’ Support digital banking initiatives and customer analytics platforms
β’ Implement risk management solutions for credit risk, market risk, and operational risk
β’ Ensure compliance with banking regulations including SOX, PCI-DSS, and internal data governance standards
β’ Handle sensitive customer and financial data following clientβs security protocols and encryption standards
Performance & Optimization
β’ Monitor and tune Hadoop cluster performance for optimal resource utilization
β’ Optimize data processing jobs for improved performance and cost efficiency
β’ Implement data partitioning and compression strategies
β’ Troubleshoot and resolve performance bottlenecks in big data pipelines
Required Qualifications
Experience & Education
β’ Bachelor's degree in Computer Science, Engineering, or related field
β’ 12+ years of total IT experience with minimum 6+ years in big data technologies
β’ 5+ years of hands-on experience with Hadoop ecosystem (HDFS, MapReduce, Hive, Spark)
β’ 4+ years of experience in banking, financial services, or fintech industry .
β’ Experience working in large enterprise environments with complex data architectures
Technical Skills
β’ Expert-level proficiency in Java and Python programming
β’ Strong experience with Hadoop distributions (Cloudera, Hortonworks, MapR)
β’ Proficiency in Apache Spark (Scala/Python), Kafka, HBase, and Hive
β’ Experience with data serialization formats (Avro, Parquet, ORC)
β’ Knowledge of SQL and NoSQL databases
β’ Familiarity with cloud platforms (AWS, Azure, GCP) and their big data services
β’ Experience with version control systems (Git) and CI/CD pipelines
Financial Domain Knowledge
β’ Deep understanding of banking operations, trading systems, and financial markets
β’ Experience with financial data formats and industry standards (FIX, SWIFT, etc.)
β’ Knowledge of regulatory requirements (Basel III, Dodd-Frank, MiFID II)
β’ Understanding of risk management and compliance frameworks
Additional Skills
β’ Experience with data visualization tools (Tableau, Power BI, Qlik)
β’ Knowledge of machine learning frameworks and algorithms
β’ Understanding of data security and encryption techniques
β’ Experience with Agile/Scrum development methodologies
Preferred Qualifications
β’ Master's degree in Computer Science, Data Science, or related field
β’ Certifications in Hadoop technologies (Cloudera, Hortonworks)
β’ Experience with real-time trading systems and market data processing
β’ Knowledge of containerization technologies (Docker, Kubernetes)
β’ Experience with data governance and metadata management tools
β’ Understanding of DevOps practices and infrastructure automation
Key Competencies
β’ Strong analytical and problem-solving skills
β’ Excellent communication and collaboration abilities
β’ Ability to work in fast-paced, high-pressure financial environments
β’ Attention to detail and commitment to data accuracy
β’ Ability to mentor junior developers and lead technical initiatives
β’ Strong project management and multitasking capabilities
Regards,
Radiantze Inc