Marchon Partners

ODI Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior ODI Developer with a contract length of "unknown", offering a pay rate of "unknown". The position requires expertise in ODI, Oracle Data Warehouse, Linux, Python scripting, and a strong understanding of the Banking domain.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
600
-
🗓️ - Date
April 14, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Boston, MA
-
🧠 - Skills detailed
#API (Application Programming Interface) #"ETL (Extract #Transform #Load)" #Agile #GIT #Scripting #Documentation #Oracle #Code Reviews #Data Architecture #Data Engineering #Cloud #CRM (Customer Relationship Management) #IP (Internet Protocol) #Unix #Data Modeling #Data Warehouse #SQL Queries #Scrum #SQL (Structured Query Language) #DBA (Database Administrator) #Jira #Unit Testing #Automation #BitBucket #Linux #Kanban #Apache Airflow #Data Processing #Python #Data Pipeline #Data Transformations #dbt (data build tool) #Computer Science #ODI (Oracle Data Integrator) #Scala #Airflow
Role description
Senior ODI Developer Role Overview: Seeking a highly skilled and experienced Sr. ODI Developer to join our Private Banking Systems team. The ideal candidate will possess expertise in a range of technologies, including ODI (Oracle Data Integrator), Oracle Data Warehouse, Linux, Python scripting, and have a deep understanding of the Banking domain is a big plus. As a Data Engineer, you will play a pivotal role in designing, developing, and maintaining data solutions. Key Responsibilities: • Build ODI mappings/interfaces, packages, procedures, scenarios, topology configuration, ODI Agent and load plans to integrate data from multiple enterprise systems. • Expertise in building Pl/SQL queries, procedures, data loading process, ensuring high-performance and scalability to meet the evolving data needs of the various applications. • Design, develop, and maintain ETL/ELT pipelines using Oracle Data Integrator (ODI). • Collaborate effectively with cross-functional teams, including other data engineers, DBA group, analysts, and business stakeholders, to understand data requirements and deliver solutions. • Monitor and troubleshoot RMJ jobs, ODI workflows, sessions, agents, and data pipelines on Linux environments. • Perform root cause analysis for failures related to ODI workflows, RMJ jobs, network connectivity, API integrations, and file transfers. • Optimize ETL workflows to improve reliability, performance, and scalability. • Use scripting and automation tools to support data processing and operational workflows. • Work in Linux/Unix environments, using command-line tools and shell scripts for job automation and troubleshooting. • Maintain comprehensive documentation of data processes, configurations, and best practices. • Participate in walk-throughs which review program specifications, source code, and all technical supporting documentation, including screens/reports. Provide feedback in accordance with team standards and guidelines. • Participate in implementation of changes, enhancements, and newly developed programs. • Conduct technical research and provide recommendations, develop proofs of concept or prototypes, contributing to technical design of applications. • Helping to identify coding patterns and anti-patterns and enforce implementation of the patterns through code reviews. • Quickly resolving issues encountered by business lines in the production environment, maintaining a helpful, "high touch " approach to working with business users, performing root cause analysis, technology evaluation, and performance tuning. Desired Qualifications: • Degree in Computer Science, Engineering or related technical area • 7+ years of extensive hands-on experience in ODI, Oracle Datawarehouse, Oracle PL/SQL, Linux, Python scripting, and ODI admin module (ODI Agent setup, logs configuration, certificate installation). • Must have experience in building Pl/SQL queries for Oracle Server (incl. stored procedures, functions...) and must understand basic principles of data modeling • Excellent collaborative and communication skills, particularly in high-stress situations • Experience with scripting Python and Linux scripting, CLE, networking fundamentals (API, IP/ports, SFTP/FTP connectivity) • High proficiency in development practices: unit testing, Continuous Integration (CI/CD), refactoring, clean code • Experience with Bitbucket/GIT source control management • Problem solving skills, able to determine upcoming risks & issues and address them accordingly. • Ability to interpret and troubleshoot applications using logs. • Pro-active approach and good communication skills. • Experience with agile methodologies (Scrum, Kanban) and tools (Jira) Nice to Have: • Private Banking domain experience. • Working experience in a financial service industry • Financial application knowledge like FIS AddVantage, CRD, CRM Pivotal. • Experience with Apache Airflow for workflow orchestration. • Knowledge of dbt (Data Build Tool) for modern data transformations. • Exposure to cloud data platforms or hybrid data architectures. Key Competencies: • Strong analytical and problem-solving skills • Ability to work with large-scale enterprise data environments • Excellent collaboration and communication skills • Ability to manage multiple priorities in a fast-paced environment • Commitment to continuous learning and technology innovation