

Compunnel Inc.
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown" and a pay rate of "unknown," located in "unknown." Key skills include advanced proficiency in JAVA, PYTHON, JSON, and experience with Snowflake, PostgreSQL, and AWS services.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 18, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New Hampshire, United States
-
🧠 - Skills detailed
#Lambda (AWS Lambda) #Agile #Informatica #SQL (Structured Query Language) #Java #IAM (Identity and Access Management) #AWS (Amazon Web Services) #Python #"ETL (Extract #Transform #Load)" #Linux #Cloud #RDS (Amazon Relational Database Service) #Docker #Automation #PostgreSQL #Stories #Unix #Debugging #JSON (JavaScript Object Notation) #Version Control #Ansible #Maven #Scripting #Data Engineering #Data Integration #EC2 #Documentation #Data Manipulation #DevOps #S3 (Amazon Simple Storage Service) #Shell Scripting #Snowflake #Jenkins #Databases #Jira
Role description
About the Company
Extensive experience with ETL technologies.
About the Role
Design and develop ETL reporting and analytics solutions.
Responsibilities
• Knowledge of Data Warehousing methodologies and concepts – preferred
• Advanced data manipulation languages and frameworks (JAVA, PYTHON, JSON) – required
• RMDS experience (Snowflake, PostgreSQL) – required
• Knowledge of Cloud platforms and Services (AWS – IAM, EC2, S3, Lambda, RDS) – required
• Designing and developing low to moderate complex data integration solution – required
• Experience with DevOps, Continuous Integration and Continuous Delivery (Maven, Jenkins, Stash, Ansible, Docker) will be preferred
• Expert in SQL and Stored Procedures on any Relational databases
• Good in debugging, analyzing and Production Support
• Application Development based on JIRA stories (Agile environment)
• Demonstrable experience with ETL tools (Informatica, Snaplogic)
• Experience in working with Python in an AWS environment
• Create, update, and maintain technical documentation for software-based projects and products.
• Solving production issues.
• Interact effectively with business partners to understand business requirements and assist in generation of technical requirements.
• Participate in architecture, technical design, and product implementation discussions.
• Working Knowledge of Unix/Linux operating systems and shell scripting
• Experience with developing sophisticated Continuous Integration & Continuous Delivery (CI/CD) pipeline including software configuration management, test automation, version control, static code analysis.
• Excellent interpersonal and communication skills
• Ability to work with global Agile teams
• Proven ability to deal with ambiguity and work in fast paced environment
• Ability to mentor junior data engineers.
Qualifications
Education details
Required Skills
Advanced data manipulation languages and frameworks (JAVA, PYTHON, JSON) – required
RMDS experience (Snowflake, PostgreSQL) – required
Knowledge of Cloud platforms and Services (AWS – IAM, EC2, S3, Lambda, RDS) – required
Designing and developing low to moderate complex data integration solution – required
Preferred Skills
Experience with DevOps, Continuous Integration and Continuous Delivery (Maven, Jenkins, Stash, Ansible, Docker) will be preferred
About the Company
Extensive experience with ETL technologies.
About the Role
Design and develop ETL reporting and analytics solutions.
Responsibilities
• Knowledge of Data Warehousing methodologies and concepts – preferred
• Advanced data manipulation languages and frameworks (JAVA, PYTHON, JSON) – required
• RMDS experience (Snowflake, PostgreSQL) – required
• Knowledge of Cloud platforms and Services (AWS – IAM, EC2, S3, Lambda, RDS) – required
• Designing and developing low to moderate complex data integration solution – required
• Experience with DevOps, Continuous Integration and Continuous Delivery (Maven, Jenkins, Stash, Ansible, Docker) will be preferred
• Expert in SQL and Stored Procedures on any Relational databases
• Good in debugging, analyzing and Production Support
• Application Development based on JIRA stories (Agile environment)
• Demonstrable experience with ETL tools (Informatica, Snaplogic)
• Experience in working with Python in an AWS environment
• Create, update, and maintain technical documentation for software-based projects and products.
• Solving production issues.
• Interact effectively with business partners to understand business requirements and assist in generation of technical requirements.
• Participate in architecture, technical design, and product implementation discussions.
• Working Knowledge of Unix/Linux operating systems and shell scripting
• Experience with developing sophisticated Continuous Integration & Continuous Delivery (CI/CD) pipeline including software configuration management, test automation, version control, static code analysis.
• Excellent interpersonal and communication skills
• Ability to work with global Agile teams
• Proven ability to deal with ambiguity and work in fast paced environment
• Ability to mentor junior data engineers.
Qualifications
Education details
Required Skills
Advanced data manipulation languages and frameworks (JAVA, PYTHON, JSON) – required
RMDS experience (Snowflake, PostgreSQL) – required
Knowledge of Cloud platforms and Services (AWS – IAM, EC2, S3, Lambda, RDS) – required
Designing and developing low to moderate complex data integration solution – required
Preferred Skills
Experience with DevOps, Continuous Integration and Continuous Delivery (Maven, Jenkins, Stash, Ansible, Docker) will be preferred






