

Compunnel Inc.
Data Engineer-- KUMDC5768358
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown," offering a pay rate of "$$$/hour." Key skills include strong SQL, data masking technologies, Python, and AWS experience. A BS/BA or equivalent experience is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 20, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New Hampshire, United States
-
🧠 - Skills detailed
#Bash #RDBMS (Relational Database Management System) #Oracle #Automation #Shell Scripting #Data Analysis #AWS (Amazon Web Services) #MySQL #Cloud #Data Quality #SQL (Structured Query Language) #Data Pipeline #Informatica #Automated Testing #"ETL (Extract #Transform #Load)" #Unix #Data Engineering #Data Management #Python #SQL Server #Monitoring #Scripting #JCL (Job Control Language)
Role description
Top Skills / Overview of the JD
Masking data (similar to ETL - more just transformation of data and placing into lower level environment) - effectively creating test data and dev data - masking removes the PII
Strong SQL - DB2 systems are being leveraged so IBM Db2 is helpful but any relation DB experience is good (Postgres, MySQL, SQL server, Oracle) - foundational data engineering knowledge - complex sequel - design / schema - relationship and logical nature of data systems- looking for more depth than the Associate level
Masking tech is a needed - Delphix being used but there are others (DATPROF, IRI FieldShield, IBM infosphere optim, Informatica Persistent data Masking Gigantics) - any ETL in lieu of masking tech would be beneficial
Python, AWS
The Expertise and Skills You Bring
• Software development experience with a focus on data engineering/analysis.
• BS/BA or advanced degree, or equivalent experience preferred.
• Strong experience with Unix shell scripting (bash, ksh) and scheduling / orchestration tools (Control-M).
• Strong proficiency in Python, with experience developing data pipelines, automation scripts, and reusable data transformation logic.
• Experience developing APIs.
• Experience working in the cloud preferably AWS.
• Experience with Data Masking solutions (Delphix, Optim).
• Proficiency with DB2.
• Extensive experience in SQL language.
• Proven data analysis skills; not afraid to work in a complex data ecosystem.
• Knowledge of native stored procedures is a plus.
• Efficient in developing data management solutions using RDBMS, Oracle, DB2, Cockroach DB, Postgres.
• Experience with data quality frameworks, including automated testing, validation, and monitoring to ensure trusted data.
• Any experience in mainframe languages and technologies is a plus. (Cobol, CICS, VSAM, JCL, utilities, FileAid, Endevor, debuggers, schedulers, etc.).
Top Skills / Overview of the JD
Masking data (similar to ETL - more just transformation of data and placing into lower level environment) - effectively creating test data and dev data - masking removes the PII
Strong SQL - DB2 systems are being leveraged so IBM Db2 is helpful but any relation DB experience is good (Postgres, MySQL, SQL server, Oracle) - foundational data engineering knowledge - complex sequel - design / schema - relationship and logical nature of data systems- looking for more depth than the Associate level
Masking tech is a needed - Delphix being used but there are others (DATPROF, IRI FieldShield, IBM infosphere optim, Informatica Persistent data Masking Gigantics) - any ETL in lieu of masking tech would be beneficial
Python, AWS
The Expertise and Skills You Bring
• Software development experience with a focus on data engineering/analysis.
• BS/BA or advanced degree, or equivalent experience preferred.
• Strong experience with Unix shell scripting (bash, ksh) and scheduling / orchestration tools (Control-M).
• Strong proficiency in Python, with experience developing data pipelines, automation scripts, and reusable data transformation logic.
• Experience developing APIs.
• Experience working in the cloud preferably AWS.
• Experience with Data Masking solutions (Delphix, Optim).
• Proficiency with DB2.
• Extensive experience in SQL language.
• Proven data analysis skills; not afraid to work in a complex data ecosystem.
• Knowledge of native stored procedures is a plus.
• Efficient in developing data management solutions using RDBMS, Oracle, DB2, Cockroach DB, Postgres.
• Experience with data quality frameworks, including automated testing, validation, and monitoring to ensure trusted data.
• Any experience in mainframe languages and technologies is a plus. (Cobol, CICS, VSAM, JCL, utilities, FileAid, Endevor, debuggers, schedulers, etc.).






