HCL Global Systems Inc

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 10+ year experience requirement, focusing on SQL, Oracle, AWS, and Java SpringBatch for ETL. Located in Smithfield, RI (Hybrid), the contract is W2 only, with a strong preference for financial domain experience.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
January 20, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Smithfield, RI
-
🧠 - Skills detailed
#Data Modeling #Data Engineering #S3 (Amazon Simple Storage Service) #SnowSQL #Maven #DevOps #Informatica #Jenkins #SQL (Structured Query Language) #Data Vault #Vault #Batch #"ETL (Extract #Transform #Load)" #Oracle #Scrum #Data Analysis #Ansible #Lambda (AWS Lambda) #Docker #Agile #Deployment #Python #Java #Kanban #AWS (Amazon Web Services) #Computer Science #Snowflake
Role description
Job Details: Data Engineer Location :: Smithfield, RI (Hybrid) Mode :: W2 Only (No C2C/1099) β€’ REQUIRED SKILLS β€’ β€’ Strong SQL for querying and data validation β€’ Oracle β€’ AWS β€’ ETL experience with Java SpringBatch (for the ETL data transformation). β€’ Note: the ETL work is done in Java Nice to haves: β€’ Python β€’ Snowflake β€’ Financial domain experience. β€’ Note: Informatica ETL experience is not helpful at all to us in this role. It’s fine if someone has it, but the experience, we need is on the Java ETL side. The Expertise and skills you bring: β€’ Bachelor’s or master’s Degree in a technology related field (e.g. Engineering, Computer Science, etc.) required with 10+ years of experience β€’ Advanced SQL/SnowSQL knowledge and experience working with snowflake database β€’ Good experience in working with AWS (Batch, S3) β€’ 5+ years of Application using Java / Spring Batch β€’ 3+ years of Python development β€’ Hands-on experience on SQL query optimization and tuning to improve performance is desirable β€’ Experience in job scheduling tools (Control M preferred) β€’ Proven data analysis skills β€’ Strong data modeling skills doing either Dimensional or Data Vault models β€’ Good to have - Working experience with some or all of the following: AWS, Containerization, associated build and deployment CI/CD pipelines, Lambda development. β€’ Experience with DevOps, Continuous Integration and Continuous Delivery (Maven, Jenkins, Stash, Ansible, Docker) β€’ Experience in Agile methodologies (Kanban and SCRUM) is a plus β€’ Proven track record to handle ambiguity and work in fast paced environment β€’ Good interpersonal skills to work with multiple teams in the organization