Eliassen Group

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Smithfield, RI, with a contract length of unspecified duration, offering $65.00 to $70.00/hr. Requires 6-9 years of experience in ELT/ETL, strong Python, SQL, and orchestration skills using Apache Airflow or Control-M.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
560
-
πŸ—“οΈ - Date
May 15, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Smithfield, RI
-
🧠 - Skills detailed
#TypeScript #Airflow #Oracle #PostgreSQL #SQL (Structured Query Language) #Databases #EC2 #Snowflake #Deployment #Data Engineering #Automation #GitHub #Data Processing #Groovy #IAM (Identity and Access Management) #Jenkins #Scripting #AWS (Amazon Web Services) #SNS (Simple Notification Service) #Cloud #"ETL (Extract #Transform #Load)" #Apache Airflow #Docker #S3 (Amazon Simple Storage Service) #Scala #SQS (Simple Queue Service) #Data Pipeline #Angular #Python #Golang #Lambda (AWS Lambda) #Storage #DynamoDB
Role description
Onsite in Smithfield, RI Our client seeks a Data Engineer to design, build, and support enterprise-scale data platforms and pipelines. The role focuses on ELT/ETL across Snowflake, Oracle, and PostgreSQL, with strong Python, SQL, and orchestration using Apache Airflow or Control-M. You will drive CI/CD, containerization with Docker, and AWS services to deliver scalable, reliable data solutions. Due to client requirements, applicants must be willing and able to work on a w2 basis. For our w2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance. Rate: $65.00 to $70.00/hr. w2 Responsibilities β€’ Design, build, and maintain scalable ELT/ETL pipelines across Snowflake, Oracle, and PostgreSQL. β€’ Develop robust data models and transformations to support analytics and application workloads. β€’ Orchestrate complex data workflows using Apache Airflow and/or Control-M for reliability and efficiency. β€’ Implement CI/CD pipelines with GitHub and Jenkins for automated build, test, and deployment. β€’ Containerize and deploy data services using Docker with secure, multi-stage builds and registries. β€’ Leverage AWS services such as Lambda, SQS/SNS, EC2, S3, CloudWatch, and IAM for cloud-native solutions. β€’ Write performant Python and SQL for data processing, quality, and automation. β€’ Monitor, troubleshoot, and optimize data pipelines and platform components. Experience Requirements β€’ 6–9 years designing, building, and supporting enterprise-scale data engineering or platform solutions. β€’ Strong hands-on experience with ELT/ETL pipelines across Snowflake, Oracle, and PostgreSQL. β€’ Deep understanding of data structures, relational databases, and cloud-native storage technologies. β€’ Proficiency in Python and SQL, with scripting experience in Shell and Groovy. β€’ Proven orchestration experience with Apache Airflow and/or Control-M. β€’ Hands-on Docker experience including image creation, multi-stage builds, registries, and secure deployments. β€’ Strong CI/CD experience with GitHub and Jenkins, including automated builds, testing, and deployments. β€’ Solid understanding of AWS services including Lambda, SQS/SNS, EC2, S3, CloudWatch, and IAM. β€’ Exposure to Angular, Node.js, and TypeScript (preferred). β€’ Experience with AWS DynamoDB (preferred). β€’ Familiarity with Go (Golang) for backend or data platforms (preferred). β€’ Broader experience with PostgreSQL and AWS-based architectures (preferred). Education Requirements β€’ Bachelor’s degree, preferably in Engineering or Business.