

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown" and a pay rate of "$X - $Y" located in Merrimack, Smithfield, or Westlake. Key skills required include Oracle, Snowflake, AWS, Python, and ETL processes.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
600
-
ποΈ - Date discovered
September 30, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Smithfield, RI
-
π§ - Skills detailed
#Snowflake #Security #Batch #Data Quality #Jenkins #Data Processing #AWS (Amazon Web Services) #Data Architecture #Data Pipeline #S3 (Amazon Simple Storage Service) #Artifactory #Automation #Deployment #Scala #Data Engineering #Shell Scripting #Informatica #Python #Unix #Data Governance #EC2 #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Hadoop #Spark (Apache Spark) #Big Data #Scripting #Documentation #Data Integration #AI (Artificial Intelligence) #Oracle #Databases #GIT #Data Modeling #Cloud
Role description
Responsibilities
Kforce has a client that is seeking a Data Engineer. This position can be located in Merrimack, Smithfield or Westlake. Summary: We are looking for a hardworking, highly motivated Data Engineer with strong expertise in analyzing existing solutions, designing new solutions and helping team members in building robust and scalable solutions using the best software design & development practices. In this role, you will be responsible for the leading engineering & developing quality software components and applications for brokerage products. You will build, modernize, and maintain Core & Common tools and Data Solutions. You will also apply and adopt variety of cloud-native technologies for the products. In addition to building software, you will have an opportunity to help define and implement development practices, standards and strategies. Duties:
β’ Design, develop, and maintain data pipelines and ETL processes using Oracle, Snowflake, and AWS batch, Python
β’ Responsible for System analysis including analyzing system requirements, identifying system specifications
β’ Implement and manage data integration solutions to ensure seamless data flow across systems
β’ Develop and optimize Unix shell scripts for automation and data processing tasks
β’ Utilize AWS Batch for scheduling and executing ETL pipelines and batch processing jobs
β’ Implement CI/CD pipelines using tools like Jenkins, Git, and Artifactory to automate deployment and integration processes
β’ Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs
β’ Ensure data quality, integrity, and security across all data platforms
β’ Troubleshoot and resolve data-related issues in a timely manner
β’ Document data engineering processes and maintain comprehensive technical documentation
Requirements
β’ Strong experience with Oracle databases and Snowflake data warehousing
β’ Hands-on experience with AWS Data Engineering services including AWS Batch, S3, EMR, EC2
β’ Experience with big data technologies (e.g., Hadoop, Spark)
β’ Expertise in Informatica for data integration and ETL processes
β’ Solid understanding of data modeling and data architecture principles
β’ Strong SQL skills for data querying and manipulation
β’ Familiarity with CI/CD tools such as Jenkins, Git, and Artifactory
β’ Knowledge of data governance and data quality best practices
β’ Knowledge of Python for data processing and automation
β’ Solid understanding of data warehousing concepts and ETL processes
β’ Proficiency in Unix and shell scripting
β’ Excellent analytical and problem-solving skills
β’ Strong communication and collaboration skills
β’ Ability to work independently and as part of a team in a fast-paced environment
The pay range is the lowest to highest compensation we reasonably in good faith believe we would pay at posting for this role. We may ultimately pay more or less than this range. Employee pay is based on factors like relevant education, qualifications, certifications, experience, skills, seniority, location, performance, union contract and business needs. This range may be modified in the future.
We offer comprehensive benefits including medical/dental/vision insurance, HSA, FSA, 401(k), and life, disability & ADD insurance to eligible employees. Salaried personnel receive paid time off. Hourly employees are not eligible for paid time off unless required by law. Hourly employees on a Service Contract Act project are eligible for paid sick leave.
Note: Pay is not considered compensation until it is earned, vested and determinable. The amount and availability of any compensation remains in Kforce's sole discretion unless and until paid and may be modified in its discretion consistent with the law.
This job is not eligible for bonuses, incentives or commissions.
Kforce is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.
By clicking βApply Todayβ you agree to receive calls, AI-generated calls, text messages or emails from Kforce and its affiliates, and service providers. Note that if you choose to communicate with Kforce via text messaging the frequency may vary, and message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You will always have the right to cease communicating via text by using key words such as STOP.
Responsibilities
Kforce has a client that is seeking a Data Engineer. This position can be located in Merrimack, Smithfield or Westlake. Summary: We are looking for a hardworking, highly motivated Data Engineer with strong expertise in analyzing existing solutions, designing new solutions and helping team members in building robust and scalable solutions using the best software design & development practices. In this role, you will be responsible for the leading engineering & developing quality software components and applications for brokerage products. You will build, modernize, and maintain Core & Common tools and Data Solutions. You will also apply and adopt variety of cloud-native technologies for the products. In addition to building software, you will have an opportunity to help define and implement development practices, standards and strategies. Duties:
β’ Design, develop, and maintain data pipelines and ETL processes using Oracle, Snowflake, and AWS batch, Python
β’ Responsible for System analysis including analyzing system requirements, identifying system specifications
β’ Implement and manage data integration solutions to ensure seamless data flow across systems
β’ Develop and optimize Unix shell scripts for automation and data processing tasks
β’ Utilize AWS Batch for scheduling and executing ETL pipelines and batch processing jobs
β’ Implement CI/CD pipelines using tools like Jenkins, Git, and Artifactory to automate deployment and integration processes
β’ Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs
β’ Ensure data quality, integrity, and security across all data platforms
β’ Troubleshoot and resolve data-related issues in a timely manner
β’ Document data engineering processes and maintain comprehensive technical documentation
Requirements
β’ Strong experience with Oracle databases and Snowflake data warehousing
β’ Hands-on experience with AWS Data Engineering services including AWS Batch, S3, EMR, EC2
β’ Experience with big data technologies (e.g., Hadoop, Spark)
β’ Expertise in Informatica for data integration and ETL processes
β’ Solid understanding of data modeling and data architecture principles
β’ Strong SQL skills for data querying and manipulation
β’ Familiarity with CI/CD tools such as Jenkins, Git, and Artifactory
β’ Knowledge of data governance and data quality best practices
β’ Knowledge of Python for data processing and automation
β’ Solid understanding of data warehousing concepts and ETL processes
β’ Proficiency in Unix and shell scripting
β’ Excellent analytical and problem-solving skills
β’ Strong communication and collaboration skills
β’ Ability to work independently and as part of a team in a fast-paced environment
The pay range is the lowest to highest compensation we reasonably in good faith believe we would pay at posting for this role. We may ultimately pay more or less than this range. Employee pay is based on factors like relevant education, qualifications, certifications, experience, skills, seniority, location, performance, union contract and business needs. This range may be modified in the future.
We offer comprehensive benefits including medical/dental/vision insurance, HSA, FSA, 401(k), and life, disability & ADD insurance to eligible employees. Salaried personnel receive paid time off. Hourly employees are not eligible for paid time off unless required by law. Hourly employees on a Service Contract Act project are eligible for paid sick leave.
Note: Pay is not considered compensation until it is earned, vested and determinable. The amount and availability of any compensation remains in Kforce's sole discretion unless and until paid and may be modified in its discretion consistent with the law.
This job is not eligible for bonuses, incentives or commissions.
Kforce is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.
By clicking βApply Todayβ you agree to receive calls, AI-generated calls, text messages or emails from Kforce and its affiliates, and service providers. Note that if you choose to communicate with Kforce via text messaging the frequency may vary, and message and data rates may apply. Carriers are not liable for delayed or undelivered messages. You will always have the right to cease communicating via text by using key words such as STOP.