

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Alpharetta, GA, with a 6-month contract (likely extensions) at a pay rate of "TBD." Key skills include ETL (SSIS, Informatica), Python/Java, Azure/AWS, and Unix/Linux. Insurance industry experience is preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
496
-
ποΈ - Date discovered
September 13, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Alpharetta, GA
-
π§ - Skills detailed
#Cloud #Azure #Scripting #Unix #GitLab #Strategy #Databricks #Data Strategy #GitHub #Datasets #Scala #Computer Science #Data Engineering #Python #Linux #Programming #Agile #C++ #SSIS (SQL Server Integration Services) #AWS (Amazon Web Services) #GIT #Informatica #Documentation #Scrum #Data Manipulation #BI (Business Intelligence) #"ETL (Extract #Transform #Load)" #Java #Data Pipeline #Big Data
Role description
β’
β’
β’ Can not work with subvendors
β’
β’
β’
β’
β’
β’ Local to Alpharetta area only please
β’
β’
β’ Title: Data Engineer
Location: Alpharetta, GA
Note: hybrid (1-5 days in office, Wednesdays required) Fully onsite for first 2 weeks
Duration: 6 months, likely extensions.
About the role:
β’ This role will be working on a new project which is Big Data/Cloud-Driven Initiative. This role is important to enable the ingestion, transformation, and management of massive datasets across cloud and on-prem environments. They will be supporting client's data-driven product strategy, ensuring scalable, accurate, and efficient data pipelines for business and customer-facing platforms
β’ Client is a global leader in analytics and information services. This role offers opportunity to work with cutting-edge Big Data technologies (Databricks, Azure, AWS, Informatica, SSIS). Background in insurance is highly preferred
β’ This team supports internal development teams, QA, and business stakeholders building data-driven solutions and products. They have collaborative, innovation-driven environment. They have exposure to highly visible projects impacting data strategy and customer solutions.
β’ This role will help in accomplishing tasks like designing, building, and maintaining data pipelines; enabling business intelligence and analytics through scalable and accurate data transformation and integration.
Β· Daily breakdown:
β’ 40% Designing & developing ETL/data pipelines (SSIS, Informatica, Databricks, Python, Java)
β’ 30% Collaborating with dev/QA/business teams (Agile/Scrum participation, requirement gathering, solutioning)
β’ 20% Troubleshooting, bug fixes, technical issue resolution
β’ 10% Documentation, keeping up with new tools/tech
Must Have
Ranked must-haves:
1. ETL/Data Engineering: SSIS, Informatica, Databricks β 5+ years
1. Programming: Python and/or Java β 3+ years
1. Cloud: Azure and/or AWS β 2β3+ years
1. Unix/Linux proficiency β 2+ years
1. Git/GitHub/GitLab experience β 1+ year
Education/Certs:
β’ Bachelorβs degree in Computer Science, Engineering, or related field preferred.
β’ Certifications in cloud (Azure/AWS) or Informatica/Databricks a plus.
Environment/Tech Ecosystem: Mix of cloud (Azure, AWS), Big Data (Databricks), ETL tools (SSIS, Informatica), modern dev practices (Agile, Git, TDD).
Work type: Primarily new data pipelines and application build; some maintenance on existing solutions.
Pluses
Familiarity with C/C++ and additional scripting languages.
Test-driven development knowledge.
Prior experience in highly regulated industries (financial services, healthcare, insurance, legal tech).
Basic Functions:
β’ Write and review portions of detailed specifications for the development of system components of moderate complexity.
β’ Complete moderate to complex bug fixes.
β’ Work closely with other development team members, Quality Analysts and Business to understand product requirements and translate them into software designs.
β’ Operate in various development environments (Agile,Waterfall, etc.) while collaborating with key stakeholders.
β’ Resolve technical issues as necessary.
β’ Keep abreast of new technology developments.
β’ Make major code design and changes to new & existing products and be a primary point of contact for products they develop or own.
Technical Skills:
β’ Proficiency with data manipulation languages SSIS, Informatica
β’ Ability to work with complex data models.
β’ Proficiency in Java Proficiency in other development languages including but not limited to: Python, C/C++ etc.
β’ Familiar with Unix/Linux Servers and commands
β’ Familiarity of industry best practices code coverage.
β’ Hands on experience with Cloud Technologies Azure/AWS.
β’ Hand on experience in Databricks
β’ Experience working in software development methodologies (e.g., Agile, Waterfall)
β’ Experience working with Git GitLab/GitHub.
β’ Knowledge of test-driven development.
β’ Ability and desire to learn new processes and technologies
β’ Excellent oral and written communications skills.
Accountabilities:
β’ Write and review portions of detailed specifications for the development of system components of moderate complexity.
β’ Complete moderate to complex bug fixes.
β’ Work closely with other development team members, Quality Analysts and Business to understand product requirements and translate them into software designs.
β’ Operate in various development environments (Agile,Waterfall, etc.) while collaborating with key stakeholders.
β’ Resolve technical issues as necessary.
β’ Keep abreast of new technology developments.
β’ Make major code design and changes to new & existing products and be a primary point of contact for products they develop or own.
β’
β’
β’ Can not work with subvendors
β’
β’
β’
β’
β’
β’ Local to Alpharetta area only please
β’
β’
β’ Title: Data Engineer
Location: Alpharetta, GA
Note: hybrid (1-5 days in office, Wednesdays required) Fully onsite for first 2 weeks
Duration: 6 months, likely extensions.
About the role:
β’ This role will be working on a new project which is Big Data/Cloud-Driven Initiative. This role is important to enable the ingestion, transformation, and management of massive datasets across cloud and on-prem environments. They will be supporting client's data-driven product strategy, ensuring scalable, accurate, and efficient data pipelines for business and customer-facing platforms
β’ Client is a global leader in analytics and information services. This role offers opportunity to work with cutting-edge Big Data technologies (Databricks, Azure, AWS, Informatica, SSIS). Background in insurance is highly preferred
β’ This team supports internal development teams, QA, and business stakeholders building data-driven solutions and products. They have collaborative, innovation-driven environment. They have exposure to highly visible projects impacting data strategy and customer solutions.
β’ This role will help in accomplishing tasks like designing, building, and maintaining data pipelines; enabling business intelligence and analytics through scalable and accurate data transformation and integration.
Β· Daily breakdown:
β’ 40% Designing & developing ETL/data pipelines (SSIS, Informatica, Databricks, Python, Java)
β’ 30% Collaborating with dev/QA/business teams (Agile/Scrum participation, requirement gathering, solutioning)
β’ 20% Troubleshooting, bug fixes, technical issue resolution
β’ 10% Documentation, keeping up with new tools/tech
Must Have
Ranked must-haves:
1. ETL/Data Engineering: SSIS, Informatica, Databricks β 5+ years
1. Programming: Python and/or Java β 3+ years
1. Cloud: Azure and/or AWS β 2β3+ years
1. Unix/Linux proficiency β 2+ years
1. Git/GitHub/GitLab experience β 1+ year
Education/Certs:
β’ Bachelorβs degree in Computer Science, Engineering, or related field preferred.
β’ Certifications in cloud (Azure/AWS) or Informatica/Databricks a plus.
Environment/Tech Ecosystem: Mix of cloud (Azure, AWS), Big Data (Databricks), ETL tools (SSIS, Informatica), modern dev practices (Agile, Git, TDD).
Work type: Primarily new data pipelines and application build; some maintenance on existing solutions.
Pluses
Familiarity with C/C++ and additional scripting languages.
Test-driven development knowledge.
Prior experience in highly regulated industries (financial services, healthcare, insurance, legal tech).
Basic Functions:
β’ Write and review portions of detailed specifications for the development of system components of moderate complexity.
β’ Complete moderate to complex bug fixes.
β’ Work closely with other development team members, Quality Analysts and Business to understand product requirements and translate them into software designs.
β’ Operate in various development environments (Agile,Waterfall, etc.) while collaborating with key stakeholders.
β’ Resolve technical issues as necessary.
β’ Keep abreast of new technology developments.
β’ Make major code design and changes to new & existing products and be a primary point of contact for products they develop or own.
Technical Skills:
β’ Proficiency with data manipulation languages SSIS, Informatica
β’ Ability to work with complex data models.
β’ Proficiency in Java Proficiency in other development languages including but not limited to: Python, C/C++ etc.
β’ Familiar with Unix/Linux Servers and commands
β’ Familiarity of industry best practices code coverage.
β’ Hands on experience with Cloud Technologies Azure/AWS.
β’ Hand on experience in Databricks
β’ Experience working in software development methodologies (e.g., Agile, Waterfall)
β’ Experience working with Git GitLab/GitHub.
β’ Knowledge of test-driven development.
β’ Ability and desire to learn new processes and technologies
β’ Excellent oral and written communications skills.
Accountabilities:
β’ Write and review portions of detailed specifications for the development of system components of moderate complexity.
β’ Complete moderate to complex bug fixes.
β’ Work closely with other development team members, Quality Analysts and Business to understand product requirements and translate them into software designs.
β’ Operate in various development environments (Agile,Waterfall, etc.) while collaborating with key stakeholders.
β’ Resolve technical issues as necessary.
β’ Keep abreast of new technology developments.
β’ Make major code design and changes to new & existing products and be a primary point of contact for products they develop or own.