

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer contract position in Raleigh, NC, lasting ~14 months with a pay rate of "TBD." Key skills include SQL, Python, and expertise in data warehousing and cloud platforms like Snowflake and AWS.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 13, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Raleigh, NC
-
π§ - Skills detailed
#Data Pipeline #BigQuery #Cloud #SAS #Physical Data Model #Data Processing #Data Lake #SAP #dbt (data build tool) #Snowflake #Data Integration #"ETL (Extract #Transform #Load)" #Scripting #Teradata SQL #Data Warehouse #DataStage #Python #SQL Server #Teradata #IICS (Informatica Intelligent Cloud Services) #GitLab #Public Cloud #Data Engineering #Oracle #AWS (Amazon Web Services) #Fivetran #EDW (Enterprise Data Warehouse) #Redshift #AWS Glue #SQL (Structured Query Language) #Data Modeling #Business Objects #SAP BODS (BusinessObjects Data Services) #Informatica Cloud #Informatica #Data Integrity #Data Architecture #Data Mart #BO (Business Objects) #Version Control
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Primary Job Title:
Data Engineer
Alternate/Related Job Titles:
β’ ETL Developer
β’ Data Integration Engineer
β’ Cloud Data Pipeline Engineer
Location:
Raleigh, NCOnsite Flexibility:
OnsiteContract Details:
β’ Position Type: Contract
β’ Contract Duration: ~14 months
β’ Start: 08/12/2025
β’ End: 10/12/2026
Job Summary:
The Data Engineer is responsible for designing, building, and maintaining data pipelines that support integrations for the Enterprise Data Warehouse, Operational Data Store, Data Marts, and related platforms, in alignment with FCB-defined guidelines. This role requires deep technical expertise in data integration, data warehousing, and cloud-based platforms, along with proficiency in both legacy and modern ETL/ELT tools.Key Responsibilities:
β’ Design, build, and maintain complex data pipelines for enterprise data integration.
β’ Support existing SAS ETL scripts with maintenance and minor enhancements.
β’ Implement solutions using ETL tools such as SAP Business Objects Data Services (BODS), Informatica Cloud Data Integration Services (IICS), IBM DataStage.
β’ Leverage ELT tools such as DBT, Fivetran, and AWS Glue for data processing.
β’ Develop and maintain large-scale Data Warehousing solutions using Oracle, Netezza, Teradata, SQL Server.
β’ Work with cloud-based data platforms including Snowflake, AWS Redshift, and Google BigQuery.
β’ Create and manage Logical and Physical Data Models using relational or dimensional modeling practices.
β’ Optimize the performance of data pipelines and database objects.
β’ Utilize GitLab for version control and CI/CD processes.
β’ Troubleshoot and resolve data integrity issues.
Required Experience:
β’ Experience designing and building data warehouses and data lakes.
β’ Strong understanding of data warehouse principles and concepts.
β’ Hands-on work in large-scale data warehousing applications.
β’ Public cloud data platform experience (Snowflake, AWS).
Nice-to-Have Experience:
β’ Familiarity with Google BigQuery.
β’ Experience with performance tuning in high-volume ETL/ELT processes.
Required Skills:
β’ Expertise in SQL.
β’ Proficiency in at least one scripting language (e.g., Python).
β’ Strong knowledge of data architecture, design patterns, and modeling.
Preferred Skills:
β’ Experience with SAS ETL.
β’ Proficiency with SAP BODS, IICS, and IBM DataStage.
β’ Experience with DBT, Fivetran, and AWS Glue.
Additional Skills:
β’ Logical and Physical Data Modeling expertise.
β’ GitLab CI/CD workflow knowledge.
β’ Experience in resolving data integrity issues.
Benefits:
β’ Medical, Vision, and Dental Insurance Plans
β’ 401k Retirement Fund
About GTT:
GTT is a minority-owned staffing firm and a subsidiary of Chenega Corporation, a Native American-owned company in Alaska. As a Native American-owned, economically disadvantaged corporation, we highly value diverse and inclusive workplaces. Our clients are Fortune 500 banking, insurance, financial services, and technology companies, along with some of the nationβs largest life sciences, biotech, utility, and retail companies across the US and Canada. We look forward to helping you land your next great career opportunity!Job Number: 25-24673