

Data Engineer (contract)
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer contract position in Charlotte, NC, lasting 12 months, offering a W2 engagement. Key skills include Big Data technologies, Microsoft SQL Server, and data architecture. A bachelor's degree and prior financial institution experience are preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 11, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Charlotte, NC
-
π§ - Skills detailed
#Big Data #Data Modeling #Cloud #Data Warehouse #RDS (Amazon Relational Database Service) #Scrum #SSRS (SQL Server Reporting Services) #Computer Science #GitHub #Deployment #SSAS (SQL Server Analysis Services) #Data Engineering #Hadoop #Jenkins #Data Integration #Spark (Apache Spark) #Data Governance #Agile #"ETL (Extract #Transform #Load)" #Security #SSIS (SQL Server Integration Services) #Microsoft SQL #MS SQL (Microsoft SQL Server) #Physical Data Model #SQL Server #Microsoft SQL Server #Data Architecture #SQL (Structured Query Language) #AWS (Amazon Web Services) #GCP (Google Cloud Platform) #Compliance
Role description
Title: Data Engineer
Location: 300 S Brevard St Charlotte, NC
Duration: 12 months
Work Engagement: W2
Work Schedule: 3 days in office/2 days remote
Benefits on offer for this contract position: Health Insurance, Life insurance, 401K and Voluntary Benefits
Summary:
In this contingent resource assignment, you may: Consult on or participate in moderately complex initiatives and deliverables within Specialty Software Engineering and contribute to large-scale planning related to Specialty Software Engineering deliverables. Review and analyze moderately complex Specialty Software Engineering challenges that require an in-depth evaluation of variable factors. Contribute to the resolution of moderately complex issues and consult with others to meet Specialty Software Engineering deliverables while leveraging solid understanding of the function, policies, procedures, and compliance requirements. Collaborate with client personnel in Specialty Software Engineering.
Responsibilities:
β’ Part of a team that provides data integration, delivery and support for risk reporting.
β’ Will use strong database knowledge (Big data and MS SQL Server) and will be responsible for meeting application architecture specifications and creation of data warehousing strategies.
β’ Play a role in planning, designing, implementing and maintaining the consolidated database environment.
β’ Interface with systems of records, platform teams, and other database engineers to support and deliver for the shared database environment.
β’ Responsible for managing underlying infrastructure needed for building and sustaining the database environment.
β’ Maintaining existing data model queries, performing enhancements as part of an Agile delivery team, deliver major/minor security fixes, data controls and vulnerability remediation as required.
β’ Lead moderately complex initiatives and deliverables within technical risk data domain environment.
β’ Design, code, test, debug, and document for work efforts associated with technology and data domain, including upgrades and deployments.
β’ Resolve moderately complex issues as part of a scrum delivery team to meet existing client needs while leveraging solid understanding of the function, policies, procedures, or compliance requirements.
β’ Update data sets into CARAT (Centralized Aggregated Risk and Analytics Tools) and RDS (Risk Data System), to allow the control team to conduct further analytics and reporting.
Qualifications:
β’ Applicants must be authorized to work for ANY employer in the U.S. This position is not eligible for visa sponsorship.
β’ Bachelorβs degree in computer science, Electrical and Computer Engineering, Electronic Engineering, Information Systems, Information Technology, or related technical discipline.
β’ Proven experience in Data Engineering, with a strong understanding of data architecture and pipeline development.
β’ Hands-on experience with Big Data technologies, including Hadoop, Spark, Hive, and/or AWS data services.
β’ Proficiency in Microsoft SQL Server and related tools such as SSIS, SSRS, and SSAS.
β’ Strong expertise in SQL, ETL processes, and data modeling.
β’ Solid understanding of data integration and ETL data load processes.
β’ Demonstrated experience in the end-to-end design and delivery of data warehouse solutions.
β’ Familiarity with Agile methodologies and participation in Scrum ceremonies.
β’ Knowledge of logical and physical data modeling techniques.
β’ Google Cloud Platform (GCP) certification or other advanced cloud certifications (preferred).
β’ Experience with data governance, risk management, and compliance frameworks (preferred).
β’ Working knowledge of Confluence, Jenkins, and GitHub.
β’ Prior experience working within a large financial institution is highly desirable.
Title: Data Engineer
Location: 300 S Brevard St Charlotte, NC
Duration: 12 months
Work Engagement: W2
Work Schedule: 3 days in office/2 days remote
Benefits on offer for this contract position: Health Insurance, Life insurance, 401K and Voluntary Benefits
Summary:
In this contingent resource assignment, you may: Consult on or participate in moderately complex initiatives and deliverables within Specialty Software Engineering and contribute to large-scale planning related to Specialty Software Engineering deliverables. Review and analyze moderately complex Specialty Software Engineering challenges that require an in-depth evaluation of variable factors. Contribute to the resolution of moderately complex issues and consult with others to meet Specialty Software Engineering deliverables while leveraging solid understanding of the function, policies, procedures, and compliance requirements. Collaborate with client personnel in Specialty Software Engineering.
Responsibilities:
β’ Part of a team that provides data integration, delivery and support for risk reporting.
β’ Will use strong database knowledge (Big data and MS SQL Server) and will be responsible for meeting application architecture specifications and creation of data warehousing strategies.
β’ Play a role in planning, designing, implementing and maintaining the consolidated database environment.
β’ Interface with systems of records, platform teams, and other database engineers to support and deliver for the shared database environment.
β’ Responsible for managing underlying infrastructure needed for building and sustaining the database environment.
β’ Maintaining existing data model queries, performing enhancements as part of an Agile delivery team, deliver major/minor security fixes, data controls and vulnerability remediation as required.
β’ Lead moderately complex initiatives and deliverables within technical risk data domain environment.
β’ Design, code, test, debug, and document for work efforts associated with technology and data domain, including upgrades and deployments.
β’ Resolve moderately complex issues as part of a scrum delivery team to meet existing client needs while leveraging solid understanding of the function, policies, procedures, or compliance requirements.
β’ Update data sets into CARAT (Centralized Aggregated Risk and Analytics Tools) and RDS (Risk Data System), to allow the control team to conduct further analytics and reporting.
Qualifications:
β’ Applicants must be authorized to work for ANY employer in the U.S. This position is not eligible for visa sponsorship.
β’ Bachelorβs degree in computer science, Electrical and Computer Engineering, Electronic Engineering, Information Systems, Information Technology, or related technical discipline.
β’ Proven experience in Data Engineering, with a strong understanding of data architecture and pipeline development.
β’ Hands-on experience with Big Data technologies, including Hadoop, Spark, Hive, and/or AWS data services.
β’ Proficiency in Microsoft SQL Server and related tools such as SSIS, SSRS, and SSAS.
β’ Strong expertise in SQL, ETL processes, and data modeling.
β’ Solid understanding of data integration and ETL data load processes.
β’ Demonstrated experience in the end-to-end design and delivery of data warehouse solutions.
β’ Familiarity with Agile methodologies and participation in Scrum ceremonies.
β’ Knowledge of logical and physical data modeling techniques.
β’ Google Cloud Platform (GCP) certification or other advanced cloud certifications (preferred).
β’ Experience with data governance, risk management, and compliance frameworks (preferred).
β’ Working knowledge of Confluence, Jenkins, and GitHub.
β’ Prior experience working within a large financial institution is highly desirable.