

Data Integration Lead/Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Integration Lead/Architect on a contract basis for over 6 months, offering $68.10 - $75.00 per hour. Key requirements include 7+ years in Data Engineering, ETL/ELT tools, SQL Server, and experience with Cloud Data Lakes and real-time pipelines.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
600
-
ποΈ - Date discovered
September 20, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Austin, TX
-
π§ - Skills detailed
#Kafka (Apache Kafka) #Data Lakehouse #Cloud #"ETL (Extract #Transform #Load)" #RDBMS (Relational Database Management System) #Data Lake #SSIS (SQL Server Integration Services) #BI (Business Intelligence) #AWS Lambda #Documentation #Databases #Data Analysis #SQL (Structured Query Language) #Data Integration #Scala #Python #Microsoft Power BI #Data Warehouse #AWS (Amazon Web Services) #Compliance #Batch #Data Engineering #Kubernetes #Snowflake #API (Application Programming Interface) #Data Processing #Monitoring #Data Quality #Security #SQL Server #Strategy #Data Governance #Data Pipeline #Lambda (AWS Lambda)
Role description
We are seeking a highly experienced Senior Data Engineer / Data Integration Architect to design, implement, and manage enterprise-level data integration solutions. This role is critical to building scalable, reliable, and secure data pipelines that support both batch and real-time data processing needs. You will collaborate with business teams, data analysts, and developers to deliver robust, high-performance data integration solutions.
Responsibilities:
Develop data integration strategy, architecture, and roadmap (e.g., batch vs. real-time, API vs. ETL).
Maintain documentation for data flows, mappings, and transformation logic across all workloads.
Ensure performance, scalability, and reliability of data movement processes.
Implement data validation, cleansing, and reconciliation processes to ensure data quality.
Optimize data workflows for performance, cost-efficiency, and scalability.
Define and enforce data integration standards and best practices.
Design and develop data pipelines for internal and external data from diverse sources.
Build API-based integrations, data sharing, and transformation solutions.
Implement data integration monitoring systems, including logs, error handling, and alert mechanisms.
Ensure compliance with data governance, security, and regulatory requirements (PII, HIPAA, etc.).
Communicate effectively with business stakeholders, presenting solutions and progress updates.
Required Skills & Qualifications:
7+ years of experience with Data Warehouse design and development.
7+ years in Data Engineering and Data Integration Architecture.
Hands-on expertise with ETL/ELT tools and SQL Server (or other RDBMS).
Proven experience developing scalable data pipelines.
Strong background in data quality frameworks, QA/QC implementation.
5+ years working with SSIS and Cloud Data Lakes (e.g., Snowflake).
5+ years with BI Tools (Power BI preferred).
Experience with API-based integrations (MuleSoft preferred).
3+ years working with real-time pipelines (Kafka, Python).
Strong problem-solving, communication, and collaboration skills.
Preferred Skills:
Experience with Data Lakehouse Architecture.
Knowledge of data handling compliance (PII, HIPAA).
Experience with Salesforce, AWS Lambda, Kubernetes, and MS Access databases.
Familiarity with Microsoft 365 suite.
Job Types: Full-time, Contract
Pay: $68.10 - $75.00 per hour
Expected hours: 37.5 per week
Application Question(s):
Two professional References required for the Interview process -
Experience:
Data Warehouse: 8 years (Preferred)
Data Engineering: 8 years (Preferred)
Data integration architecture: 8 years (Preferred)
ETL / ELT tool: 8 years (Preferred)
SQL Server or other RDBMS: 8 years (Preferred)
Developing Data pipelines: 8 years (Preferred)
Data quality framework, QA/QC: 8 years (Preferred)
Cloud data lake and SSIS: 5 years (Preferred)
BI tools (ex: Power BI): 5 years (Preferred)
API ( EX. Mulesoft): 5 years (Preferred)
Real time pipelines β Kafka, Python: 3 years (Preferred)
Work Location: On the road
We are seeking a highly experienced Senior Data Engineer / Data Integration Architect to design, implement, and manage enterprise-level data integration solutions. This role is critical to building scalable, reliable, and secure data pipelines that support both batch and real-time data processing needs. You will collaborate with business teams, data analysts, and developers to deliver robust, high-performance data integration solutions.
Responsibilities:
Develop data integration strategy, architecture, and roadmap (e.g., batch vs. real-time, API vs. ETL).
Maintain documentation for data flows, mappings, and transformation logic across all workloads.
Ensure performance, scalability, and reliability of data movement processes.
Implement data validation, cleansing, and reconciliation processes to ensure data quality.
Optimize data workflows for performance, cost-efficiency, and scalability.
Define and enforce data integration standards and best practices.
Design and develop data pipelines for internal and external data from diverse sources.
Build API-based integrations, data sharing, and transformation solutions.
Implement data integration monitoring systems, including logs, error handling, and alert mechanisms.
Ensure compliance with data governance, security, and regulatory requirements (PII, HIPAA, etc.).
Communicate effectively with business stakeholders, presenting solutions and progress updates.
Required Skills & Qualifications:
7+ years of experience with Data Warehouse design and development.
7+ years in Data Engineering and Data Integration Architecture.
Hands-on expertise with ETL/ELT tools and SQL Server (or other RDBMS).
Proven experience developing scalable data pipelines.
Strong background in data quality frameworks, QA/QC implementation.
5+ years working with SSIS and Cloud Data Lakes (e.g., Snowflake).
5+ years with BI Tools (Power BI preferred).
Experience with API-based integrations (MuleSoft preferred).
3+ years working with real-time pipelines (Kafka, Python).
Strong problem-solving, communication, and collaboration skills.
Preferred Skills:
Experience with Data Lakehouse Architecture.
Knowledge of data handling compliance (PII, HIPAA).
Experience with Salesforce, AWS Lambda, Kubernetes, and MS Access databases.
Familiarity with Microsoft 365 suite.
Job Types: Full-time, Contract
Pay: $68.10 - $75.00 per hour
Expected hours: 37.5 per week
Application Question(s):
Two professional References required for the Interview process -
Experience:
Data Warehouse: 8 years (Preferred)
Data Engineering: 8 years (Preferred)
Data integration architecture: 8 years (Preferred)
ETL / ELT tool: 8 years (Preferred)
SQL Server or other RDBMS: 8 years (Preferred)
Developing Data pipelines: 8 years (Preferred)
Data quality framework, QA/QC: 8 years (Preferred)
Cloud data lake and SSIS: 5 years (Preferred)
BI tools (ex: Power BI): 5 years (Preferred)
API ( EX. Mulesoft): 5 years (Preferred)
Real time pipelines β Kafka, Python: 3 years (Preferred)
Work Location: On the road