

Motion Recruitment
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a 24-month contract, paying competitively, located in Charlotte, NC or Irving, TX (Hybrid). Requires 5+ years in Software Engineering, expertise in ETL/ELT workflows, data governance, and proficiency with GCP and Apache Airflow.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 15, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Metadata #GCP (Google Cloud Platform) #Data Quality #Batch #Data Engineering #Scala #Airflow #Data Management #Data Governance #Delta Lake #PySpark #Cloud #Spark (Apache Spark) #Consulting #"ETL (Extract #Transform #Load)" #Apache Airflow #Compliance #Datasets #Data Pipeline
Role description
Outstanding long-term contract opportunity! A well-known Financial Services Company is looking for a Data Pipeline Engineer in Charlotte, NC or Irving, TX (Hybrid).
Work with the brightest minds at one of the largest financial institutions in the world. This is a long-term contract opportunity that includes a competitive benefit package! Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.
Contract Duration: 24 Months
Required Skills & Experience
• 5+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work or consulting experience, training, military experience, education
• Ability to design and develop ETL/ELT workflows and data pipelines for batch and real-time processing. (GCP or any other cloud platforms are fine I.E. PYSPARK)
• Build and maintain data pipelines for reporting and downstream applications using open source frameworks and cloud technologies
• Implement operational and analytical data stores leveraging Delta Lake and modern database concepts. (BIG QUERY OR SIMILAR APPS)
• Optimize data structures for performance and scalability across large datasets.
• Apply best practices for data governance, lineage tracking, and metadata management, including integration with Google Dataplex for centralized governance and data quality enforcement.
• Develop, schedule, and orchestrate complex workflows using Apache Airflow, with strong proficiency in designing and managing Airflow DAGs.
What You Will Be Doing
• Consult on complex initiatives with broad impact and large-scale planning for Software Engineering.
• Review and analyze complex multi-faceted, larger scale or longer-term Software Engineering challenges that require in-depth evaluation of multiple factors including intangibles or unprecedented factors.
• Contribute to the resolution of complex and multi-faceted situations requiring solid understanding of the function, policies, procedures, and compliance requirements that meet deliverables.
• Strategically collaborate and consult with client personnel.
• Design and develop ETL/ELT workflows and data pipelines for batch and real-time processing.
• Build and maintain data pipelines for reporting and downstream applications using open?source frameworks and cloud technologies.
• Implement operational and analytical data stores leveraging Delta Lake and modern database concepts.
• Optimize data structures for performance and scalability across large datasets.
• Collaborate with architects and engineering teams to ensure alignment with target?state architecture.
• Apply best practices for data governance, lineage tracking, and metadata management, including integration with Google Dataplex for centralized governance and data quality enforcement.
• Develop, schedule, and orchestrate complex workflows using Apache Airflow, with strong proficiency in designing and managing Airflow DAGs.
• Troubleshoot and resolve issues in data pipelines and ensure high availability and reliability.
Posted By: Rachel LeClair
Outstanding long-term contract opportunity! A well-known Financial Services Company is looking for a Data Pipeline Engineer in Charlotte, NC or Irving, TX (Hybrid).
Work with the brightest minds at one of the largest financial institutions in the world. This is a long-term contract opportunity that includes a competitive benefit package! Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today.
Contract Duration: 24 Months
Required Skills & Experience
• 5+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work or consulting experience, training, military experience, education
• Ability to design and develop ETL/ELT workflows and data pipelines for batch and real-time processing. (GCP or any other cloud platforms are fine I.E. PYSPARK)
• Build and maintain data pipelines for reporting and downstream applications using open source frameworks and cloud technologies
• Implement operational and analytical data stores leveraging Delta Lake and modern database concepts. (BIG QUERY OR SIMILAR APPS)
• Optimize data structures for performance and scalability across large datasets.
• Apply best practices for data governance, lineage tracking, and metadata management, including integration with Google Dataplex for centralized governance and data quality enforcement.
• Develop, schedule, and orchestrate complex workflows using Apache Airflow, with strong proficiency in designing and managing Airflow DAGs.
What You Will Be Doing
• Consult on complex initiatives with broad impact and large-scale planning for Software Engineering.
• Review and analyze complex multi-faceted, larger scale or longer-term Software Engineering challenges that require in-depth evaluation of multiple factors including intangibles or unprecedented factors.
• Contribute to the resolution of complex and multi-faceted situations requiring solid understanding of the function, policies, procedures, and compliance requirements that meet deliverables.
• Strategically collaborate and consult with client personnel.
• Design and develop ETL/ELT workflows and data pipelines for batch and real-time processing.
• Build and maintain data pipelines for reporting and downstream applications using open?source frameworks and cloud technologies.
• Implement operational and analytical data stores leveraging Delta Lake and modern database concepts.
• Optimize data structures for performance and scalability across large datasets.
• Collaborate with architects and engineering teams to ensure alignment with target?state architecture.
• Apply best practices for data governance, lineage tracking, and metadata management, including integration with Google Dataplex for centralized governance and data quality enforcement.
• Develop, schedule, and orchestrate complex workflows using Apache Airflow, with strong proficiency in designing and managing Airflow DAGs.
• Troubleshoot and resolve issues in data pipelines and ensure high availability and reliability.
Posted By: Rachel LeClair






