

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 4–6 years of experience, focusing on ETL, Snowflake, SQL, and AWS in the financial services industry. It is a remote, up to 1-year contract position with a pay rate of "pay rate".
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
June 17, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Version Control #Redshift #Schema Design #Lambda (AWS Lambda) #Snowflake #Complex Queries #Data Governance #Documentation #SQL (Structured Query Language) #Scala #S3 (Amazon Simple Storage Service) #AWS (Amazon Web Services) #Data Quality #GIT #Data Ingestion #Consulting #"ETL (Extract #Transform #Load)" #Compliance #Data Engineering #Data Pipeline #Data Analysis #Security #Airflow
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Location: Remote (US-based)
Duration: Up to 1-Year Contract (via Consultancy)
Technologies: ETL, Snowflake, SQL, AWS
Industry: Financial Services / Payments
About the Role:
We are seeking a skilled Data Engineer to support ongoing data infrastructure initiatives for a leading financial payments platform. This is a remote, US-based contract opportunity (up to one year) delivered through a consultancy, ideal for mid-level data engineers looking to grow their experience in a high-impact, regulated environment. The role will focus on building and optimizing data pipelines across a modern data stack with an emphasis on Snowflake and AWS.
Key Responsibilities:
• Design, develop, and maintain robust ETL/ELT pipelines to support data ingestion, transformation, and delivery.
• Work with large-scale data sets in Snowflake and optimize performance and scalability of SQL-based queries and data models.
• Collaborate with data analysts, engineers, and business stakeholders to ensure data solutions meet financial reporting and compliance requirements.
• Integrate data from various sources (structured and unstructured) using AWS-native services and third-party tools.
• Monitor, troubleshoot, and improve data workflows, ensuring data quality, consistency, and reliability across systems.
• Contribute to documentation and best practices for scalable data engineering processes.
Qualifications:
• 4–6 years of hands-on experience in data engineering or a related field.
• Proficient in developing ETL pipelines using modern tools and techniques.
• Strong experience with Snowflake, including performance tuning and schema design.
• Proficient in SQL with the ability to write complex queries and optimize them for performance.
• Hands-on experience with AWS services (e.g., S3, Glue, Lambda, Redshift, etc.).
• Understanding of data governance, privacy, and security in a financial or regulated environment.
Preferred Qualifications:
• Prior experience working in or with financial services or payments platforms.
• Familiarity with CI/CD workflows, orchestration tools (e.g., Airflow), and version control (Git).
• Experience working in a consulting or client-facing environment.