
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer (Contract) for a prominent FinTech client, offering $110.00 - $120.00 per hour. Requires 8+ years in software engineering, 5+ years in AWS data pipelines, and expertise in PySpark, Terraform, and Airflow. Remote work location.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
960
-
ποΈ - Date discovered
August 27, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Remote
-
π§ - Skills detailed
#Databases #Computer Science #SQL (Structured Query Language) #AWS Glue #Cloud #Scala #Data Quality #Debugging #Data Engineering #Spark (Apache Spark) #Airflow #Data Pipeline #Observability #Security #Terraform #PySpark #NoSQL #AWS (Amazon Web Services)
Role description
Data Engineer (Contract) at Invu Technology
Invu Technology Inc is a respected boutique technology services firm based in the
Silicon Valley, boasting two decades of excellence in delivering top-tier services to
leading startups and corporate clients. Our team members have garnered invaluable
experiences working on high-profile, mission-critical projects for esteemed companies
such as American Express, Cisco, Autodesk, Boku, Convoke Systems, Rapid Recon,
and Billd.
We are currently seeking a highly skilled Data Engineer for a contract role to join an
extensive engagement with one of our key clients, a prominent player in the FinTech
industry.
Key Responsibilities
β This individual will play a pivotal role in building robust data pipelines with a
strong focus on performance, reliability, and data quality.
β Collaborate within a team environment to construct data pipelines within an AWS
ecosystem.
β Ensure the flexibility, scalability, and robust security of data pipelines.
β Ensure quality levels and observability of data pipelines.
β Utilize tools and frameworks including PySpark, Terraform, Airflow, and AWS
Glue to build and manage scalable data solutions.
β Engage closely with fellow team members and technology leaders to drive
project success.
Required Experience:
β 8+ years of software engineering experience.
β 5+ years of hands-on experience in data engineering constructing data
pipelines within AWS.
β Extensive familiarity in building data pipelines using PySpark, Terraform,
Airflow, and AWS Glue as well as SQL and NoSQL databases
β Experience using Terraform to provision cloud environments for development
and execution of data pipelines
β Experience with debugging and observability tools that ensure robust
diagnostics and quality levels
β Demonstrated expertise in best practices for team collaboration and
engineering.β Strong problem-solving abilities and meticulous attention to detail.
β Excellent communication and collaboration skills.
Additional Requirements:
β Bachelor's degree in Computer Science or a related field.
β Must be based in the US and possess authorization to work in the US.
Job Type: Contract
Pay: $110.00 - $120.00 per hour
Application Question(s):
Do you have PySpark experience?
Work Location: Remote
Data Engineer (Contract) at Invu Technology
Invu Technology Inc is a respected boutique technology services firm based in the
Silicon Valley, boasting two decades of excellence in delivering top-tier services to
leading startups and corporate clients. Our team members have garnered invaluable
experiences working on high-profile, mission-critical projects for esteemed companies
such as American Express, Cisco, Autodesk, Boku, Convoke Systems, Rapid Recon,
and Billd.
We are currently seeking a highly skilled Data Engineer for a contract role to join an
extensive engagement with one of our key clients, a prominent player in the FinTech
industry.
Key Responsibilities
β This individual will play a pivotal role in building robust data pipelines with a
strong focus on performance, reliability, and data quality.
β Collaborate within a team environment to construct data pipelines within an AWS
ecosystem.
β Ensure the flexibility, scalability, and robust security of data pipelines.
β Ensure quality levels and observability of data pipelines.
β Utilize tools and frameworks including PySpark, Terraform, Airflow, and AWS
Glue to build and manage scalable data solutions.
β Engage closely with fellow team members and technology leaders to drive
project success.
Required Experience:
β 8+ years of software engineering experience.
β 5+ years of hands-on experience in data engineering constructing data
pipelines within AWS.
β Extensive familiarity in building data pipelines using PySpark, Terraform,
Airflow, and AWS Glue as well as SQL and NoSQL databases
β Experience using Terraform to provision cloud environments for development
and execution of data pipelines
β Experience with debugging and observability tools that ensure robust
diagnostics and quality levels
β Demonstrated expertise in best practices for team collaboration and
engineering.β Strong problem-solving abilities and meticulous attention to detail.
β Excellent communication and collaboration skills.
Additional Requirements:
β Bachelor's degree in Computer Science or a related field.
β Must be based in the US and possess authorization to work in the US.
Job Type: Contract
Pay: $110.00 - $120.00 per hour
Application Question(s):
Do you have PySpark experience?
Work Location: Remote