

Invu Technology
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of unspecified duration, offering a pay rate of $90.00 - $100.00 per hour. It requires 5+ years in software engineering, 3+ years in AWS data pipeline construction, and a Bachelor's degree in Computer Science. Hybrid work is based in San Francisco, CA.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
800
-
🗓️ - Date
March 19, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
San Francisco, CA 94111
-
🧠 - Skills detailed
#Data Quality #S3 (Amazon Simple Storage Service) #Redshift #Security #Spark (Apache Spark) #Scala #Computer Science #NoSQL #Airflow #Observability #Databases #Java #Hadoop #AWS (Amazon Web Services) #Data Engineering #SQL (Structured Query Language) #Python #Terraform #Data Pipeline
Role description
About the job
Invu Technology Inc is a respected boutique technology services firm based in the Silicon Valley, boasting two decades of excellence in delivering top-tier services to leading startups and corporate clients. Our team members have garnered invaluable experiences working on high-profile, mission-critical projects for esteemed companies such as American Express, Cisco, Autodesk, Boku, Convoke Systems, Rapid Recon, and Billd.
We are currently seeking a highly skilled Data Engineer to join an extensive engagement with one of our key clients, a prominent player in the FinTech industry.
Key Responsibilities:
This individual will play a pivotal role in building robust data pipelines with a strong focus on performance, reliability, and data quality.
Collaborate within a team environment to construct data pipelines within an AWS ecosystem.
Ensure the flexibility, scalability, and robust security of data pipelines.
Ensure quality levels and observability of data pipelines
Engage closely with fellow team members and technology leaders to drive project success.
Required Experience:
5+ years of software engineering experience.
3+ years of hands-on experience in data engineering constructing data engineering constructing data pipelines within AWS.
Extensive familiarity in building data pipelines using Java, Python, Redshift, S3, Airflow, Glue, Terraform as well as SQL and NoSQL databases.
Experience with Hadoop, Spark and other data tools
Demonstrated expertise in best practices for team collaboration and engineering.
Strong problem-solving abilities and meticulous attention to detail.
Excellent communication and collaboration skills.
Additional Requirements:
Bachelor's degree in Computer Science or a related field.
Must be based in the US and possess authorization to work in the US.
Position is hybrid, working 3 days a week in the downtown San Francisco office.
Pay: $90.00 - $100.00 per hour
Application Question(s):
Are you able to work onsite in San Francisco 3 days per week?
How many years of experience do you have building data pipelines in AWS?
How many years of professional experience do you have with Python or Java?
Work Location: Hybrid remote in San Francisco, CA 94111
About the job
Invu Technology Inc is a respected boutique technology services firm based in the Silicon Valley, boasting two decades of excellence in delivering top-tier services to leading startups and corporate clients. Our team members have garnered invaluable experiences working on high-profile, mission-critical projects for esteemed companies such as American Express, Cisco, Autodesk, Boku, Convoke Systems, Rapid Recon, and Billd.
We are currently seeking a highly skilled Data Engineer to join an extensive engagement with one of our key clients, a prominent player in the FinTech industry.
Key Responsibilities:
This individual will play a pivotal role in building robust data pipelines with a strong focus on performance, reliability, and data quality.
Collaborate within a team environment to construct data pipelines within an AWS ecosystem.
Ensure the flexibility, scalability, and robust security of data pipelines.
Ensure quality levels and observability of data pipelines
Engage closely with fellow team members and technology leaders to drive project success.
Required Experience:
5+ years of software engineering experience.
3+ years of hands-on experience in data engineering constructing data engineering constructing data pipelines within AWS.
Extensive familiarity in building data pipelines using Java, Python, Redshift, S3, Airflow, Glue, Terraform as well as SQL and NoSQL databases.
Experience with Hadoop, Spark and other data tools
Demonstrated expertise in best practices for team collaboration and engineering.
Strong problem-solving abilities and meticulous attention to detail.
Excellent communication and collaboration skills.
Additional Requirements:
Bachelor's degree in Computer Science or a related field.
Must be based in the US and possess authorization to work in the US.
Position is hybrid, working 3 days a week in the downtown San Francisco office.
Pay: $90.00 - $100.00 per hour
Application Question(s):
Are you able to work onsite in San Francisco 3 days per week?
How many years of experience do you have building data pipelines in AWS?
How many years of professional experience do you have with Python or Java?
Work Location: Hybrid remote in San Francisco, CA 94111






