Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer (Contract) for a prominent FinTech client, offering $110.00 - $120.00 per hour. Requires 8+ years in software engineering, 5+ years in AWS data pipelines, and expertise in PySpark, Terraform, and Airflow. Remote work location.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
960
-
πŸ—“οΈ - Date discovered
August 27, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Remote
-
🧠 - Skills detailed
#Databases #Computer Science #SQL (Structured Query Language) #AWS Glue #Cloud #Scala #Data Quality #Debugging #Data Engineering #Spark (Apache Spark) #Airflow #Data Pipeline #Observability #Security #Terraform #PySpark #NoSQL #AWS (Amazon Web Services)
Role description
Data Engineer (Contract) at Invu Technology Invu Technology Inc is a respected boutique technology services firm based in the Silicon Valley, boasting two decades of excellence in delivering top-tier services to leading startups and corporate clients. Our team members have garnered invaluable experiences working on high-profile, mission-critical projects for esteemed companies such as American Express, Cisco, Autodesk, Boku, Convoke Systems, Rapid Recon, and Billd. We are currently seeking a highly skilled Data Engineer for a contract role to join an extensive engagement with one of our key clients, a prominent player in the FinTech industry. Key Responsibilities ● This individual will play a pivotal role in building robust data pipelines with a strong focus on performance, reliability, and data quality. ● Collaborate within a team environment to construct data pipelines within an AWS ecosystem. ● Ensure the flexibility, scalability, and robust security of data pipelines. ● Ensure quality levels and observability of data pipelines. ● Utilize tools and frameworks including PySpark, Terraform, Airflow, and AWS Glue to build and manage scalable data solutions. ● Engage closely with fellow team members and technology leaders to drive project success. Required Experience: ● 8+ years of software engineering experience. ● 5+ years of hands-on experience in data engineering constructing data pipelines within AWS. ● Extensive familiarity in building data pipelines using PySpark, Terraform, Airflow, and AWS Glue as well as SQL and NoSQL databases ● Experience using Terraform to provision cloud environments for development and execution of data pipelines ● Experience with debugging and observability tools that ensure robust diagnostics and quality levels ● Demonstrated expertise in best practices for team collaboration and engineering.● Strong problem-solving abilities and meticulous attention to detail. ● Excellent communication and collaboration skills. Additional Requirements: ● Bachelor's degree in Computer Science or a related field. ● Must be based in the US and possess authorization to work in the US. Job Type: Contract Pay: $110.00 - $120.00 per hour Application Question(s): Do you have PySpark experience? Work Location: Remote