Senior Data Engineer – Quantitative Data Pipelines

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer focused on quantitative data pipelines, offering a contract of more than 6 months, with a pay rate of $90,000 - $144,000 annually. Key skills include Databricks, AWS, Python, and SQL, with 7+ years of experience required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
654.5454545455
-
🗓️ - Date discovered
June 4, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#AWS (Amazon Web Services) #Data Manipulation #Data Quality #Data Engineering #dbt (data build tool) #Data Pipeline #Scala #"ETL (Extract #Transform #Load)" #Python #Data Architecture #Cloud #Datasets #Libraries #Data Science #Airflow #Apache Airflow #Documentation #Pandas #SQL (Structured Query Language) #ML (Machine Learning) #Databricks
Role description
What we're working on: We help global enterprises launch digital products that reach millions of users. Our projects involve massive datasets, complex pipelines, and real-world impact across industries. We're seeking a Data Engineer to join our remote team and help us solve meaningful problems with clean, scalable data solutions. What You'll Do • Design, develop, and maintain efficient ETL/ELT data pipelines using Databricks, AWS, DBT, and Apache Airflow • Transform and prepare data for quantitative analysis and machine learning applications • Collaborate directly with data scientists and quant analysts to understand their data requirements • Implement data quality checks and validation processes to ensure accuracy of financial data • Optimize data models and queries for improved performance • Document data flows, transformations, and technical specifications • Support on-the-fly development requests in a fast-paced environment • Troubleshoot and resolve data pipeline issues independently What We’re Looking For • 7+ years of experience as a Data Engineer with at least 2 years focusing on Databricks ecosystem • Strong proficiency in Python, including data science libraries like Pandas • Proven experience with AWS cloud services and data architecture • Experience with DBT for data transformation and documentation • Familiarity with Apache Airflow for workflow orchestration • Strong SQL skills for data manipulation and analysis • Excellent communication skills with ability to collaborate directly with end-users • Self-starter capable of working independently with minimal supervision • Ability to work in a fast-paced environment with changing requirements The job is 100% Remote; please ensure you have a comfortable office set at your desired work location. Ongoing recruitment – no set deadline. Salary Salary range: $90,000 - $144,000 annually, with final compensation determined by your qualifications, expertise, experience, and the role's scope. In addition to competitive pay, we offer a variety of benefits to support your professional and personal growth, including: • Flexible working hours in a remote environment. • Health insurance (medical and dental) for W2 Employees • 401K Contribution • A professional development fund to enhance your skills and knowledge. • 15 days of paid time off annually. • Access to soft-skill development courses to further your career. This is a full-time position requiring a minimum of 40 hours per week, Monday through Friday. At Lumenalta, we are committed to creating an environment that prioritizes growth, work-life balance, and the diverse needs of our team members. Ongoing recruitment – no set deadline.