

Lead Data Python Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a "Lead Data Python Engineer – Capital Markets" in Irvine, CA or Los Angeles, CA (Hybrid – 3 Days/Week Onsite) with a contract length of unspecified duration, offering $70–$75/hr. Requires 5–8+ years of data engineering experience, strong expertise in Python, PySpark, Hive, SQL, and AWS, along with capital markets experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
600
-
🗓️ - Date discovered
September 26, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
1099 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Los Angeles, CA
-
🧠 - Skills detailed
#Security #Spark (Apache Spark) #Data Storage #Data Processing #Consulting #AWS (Amazon Web Services) #Compliance #"ETL (Extract #Transform #Load)" #React #ADaM (Analysis Data Model) #Cloud #SQL (Structured Query Language) #Python #PySpark #Data Pipeline #Scala #Java #Data Governance #Data Engineering #Angular #Storage
Role description
Hello,
We have 3 urgent openings for an “Lead Data Python Engineer – Capital Markets". These are Irvine, CA or Los Angeles, CA (Hybrid – 3 Days/Week Onsite).
Job Title: Data Python Engineer – Capital Markets
Location: Irvine, CA or Los Angeles, CA (Hybrid – 3 Days/Week Onsite)
Client: Sapient / Capital Group
Rate: $70–$75/hr (W2 or 1099)
Job Description:
We are seeking a highly skilled Data Python Engineer with strong Capital Markets / Financial Services experience to join our client team at Capital Group. The ideal candidate will have deep expertise in Python, PySpark, Hive, and SQL, combined with strong data engineering skills on AWS cloud platforms.
This role requires working onsite 3 days a week in either Irvine, CA or Los Angeles, CA, collaborating closely with stakeholders to build scalable data pipelines and solutions that drive insights and efficiency within the financial domain.
Key Responsibilities:
• Design, build, and maintain robust data pipelines using Python, PySpark, Hive, and SQL.
• Work on large-scale data processing and transformation in AWS cloud environments.
• Collaborate with Capital Markets teams to deliver high-performance data solutions supporting trading, risk, and financial analytics.
• Ensure compliance with data governance, quality, and security standards.
• Partner with cross-functional teams to align technical solutions with business goals.
Required Skills & Experience:
• 5–8+ years of professional experience in data engineering and financial services.
• Strong hands-on expertise in Python, PySpark, Hive, and SQL.
• Solid background working with capital markets or financial domain data.
• Experience with AWS services for data storage, processing, and orchestration.
• Strong problem-solving skills with the ability to work in a fast-paced environment.
• Excellent communication and collaboration abilities.
ABOUT US:
Anagh Technologies is a technical consulting firm specializing in UI, Front-End, and Full-Stack web technologies. We currently have 30+ positions in Angular, React, Node, and Java.
If technically strong, we can 100% get you an offer within 2 weeks MAX, as we will consider you for multiple roles at once. If you are interested and are available, please email me your resume and contact information to adam.h AT anaghtech.com. Thank you for your time.
Hello,
We have 3 urgent openings for an “Lead Data Python Engineer – Capital Markets". These are Irvine, CA or Los Angeles, CA (Hybrid – 3 Days/Week Onsite).
Job Title: Data Python Engineer – Capital Markets
Location: Irvine, CA or Los Angeles, CA (Hybrid – 3 Days/Week Onsite)
Client: Sapient / Capital Group
Rate: $70–$75/hr (W2 or 1099)
Job Description:
We are seeking a highly skilled Data Python Engineer with strong Capital Markets / Financial Services experience to join our client team at Capital Group. The ideal candidate will have deep expertise in Python, PySpark, Hive, and SQL, combined with strong data engineering skills on AWS cloud platforms.
This role requires working onsite 3 days a week in either Irvine, CA or Los Angeles, CA, collaborating closely with stakeholders to build scalable data pipelines and solutions that drive insights and efficiency within the financial domain.
Key Responsibilities:
• Design, build, and maintain robust data pipelines using Python, PySpark, Hive, and SQL.
• Work on large-scale data processing and transformation in AWS cloud environments.
• Collaborate with Capital Markets teams to deliver high-performance data solutions supporting trading, risk, and financial analytics.
• Ensure compliance with data governance, quality, and security standards.
• Partner with cross-functional teams to align technical solutions with business goals.
Required Skills & Experience:
• 5–8+ years of professional experience in data engineering and financial services.
• Strong hands-on expertise in Python, PySpark, Hive, and SQL.
• Solid background working with capital markets or financial domain data.
• Experience with AWS services for data storage, processing, and orchestration.
• Strong problem-solving skills with the ability to work in a fast-paced environment.
• Excellent communication and collaboration abilities.
ABOUT US:
Anagh Technologies is a technical consulting firm specializing in UI, Front-End, and Full-Stack web technologies. We currently have 30+ positions in Angular, React, Node, and Java.
If technically strong, we can 100% get you an offer within 2 weeks MAX, as we will consider you for multiple roles at once. If you are interested and are available, please email me your resume and contact information to adam.h AT anaghtech.com. Thank you for your time.