

Amtex Systems Inc.
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in New York City, NY, focusing on Alternative Data ingestion. Contract length is unspecified, with a pay rate of "$XX/hour." Requires 15+ years in US financial investment, expert Python/SQL, and dbt proficiency.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 13, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, NY
-
🧠 - Skills detailed
#S3 (Amazon Simple Storage Service) #Datasets #Python #Data Engineering #Pandas #Spark (Apache Spark) #API (Application Programming Interface) #Data Science #PySpark #Snowflake #SQL (Structured Query Language) #AWS S3 (Amazon Simple Storage Service) #Data Pipeline #dbt (data build tool) #NumPy #Cloud #Data Ingestion #AWS (Amazon Web Services) #Data Quality #Airflow #"ETL (Extract #Transform #Load)" #Storage #Libraries #Web Scraping #SciPy
Role description
Job Title: Senior Data Engineer / Data Ingestion Expert (Alternative Data)
Location: New York City, NY (Park Avenue & 55th/56th St)
Work Model: 4 Days On-site / 1 Day Remote (Virtual Machine-based environment)
Department: Research
Interview Process
1. Screening: 2 x 45-minute Video Interviews.
1. Final: 2.5-hour in-person interview in NYC (intensive live coding and architectural scenarios).
Eligibility Requirements (Strict)
• Language: Clear, professional English communication.
• Tenure: Minimum 10 years of professional working experience within the USA.
Role Overview
A premier $9B AUM Hedge Fund is seeking an elite Senior Data Engineer to join our Research Department. This role is focused on the end-to-end ingestion and integration of Alternative Data at scale (hundreds of TBs to Petabytes). You will not just execute; you will be expected to provide architectural suggestions, improve existing workflows, and own the data pipeline from raw ingestion to downstream analytics.
Technical Requirements
• Core Languages: Expert-level Python and SQL.
• Data Transformation: Extensive experience with dbt (Data Build Tool) is critical for building modular, testable transformations.
• Infrastructure & Tools:
• Snowflake: Expertise in Snowflake vendor shares and loading.
• Cloud/Storage: AWS S3, SFTP servers, and robust API integration experience.
• Orchestration: Experience with Prefect (or similar tools like Airflow/Dagster).
• Python Stack: Deep knowledge of NumPy, SciPy, and Pandas. Familiarity with high-performance libraries like Polars, DASK, or PySpark is highly preferred.
• Environment: Experience working within Windows-based Virtual Machines (VMs).
Key Responsibilities
• Data Ingestion: Build ETL processes to ingest complex Alternative Data (Credit Card transactions, Geolocation, Satellite imagery, Web scraping, etc.) from APIs and S3 into Snowflake.
• Analytics Integration: Integrate raw data into internal Python-based analytics systems and SQL processes.
• System Optimization: Evaluate a "non-perfect" legacy system and proactively suggest/implement improvements for data quality and alerting.
• End-to-End Ownership: Monitor and maintain the pipelines you build. This includes early-morning troubleshooting (as early as 6:00 AM) to ensure data availability for the trading day.
Qualifications & Candidate Profile
• Experience: 15+ years of expertise within US financial investment companies (Hedge Funds or Asset Managers preferred).
• Alternative Data Expertise: Deep familiarity with non-market data sources and how they are used to generate investment signals.
• Communication: Must possess excellent communication skills to bridge the gap between Data Scientists, Engineers, and Business Stakeholders.
• Maturity: High sense of "ownership"—you build it, you run it, you fix it.
• Efficiency: Ability to work in a high-pressure environment with rapid turnaround times (e.g., processing 10+ datasets in a 2-week sprint).
Job Title: Senior Data Engineer / Data Ingestion Expert (Alternative Data)
Location: New York City, NY (Park Avenue & 55th/56th St)
Work Model: 4 Days On-site / 1 Day Remote (Virtual Machine-based environment)
Department: Research
Interview Process
1. Screening: 2 x 45-minute Video Interviews.
1. Final: 2.5-hour in-person interview in NYC (intensive live coding and architectural scenarios).
Eligibility Requirements (Strict)
• Language: Clear, professional English communication.
• Tenure: Minimum 10 years of professional working experience within the USA.
Role Overview
A premier $9B AUM Hedge Fund is seeking an elite Senior Data Engineer to join our Research Department. This role is focused on the end-to-end ingestion and integration of Alternative Data at scale (hundreds of TBs to Petabytes). You will not just execute; you will be expected to provide architectural suggestions, improve existing workflows, and own the data pipeline from raw ingestion to downstream analytics.
Technical Requirements
• Core Languages: Expert-level Python and SQL.
• Data Transformation: Extensive experience with dbt (Data Build Tool) is critical for building modular, testable transformations.
• Infrastructure & Tools:
• Snowflake: Expertise in Snowflake vendor shares and loading.
• Cloud/Storage: AWS S3, SFTP servers, and robust API integration experience.
• Orchestration: Experience with Prefect (or similar tools like Airflow/Dagster).
• Python Stack: Deep knowledge of NumPy, SciPy, and Pandas. Familiarity with high-performance libraries like Polars, DASK, or PySpark is highly preferred.
• Environment: Experience working within Windows-based Virtual Machines (VMs).
Key Responsibilities
• Data Ingestion: Build ETL processes to ingest complex Alternative Data (Credit Card transactions, Geolocation, Satellite imagery, Web scraping, etc.) from APIs and S3 into Snowflake.
• Analytics Integration: Integrate raw data into internal Python-based analytics systems and SQL processes.
• System Optimization: Evaluate a "non-perfect" legacy system and proactively suggest/implement improvements for data quality and alerting.
• End-to-End Ownership: Monitor and maintain the pipelines you build. This includes early-morning troubleshooting (as early as 6:00 AM) to ensure data availability for the trading day.
Qualifications & Candidate Profile
• Experience: 15+ years of expertise within US financial investment companies (Hedge Funds or Asset Managers preferred).
• Alternative Data Expertise: Deep familiarity with non-market data sources and how they are used to generate investment signals.
• Communication: Must possess excellent communication skills to bridge the gap between Data Scientists, Engineers, and Business Stakeholders.
• Maturity: High sense of "ownership"—you build it, you run it, you fix it.
• Efficiency: Ability to work in a high-pressure environment with rapid turnaround times (e.g., processing 10+ datasets in a 2-week sprint).






