

Sr ETL Data Engineer: W2 Role
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a "Sr ETL Data Engineer" with a contract length of "unknown" and a pay rate of "$60.00 to $70.00 per hour." Located in "NYC: Hybrid," it requires 8+ years of experience in data engineering, strong SQL skills, and proficiency in cloud platforms.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
560
-
ποΈ - Date discovered
July 13, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
New York, NY
-
π§ - Skills detailed
#Data Access #Agile #Data Engineering #Data Pipeline #Datasets #Airflow #Python #SQL (Structured Query Language) #AI (Artificial Intelligence) #Storage #Azure #Observability #Security #MySQL #"ETL (Extract #Transform #Load)" #dbt (data build tool) #Terraform #Data Security #Data Warehouse #Luigi #Data Analysis #Databases #Data Integration #Data Science #BI (Business Intelligence) #Scripting #ML (Machine Learning) #Scala #Kubernetes #Redshift #PostgreSQL #Data Lake #Data Quality #Monitoring #BigQuery #Snowflake #Compliance #GIT #AWS (Amazon Web Services) #Docker #Database Schema #Cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Role: Sr ETL Data Engineer
Location: NYC: Hybrid
USC/ GC can apply
About The Role
You will be responsible for architecting and operating scalable, reliable data infrastructure that supports both operational and analytical workloads. This includes managing ETL pipelines, optimizing storage and access patterns, and supporting structured data modelling for use by analysts, data scientists, and application developers.
Responsibilities
β’ Design, build, and maintain robust data infrastructure to support analytics, reporting, and sustainability workflows
β’ Own the architecture and administration of relational and cloud-native databases (e.g., PostgreSQL, Snowflake, Redshift, MySQL)
β’ Build and manage ETL/ELT pipelines for ingesting and transforming data across systems and third-party sources
β’ Optimize database schemas, indexes, partitioning strategies, and storage for both query performance and cost-efficiency
β’ Enable and support analytics platforms by providing clean, well-documented, queryable datasets for downstream use (BI, dashboards, AI/ML)
β’ Implement and monitor data quality, governance, and access control policies
β’ Collaborate with data analysts, ESG specialists, and application developers to streamline data access and analytical readiness
β’ Automate and scale platform operations using infrastructure-as-code, containerization, and cloud-native orchestration tools
β’ Establish and manage data observability, pipeline monitoring, and alerting for reliability and integrity
β’ Support ESG data integration for internal and external sustainability disclosures, investment analysis, and regulatory reporting
Qualifications
β’ 8+ (or 5+ with exceptional expertise) years of experience in data engineering, data platform infrastructure, or analytics-focused engineering
β’ Strong command of SQL and database optimization techniques
β’ Experience with data warehouse systems (e.g., Snowflake, Redshift, BigQuery) and OLAP/OLTP hybrid design
β’ Proficiency with Python or similar for scripting, transformation, and data operations
β’ Familiarity with data pipeline frameworks (e.g., Airflow, dbt, Luigi) and orchestration practices
β’ Hands-on experience with cloud platforms (e.g., AWS, Azure) and cloud-native data services
β’ Experience building data platforms for analytics, BI, or machine learning use cases
β’ Knowledge of CI/CD, Git, and Agile methodologies applied to data workflows
β’ Solid understanding of data security, access control, and compliance
Preferred Skills
β’ Experience working with ESG/sustainability datasets or regulatory data (e.g., MSCI, CDP, SFDR)
β’ Knowledge of investment data models, portfolio risk data, or financial instrument structures
β’ Exposure to data versioning, columnar formats (Parquet, ORC), or data lake architectures
β’ Familiarity with infrastructure-as-code tools (e.g., Terraform, CloudFormation) and containerization (e.g., Docker, Kubernetes)
β’ Experience with data observability, monitoring pipelines, and alerting frameworks
Compensation: From $60.00 to $70.00 per hour