BCforward

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in New York, NY, on a long-term contract with a pay rate of "unknown." Key skills required include Python, SQL, data modeling, and Airflow. Experience with data warehousing tools like BigQuery and Databricks is essential.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
520
-
πŸ—“οΈ - Date
December 2, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
New York, NY
-
🧠 - Skills detailed
#Data Warehouse #BigQuery #Delta Lake #Oracle #Data Pipeline #Data Modeling #Data Engineering #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Spark (Apache Spark) #Airflow #SQL Server #Databricks #Python #SQL Queries #Cloud
Role description
Title: Senior Data Engineer Location: New York, NY 10281 (Hybrid) Duration: Long-Term Contract (Possible Full-Time) Interview Process: 1–2 Rounds Overview: We are seeking a highly skilled Data Engineer with strong expertise in building data pipelines and integrating data from multiple sources. This role supports FP&A initiatives and IT spend tracking, leveraging tools like IBM Apptio. The ideal candidate is a self-starter who can work independently and deliver high-quality solutions in a fast-paced environment. Key Responsibilities: β€’ Design and build ETL pipelines from scratch using Python. β€’ Develop efficient, reusable, and maintainable code following OOP principles. β€’ Perform advanced data modeling (schemas, entity relationships). β€’ Write and optimize complex SQL queries across multiple dialects (SQL Server, DB2, Oracle). β€’ Create and manage Airflow DAGs for workflow orchestration. β€’ Ingest, cleanse, govern, and report data from data warehouses such as BigQuery and Databricks Delta Lakehouse. β€’ Collaborate with stakeholders to integrate data from diverse sources. β€’ Learn and adapt to new cloud-based business applications and tools. Must-Have Skills: β€’ Python (ETL development, OOP concepts). β€’ SQL (expert-level across multiple dialects). β€’ Data Modeling (schemas, entity relationships). β€’ Airflow (DAG development). β€’ Data Warehousing (BigQuery, Databricks Delta Lakehouse). Nice-to-Have Skills: β€’ Familiarity with Spark. β€’ Prior experience with IBM Apptio. β€’ Exposure to FP&A or IT financial tracking tools. Why This Role? β€’ High-impact position in a fast-moving environment. β€’ Opportunity to work on cutting-edge tools and technologies. β€’ Potential for long-term engagement and full-time conversion.