TEK NINJAS

Lead Data Engineer (Onsite)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer (Onsite) in New York City, NY, lasting 6-12 months, with a pay rate of $75-$80/hr. Key skills include Data Architecture, SQL, Python, AWS, and experience in financial services, particularly asset management.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
October 8, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
New York, NY
-
🧠 - Skills detailed
#Strategy #AWS (Amazon Web Services) #Business Analysis #Snowflake #Data Integration #Microsoft Power BI #Cloud #Security #SQL (Structured Query Language) #Airflow #Python #Data Quality #Data Orchestration #Data Engineering #"ETL (Extract #Transform #Load)" #Leadership #Quality Assurance #BI (Business Intelligence) #Visualization #Data Pipeline #Data Security #Azure #Data Architecture #Compliance #Computer Science #Documentation #Scala #Tableau #Apache Airflow #Data Governance #Data Accuracy
Role description
Job Title: Lead Data Engineer / Data Architect Location: New York City, NY- (Onsite – 5 days/week) Duration: 6 - 12+ Months Pay : 75 - 80/hr Tech skill sets: Data Architecture, Data Modelling, Expert level SQL, Python, AWS Technologies (Glue), Snowflake and data engineering design patterns Domain: Asset management, Alternatives Investments, Financial Services. About the Role: We are seeking an experienced Lead Data Engineer / Data Architect to spearhead the design, development, and implementation of our enterprise data platform. This role will be instrumental in shaping and driving our data engineering roadmap, ensuring scalability, performance, and innovation. The ideal candidate combines deep technical expertise with strong leadership skills and a solid understanding of financial services data domains. Key Responsibilities: β€’ Lead the data architecture and engineering strategy, ensuring alignment with business and technology roadmaps. β€’ Design and implement data models, pipelines, and data integration frameworks across multiple platforms. β€’ Partner with stakeholders to translate business requirements into scalable data solutions. β€’ Performance Optimization: Optimize data pipelines for performance, scalability, and reliability, including query tuning and resource management within Snowflake. β€’ Drive adoption of best practices in data engineering design patterns and modern cloud architectures. β€’ Data Quality Assurance: Implement and monitor data validation procedures to ensure data accuracy and consistency across systems. β€’ Collaboration and Communication: Work closely with project managers, data architects, and business analysts to align project milestones and deliverables with business goals. β€’ Mentor and guide data engineering teams (onsite & offshore) to deliver high-quality outcomes. β€’ Ensure compliance with data governance, security, and privacy standards. β€’ Documentation: Create and maintain detailed documentation of data pipelines, data flow diagrams, and transformation logic. β€’ Issue Resolution: Troubleshoot and resolve issues related to data pipelines, including job failures and performance bottlenecks. Required Qualifications: β€’ Bachelor’s degree in Computer Science, Information Technology, or a related field. β€’ 10+ years of experience in data engineering with a strong focus on Data Architecture, Data Modelling (Conceptual, Logical, Physical), ELT processes and data pipeline development. β€’ Hands-on experience with Snowflake cloud data platform, including data sharing, secure views, and performance optimization. β€’ Proficiency in SQL and familiarity with data integration and ETL/ELT tools. β€’ AWS Technologies (Glue, EMR) β€’ Python for data engineering workflows β€’ Strong understanding of data engineering design patterns β€’ Strong problem-solving skills and the ability to work independently to meet deadlines. β€’ Excellent communication skills for effectively interacting with technical and non-technical stakeholders. Preferred Qualifications: β€’ Certifications in Snowflake or relevant data technologies. β€’ Experience in the financial services sector, especially asset management / alternative investments, with an understanding of data security and compliance requirements. β€’ Familiarity with cloud platforms (e.g., AWS, Azure) and data orchestration tools (e.g., Apache Airflow). β€’ Knowledge of data visualization tools (e.g., Tableau, Power BI).