Lead Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer with financial services experience, offering a contract length of "unknown" and a pay rate of "unknown." Key skills include Python, Databricks, SQL, and cloud platforms. Leadership experience is required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
600
-
πŸ—“οΈ - Date discovered
August 28, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Poughkeepsie, NY
-
🧠 - Skills detailed
#Redshift #Java #PostgreSQL #R #Dataflow #Azure #Data Lake #SQL (Structured Query Language) #Synapse #"ETL (Extract #Transform #Load)" #Leadership #Oracle #S3 (Amazon Simple Storage Service) #Databricks #Data Orchestration #Programming #Data Modeling #Snowflake #MS SQL (Microsoft SQL Server) #Cloud #Data Engineering #GCP (Google Cloud Platform) #AWS (Amazon Web Services) #BigQuery #Python
Role description
Lead Data Engineer with financial services experience to lead a team of two engineers with hands on Python, Databricks, SQL, and cloud. β€’ Hands-on experience in data engineering, with a proven track record of designing and implementing complex data solutions. β€’ Experience in a leadership or management role, successfully leading and mentoring data engineering teams. β€’ Python β€’ Databricks KNOWLEDGE, SKILLS, AND ABILITIES β€’ Strong proficiency in programming languages such as Python, R, or Java. β€’ Expertise in SQL and experience with various database technologies (e.g., PostgreSQL, MS SQL, or Oracle PL/SQL). β€’ Experience with cloud based data platforms (e.g. DataBricks, Snowflake, Fabric) β€’ Proven experience with cloud platforms (e.g., AWS, Azure, GCP) and their respective data services (e.g., S3, Redshift, EMR, Glue, BigQuery, Dataflow, Azure Data Lake, Azure Synapse). β€’ Deep understanding of ETL/ELT processes, data modeling, data warehousing, and data lake architectures. β€’ Familiarity with Data Orchestration and CI/CD pipelines.