Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 12+ month contract, remote (PST hours), offering a competitive pay rate. Requires 10+ years in IT, 4+ years in data engineering, and strong Azure stack experience. Databricks certification is a plus.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
August 29, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#AI (Artificial Intelligence) #Data Modeling #Databricks #"ETL (Extract #Transform #Load)" #ADF (Azure Data Factory) #Scala #AWS (Amazon Web Services) #Data Pipeline #Datasets #PySpark #Snowflake #Spark (Apache Spark) #Spark SQL #Azure #SQL (Structured Query Language) #Data Engineering #Consulting
Role description
KPI Partners is a 5 times Gartner recognized data, analytics, and AI consulting company. We are leaders in data engineering on Azure, AWS, Google, Snowflake and Databricks. Founded in 2006, KPI has over 400 consultants and has successfully delivered over 1,000 projects to our clients. We are looking for skilled data engineers who want to work with the best team in data engineering. Title: Senior Data Engineer Location: Remote – PST Hours ( 8 AM – 5 PM PST) Job Type: Contract – 12+ Months About the Role: We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team at KPI, working on challenging and multi-year data transformation projects. This is an excellent opportunity for a talented data engineer to play a key role in building innovative data solutions using Azure Native Services and related technologies. If you are passionate about working with large-scale data systems and enjoy solving complex engineering problems, this role is for you. Key Responsibilities: • Data Engineering: Design, development, and implementation of data pipelines and solutions using PySpark, SQL, and related technologies. • Collaboration: Work closely with cross-functional teams to understand business requirements and translate them into robust data solutions. • Data Warehousing: Design and implement data warehousing solutions, ensuring scalability, performance, and reliability. • Continuous Learning: Stay up to date with modern technologies and trends in data engineering and apply them to improve our data platform. • Mentorship: Provide guidance and mentorship to junior data engineers, ensuring best practices in coding, design, and development. Must-Have Skills & Qualifications: • Minimum 10+ years of overall experience in IT Industry. • 4+ years of experience in data engineering, with a strong background in building large-scale data solutions. • 4+ years of hands-on experience developing and implementing data pipelines using Azure stack experience (Azure, ADF, Databricks, Functions) • Proven expertise in SQL for querying, manipulating, and analyzing large datasets. • Strong knowledge of ETL processes and data warehousing fundamentals. • Self-motivated and independent, with a “let’s get this done” mindset and the ability to thrive in a fast-paced and dynamic environment. Good-to-Have Skills: • Databricks Certification is a plus. • Data Modeling, Azure Architect Certification, CPG. Apply Now! If you’re ready to take on a senior role and work on transformative projects with a talented team, we encourage you to apply today. Please send your resume and contact information to saiteja.kukkadapu@kpipartners.com