Take2 Consulting, LLC

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "unknown," offering a pay rate of "unknown" and remote work location. Key skills include Python, Spark, Azure, and data modeling. Experience in AEC industry preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 27, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Tampa, FL
-
🧠 - Skills detailed
#DevOps #ADF (Azure Data Factory) #Observability #Python #"ETL (Extract #Transform #Load)" #Airflow #Data Architecture #Monitoring #AI (Artificial Intelligence) #Security #Databricks #Azure DevOps #Azure #Azure cloud #Documentation #Consulting #Version Control #Scala #Leadership #Azure Data Factory #ML (Machine Learning) #Data Modeling #Agile #PySpark #Apache Airflow #Data Engineering #SQL (Structured Query Language) #BI (Business Intelligence) #GIT #Microsoft Power BI #Spark (Apache Spark) #Cloud #Project Management #Data Pipeline #API (Application Programming Interface)
Role description
As a Senior Data Engineer and be at the forefront of developing innovative data solutions that empower architecture, engineering, and consulting (AEC) firms worldwide. In this pivotal role, you will lead efforts to design, develop, test, and maintain scalable data platforms that support critical business functions. Collaborating across cross-functional teams, you will enable data-driven decision-making, enhance operational efficiency, and drive technological advancements. Key Responsibilities • Architect and Develop Data Pipelines: Design, build, and optimize ETL/ELT processes using advanced tools like Spark, pySpark, and Databricks to facilitate seamless data flow across systems. • Code Excellence and Testing: Write high-quality, scalable, and maintainable code in Python, ensuring it meets business needs and industry standards. Develop comprehensive unit tests to uphold code quality. • Data Modeling & Warehousing: Implement robust data models and manage data warehousing concepts, leveraging expertise in data architecture and cloud services such as Azure. • Performance Monitoring & Optimization: Continuously assess and improve application performance, troubleshoot issues promptly, and implement best practices for security and efficiency. • Collaboration & Mentorship: Participate in Agile ceremonies, provide guidance and mentorship to junior engineers, and actively seek peer feedback to foster a collaborative environment. • Documentation & Communication: Maintain detailed technical documentation, clearly communicate project status, risks, and estimates to stakeholders, ensuring transparency and alignment. Qualifications & Skills • Required: • Expert-level proficiency in Python • Advanced experience with Spark, pySpark, T-SQL, and Databricks • Deep understanding of ETL/ELT processes, data modeling, and data warehousing • Extensive experience with Azure cloud services, version control (Azure DevOps, Git), and performance tuning • Strong knowledge of security protocols including row-level and object-level security • Solid understanding of Medallion architecture and RESTful API invocation • Proven ability to work independently in a fast-paced environment • Preferred: • Experience with orchestration tools such as Apache Airflow or Azure Data Factory • Familiarity with Microsoft Fabric, Power BI, and cross-tenant data sharing • Knowledge of AI/ML concepts, cloud cost management, and data pipeline security in Fabric • Team leadership experience and proficiency with observability tooling • Nice to Have: • Background in project management, financial concepts, or relevant certifications in Azure, Python, SQL, or Databricks Why Join Take2? • Opportunity to work on cutting-edge data projects impacting top-tier clients • Clear career progression with exposure to leadership roles and advanced technologies • A collaborative, innovative, and learning-focused environment • Competitive compensation package and benefits • Access to ongoing training, certifications, and skill development initiatives