

Tal Search Group, Inc.
ETL Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Developer in Tallahassee, Florida, for 1 year, with an hourly W2 pay rate. Requires a bachelor's degree or equivalent experience, 7 years in ETL development, and expertise in Azure tools, SQL, and data governance.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 28, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Tallahassee, FL
-
🧠 - Skills detailed
#Data Warehouse #Data Processing #Azure #ADF (Azure Data Factory) #Data Lake #Data Engineering #Azure Databricks #Microsoft Power BI #Cloud #Data Integration #Azure Synapse Analytics #Data Transformations #Data Quality #Databricks #Datasets #Documentation #Data Governance #Data Lineage #Data Access #Python #Computer Science #Azure Data Factory #SQL (Structured Query Language) #Synapse #"ETL (Extract #Transform #Load)" #Security #Spark (Apache Spark) #Compliance #Scala #BI (Business Intelligence)
Role description
Job Title: Product Developer
Location: Tallahassee, Florida (On-site 5 days a week)
Duration: 1 yr, plus extension
Rate: Hourly W2 + CHP
Start Date: July 1 2026
.
Tasks and Activities:
General Experience Expectations: Typically has four (4) years of high-tech industry, product engineering, and IT work experience.
The Department's OIT is seeking the services of four (4) experienced Product Developers under the working title of Extract, Transform, and Load (ETL) Developers.
Required Qualifications:
A bachelor’s degree from an accredited college or university in Computer Science, Information Systems, or other related field, or four (4) years of equivalent work experience is required. Relevant experience may be substituted for education on a year-for-year basis when applicable.
The Department requires the following experience, skills, and knowledge for this position:
• A minimum of seven (7) years of experience in ETL development and data engineering, with at least three (3) years of hands-on experience working with ADF, Azure Databricks, Azure Synapse Analytics, and Azure Purview;
• Proven track record of building and optimizing large-scale ETL pipelines for high-performance, high-availability environments;
• Expertise in Spark, Python, and/or Scala for large-scale data transformations;
• Strong Structured Query Language (SQL) proficiency and experience working with complex data structures;
• In-depth knowledge of data governance, security protocols, and role-based access control (RBAC) within the Azure ecosystem
• Ability to design ETL processes that are resilient, efficient, and fully compliant with regulatory standards.
Preferred Qualifications:
The Department prefers the Candidates to have the following experience, skills, and/or knowledge for this position:
• Microsoft Certified: Azure Data Engineer Associate;
• Microsoft Certified: Azure Solutions Architect Expert;
• Microsoft Certified: Azure Fundamentals; and
• Databricks Certified: Data Engineer Associate.
Scope of Work/Job Characteristics:
The ETL Developers will be responsible for driving the development of data integration pipelines to enable efficient, reliable access to critical data within the client’s Information Management System (CIMS) Data Warehouse or Data Lake on Azure. The role will be responsible for leveraging a technical stack that includes work with Azure Data Factory (ADF), Azure Databricks, Azure Synapse, Power BI, and Azure Purview. The role will be at the forefront of transforming complex datasets into actionable business insights.
ETL Pipeline Design and Development
• Lead the design and development of high-performing ETL processes to integrate and transform data across disparate sources;
• Deliver efficient, reliable pipelines that meet business needs and maintain the highest standards of security; and
• Utilize ADF to automate and streamline data workflows, ensuring smooth transitions from source to target.
Data Integration and Transformation
• Build and manage complex ETL workflows that extract, transform, and load data for downstream analytics and reporting, ensuring data is accurate, timely, and secure; and
• Take ownership of data quality and validation, creating resilient ETL processes that ensure only trusted data reaches its destination.
Cloud Platform Expertise
• Leverage the full power of the Azure ecosystem—ADF, Databricks, Synapse, and Purview—to manage and process high volumes of structured and unstructured data, delivering solutions that are scalable and performance-optimized; and
• Integrate large datasets into Azure Synapse Analytics, enabling analytics teams to deliver data-driven insights that support the Department’s mission.
Performance Optimization
• Continuously optimize ETL jobs to minimize latency and maximize throughput; and
• Ensure the architecture supports fast, reliable data access for end-users and systems, meeting stringent performance metrics.
Security and Compliance
• Embed security and compliance best practices in every step of the ETL process.
• Protect sensitive data by adhering to industry standards and ensuring compliance with the Department’s data governance policies; and
• Use Azure Purview to enforce data governance, track data lineage, and ensure that data handling meets the highest standards of integrity.
Collaboration and Stakeholder Engagement
• Partner with cross-functional teams—data engineers, analysts, business stakeholders, and security experts—to design and implement ETL solutions that meet the Department’s evolving needs; and
• Act as a technical leader and mentor, helping guide junior team members and providing expert guidance on data processing and transformation best practices.
Documentation and Best Practices
• Develop and maintain clear, detailed documentation for ETL processes, ensuring the team can consistently deliver high-quality, reliable solutions; and
• Establish and enforce best practices for data handling, ETL development, and security, driving a culture of excellence and accountability.
WE UNABLE TO WORK WITH THIRD PARTIES, NO VISA SPONSORSHIP/TRANSFER, AND CAN ONLY HIRE ON W-2 BASIS ONLY. THANK YOU!
Job Title: Product Developer
Location: Tallahassee, Florida (On-site 5 days a week)
Duration: 1 yr, plus extension
Rate: Hourly W2 + CHP
Start Date: July 1 2026
.
Tasks and Activities:
General Experience Expectations: Typically has four (4) years of high-tech industry, product engineering, and IT work experience.
The Department's OIT is seeking the services of four (4) experienced Product Developers under the working title of Extract, Transform, and Load (ETL) Developers.
Required Qualifications:
A bachelor’s degree from an accredited college or university in Computer Science, Information Systems, or other related field, or four (4) years of equivalent work experience is required. Relevant experience may be substituted for education on a year-for-year basis when applicable.
The Department requires the following experience, skills, and knowledge for this position:
• A minimum of seven (7) years of experience in ETL development and data engineering, with at least three (3) years of hands-on experience working with ADF, Azure Databricks, Azure Synapse Analytics, and Azure Purview;
• Proven track record of building and optimizing large-scale ETL pipelines for high-performance, high-availability environments;
• Expertise in Spark, Python, and/or Scala for large-scale data transformations;
• Strong Structured Query Language (SQL) proficiency and experience working with complex data structures;
• In-depth knowledge of data governance, security protocols, and role-based access control (RBAC) within the Azure ecosystem
• Ability to design ETL processes that are resilient, efficient, and fully compliant with regulatory standards.
Preferred Qualifications:
The Department prefers the Candidates to have the following experience, skills, and/or knowledge for this position:
• Microsoft Certified: Azure Data Engineer Associate;
• Microsoft Certified: Azure Solutions Architect Expert;
• Microsoft Certified: Azure Fundamentals; and
• Databricks Certified: Data Engineer Associate.
Scope of Work/Job Characteristics:
The ETL Developers will be responsible for driving the development of data integration pipelines to enable efficient, reliable access to critical data within the client’s Information Management System (CIMS) Data Warehouse or Data Lake on Azure. The role will be responsible for leveraging a technical stack that includes work with Azure Data Factory (ADF), Azure Databricks, Azure Synapse, Power BI, and Azure Purview. The role will be at the forefront of transforming complex datasets into actionable business insights.
ETL Pipeline Design and Development
• Lead the design and development of high-performing ETL processes to integrate and transform data across disparate sources;
• Deliver efficient, reliable pipelines that meet business needs and maintain the highest standards of security; and
• Utilize ADF to automate and streamline data workflows, ensuring smooth transitions from source to target.
Data Integration and Transformation
• Build and manage complex ETL workflows that extract, transform, and load data for downstream analytics and reporting, ensuring data is accurate, timely, and secure; and
• Take ownership of data quality and validation, creating resilient ETL processes that ensure only trusted data reaches its destination.
Cloud Platform Expertise
• Leverage the full power of the Azure ecosystem—ADF, Databricks, Synapse, and Purview—to manage and process high volumes of structured and unstructured data, delivering solutions that are scalable and performance-optimized; and
• Integrate large datasets into Azure Synapse Analytics, enabling analytics teams to deliver data-driven insights that support the Department’s mission.
Performance Optimization
• Continuously optimize ETL jobs to minimize latency and maximize throughput; and
• Ensure the architecture supports fast, reliable data access for end-users and systems, meeting stringent performance metrics.
Security and Compliance
• Embed security and compliance best practices in every step of the ETL process.
• Protect sensitive data by adhering to industry standards and ensuring compliance with the Department’s data governance policies; and
• Use Azure Purview to enforce data governance, track data lineage, and ensure that data handling meets the highest standards of integrity.
Collaboration and Stakeholder Engagement
• Partner with cross-functional teams—data engineers, analysts, business stakeholders, and security experts—to design and implement ETL solutions that meet the Department’s evolving needs; and
• Act as a technical leader and mentor, helping guide junior team members and providing expert guidance on data processing and transformation best practices.
Documentation and Best Practices
• Develop and maintain clear, detailed documentation for ETL processes, ensuring the team can consistently deliver high-quality, reliable solutions; and
• Establish and enforce best practices for data handling, ETL development, and security, driving a culture of excellence and accountability.
WE UNABLE TO WORK WITH THIRD PARTIES, NO VISA SPONSORSHIP/TRANSFER, AND CAN ONLY HIRE ON W-2 BASIS ONLY. THANK YOU!






