Robert Half

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with over 5 years of experience, focusing on building scalable data pipelines and optimizing ETL processes. Contract length exceeds 6 months, with a pay rate of "X". Remote work available. Key skills include SQL, Python, and cloud platforms.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
545
-
πŸ—“οΈ - Date
October 16, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Metro Jacksonville
-
🧠 - Skills detailed
#GCP (Google Cloud Platform) #Informatica #Data Storage #Scala #Tableau #Data Modeling #Datasets #BigQuery #Cloud #Data Pipeline #Snowflake #Data Quality #Python #SSIS (SQL Server Integration Services) #Microsoft Power BI #BI (Business Intelligence) #Version Control #Databricks #Synapse #Azure Data Factory #Data Engineering #Visualization #"ETL (Extract #Transform #Load)" #Storage #Security #Data Access #Azure #SQL (Structured Query Language) #API (Application Programming Interface) #Data Architecture #Data Lifecycle #AWS (Amazon Web Services) #Data Processing #Automation #ADF (Azure Data Factory) #Data Governance #GIT
Role description
We’re looking for a Data Engineer to join our growing analytics team! This role is ideal for someone who enjoys building scalable data pipelines, optimizing ETL processes, and working cross-functionally to make data accessible and actionable. Key Responsibilities β€’ Design, build, and maintain reliable data pipelines to collect, process, and transform large datasets. β€’ Collaborate with analysts, developers, and business stakeholders to ensure seamless data flow across systems. β€’ Develop and maintain ETL/ELT frameworks for structured and unstructured data sources. β€’ Optimize data storage and retrieval for analytics and reporting. β€’ Ensure data quality, integrity, and security throughout the data lifecycle. β€’ Work with cloud platforms (Azure, AWS, or GCP) and modern data tools. Qualifications β€’ 5+ years of experience as a Data Engineer or in a similar data-focused role. β€’ Strong proficiency with SQL and data modeling concepts. β€’ Hands-on experience with ETL tools (Azure Data Factory, SSIS, Informatica, etc.). β€’ Experience with Python or Scala for data processing. β€’ Familiarity with cloud data architectures (Azure Synapse, Snowflake, Databricks, or BigQuery). β€’ Understanding of data governance, security, and best practices. Preferred Skills β€’ Experience working with Power BI, Tableau, or similar visualization tools. β€’ Exposure to API integrations and automation workflows. β€’ Knowledge of CI/CD processes and version control (Git).