Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a long-term contract in Atlanta, GA, offering a pay rate of "X". Requires 1–3 years of experience, strong SQL skills, proficiency in Azure tools, and Microsoft Azure Data Engineer Certification.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
May 30, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Atlanta, GA
-
🧠 - Skills detailed
#Code Reviews #Synapse #SQL (Structured Query Language) #Big Data #Azure DevOps #Microsoft Power BI #Data Integration #Spark (Apache Spark) #Data Pipeline #API (Application Programming Interface) #Azure Databricks #ML (Machine Learning) #NoSQL #Cloud #Scala #Data Engineering #Agile #Documentation #ADF (Azure Data Factory) #Data Lake #Data Modeling #Python #Databricks #Vault #Normalization #BI (Business Intelligence) #Oracle #Data Normalization #Deployment #Datasets #DevOps #AI (Artificial Intelligence) #Automation #Azure Data Factory #Hadoop #Data Governance #Azure #Informatica #"ETL (Extract #Transform #Load)" #SSAS (SQL Server Analysis Services) #Programming #Databases #Docker #Microsoft Azure #Oracle GoldenGate #SSIS (SQL Server Integration Services) #Jira
Role description
Open to all US Citizens and Green Card holders who are able to work onsite in Atlanta, GA. This is a long term contract (hours worked/ hours paid on W2. As a contractor you will have the ability to opt into health, dental, and vision benefits including a 401k. \_\_\_\_\_\_\_\_ We are seeking a Data Engineer with experience manipulating and transforming data in a software engineering environment. This role requires deep technical knowledge and hands-on experience working with both traditional relational systems and modern cloud and big data technologies. The ideal candidate will have experience normalizing databases, structuring data for analytics, and building pipelines that support enterprise data initiatives. Key Responsibilities: • Develop, test, and support scalable data pipelines using both cloud and on-premises technologies. • Normalize relational and NoSQL databases to meet the requirements of consuming applications. • Design and construct datasets that are machine-readable, consistent, and optimized for analysis. • Integrate and transform raw data from multiple sources into unified, business-ready formats. • Collaborate with cross-functional teams to deliver end-to-end data engineering and analytics solutions. • Create functional and technical design documentation for data workflows and system components. • Perform code reviews and contribute to reusable solution components and best practices. • Support automation and deployment efforts using CI/CD and DevOps methodologies. • Participate in Agile development processes and ensure high-quality delivery through collaborative team efforts. Required Skills and Experience: • 1–3 years of experience in data engineering, data integration, or software engineering focused on data. • Strong proficiency in SQL, data modeling, and data normalization techniques. • Hands-on experience with: • Microsoft Azure tools: Azure Data Lake, Azure Data Factory, Azure Synapse, Azure Databricks, Azure Key Vault, Power BI. • Big data tools: Hadoop, Spark, Hive. • ETL/BI tools: SSIS, SSAS, Informatica, Oracle GoldenGate. • Programming languages: Python (including for AI/ML development). • Automation tools: Autosys or similar. • Experience implementing various data models and working with structured, semi-structured, and unstructured data. • Familiarity with containerization tools like Docker and OpenShift. • Experience working in medium to large enterprise environments and collaborating across multiple projects and teams. • Microsoft Azure Data Engineer Certification. • Knowledge of data governance practices and tools (e.g., Unity Catalog in Databricks). • Experience designing API- and web service-based data integration solutions. Preferred Qualifications: • Industry experience in the Gas/Utilities sector and understanding of related data strategies. • Experience using JIRA and/or Azure DevOps for work management and delivery tracking. • Strong understanding of Agile development and DevOps/CI-CD pipelines.