

Insight Global
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in New Orleans, LA, offering $67-74/hour for a 12-month contract. Requires a Bachelor’s degree, 3-6 years in data engineering, proficiency in SQL and Python, and experience with cloud platforms like Oracle and Azure.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
592
-
🗓️ - Date
October 2, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New Orleans, LA 70129
-
🧠 - Skills detailed
#Data Modeling #Monitoring #Scala #Python #SQL (Structured Query Language) #Storage #ADF (Azure Data Factory) #Data Architecture #Data Engineering #Oracle #Version Control #Documentation #PySpark #Azure #Synapse #Databricks #Spark (Apache Spark) #Cloud #Computer Science #BI (Business Intelligence) #Logging #Data Quality #Datasets #Azure Data Factory #"ETL (Extract #Transform #Load)" #Deployment
Role description
Data Engineer
Location: New Orleans, LA (On-site, 5 days/week)Pay: $67-74/hourContract: 12-month contract with potential for permanent hire
Role Overview
We’re seeking a Data Engineer to help build internal data capabilities that support our cloud-based transformation initiative. You’ll be responsible for ingesting, transforming, and mapping data across multiple systems—ensuring seamless integration between backend Oracle environments and frontend reporting platforms.
Responsibilities
Design, develop, and maintain ETL/ELT pipelines using Azure and related platforms
Implement orchestration, scheduling, and monitoring for high-availability data flows
Ensure data quality through validation, testing, logging, and automated monitoring
Optimize storage, partitioning, and compute performance in Fabric Lakehouse environments
Collaborate with Data Architects to align with enterprise data models and governance standards
Partner with Analysts to deliver curated, trusted datasets for BI and advanced analytics
Apply CI/CD practices to data workflows, including version control and automated deployment
Maintain documentation of data flows, schemas, and processes
Support team knowledge sharing and perform other duties as assigned
Qualifications
Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or related field
3–6 years of professional experience in data engineering or ETL development
Hands-on experience with cloud-based platforms (Oracle, Fabric, Databricks, Synapse, or equivalent)
Strong proficiency in SQL and Python (PySpark or Scala a plus)
Solid understanding of data modeling principles and performance optimization
Familiarity with orchestration tools (Azure Data Factory, Fabric Pipelines, or similar)
What We Offer
Opportunity to work on a high-impact transformation initiative
Fast-paced startup environment with enterprise-level backing
Clear path to permanent employment for top performers
Collaborative team culture focused on innovation and scalability
Benefit packages for this role may include healthcare insurance offerings and paid leave as provided by applicable law
Job Types: Full-time, Contract, Permanent
Pay: $67.00 - $74.00 per hour
Expected hours: 40 per week
Benefits:
Health insurance
Paid time off
Vision insurance
Work Location: In person
Data Engineer
Location: New Orleans, LA (On-site, 5 days/week)Pay: $67-74/hourContract: 12-month contract with potential for permanent hire
Role Overview
We’re seeking a Data Engineer to help build internal data capabilities that support our cloud-based transformation initiative. You’ll be responsible for ingesting, transforming, and mapping data across multiple systems—ensuring seamless integration between backend Oracle environments and frontend reporting platforms.
Responsibilities
Design, develop, and maintain ETL/ELT pipelines using Azure and related platforms
Implement orchestration, scheduling, and monitoring for high-availability data flows
Ensure data quality through validation, testing, logging, and automated monitoring
Optimize storage, partitioning, and compute performance in Fabric Lakehouse environments
Collaborate with Data Architects to align with enterprise data models and governance standards
Partner with Analysts to deliver curated, trusted datasets for BI and advanced analytics
Apply CI/CD practices to data workflows, including version control and automated deployment
Maintain documentation of data flows, schemas, and processes
Support team knowledge sharing and perform other duties as assigned
Qualifications
Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or related field
3–6 years of professional experience in data engineering or ETL development
Hands-on experience with cloud-based platforms (Oracle, Fabric, Databricks, Synapse, or equivalent)
Strong proficiency in SQL and Python (PySpark or Scala a plus)
Solid understanding of data modeling principles and performance optimization
Familiarity with orchestration tools (Azure Data Factory, Fabric Pipelines, or similar)
What We Offer
Opportunity to work on a high-impact transformation initiative
Fast-paced startup environment with enterprise-level backing
Clear path to permanent employment for top performers
Collaborative team culture focused on innovation and scalability
Benefit packages for this role may include healthcare insurance offerings and paid leave as provided by applicable law
Job Types: Full-time, Contract, Permanent
Pay: $67.00 - $74.00 per hour
Expected hours: 40 per week
Benefits:
Health insurance
Paid time off
Vision insurance
Work Location: In person