Jobs via Dice

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Charlotte, NC or Denver, CO, for 12+ months at $50/hr on W2. Key skills include Python, Prefect, AWS, and Terraform. Experience with data pipelines and cloud optimization is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
400
-
🗓️ - Date
October 4, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Infrastructure as Code (IaC) #Automation #S3 (Amazon Simple Storage Service) #GCP (Google Cloud Platform) #Data Pipeline #Data Storage #Cloud #Python #SAP #Athena #Redshift #Data Engineering #Scripting #Visualization #Azure #Storage #Computer Science #Data Analysis #Data Integrity #Tableau #Deployment #Datasets #Lambda (AWS Lambda) #AWS (Amazon Web Services) #Terraform #Data Science #Data Processing #JavaScript #"ETL (Extract #Transform #Load)" #ML (Machine Learning) #Scala
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Intellisoft Technologies, is seeking the following. Apply via Dice today! Hello, Please find the below requirement and let me know your thoughts Position: Data Engineer Location: Charlotte, NC or Denver, CO 4x/week onsite Local consultants please Contract Length: 12+ months Rate: $50/hr on W2 without benefits Face-To Face Interview. Top Requirements: • Python • Prefect • AWS • Athena • Terraform • JavaScript Plusses • FinOps background • Redshift, S3, and Lambda Day to Day Responsibilities/project specifics: This resource will initially help automate projects and complete hands on code work to improve processes. The resource will work on data pipelines, putting analysis together and putting into visualization tools Tableau or other tools. We are seeking support in analyzing how we pull data, working with SAP systems and evolving into how the organization move down the cloud pathway. Job will include data analysis/engineering and will include responsibilities around reporting on cloud costing for AWS, Google, Azure. An internal team has already creating the AWS costing blueprint and these resources will be available. Understanding data flows, how data is tied together and how to get the pipelines working and automation. Job Description: We are seeking an experienced Advanced Data Engineer to join our team, focusing on supporting and enhancing our data pipelines, visualizations, and analytics capabilities. The ideal candidate will have a robust background in data engineering and analytics, with a deep understanding of modern data technologies and frameworks. This role demands a strong technical skill set, the ability to work collaboratively with cross-functional teams, and a keen eye for detail. Key Responsibilities Design, develop, and maintain scalable data pipelines using Prefect to ensure efficient data flow across systems. Implement and optimize data storage solutions using AWS services, including Athena, for high-performance querying and analysis. Utilize Python for scripting and automation to enhance data processing workflows and integrations. Employ Terraform for infrastructure as code, ensuring reliable and consistent deployment of data-related resources. Develop CI/CD pipelines to automate testing, integration, and deployment processes for data applications. Create interactive and insightful data visualizations using JavaScript frameworks to support decision-making processes. Apply advanced analytics techniques to extract valuable insights from complex datasets, driving business growth and innovation. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver tailored solutions. Monitor and troubleshoot data pipelines, ensuring data integrity and reliability across platforms. Implement cloud optimization strategies to maximize efficiency and reduce costs across cloud services. Leverage Google Cloud Platform (Google Cloud Platform) for data solutions and integration with existing AWS infrastructure. Minimum Qualifications Proven experience as a Data Engineer or similar role, with a focus on data pipelines and analysis. Strong expertise in Prefect, AWS, Athena, Python, Terraform, and JavaScript. Solid understanding of CI/CD practices and tools to streamline data engineering workflows. Familiarity with advanced analytics techniques and their application in business contexts. Experience with cloud optimization strategies and tools to enhance performance and cost-effectiveness. Proficiency in Google Cloud Platform (Google Cloud Platform) and its integration with other cloud services. Excellent problem-solving skills, with the ability to diagnose and resolve complex data issues. Strong communication skills to collaborate effectively with technical and non-technical teams. Bachelor s degree in Computer Science, Engineering, or a related field; advanced degrees are a plus. Preferred Qualifications Experience with additional AWS services such as Redshift, S3, and Lambda. Knowledge of machine learning frameworks and their integration with data engineering processes. Ability to work in a fast-paced environment and adapt to changing requirements and priorities. Fill The following Details for submission: Candidate Legal Name: Contact Numbers : Email Address ID: Current Location : Availability: Relocation: Total Experience: US Experience: Presently working: LinkedIn: Last 4 digits of SSN: DOB(DD-MM): Full Education Details : Visa Status and Expiry: Official references from recent projects • Full Name Phone: Official Email ID: Designation: Client name: • Full Name Phone: Official Email ID: Designation : Client name: Thanks & Regards, Vasu Intellisoft Technologies Inc., 11494 Luna Road, Ste 280 Farmers Branch, TX -75234 (O) ext 131