Lead Data Engineer- IGrid/Power Systems

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer in Charlotte, NC, on a 12-month contract at $92.25 - $102.25/hr. Requires 8-15 years of experience, a degree in Computer Science or related fields, and expertise in AWS technologies, Power Systems, and data pipelines.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
816
-
πŸ—“οΈ - Date discovered
August 30, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Programming #Redshift #Python #BitBucket #Batch #Hadoop #Migration #SQL (Structured Query Language) #Infrastructure as Code (IaC) #Scripting #"ETL (Extract #Transform #Load)" #Data Modeling #Data Quality #Logging #SNS (Simple Notification Service) #Databases #HBase #Computer Science #Data Science #Cloud #Django #Data Ingestion #Data Pipeline #Data Encryption #API (Application Programming Interface) #Big Data #Terraform #EC2 #AWS (Amazon Web Services) #Deployment #SQS (Simple Queue Service) #Datasets #Data Management #Athena #Scala #Automation #Security #DynamoDB #Data Engineering #Leadership #Java #PySpark #Data Analysis #Lambda (AWS Lambda) #Spark (Apache Spark) #Data Security #S3 (Amazon Simple Storage Service) #Pandas #DevOps
Role description
Primary Talent Partners has a new contract opening for a Lead Data Engineer with our large power and utilities client in Charlotte, NC. This is a 12-month contract with a potential for extension. Pay: $92.25 - $102.25/hr;Β W2 contract, no PTO, no Benefits. ACA-compliant supplemental package available for enrollment. Description: β€’ Support or collaborate with application developers, database architects, data analysts and data scientists to ensure optimal data delivery architecture throughout ongoing projects/operations. β€’ Design, build, and manage analytics infrastructure that can be utilized by data analysts, data scientists, and non-technical data consumers, which enables functions of the big data platform for Analytics. β€’ Develop, construct, test, and maintain architectures, such as databases and large-scale processing systems that help analyze and process data in the way the Analytics organization requires. β€’ Develop highly scalable data management interfaces, as well as software components by employing programming languages and tools. β€’ Work closely with a team of Data Science staff to take existing or new models and convert them into scalable analytical solutions. β€’ Design, document, build, test and deploy data pipelines that assemble large complex datasets from various sources and integrate them into a unified view. β€’ Identify, design, and implement operational improvements: automating manual processes, data quality checks, error handling and recovery, re-designing infrastructure as needed. β€’ Create data models that will allow analytics and business teams to derive insights about customer behaviors β€’ Build new data pipelines, identify existing data gaps and provide automated solutions to deliver analytical capabilities and enriched data to applications. β€’ Responsible for obtaining data from the System of Record and establishing batch or real-time data feed to provide analysis in an automated fashion. β€’ Develop techniques supporting trending and analytic decision making processes β€’ Apply technologies for responsive front-end experience β€’ Ensure systems meet business requirements and industry practices β€’ Research opportunities for data acquisition and new uses for existing data β€’ Develop data set processes for data modeling, mining and production β€’ Integrate data management technologies and software engineering tools into existing structures β€’ Employ a variety of languages and tools (e.g. scripting languages) Core Responsibilities: β€’ Provides technical direction, guides the team on key technical aspects and responsible for product tech delivery β€’ Lead the Design, Build, Test and Deployment of components β€’ Where applicable in collaboration with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead) β€’ Understand requirements / use case to outline technical scope and lead delivery of technical solution β€’ Confirm required developers and skillsets specific to product β€’ Provides leadership, direction, peer review and accountability to developers on the product (key responsibility) β€’ Works closely with the Product Owner to align on delivery goals and timing β€’ Assists Product Owner with prioritizing and managing team backlog β€’ Collaborates with Data and Solution architects on key technical decisions β€’ The architecture and design to deliver the requirements and functionality Core Experience and Abilities: β€’ Power Systems engineering experience β€’ DER Dispatch experience β€’ Automated fault analysis systems experience β€’ Ability to perform hands on development and peer review for certain components / tech stack on the product β€’ Standing up of development instances and migration path (with required security, access/roles) β€’ Develop components and related processes (e.g. data pipelines and associated ETL processes, workflows) β€’ Lead implementation of integrated data quality framework β€’ Ensures optimal framework design and load testing scope to optimize performance (specifically for Big Data) β€’ Supports data scientist with test and validation of models β€’ Performs impact analysis and identifies risk to design changes β€’ Ability to build new data pipelines, identify existing data gaps and provide automated solutions to deliver analytical capabilities and enriched data to applications β€’ Ensures Test Driven development β€’ Experience leading teams to deliver complex products β€’ Strong technical skills and communication skills β€’ Strong skills with business stakeholder interactions β€’ Strong solutioning and architecture skills β€’ Experience building real time data ingestion streams (event driven) β€’ Ensure data security and permissions solutions, including data encryption, user access controls and logging Core Technical Skills: β€’ Experience with native AWS technologies for data and analytics such as Athena, S3, Lambda, Glue, EMR, Kinesis, SNS, CloudWatch, etc. β€’ Tools and Languages such as Django, Python, Java, Scala, Pandas β€’ Infrastructure as Code technology such as Terraform β€’ AWS services such as S3, EMR, Glue, Lambda, Athena, Kinesis, EC2, SNS, SQS, Cloudwatch β€’ Experience with databases such as Redshift, Document DB, DynamoDB and Mongo DB β€’ Experience transitioning on premise big data platforms into cloud based platforms such as AWS Additional Technical Skills (nice to have, but not required for the role): β€’ Hadoop platform (Hive; HBase; Druid) β€’ Spark β€’ PySpark β€’ SQL β€’ Workflow Automation β€’ DevOps pipeline (CI/CD); Bitbucket; Concourse β€’ API frameworks Requirements: β€’ Degree in Computer Science, Engineering, or related fields. β€’ 8-15 years of experience Primary Talent Partners is an Equal Opportunity / Affirmative Action employer committed to diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, age, national origin, disability, protected veteran status, gender identity, or any other factor protected by applicable federal, state, or local laws. If you are a person with a disability needing assistance with the application or at any point in the hiring process, please contact us at info@primarytalentpartners.com #PTPJobs