
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Charlotte, NC (Hybrid) for a long-term contract, paying $60.00 - $65.00 per hour. Key skills include Python, Pyspark, AWS, and SQL. Experience in data engineering and ETL tools is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
520
-
ποΈ - Date discovered
June 15, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Charlotte, NC 28203
-
π§ - Skills detailed
#S3 (Amazon Simple Storage Service) #Data Manipulation #Data Warehouse #Jenkins #Oracle #Scrum #REST (Representational State Transfer) #BitBucket #Data Mining #Business Analysis #Data Integration #Databases #Agile #GitHub #Lambda (AWS Lambda) #Informatica #Python #RDS (Amazon Relational Database Service) #API (Application Programming Interface) #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #DataStage #Data Profiling #Batch #Unix #DevOps #REST API #Data Engineering #Automation #PySpark #Unit Testing #MDM (Master Data Management) #Data Analysis #Sybase #Programming #Scala #SQL (Structured Query Language) #Kafka (Apache Kafka) #Spark (Apache Spark) #SQL Server #Triggers #Data Extraction #Classification
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Hello,
Hope you are doing well!!
I'm reaching out to you regarding this opening with our client who is looking for resources for their project. If you are interested, then Kindly share your resume at gautam.u@sapear.com. or call me at 346-770-0776
Position: Data Engineer
Location: Charlotte, NC (Hybrid)
Duration: Long Term
Top Skills:Python, Pyspark, AWS, SQL
Job Description: The primary role of the Data Engineer is to function as a critical member of a data team by designing data integration solutions that deliver business value in line with the companyβs objectives. They are responsible for the design and development of data/batch processing, data manipulation, data mining, and data extraction/transformation/loading into large data domains using Python/Pyspark and AWS tools.
Responsibilities:
Provide scoping, estimating, planning, design, development, and support services to a project.
Identify and develop the technical detail design document.
Work with developers and business areas to design, configure, deploy, and maintain custom ETL Infrastructure to support project initiatives.
Design and develop data/batch processing, data manipulation, data mining, and data extraction/transformation/loading (ETL Pipelines) into large data domains.
Document and present solution alternatives to clients, which support business processes and business objectives.
Work with business analysts to understand and prioritize user requirements.
Design, development, test, and implement application code.
Follow proper software development lifecycle processes and standards.
Quality Analysis of the products, responsible for the Defect tracking and Classification.
Track progress and intervene as needed to eliminate barriers and ensure delivery.
Resolve or escalate problems and manage risk for both development and production support.
Coordinate vendors and contractors for specific projects or systems.
Maintain deep knowledge and awareness of technical & industry best practices and trends, especially in technology & methodologies.
Skills and Knowledge:
Developer experience specifically focusing on Data Engineering.
Hands-on experience in Development using Python and Pyspark as an ETL tool.
Experience in AWS services like Glue, Lambda, MSK (Kafka), S3, Step functions, RDS, EKS etc.
Experience in Databases like Postgres, SQL Server, Oracle, Sybase etc.
Experience with SQL database programming, SQL performance tuning, relational model analysis, queries, stored procedures, views, functions, and triggers.
Strong technical experience in Design (Mapping specifications, HLD, LLD), Development (Coding, Unit testing).
Knowledge in developing UNIX scripts, Oracle SQL/PL-SQL.
Experience with data models, data mining, data analysis and data profiling.
Experience in working with REST API's.
Experience in workload automation tools like Control-M, Autosys etc.
Good knowledge in CI/CD DevOps process and tools like Bitbucket, GitHub, Jenkins.
Strong experience with Agile/SCRUM methodology.
Experience with other ETL tools (DataStage, Informatica, Pentaho, etc.).
Knowledge in MDM, Data warehouse and Data Analytics.
Thank you,Gautam Kaushik
Job Types: Full-time, Contract
Pay: $60.00 - $65.00 per hour
Expected hours: 80 per week
Benefits:
Flexible schedule
Schedule:
8 hour shift
Monday to Friday
Ability to Commute:
Charlotte, NC 28203 (Required)
Ability to Relocate:
Charlotte, NC 28203: Relocate before starting work (Required)
Work Location: In person