Pyspark Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a PySpark Developer on a 6-month contract in Charlotte, NC, offering $50.00 - $60.00 per hour. Requires 2+ years in Python, PySpark, Apache Airflow, and experience in the Retail industry. Hybrid work, onsite two days a week.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
480
-
πŸ—“οΈ - Date discovered
August 13, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Charlotte, NC 28202
-
🧠 - Skills detailed
#Airflow #SQL (Structured Query Language) #Storage #"ETL (Extract #Transform #Load)" #Compliance #Data Storage #Apache Airflow #Data Processing #Python #JSON (JavaScript Object Notation) #RDBMS (Relational Database Management System) #Deployment #PySpark #PostgreSQL #Kubernetes #Spark (Apache Spark) #Spark SQL
Role description
Greeting from IT Engagements…! IT Engagements is a global staff augmentation firm providing a wide range of talent on-demand and total workforce solutions. We have an immediate opening for the below position with our premium clients. Role: Pyspark Developer Location: Charlotte, NC Our client, a leader in their industry, has an excellent opportunity for a PySpark Developer to work on a 6-month contract position in Charlotte, NC, working hybrid onsite two days a week. Responsibilities: Design, develop, and maintain PySpark-based data transformation pipelines for diverse formats (JSON, CSV, RDBMS, stream) on Kubernetes/on-prem platforms. Optimize and tune PySpark jobs to efficiently handle medium to large-scale data volumes. Develop and implement data workflows leveraging Apache Airflow. Support and maintain PySpark applications, ensuring reliability and performance. Collaborate with cross-functional teams to ensure high-quality data processing and compliance. Requirements: 2+ years of experience in Python and PySpark development. 2+ years of experience with PySpark data transformation pipeline design and deployment on Kubernetes/on-prem platforms. 2+ years of experience in performance optimization for large-scale data processing. 2+ years of experience in designing and implementing Apache Airflow workflows. 2+ years of experience with Spark SQL and PostgreSQL for data storage and querying. Experience working in the Retail industry Thank you mani@itengagements.com Job Type: Contract Pay: $50.00 - $60.00 per hour Expected hours: 40 per week Work Location: In person