

Senior Data Engineer – PySpark & AWS
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer – PySpark & AWS, requiring 12+ years of experience, strong skills in PySpark, AWS, Python, and data troubleshooting. Contract length is unspecified, with a pay rate on a W2 basis, located in Dallas, TX/Mclean, VA (Hybrid).
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
June 18, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Debugging #AWS (Amazon Web Services) #Cloud #Snowflake #Python #Spark (Apache Spark) #EC2 #JSON (JavaScript Object Notation) #DynamoDB #API (Application Programming Interface) #PySpark #S3 (Amazon Simple Storage Service) #Data Engineering
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
One of my clients is looking for a Senior Data Engineer – PySpark & AWS - Dallas, TX/ Mclean, VA (Hybrid) for a contact role.
W2 only!
No H1-B or OPT/CPT please.
Required Skills & Qualifications:
• Minimum 12 years of experience is required.
• Hands-on experience with PySpark, especially for building pipelines that integrate with AWS Step Functions.
• Proven ability to develop business logic to handle dynamic and cloud-based data sources, including Snowflake.
• Strong Python development skills with a focus on API integrations and backend services.
• Experience working with JSON and Parquet file formats in AWS environments.
• Deep understanding of AWS data tools and services: S3, EC2, DynamoDB, Glue.
• Excellent debugging, performance tuning, and data troubleshooting skills.
If interested, please send your resume at harsh@hireplusinfotech.com