Python Developer with GCP, Cloud, Data-Client W2

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python Developer with GCP and Cloud Data expertise, offering a 24-month contract at $53.13/hr in Chandler, AZ. Requires 4+ years in Software Engineering, experience with Hadoop, Hive, pySpark, AWS S3, and database design.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
424
-
πŸ—“οΈ - Date discovered
September 23, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Chandler, AZ
-
🧠 - Skills detailed
#Hadoop #Consulting #Security #S3 (Amazon Simple Storage Service) #AWS (Amazon Web Services) #Unix #Data Pipeline #PySpark #Compliance #AWS S3 (Amazon Simple Storage Service) #Shell Scripting #Spark (Apache Spark) #MySQL #"ETL (Extract #Transform #Load)" #Data Engineering #Scripting #Python #Dremio #Cloud #Database Design #Storage #GCP (Google Cloud Platform)
Role description
Python Developer with GCP, Cloud, Data Duration: 24-month contract with possible conversion to FTE Location: Chandler, AZ only option - 3 days a week in office $53.13/hr w2 Interviews: 1st round: Lead Engineers will be evaluating candidates. Since this is an engineering role, there will be a technical aptitude review, 60 min MS teams call then 2nd round: 30 min MS Teams meeting In this contingent resource assignment you may: Consult on or participate in moderately complex initiatives and deliverables within Software Engineering and contribute to large-scale planning related to Software Engineering deliverables. Review and analyze moderately complex Software Engineering challenges that require an in-depth evaluation of variable factors. Contribute to the resolution of moderately complex issues and consult with others to meet Software Engineering deliverables while leveraging solid understanding of the function policies procedures and compliance requirements. Collaborate with client personnel in Software Engineering. Required Qualifications: 4 years of Software Engineering experience or equivalent demonstrated through one or a combination of the following: work or consulting experience training military experience education. JOB DESCRIPTION: Minimum 4 years of hand on experience with - Building data pipeline using big-data stack Hadoop Hive pySpark python - Amazon AWS S3 - Object storage security data service integration with S3 - Data modelling and database design. - Job Scheduler - Autosys - PowerBI Dremio - Unix/shell scripting CICD pipeline - Exposure in GCP cloud data engineering is a plus Manager Notes: -The contractors need to be proactive they can't wait to be told what to do -Must be accountable along with the technical skills -The tech stack mentioned these are the technologies being used to build data pipelines -They need to model design the data build pipelines applying logic to the data to transform the data and troubleshoot -They should have strong understanding and implementation of Autosys -Ability to automate using spark Python Hadoop/Hive -Should have a fundamental background in database design MySQL or any standard database -Exposure to Cloud data engineering is a big plus not required -Financial services experience is a plus but not required-having domain knowledge is helpful Technical Assessment -We need a clear understanding of tech work experience they need to be able to describe the work they have done -Overall problem solving so given a problem how efficiently does their thought process drive towards a solution? Skill Highlights- please indicate the # of years on each of the following skills: β€’ Hadoop β€’ Hive β€’ pySpark β€’ python β€’ Amazon AWS S3 β€’ Data modelling β€’ Database design β€’ Job Scheduler - Autosys β€’ PowerBI Dremio β€’ Unix/shell scripting β€’ CICD pipeline β€’ GCP cloud data engineering ideal