Big Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Big Data Engineer with a 24-month contract in Chandler, AZ (Hybrid). Requires 4+ years in software engineering, expertise in Hadoop, AWS S3, data modeling, and Unix scripting. GCP experience is a plus.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 30, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Chandler, AZ
-
🧠 - Skills detailed
#Python #Hadoop #Scripting #Dremio #AWS S3 (Amazon Simple Storage Service) #Shell Scripting #Unix #Cloud #Data Pipeline #Big Data #Database Design #AWS (Amazon Web Services) #Security #Data Engineering #GCP (Google Cloud Platform) #Spark (Apache Spark) #S3 (Amazon Simple Storage Service) #Storage #PySpark #Compliance #Consulting
Role description
Outstanding long-term contract opportunity! A well-known Financial Services Company is looking for a Big Data Engineer in Chandler, AZ (Hybrid). Work with the brightest minds at one of the largest financial institutions in the world. This is a long-term contract opportunity that includes a competitive benefit package! Our client has been around for over 150 years and is continuously innovating in today's digital age. If you want to work for a company that is not only a household name, but also truly cares about satisfying customers' financial needs and helping people succeed financially, apply today. Contract Duration: 24 Months Required Skills & Experience β€’ 4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work or consulting experience, training, military experience, education. β€’ Minimum 4 years of hand on experience with β€’ Building data pipeline using big-data stack (Hadoop, Hive, pySpark, python) β€’ Amazon AWS S3 – Object storage, security, data service integration with S3 β€’ Data modelling and database design. β€’ Job Scheduler – Autosys β€’ PowerBI, Dremio β€’ Unix/shell scripting, CICD pipeline Desired Skills & Experience β€’ Exposure in GCP cloud data engineering What You Will Be Doing β€’ Consult on or participate in moderately complex initiatives and deliverables within Software Engineering and contribute to large-scale planning related to Software Engineering deliverables. β€’ Review and analyze moderately complex Software Engineering challenges that require an in-depth evaluation of variable factors. β€’ Contribute to the resolution of moderately complex issues and consult with others to meet Software Engineering deliverables while leveraging solid understanding of the function, policies, procedures, and compliance requirements. β€’ Collaborate with client personnel in Software Engineering. Posted By: Blair Richardson