

Big Data Engineer with GCP Experience
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Big Data Engineer with GCP experience in Phoenix, AZ (Hybrid) for a contract of unspecified length, offering $50-$60/hr C2C. Requires 4+ years in big data engineering, proficiency in SQL, Python, and GCP services.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
480
-
ποΈ - Date discovered
August 9, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Corp-to-Corp (C2C)
-
π - Security clearance
Unknown
-
π - Location detailed
Phoenix, AZ
-
π§ - Skills detailed
#"ETL (Extract #Transform #Load)" #Dataflow #SQL (Structured Query Language) #Compliance #Data Storage #Storage #Strategy #Apache Beam #Security #Python #Terraform #Data Security #Cloud #DevOps #GDPR (General Data Protection Regulation) #Data Pipeline #Version Control #Scrum #GCP (Google Cloud Platform) #BigQuery #GIT #Kubernetes #Data Science #Data Lake #Big Data #Data Quality #Airflow #Java #Batch #Data Engineering #AWS (Amazon Web Services) #Docker #Scala #Computer Science #Azure #Agile
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
About Company,
Droisys is an innovation technology company focused on helping companies accelerate their digital initiatives from strategy and planning through execution. We leverage deep technical expertise, Agile methodologies, and data-driven intelligence to modernize systems of engagement and simplify human/tech interaction.
Amazing things happen when we work in environments where everyone feels a true sense of belonging and when candidates have the requisite skills and opportunities to succeed. At Droisys, we invest in our talent and support career growth, and we are always on the lookout for amazing talent who can contribute to our growth by delivering top results for our clients. Join us to challenge yourself and accomplish work that matters
Big Data Engineer with GCP Experience
Phoenix, AZ (Hybrid )
Rate Range -$50 to $60 hr C2C All Inc
About the Role
We are seeking an experienced Big Data Engineer with strong proficiency in Google Cloud Platform (GCP) to join our growing Data Engineering team in Phoenix, AZ. You will be responsible for building and optimizing our data pipelines, architectures, and data flow, with a strong focus on scalable and secure cloud-based solutions.
Key Responsibilities
β’ Design, develop, and maintain scalable data pipelines and ETL processes using GCP services.
β’ Work with structured and unstructured data from various sources and ensure high data quality and reliability.
β’ Collaborate with data scientists, analysts, and business teams to understand requirements and deliver high-impact data solutions.
β’ Develop, deploy, and monitor data pipelines using tools such as Apache Beam, Dataflow, BigQuery, Cloud Composer (Airflow), and Pub/Sub.
β’ Optimize data storage and retrieval strategies for performance and cost-efficiency using GCP-native tools.
β’ Implement data security and governance best practices in alignment with company and regulatory policies.
β’ Troubleshoot issues in data workflows and provide support for production systems.
Required Qualifications
β’ Bachelorβs degree in Computer Science, Engineering, or related field (Masterβs preferred).
β’ 4+ years of experience in big data engineering, with 2+ years working directly with GCP-based data solutions.
β’ Proficiency in SQL, Python, and Java or Scala.
β’ Hands-on experience with GCP services: BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud Functions, and Cloud Composer.
β’ Strong understanding of data warehousing, data lakes, and stream/batch processing.
β’ Experience with CI/CD, version control (e.g., Git), and DevOps principles.
β’ Knowledge of data security, privacy laws, and compliance (e.g., HIPAA, GDPR) is a plus.
Preferred Qualifications
β’ GCP Professional Data Engineer Certification or equivalent.
β’ Experience with Terraform, Docker, or Kubernetes.
β’ Prior experience in Agile/Scrum development environments.
β’ Exposure to other cloud platforms (AWS, Azure) is a plus
Droisys is an equal opportunity employer. We do not discriminate based on race, religion, color, national origin, gender, gender expression, sexual orientation, age, marital status, veteran status, disability status or any other characteristic protected by law. Droisys believes in diversity, inclusion, and belonging, and we are committed to fostering a diverse work environment.