GCP Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
800
-
πŸ—“οΈ - Date discovered
September 9, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
San Jose, CA
-
🧠 - Skills detailed
#SQL (Structured Query Language) #Hadoop #Azure #Data Security #Documentation #Snowflake #AWS (Amazon Web Services) #Spark (Apache Spark) #Scala #Java #Python #PostgreSQL #Big Data #Databases #Datasets #Database Management #GCP (Google Cloud Platform) #Cloud #Data Quality #BI (Business Intelligence) #Data Pipeline #Programming #"ETL (Extract #Transform #Load)" #Data Integration #PySpark #Redshift #Data Analysis #MySQL #Computer Science #Data Science #Security #Data Engineering
Role description
FINTECH Company - W2 Contract 6 months w/ extension Data Engineer - Sunnyvale, Ca Pay Rate: 95-100/hr β€’ 7+ years of experience as a Data Analyst or Data Engineer β€’ Must have worked with (GCP, SQL, BI Reporting, PySpark) β€’ Big tech background Job Description: We are seeking a talented and motivated Data Engineer to join our dynamic team in San Jose, CA. As a Data Engineer, you will play a crucial role in designing, building, and maintaining our data infrastructure to support our innovative financial technology solutions. Key Responsibilities: β€’ Data Pipeline Development: Design, develop, and maintain scalable data pipelines to process and analyze large datasets. β€’ Data Integration: Integrate data from various sources, ensuring data quality and consistency. β€’ Database Management: Optimize and manage databases, ensuring high performance and availability. β€’ ETL Processes: Develop and maintain ETL (Extract, Transform, Load) processes to support data warehousing and analytics. β€’ Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions. β€’ Data Security: Implement and maintain data security measures to protect sensitive information. β€’ Documentation: Create and maintain comprehensive documentation for data processes and systems. Qualifications: β€’ Education: Bachelor’s degree in Computer Science, Engineering, or a related field. β€’ Experience: 7+ years of experience in data engineering or a related role. β€’ Technical Skills: β€’ Proficiency in SQL and experience with relational databases (e.g., MySQL, PostgreSQL). β€’ Strong programming skills in Python or Java. β€’ Experience with big data technologies (e.g., Hadoop, Spark). β€’ Familiarity with cloud platforms (e.g., AWS, GCP, Azure). β€’ Knowledge of data warehousing concepts and tools (e.g., Snowflake, Redshift). β€’ Soft Skills: β€’ Excellent problem-solving and analytical skills. β€’ Strong communication and collaboration abilities. β€’ Ability to work in a fast-paced, dynamic environment. Benefits: β€’ Comprehensive health, dental, and vision insurance. β€’ 401(k) plan with company match. β€’ Flexible work hours and remote work options. β€’ Professional development opportunities.