

Insight Global
Data Engineer (GCP, SQL, BI Reporting, PySpark)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (GCP, SQL, BI Reporting, PySpark) in Sunnyvale, CA, on a 6-month W2 contract paying $55-$62/hr. Requires 7+ years of experience, proficiency in SQL, and big tech background.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
496
-
ποΈ - Date
January 6, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
San Jose, CA
-
π§ - Skills detailed
#MySQL #Datasets #Database Management #Python #Spark (Apache Spark) #Data Security #Java #Snowflake #Scala #PostgreSQL #Data Pipeline #Cloud #Redshift #AWS (Amazon Web Services) #Security #Data Analysis #PySpark #Computer Science #Documentation #"ETL (Extract #Transform #Load)" #Data Integration #Hadoop #SQL (Structured Query Language) #BI (Business Intelligence) #GCP (Google Cloud Platform) #Data Quality #Data Science #Databases #Data Engineering #Programming #Azure #Big Data
Role description
FINTECH Company - W2 Contract 6 months w/ extension
Data Engineer - Sunnyvale, Ca
Pay Rate: $55hr-$62hr
β’ 7+ years of experience as a Data Analyst or Data Engineer
β’ Must have worked with (GCP, SQL, BI Reporting, PySpark)
β’ Big tech background
Job Description:
We are seeking a talented and motivated Data Engineer to join our dynamic team in San Jose, CA. As a Data Engineer, you will play a crucial role in designing, building, and maintaining our data infrastructure to support our innovative financial technology solutions.
Key Responsibilities:
β’ Data Pipeline Development: Design, develop, and maintain scalable data pipelines to process and analyze large datasets.
β’ Data Integration: Integrate data from various sources, ensuring data quality and consistency.
β’ Database Management: Optimize and manage databases, ensuring high performance and availability.
β’ ETL Processes: Develop and maintain ETL (Extract, Transform, Load) processes to support data warehousing and analytics.
β’ Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
β’ Data Security: Implement and maintain data security measures to protect sensitive information.
β’ Documentation: Create and maintain comprehensive documentation for data processes and systems.
Qualifications:
β’ Education: Bachelorβs degree in Computer Science, Engineering, or a related field.
β’ Experience: 7+ years of experience in data engineering or a related role.
β’ Technical Skills:
β’ Proficiency in SQL and experience with relational databases (e.g., MySQL, PostgreSQL).
β’ Strong programming skills in Python or Java.
β’ Experience with big data technologies (e.g., Hadoop, Spark).
β’ Familiarity with cloud platforms (e.g., AWS, GCP, Azure).
β’ Knowledge of data warehousing concepts and tools (e.g., Snowflake, Redshift).
β’ Soft Skills:
β’ Excellent problem-solving and analytical skills.
β’ Strong communication and collaboration abilities.
β’ Ability to work in a fast-paced, dynamic environment.
Benefits:
β’ Comprehensive health, dental, and vision insurance.
β’ 401(k) plan with company match.
β’ Flexible work hours and remote work options.
β’ Professional development opportunities.
FINTECH Company - W2 Contract 6 months w/ extension
Data Engineer - Sunnyvale, Ca
Pay Rate: $55hr-$62hr
β’ 7+ years of experience as a Data Analyst or Data Engineer
β’ Must have worked with (GCP, SQL, BI Reporting, PySpark)
β’ Big tech background
Job Description:
We are seeking a talented and motivated Data Engineer to join our dynamic team in San Jose, CA. As a Data Engineer, you will play a crucial role in designing, building, and maintaining our data infrastructure to support our innovative financial technology solutions.
Key Responsibilities:
β’ Data Pipeline Development: Design, develop, and maintain scalable data pipelines to process and analyze large datasets.
β’ Data Integration: Integrate data from various sources, ensuring data quality and consistency.
β’ Database Management: Optimize and manage databases, ensuring high performance and availability.
β’ ETL Processes: Develop and maintain ETL (Extract, Transform, Load) processes to support data warehousing and analytics.
β’ Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
β’ Data Security: Implement and maintain data security measures to protect sensitive information.
β’ Documentation: Create and maintain comprehensive documentation for data processes and systems.
Qualifications:
β’ Education: Bachelorβs degree in Computer Science, Engineering, or a related field.
β’ Experience: 7+ years of experience in data engineering or a related role.
β’ Technical Skills:
β’ Proficiency in SQL and experience with relational databases (e.g., MySQL, PostgreSQL).
β’ Strong programming skills in Python or Java.
β’ Experience with big data technologies (e.g., Hadoop, Spark).
β’ Familiarity with cloud platforms (e.g., AWS, GCP, Azure).
β’ Knowledge of data warehousing concepts and tools (e.g., Snowflake, Redshift).
β’ Soft Skills:
β’ Excellent problem-solving and analytical skills.
β’ Strong communication and collaboration abilities.
β’ Ability to work in a fast-paced, dynamic environment.
Benefits:
β’ Comprehensive health, dental, and vision insurance.
β’ 401(k) plan with company match.
β’ Flexible work hours and remote work options.
β’ Professional development opportunities.






