Wise Skulls

Senior Data Engineer (USC AND GC ONLY)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Richardson, TX, for 6 months with a pay rate of "TBD." Requires 9+ years in Data Engineering, ETL, and Teradata, plus 3+ years in GCP, BigQuery, and containerization technologies.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 11, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Richardson, TX
-
🧠 - Skills detailed
#Metadata #Kubernetes #Cloud #Docker #Scripting #SQL (Structured Query Language) #DataStage #Microservices #Python #Datasets #Informatica #Agile #Dataflow #BigQuery #Data Engineering #Storage #Scala #Big Data #GCP (Google Cloud Platform) #Data Pipeline #"ETL (Extract #Transform #Load)" #Teradata #NoSQL #Deployment
Role description
Now Hiring: Senior Data Engineer (GCP / Big Data / ETL) Location: Richardson, TX (On-site β€” mandatory) Duration: 6 Months (Possible Extension) Job Summary We’re seeking an experienced Senior Data Engineer with deep expertise in Data Warehousing, ETL, Big Data, and modern GCP-based data pipelines. This role is ideal for someone who thrives in cross-functional environments and can architect, optimize, and scale enterprise-level data solutions on the cloud. Must-Have Skills (Non-Negotiable) β€’ 9+ years in Data Engineering & Data Warehousing β€’ 9+ years hands-on ETL experience (Informatica, DataStage, etc.) β€’ 9+ years working with Teradata β€’ 3+ years hands-on GCP and BigQuery β€’ Experience with Dataflow, Pub/Sub, Cloud Storage, and modern GCP data pipelines β€’ Strong background in query optimization, data structures, metadata & workload management β€’ Experience delivering microservices-based data solutions β€’ Proficiency in Big Data & cloud architecture β€’ 3+ years with SQL & NoSQL β€’ 3+ years with Python or similar scripting languages β€’ 3+ years with Docker, Kubernetes, CI/CD for data pipelines β€’ Expertise in deploying & scaling apps in containerized environments (K8s) β€’ Strong communication, analytical thinking, and ability to collaborate across technical & non-technical teams β€’ Familiarity with AGILE/SDLC methodologies Key Responsibilities β€’ Build, enhance, and optimize modern data pipelines on GCP β€’ Implement scalable ETL frameworks, data structures, and workflow dependency management β€’ Architect and tune BigQuery datasets, queries, and storage layers β€’ Collaborate with cross-functional teams to define data requirements and support business objectives β€’ Lead efforts in containerized deployments, CI/CD integrations, and performance optimization β€’ Drive clarity in project goals, timelines, and deliverables during Agile planning sessions πŸ“© Interested? Apply now or DM us to explore this opportunity! You can share resumes at moin@wiseskulls.com OR Call us on +1 (669) 207-1376