

GBIT (Global Bridge InfoTech Inc)
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Richardson, TX, with a contract length of "unknown" and a pay rate of "unknown." Required skills include strong Snowflake experience, advanced SQL proficiency, ETL/ELT tools, and cloud platform familiarity.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 7, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Richardson, TX
-
🧠 - Skills detailed
#Talend #Data Science #Clustering #Datasets #Cloud #Informatica #"ETL (Extract #Transform #Load)" #Data Governance #Data Ingestion #Python #Programming #AWS (Amazon Web Services) #Data Quality #Security #SQL (Structured Query Language) #dbt (data build tool) #Batch #Big Data #Compliance #GCP (Google Cloud Platform) #Snowflake #Data Engineering #Airflow #Data Pipeline #Data Lake #Azure #Scala #Java #Data Modeling #Data Mart #Data Warehouse
Role description
Role : Data Engineer
Loaction : Richardson Tx
Key Responsibilities
• Design, develop, and maintain scalable data pipelines using Snowflake and cloud platforms
• Build and optimize data models, data warehouses, and data marts
• Implement ETL/ELT workflows for structured and semi-structured data
• Work with large datasets to ensure data quality, integrity, and performance optimization
• Collaborate with business stakeholders, analysts, and data scientists to deliver data solutions
• Optimize Snowflake performance (clustering, partitioning, query tuning, cost optimization)
• Develop and manage data ingestion frameworks from multiple sources (APIs, streaming, batch)
• Ensure data governance, security, and compliance standards
Required Skills
• Strong hands-on experience with Snowflake (must-have)
• Proficiency in SQL (advanced level)
• Experience with ETL/ELT tools (e.g., dbt, Informatica, Talend, Airflow)
• Experience in cloud platforms (AWS / Azure / GCP)
• Strong programming skills in Python / Scala / Java
• Experience with data modeling (star/snowflake schema)
• Familiarity with data lakes, data warehousing concepts, and big data technologies
Role : Data Engineer
Loaction : Richardson Tx
Key Responsibilities
• Design, develop, and maintain scalable data pipelines using Snowflake and cloud platforms
• Build and optimize data models, data warehouses, and data marts
• Implement ETL/ELT workflows for structured and semi-structured data
• Work with large datasets to ensure data quality, integrity, and performance optimization
• Collaborate with business stakeholders, analysts, and data scientists to deliver data solutions
• Optimize Snowflake performance (clustering, partitioning, query tuning, cost optimization)
• Develop and manage data ingestion frameworks from multiple sources (APIs, streaming, batch)
• Ensure data governance, security, and compliance standards
Required Skills
• Strong hands-on experience with Snowflake (must-have)
• Proficiency in SQL (advanced level)
• Experience with ETL/ELT tools (e.g., dbt, Informatica, Talend, Airflow)
• Experience in cloud platforms (AWS / Azure / GCP)
• Strong programming skills in Python / Scala / Java
• Experience with data modeling (star/snowflake schema)
• Familiarity with data lakes, data warehousing concepts, and big data technologies






