

Tiger Graph Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Tiger Graph Developer based in Austin, TX or Sunnyvale, CA, with a contract duration of 12+ months. Requires 10+ years of experience, expertise in GSQL and Tiger Graph, and skills in data pipeline integration and performance tuning.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 26, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Austin, Texas Metropolitan Area
-
π§ - Skills detailed
#Data Engineering #Data Pipeline #REST (Representational State Transfer) #Storage #SQL Queries #RDBMS (Relational Database Management System) #TigerGraph #Cloud #Scala #Monitoring #NoSQL #Data Ingestion #Kafka (Apache Kafka) #S3 (Amazon Simple Storage Service)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Client : TCS
Title : Tiger Graph Developer
Location : Austin, TX~ Sunnyvale, CA (ONSITE)
Duration : 12+ Months
Rate : Open
Visa Status : ANY
Relevant Experience (in Yrs.) : 10+
Passport & LinkedIn β Must for Submission
Job Description:
Graph Modeling & Design:
β’ Design logical and physical graph schemas using GSQL based on domain requirements
β’ Optimize graph data models for query efficiency and scalability.
Tiger Graph Development:
β’ Develop and maintain GSQL queries, loading jobs, and custom functions.
β’ Implement Tiger Graphβs REST endpoints and integrate with internal/external systems via APIs or streaming platforms.
Data Pipeline Integration:
β’ Build pipelines to ingest data from sources like RDBMS, NoSQL, Kafka, or cloud storage (e.g., S3, GCS) into TigerGraph.
β’ Collaborate with data engineering to ensure high throughput, low-latency data ingestion.
Performance Tuning & Monitoring:
β’ Analyze and optimize query performance using profiling tools and TigerGraph monitoring interfaces.
β’ Conduct regular health checks, index tuning, and memory optimization for high-scale environments.
MUST HAVE SKILLS:
β’ GSQL
β’ Tiger Graph