

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer contract for 2 months, 100% remote, requiring EST hours. Key skills include GCP expertise, Python, IAM integration, and SQL. Candidates should have experience in regulated environments and handling large user datasets. GCP certifications are mandatory.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 6, 2025
π - Project duration
1 to 3 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#LDAP (Lightweight Directory Access Protocol) #Terraform #Scala #SQL (Structured Query Language) #Containers #Deployment #REST (Representational State Transfer) #Security #Cloud #"ETL (Extract #Transform #Load)" #Datasets #Computer Science #Data Processing #Data Pipeline #Metadata #Storage #GIT #API (Application Programming Interface) #Data Engineering #IAM (Identity and Access Management) #Debugging #Python #Batch #REST API #BigQuery #GCP (Google Cloud Platform) #Docker #Automation #Documentation
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Title: Data Engineer β Data Pipeline
Position Type: Contract 2 Month
Location: 100% REMOTE Role
Must be able to work EST hours.
Responsibilities:
Data Engineer to design and implement a cloud-native data processing and API integration system. The solution will ingest identity data from upstream sources, detect record-level changes, and synchronize user metadata to a downstream system via API. This is a hands-on role focused on scalable data handling, automation, and fault-tolerant service deployment within GCP.
Qualifications:
β’ Experience integrating with IAM or identity systems (e.g., LDAP, Okta, custom directories)
β’ Background working in regulated or high-security environments
β’ Experience handling large-scale user datasets (millions of records)
β’ Familiarity with hybrid data processing (batch + streaming)
β’ GCP Certifications
β’ Bachelorβs or masterβs degree in computer science, Data Engineering, or equivalent work experience.
β’ Backend development or data engineering roles focused on identity, security, or metadata systems.
β’ Strong Python engineering for data processing and backend development
β’ Advanced experience with GCP services: BigQuery, Cloud Run, Cloud Functions, Cloud Composer, Pub/Sub, Cloud Storage, Secret Manager, Cloud Scheduler
β’ Interacting with REST APIs, including OAuth2 or token-based authentication
β’ Terraform for cloud infrastructure automation
β’ Proficiency with SQL for data transformation and validation
β’ Strong understanding of CI/CD, containers (Docker), Git workflows
β’ Comfortable working with structured metadata, user roles, and directory-style data
β’ Able to work independently and meet delivery milestones
β’ Strong documentation and debugging skills
β’ Must adhere to enterprise security and change control practices