E-Solutions

Data Administrator

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Administrator/Data Pipeline Engineer with expertise in Cribl and Vector, focusing on telemetry pipelines. Contract length is unspecified, pay rate is competitive, and location is Bellevue, WA (Remote OK). Key skills include Snowflake and Azure Data Explorer experience.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
October 16, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Observability #Oracle #Azure #Monitoring #SQL (Structured Query Language) #Documentation #Cloud #Data Pipeline #Data Lifecycle #DBA (Database Administrator) #Data Ingestion #Compliance #Snowflake #Debugging #Scala #"ETL (Extract #Transform #Load)" #Security #Data Governance
Role description
Hiring Now: Data Admin | Cribl & Vector Note- This is not a DBA Role(SQL or Oracle) it's a Data Admin Position We are looking for an experienced Data Admin / Data Pipeline Engineer with strong hands-on expertise in Cribl and Vector to design and maintain secure, scalable, and high-performance telemetry pipelines. You’ll play a critical role in building data pipelines that power real-time and historical security insights for global-scale systems. Location: Bellevue, WA (Remote OK) Type: Contract | C2C & W2 Role Overview: The Data Admin / Data Pipeline Engineer will design, implement, and operationalize data flows that ensure reliability, scalability, and performance at every stage of the data lifecycle. You’ll develop and optimize pipelines that collect, transform, and route security telemetry data from multiple sources into analytical platforms such as Snowflake and Azure Data Explorer. Key Responsibilities: β€’ Design and manage high-performance telemetry pipelines using Vector and Cribl. β€’ Develop data ingestion, transformation, and routing workflows for large-scale data systems. β€’ Integrate multiple telemetry sources into Snowflake and Azure Data Explorer for analytics. β€’ Ensure data consistency, reliability, and scalability across distributed systems. β€’ Collaborate with security, data, and cloud engineering teams to improve observability and performance. β€’ Optimize data flows for real-time monitoring and historical analytics. β€’ Maintain documentation and ensure data governance compliance. Required Skills: β€’ Proven hands-on experience with Cribl and Vector (must-have). β€’ Strong understanding of data pipeline design, telemetry systems, and observability. β€’ Experience working with Snowflake and Azure Data Explorer (ADX). β€’ Proficiency in data transformation and routing techniques. β€’ Excellent analytical, debugging, and performance tuning skills. β€’ Ability to work independently in a remote and collaborative environment. If you’re passionate about data observability, telemetry pipelines, and real-time analytics, this role is a perfect fit for you. Join us and help build data pipelines that make security smarter and faster. Wishing you a wonderful day ahead! Sincerely, Kuldeep Choudhary Kuldeep.c@e-solutionsinc.com 2N Market St, Suite # 400, San Jose, CA- 95113 LinkedIn www.e-solutionsinc.com USA | CANADA | UK | SINGAPORE | MALAYSIA | INDIA β€œDisclaimer: E-Solutions Inc. provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, disability, genetic information, marital status, amnesty, or status as a covered veteran in accordance with applicable federal, state and local laws. We especially invite women, minorities, veterans, and individuals with disabilities to apply. EEO/AA/M/F/Vet/Disability.