

Trilyon, Inc.
Data Engineer
โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a "Python Data Engineer" with a 6-month remote contract, offering expertise in GCP, BigQuery, Python, and SQL. Candidates should have experience in data pipeline orchestration and CI/CD practices, focusing on scalable data infrastructure for analytics.
๐ - Country
United States
๐ฑ - Currency
$ USD
-
๐ฐ - Day rate
Unknown
-
๐๏ธ - Date
October 29, 2025
๐ - Duration
More than 6 months
-
๐๏ธ - Location
Remote
-
๐ - Contract
Unknown
-
๐ - Security
Unknown
-
๐ - Location detailed
Santa Clara County, CA
-
๐ง - Skills detailed
#Data Pipeline #SQL (Structured Query Language) #Data Processing #Data Modeling #Data Engineering #GitHub #Cloud #Batch #Dataflow #GCP (Google Cloud Platform) #Apache Airflow #Deployment #Business Analysis #ML (Machine Learning) #BigQuery #Datasets #Data Ingestion #Python #Version Control #Big Data #DevOps #Scala #GIT #GitLab #BI (Business Intelligence) #Programming #Data Science #Airflow #Storage #Libraries
Role description
Currently we are seeking a โPython Data Engineerโ for one of our clients that is a leading multination corporation.
Position: Python Data Engineer
Location: Remote
Duration: 6 months
Role Summary
We are looking for a highly skilled and motivated Data Engineer to join our team. The ideal candidate will be responsible for designing, building, and maintaining scalable data infrastructure that drives business intelligence, advanced analytics, and machine learning initiatives. You must be comfortable working autonomously, navigating complex challenges, and driving projects to successful completion in a dynamic cloud environment.
Key Responsibilities
โข Design and Optimization: Design, implement, and optimize clean, well-structured, and performant analytical datasets to support high-volume reporting, business analysis, and data science model development.
โข Pipeline Development: Architect, build, and maintain scalable and robust data pipelines for diverse applications, including business intelligence, advanced analytics
โข Big Data & Streaming: Implement and support Big Data solutions for both batch (scheduled) and real-time/streaming analytics.
โข Collaboration: Work closely with product managers and business teams to understand data requirements and translate them into technical solutions.
Required Skills and Experience
โข Cloud Platform Expertise (GCP Focus): Extensive hands-on experience working in dynamic cloud environments, with a strong preference for Google Cloud Platform (GCP) services, specifically:
โ BigQuery: Expert-level skills in data ingestion, performance optimization, and data modeling within a petabyte-scale environment.
โ Experience with other relevant GCP services like Cloud Storage, Cloud Dataflow/Beam, or Pub/Sub
โข Programming & Querying:
โ Python: Expert-level programming proficiency in Python, including experience with relevant data engineering libraries
โ SQL: A solid command of advanced SQL for complex querying, data processing, and performance tuning.
โข Data Pipeline Orchestration: Prior experience using workflow management and orchestration tools (e.g., Apache Airflow, Cloud Composer, Airflow,Dagster, or similar).
โข DevOps/CI/CD: Experience with version control (Git) and familiarity with CI/CD practices and tools (e.g., GitLab, GitHub Actions) to automate deployment and testing processes.
Equal Employment Opportunity
Trilyon is an Equal Opportunity Employer, committed to fairness and respect for all individuals. We value diversity in age, disability, ethnicity, gender, gender identity, religion, and sexual orientation, believing it drives innovation and better service. Employment decisions are made impartially, without regard to any protected characteristic under federal, state, or local law. Our diverse team drives innovation, competitiveness, and creativity, enhancing our ability to effectively serve our clients and communities. This commitment to diversity makes us stronger and more adaptable.
Warm Regards,
Signature
Currently we are seeking a โPython Data Engineerโ for one of our clients that is a leading multination corporation.
Position: Python Data Engineer
Location: Remote
Duration: 6 months
Role Summary
We are looking for a highly skilled and motivated Data Engineer to join our team. The ideal candidate will be responsible for designing, building, and maintaining scalable data infrastructure that drives business intelligence, advanced analytics, and machine learning initiatives. You must be comfortable working autonomously, navigating complex challenges, and driving projects to successful completion in a dynamic cloud environment.
Key Responsibilities
โข Design and Optimization: Design, implement, and optimize clean, well-structured, and performant analytical datasets to support high-volume reporting, business analysis, and data science model development.
โข Pipeline Development: Architect, build, and maintain scalable and robust data pipelines for diverse applications, including business intelligence, advanced analytics
โข Big Data & Streaming: Implement and support Big Data solutions for both batch (scheduled) and real-time/streaming analytics.
โข Collaboration: Work closely with product managers and business teams to understand data requirements and translate them into technical solutions.
Required Skills and Experience
โข Cloud Platform Expertise (GCP Focus): Extensive hands-on experience working in dynamic cloud environments, with a strong preference for Google Cloud Platform (GCP) services, specifically:
โ BigQuery: Expert-level skills in data ingestion, performance optimization, and data modeling within a petabyte-scale environment.
โ Experience with other relevant GCP services like Cloud Storage, Cloud Dataflow/Beam, or Pub/Sub
โข Programming & Querying:
โ Python: Expert-level programming proficiency in Python, including experience with relevant data engineering libraries
โ SQL: A solid command of advanced SQL for complex querying, data processing, and performance tuning.
โข Data Pipeline Orchestration: Prior experience using workflow management and orchestration tools (e.g., Apache Airflow, Cloud Composer, Airflow,Dagster, or similar).
โข DevOps/CI/CD: Experience with version control (Git) and familiarity with CI/CD practices and tools (e.g., GitLab, GitHub Actions) to automate deployment and testing processes.
Equal Employment Opportunity
Trilyon is an Equal Opportunity Employer, committed to fairness and respect for all individuals. We value diversity in age, disability, ethnicity, gender, gender identity, religion, and sexual orientation, believing it drives innovation and better service. Employment decisions are made impartially, without regard to any protected characteristic under federal, state, or local law. Our diverse team drives innovation, competitiveness, and creativity, enhancing our ability to effectively serve our clients and communities. This commitment to diversity makes us stronger and more adaptable.
Warm Regards,
Signature






