Allure Consultant

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Required skills include GCP expertise, Python, SQL, data pipeline orchestration, and DevOps experience. Industry experience in big data and analytics is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
640
-
🗓️ - Date
October 28, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Programming #BI (Business Intelligence) #GitLab #Cloud #Dataflow #SQL (Structured Query Language) #DevOps #Apache Airflow #Libraries #Deployment #Scala #GCP (Google Cloud Platform) #Storage #Data Processing #Datasets #Data Pipeline #GitHub #GIT #BigQuery #Batch #Data Science #Data Engineering #Version Control #Big Data #Data Modeling #ML (Machine Learning) #Python #Airflow #Data Ingestion #Business Analysis
Role description
Role Summary We are looking for a highly skilled and motivated Data Engineer to join our team. The ideal candidate will be responsible for designing, building, and maintaining scalable data infrastructure that drives business intelligence, advanced analytics, and machine learning initiatives. You must be comfortable working autonomously, navigating complex challenges, and driving projects to successful completion in a dynamic cloud environment. Key Responsibilities • Design and Optimization: Design, implement, and optimize clean, well-structured, and performant analytical datasets to support high-volume reporting, business analysis, and data science model development. • Pipeline Development: Architect, build, and maintain scalable and robust data pipelines for diverse applications, including business intelligence, advanced analytics • Big Data & Streaming: Implement and support Big Data solutions for both batch (scheduled) and real-time/streaming analytics. • Collaboration: Work closely with product managers and business teams to understand data requirements and translate them into technical solutions. Required Skills And Experience • Cloud Platform Expertise (GCP Focus): Extensive hands-on experience working in dynamic cloud environments, with a strong preference for Google Cloud Platform (GCP) services, specifically: • BigQuery: Expert-level skills in data ingestion, performance optimization, and data modeling within a petabyte-scale environment. • Experience with other relevant GCP services like Cloud Storage, Cloud Dataflow/Beam, or Pub/Sub • Programming & Querying: • Python: Expert-level programming proficiency in Python, including experience with relevant data engineering libraries • SQL: A solid command of advanced SQL for complex querying, data processing, and performance tuning. • Data Pipeline Orchestration: Prior experience using workflow management and orchestration tools (e.g., Apache Airflow, Cloud Composer, Airflow,Dagster, or similar). • DevOps/CI/CD: Experience with version control (Git) and familiarity with CI/CD practices and tools (e.g., GitLab, GitHub Actions) to automate deployment and testing processes. Skills: cloud dataflow/beam,cloud platform expertise (gcp focus),devops/ci/cd,data pipeline orchestration,sql,pub/sub,cloud storage