

Bayside Solutions
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a W2 contract, remote from Cupertino, CA, with a salary range of $124,800 - $145,600 per year. Requires 5+ years in data engineering, expertise in Astronomer.io, Apache Airflow, and Kubernetes.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
October 3, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Security #SaaS (Software as a Service) #Compliance #CLI (Command-Line Interface) #ML (Machine Learning) #Apache Airflow #dbt (data build tool) #Datasets #Azure #Metadata #Data Engineering #Airflow #Data Quality #Kubernetes #API (Application Programming Interface) #Scala #Data Lineage #Observability #Automation #GCP (Google Cloud Platform) #Cloud #AWS (Amazon Web Services) #Deployment #Visualization #Logging
Role description
Data Engineer
W2 Contract
Salary Range: $124,800 - $145,600 per year
Location: Cupertino, CA - Remote Role
Duties and Responsibilities:
• Design, deploy, and manage Astronomer.io (Astro) environments, including organization/workspace setup, Deployments, queues, executors, secrets, and runtime upgrades.
• Utilize the Astronomer CLI and Astro Platform API for local development, automation, and CI/CD workflows (astro dev, astro deploy, service accounts, and API tokens).
• Implement and manage workflow orchestration using Apache Airflow on Astronomer, including DAG optimization, deferrable operators, retries, backfills, and task dependencies.
• Integrate OpenLineage on Astronomer to track end-to-end data lineage and enable visibility across pipelines, datasets, and ML workflows.
• Set up Astro Observe to define data products, manage freshness SLAs, and configure automated alerts (Slack, PagerDuty, email) without modifying DAG code.
• Deploy and manage Astronomer in hybrid mode (SaaS control plane + customer data plane) to meet enterprise security, compliance, and privacy requirements.
• Collaborate closely with cross-functional teams to maintain high data quality, reliable pipelines, and robust observability using Astro logs, metrics, and dashboards.
• Troubleshoot and optimize Airflow DAGs, containerized workloads, and Astronomer Deployments for scalability, cost-efficiency, and high availability.
Requirements and Qualifications:
• 5+ years of data engineering experience designing and managing large-scale, distributed data systems.
• Expert-level experience with Astronomer.io (Astro):
• Hands-on with Astro Deployments, runtime management, executors, and queues.
• Proficiency with Astro CLI and Astro Platform API for automation and CI/CD.
• Experience configuring Astro Observe, data product definitions, SLAs, and alerts.
• Strong knowledge of OpenLineage integration with Astronomer for data and ML lineage tracking.
• Deep expertise in Apache Airflow on Astronomer: DAG authoring, sensors/deferrables, retries, and provider integrations.
• Solid understanding of Kubernetes for Astronomer data-plane operations and scaling containerized data applications.
• Proven ability to design observability strategies for Airflow pipelines using metrics, tracing, and logging.
Preferred Qualifications:
• Experience integrating Astronomer with metadata platforms like Marquez, DataHub, or Atlan for lineage visualization.
• Familiarity with DBT on Airflow using Astronomer Cosmos.
• Prior experience deploying Astronomer in regulated environments with strict compliance and governance controls.
• Knowledge of cloud-native data ecosystems (AWS/GCP/Azure) and secure, multi-region Astronomer setups.
Desired Skills and Experience
Astronomer.io, Astro environments, organization/workspace setup, Deployments, queues, executors, secrets, runtime upgrades, Astronomer CLI, Astro Platform API, local development, automation, CI/CD workflows, astro dev, astro deploy, service accounts, API tokens, Apache Airflow, workflow orchestration, DAG optimization, deferrable operators, retries, backfills, task dependencies, OpenLineage, data lineage, ML workflows, Astro Observe, data products, freshness SLAs, automated alerts, Slack, PagerDuty, hybrid deployment, SaaS control plane, customer data plane, enterprise security, compliance, privacy requirements, data quality, observability, Astro logs, metrics, dashboards, containerized workloads, scalability, cost-efficiency, high availability, distributed data systems, runtime management, DAG authoring, sensors, deferrables, provider integrations, Kubernetes, data-plane operations, tracing, logging, Marquez, DataHub, Atlan, DBT, Cosmos, regulated environments, governance controls, cloud-native data ecosystems, AWS, GCP, Azure, multi-region setups
Bayside Solutions, Inc. is not able to sponsor any candidates at this time. Additionally, candidates for this position must qualify as a W2 candidate.
Bayside Solutions, Inc. may collect your personal information during the position application process. Please reference Bayside Solutions, Inc.'s CCPA Privacy Policy at www.baysidesolutions.com.
Data Engineer
W2 Contract
Salary Range: $124,800 - $145,600 per year
Location: Cupertino, CA - Remote Role
Duties and Responsibilities:
• Design, deploy, and manage Astronomer.io (Astro) environments, including organization/workspace setup, Deployments, queues, executors, secrets, and runtime upgrades.
• Utilize the Astronomer CLI and Astro Platform API for local development, automation, and CI/CD workflows (astro dev, astro deploy, service accounts, and API tokens).
• Implement and manage workflow orchestration using Apache Airflow on Astronomer, including DAG optimization, deferrable operators, retries, backfills, and task dependencies.
• Integrate OpenLineage on Astronomer to track end-to-end data lineage and enable visibility across pipelines, datasets, and ML workflows.
• Set up Astro Observe to define data products, manage freshness SLAs, and configure automated alerts (Slack, PagerDuty, email) without modifying DAG code.
• Deploy and manage Astronomer in hybrid mode (SaaS control plane + customer data plane) to meet enterprise security, compliance, and privacy requirements.
• Collaborate closely with cross-functional teams to maintain high data quality, reliable pipelines, and robust observability using Astro logs, metrics, and dashboards.
• Troubleshoot and optimize Airflow DAGs, containerized workloads, and Astronomer Deployments for scalability, cost-efficiency, and high availability.
Requirements and Qualifications:
• 5+ years of data engineering experience designing and managing large-scale, distributed data systems.
• Expert-level experience with Astronomer.io (Astro):
• Hands-on with Astro Deployments, runtime management, executors, and queues.
• Proficiency with Astro CLI and Astro Platform API for automation and CI/CD.
• Experience configuring Astro Observe, data product definitions, SLAs, and alerts.
• Strong knowledge of OpenLineage integration with Astronomer for data and ML lineage tracking.
• Deep expertise in Apache Airflow on Astronomer: DAG authoring, sensors/deferrables, retries, and provider integrations.
• Solid understanding of Kubernetes for Astronomer data-plane operations and scaling containerized data applications.
• Proven ability to design observability strategies for Airflow pipelines using metrics, tracing, and logging.
Preferred Qualifications:
• Experience integrating Astronomer with metadata platforms like Marquez, DataHub, or Atlan for lineage visualization.
• Familiarity with DBT on Airflow using Astronomer Cosmos.
• Prior experience deploying Astronomer in regulated environments with strict compliance and governance controls.
• Knowledge of cloud-native data ecosystems (AWS/GCP/Azure) and secure, multi-region Astronomer setups.
Desired Skills and Experience
Astronomer.io, Astro environments, organization/workspace setup, Deployments, queues, executors, secrets, runtime upgrades, Astronomer CLI, Astro Platform API, local development, automation, CI/CD workflows, astro dev, astro deploy, service accounts, API tokens, Apache Airflow, workflow orchestration, DAG optimization, deferrable operators, retries, backfills, task dependencies, OpenLineage, data lineage, ML workflows, Astro Observe, data products, freshness SLAs, automated alerts, Slack, PagerDuty, hybrid deployment, SaaS control plane, customer data plane, enterprise security, compliance, privacy requirements, data quality, observability, Astro logs, metrics, dashboards, containerized workloads, scalability, cost-efficiency, high availability, distributed data systems, runtime management, DAG authoring, sensors, deferrables, provider integrations, Kubernetes, data-plane operations, tracing, logging, Marquez, DataHub, Atlan, DBT, Cosmos, regulated environments, governance controls, cloud-native data ecosystems, AWS, GCP, Azure, multi-region setups
Bayside Solutions, Inc. is not able to sponsor any candidates at this time. Additionally, candidates for this position must qualify as a W2 candidate.
Bayside Solutions, Inc. may collect your personal information during the position application process. Please reference Bayside Solutions, Inc.'s CCPA Privacy Policy at www.baysidesolutions.com.