

Sr Cloud Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Cloud Data Engineer on a long-term W2 contract in SFO, CA, offering competitive pay. Key skills include ETL processes, big data technologies, and cloud experience (Azure/AWS). Proficiency in data pipeline orchestration and real-time analytics is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
520
-
ποΈ - Date discovered
August 9, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
San Francisco, CA
-
π§ - Skills detailed
#"ETL (Extract #Transform #Load)" #Databases #Hadoop #Kafka (Apache Kafka) #NoSQL #Terraform #Automation #Cloud #DevOps #Data Pipeline #Data Warehouse #Data Ingestion #Data Science #Data Lake #Big Data #Airflow #Luigi #Data Engineering #Datasets #AWS (Amazon Web Services) #Scala #Spark (Apache Spark) #Infrastructure as Code (IaC) #Azure #ML (Machine Learning) #Deployment
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Sr Cloud Data Engineer
Client: Workato
Location: SFO, CA
Employment Type: Longterm Contract W2
Job Summary:
We are seeking a skilled Cloud Data Engineer with expertise in designing, developing, and optimizing large-scale data pipelines and analytics solutions on Azure and/or AWS. The ideal candidate will have strong experience with ETL processes, big data technologies, and real-time analytics, as well as proficiency in cloud infrastructure setup and management.
Key Responsibilities:
β’ Design, develop, and maintain scalable ETL pipelines and data workflows using modern tools and frameworks.
β’ Work with Azure Data Services and/or AWS Data Services to implement cloud-based data solutions.
β’ Build and orchestrate data pipelines with Luigi, Airflow, or similar workflow management tools.
β’ Integrate, process, and analyze large datasets using Hadoop, Spark, and other big data technologies.
β’ Develop and maintain data lakes, data warehouses, and real-time analytics solutions.
β’ Collaborate with data scientists, analysts, and business teams to deliver reliable, high-performance data solutions.
β’ Implement cloud infrastructure best practices, ensuring scalability, performance, and cost efficiency.
β’ Monitor, troubleshoot, and optimize data workflows for quality and efficiency.
Required Skills & Qualifications:
β’ Proven experience as a Data Engineer or similar role in cloud environments (Azure and/or AWS).
β’ Strong expertise in ETL processes, data ingestion, and data transformation.
β’ Proficiency with big data tools (Hadoop, Spark, Hive, etc.).
β’ Experience building and managing data pipelines and workflow orchestration using Luigi, Airflow, etc.
β’ Hands-on experience with data lakes, data warehouses, and real-time streaming frameworks (Kafka, Kinesis, etc.).
β’ Solid understanding of cloud infrastructure and deployment automation.
β’ Knowledge of relational and NoSQL databases.
β’ Strong problem-solving skills and ability to work in cross-functional teams.
Nice to Have:
β’ Experience with DevOps for data engineering (CI/CD pipelines, Infrastructure as Code β Terraform, CloudFormation).
β’ Familiarity with machine learning data pipelines.
β’ Exposure to multi-cloud environments.