Avacend Inc

Cloud Cost Optimization

โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Cost Optimization specialist, contract from 11/17/2025 to 4/25/2026, remote (EST preferred). Requires a degree in Computer Science/Data Science, 2+ years in cloud operations, expertise in AWS, GCP, Azure, and proficiency in Python, ETL, and ML.
๐ŸŒŽ - Country
United States
๐Ÿ’ฑ - Currency
$ USD
-
๐Ÿ’ฐ - Day rate
Unknown
-
๐Ÿ—“๏ธ - Date
November 4, 2025
๐Ÿ•’ - Duration
More than 6 months
-
๐Ÿ๏ธ - Location
Remote
-
๐Ÿ“„ - Contract
Unknown
-
๐Ÿ”’ - Security
Unknown
-
๐Ÿ“ - Location detailed
United States
-
๐Ÿง  - Skills detailed
#GCP (Google Cloud Platform) #Bash #Spark (Apache Spark) #Oracle #AWS Glue #Apache Airflow #DevOps #Data Pipeline #"ETL (Extract #Transform #Load)" #Flask #ML (Machine Learning) #Data Storage #PySpark #GitHub #AWS (Amazon Web Services) #Athena #Data Science #Version Control #Storage #Anomaly Detection #Computer Science #Documentation #Cloud #Data Lake #Jira #Snowflake #Libraries #AI (Artificial Intelligence) #Scripting #Public Cloud #Angular #Golang #Programming #TensorFlow #Python #PyTorch #Azure #GIT #Airflow #Agile #Forecasting #Jenkins #Scala #Security #GitLab
Role description
Duration: Potential contract to hire : Start/End Dates: 11/17/2025 - 4/25/2026 Location: Remote option and EST TimeZone preferred Job Description: What You Will Do: โ€ข Design and implement AI Agents to monitor and optimize cloud resources based on findings and recommendations from Cloud Service Providers. โ€ข Develop predictive models for drift detection, cost anomaly detection, and forecasting of public cloud resources and spend. โ€ข Automate operational workflows using machine learning and intelligent scripting. โ€ข Integrate AI-driven insights with Cloud Service Providers like AWS, GCP, Azure, and existing data and tools. โ€ข Conduct anomaly detection for security, cost optimization, and performance analytics. โ€ข Design, build, and maintain scalable ETL pipelines using AWS Glue and other cloud-native services. โ€ข Utilize AWS Athena for interactive querying of data stored in data lakes. โ€ข Manage and optimize data storage and processing using Snowflake cloud data platform. โ€ข Orchestrate complex workflows and data pipelines using Apache Airflow DAGs. โ€ข Continuously evaluate emerging AI technologies and tools for operational improvements. โ€ข Maintain documentation and best practices for AI/ML integration in cloud systems. Our Minimum Requirements Include: โ€ข Bachelorโ€™s or Masterโ€™s degree in Computer Science, Data Science, or related technical field, or equivalent experience. โ€ข Proven ability building and deploying ML models, with at least 2 years focused on cloud operations. โ€ข Solid knowledge of cloud technologies (AWS, GCP, Azure, OCI). โ€ข Experience with Python, PySpark, and ML libraries such as PyTorch, TensorFlow, or scikit-learn. โ€ข Comfortable working with streaming data, APIs, and telemetry systems. โ€ข Experience with AWS Glue ETL, AWS Athena, Snowflake, and Apache Airflow DAGs. โ€ข Strong communication and multi-functional collaboration skills. โ€ข Experience with Agile and DevOps operating models, including project tracking tools (e.g., Jira), Git (any Version Control systems), and CI/CD systems (e.g., GitLab, GitHub Actions, Jenkins). โ€ข Proficient in general-purpose programming languages (Python, Golang, Bash) and development platforms and technologies. Preferred Qualifications: โ€ข Understanding of Cloud Technologies and Services of one or more providers including AWS, GCP, Azure, Oracle, and Alibaba. โ€ข Established record of leading technical initiatives, delivering results, and a commitment to fostering a supportive work environment. โ€ข Hard-working, dedicated to providing quality support for your customers. โ€ข Full stack development experience with Angular for frontend and Flask for backend application development