JoCo

Senior AWS Cloud Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior AWS Cloud Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Key skills include AWS, Apache Iceberg, Snowflake, Python, and SQL. A Bachelor's degree and strong AWS data engineering experience are required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
March 19, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Dallas-Fort Worth Metroplex
-
🧠 - Skills detailed
#Snowflake #Compliance #Data Lakehouse #API (Application Programming Interface) #Data Quality #Schema Design #S3 (Amazon Simple Storage Service) #Infrastructure as Code (IaC) #Lambda (AWS Lambda) #Data Integrity #Data Access #Spark (Apache Spark) #Scala #Big Data #Computer Science #Monitoring #Cloud #Data Modeling #Airflow #"ETL (Extract #Transform #Load)" #Data Lake #Documentation #IAM (Identity and Access Management) #Java #Hadoop #AWS (Amazon Web Services) #Apache Iceberg #Data Engineering #Python #SQL (Structured Query Language) #Terraform #DevOps #Data Pipeline
Role description
What is the position? The IT Engineer will be responsible for designing and developing modern data lakehouse solutions within the AWS ecosystem, with a strong focus on Apache Iceberg and Snowflake. This role involves building scalable data platforms, developing secure data interfaces, and ensuring performance, cost efficiency, and data integrity across cloud-based systems. What are the responsibilities? As an IT Engineer, you will: β€’ Architect and implement AWS-based lakehouse solutions using Amazon S3, Apache Iceberg, and Snowflake β€’ Build and automate ETL/ELT pipelines using native AWS services (Glue, Lambda, Step Functions, etc.) β€’ Develop and manage Apache Iceberg tables, including schema design and optimization β€’ Design and support Snowflake data models, pipelines, and analytics use cases β€’ Create secure, scalable APIs for data access using AWS tools (API Gateway, Lambda, IAM) β€’ Optimize performance and cost across data platforms and pipelines β€’ Ensure data quality, integrity, and compliance through validation and monitoring β€’ Collaborate with cross-functional teams to deliver data-driven solutions β€’ Provide technical support and maintain documentation for data systems What are the requirements? β€’ Bachelor’s degree in Computer Science, Information Technology, or related field β€’ Strong experience in AWS-based data engineering β€’ Proficiency in Python, Java, or Scala β€’ Advanced SQL and data modeling experience (including Snowflake) β€’ Experience with AWS services such as S3, Glue, Lambda, API Gateway, and IAM β€’ Hands-on experience with Apache Iceberg and big data technologies (e.g., Spark, Hadoop) β€’ Experience building data pipelines and using orchestration tools (e.g., Airflow or AWS-native tools) β€’ Experience with API development and DevOps/IaC practices (e.g., Terraform) β€’ Strong analytical and communication skills β€’ Preferred: Experience with Snowflake data engineering and analytics architectures; AWS or other cloud/data engineering certifications You would be really happy here if: β€’ You can be counted on in crucial times, possessing great focus while completing projects successfully and efficiently. β€’ You understand how to successfully evaluate problems and develop appropriate solutions. JoCo is an Equal Opportunity Employer. We are committed to providing a diverse and inclusive workplace and do not discriminate on the basis of race, color, religion, gender, gender identity, sexual orientation, national origin, disability, age, or any other characteristic protected by applicable law. All employment decisions are made based on qualifications, merit, and business needs.