Senior AWS Cloud Engineeer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior AWS Cloud Engineer in McKinney, TX, with a long-term contract. Requires 3+ years of experience in Python and AWS, expertise in cloud applications, CI/CD, and strong knowledge of AWS services. AWS certification is a plus.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 26, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
McKinney, TX
-
🧠 - Skills detailed
#Deployment #Security #IAM (Identity and Access Management) #Athena #S3 (Amazon Simple Storage Service) #AWS (Amazon Web Services) #SQS (Simple Queue Service) #Data Management #ML (Machine Learning) #Web Services #Lambda (AWS Lambda) #GIT #AWS IAM (AWS Identity and Access Management) #Apache Airflow #Scripting #Databases #Database Performance #AWS Machine Learning #Cloud #ECR (Elastic Container Registery) #SQL (Structured Query Language) #EC2 #Python #Data Pipeline #SNS (Simple Notification Service) #Airflow #Computer Science #Programming #Redshift #Agile
Role description
Sr AWS Cloud Engineer McKinney, TX Hybrid 3 Days a week F2F Interview Long Term Contract santosh@ebusinesstechcorp.com What You Can Bring: β€’ Bachelor's degree in Computer Science/Engineering, Information Systems or equivalent work experience in a technical position. β€’ 3+ years of experience in Software Engineering primarily using Python, and AWS architectures. β€’ Proven experience in building cloud applications and deploying with understanding of CI/CD processes. β€’ Strong coding and scripting experience with Python and SQL. β€’ Experience of implementing Airflow in AWS (MWAA) will be a plus. β€’ Prior domain experience of Life Insurance, Annuity or Financial Services is a plus. β€’ AWS Certifications are considered a strong plus. β€’ Excellent verbal and written communication skills. β€’ Ability to work independently and as part of team. β€’ Strong programming knowledge and experience with Python. β€’ Specific working experience with primary AWS Services including Lambda, Glue, EventBridge, SQS, AWSCLI, S3, Redshift. β€’ Experience with a variety of services such as Athena, EC2, EBS, CloudWatch, CloudTrail, ECS, ECR, EMR, IAM, SNS, SES etc. β€’ Extensive hands-on experience including design and implementation across broad range of database services on Amazon Web Services (AWS). β€’ Solid understanding of various Data Management and Data Pipeline tools available in AWS. β€’ Experience building and deploying applications using the AWS CDK. β€’ Experience with Airflow, and with the administering and configuring Amazon Managed Workflows for Apache Airflow (MWAA). β€’ Experience in creating and deploying CloudFormation Templates (CFTs). β€’ Experience with Lifecycle Management of S3 Buckets. β€’ Clear Understanding of Cloud Databases Security involving AWS IAM Users and access, IAM Roles and Policies, Federated Users, and permissions. β€’ Good Understanding of AWS Encryption methodologies and AWS KMS Services. β€’ Experience with database performance testing and capacity planning. β€’ Working knowledge and experience with software development life cycle (SDLC) and agile/iterative methodologies. β€’ Knowledge of AWS Machine Learning Offerings, knowledge bases and Bedrock will be a plus. β€’ Application development and deployment experience with AWS, CI/CD pipelines, Git, and related techs.