Strategic Staffing Solutions

AWS Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Located in Charlotte, NC (Hybrid/Remote), candidates must have 5+ years in software development, strong Python and Terraform skills, and AWS service experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 5, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#AI (Artificial Intelligence) #DynamoDB #Dynatrace #Data Engineering #Collibra #Data Catalog #Data Pipeline #IAM (Identity and Access Management) #Lambda (AWS Lambda) #Observability #S3 (Amazon Simple Storage Service) #OpenSearch #Pytest #Python #Scala #API (Application Programming Interface) #Data Marketplace #GitHub #ML (Machine Learning) #Security #AWS (Amazon Web Services) #Terraform #VPC (Virtual Private Cloud) #Metadata #Kafka (Apache Kafka) #Infrastructure as Code (IaC) #Automated Testing
Role description
Data Engineer Location: Charlotte, NC (Hybrid/Remote) Work Setting: Preference for Hybrid but open to fully remote but they will be required to work 9 AM-5PM EST with the team. Limiting to candidates in CST and EST time zones. Role Summary & Description: The team is building modern, enterprise data platforms that make data easy to discover, understand, trust, and consume at scale. The Data Marketplace is a core capability in this vision, serving as the enterprise gateway for discovering data assets within the data enterprise. We are seeking a Senior Software Engineer with a strong background related to our tech stack to build and operate the software, data, and metadata foundations of the Data Marketplace. This is a hands-on individual contributor role focused on platform needs for scalable services, search and catalog capabilities, and enabling other experiences that improve how employees find and use data. Responsibilities: • Design, develop, and maintain Python-based services, Lambda functions, and data pipelines. • Provision and manage AWS infrastructure using Terraform, following best practices for modularity, security, and maintainability. • Build and operate AI/ML and search capabilities on Amazon OpenSearch, including vector search, semantic retrieval, and integration with LLM-driven workflows. • Collaborate with data, platform, and security teams to deliver end-to-end solutions across AWS services (Lambda, API Gateway, Glue, S3, DynamoDB, IAM, VPC, etc.). • Modernize Dynatrace alarming using Terraform and Python • Champion code quality through reviews, automated testing, CI/CD pipelines, and observability. • Mentor junior engineers, share best practices, and contribute to architectural decisions and technical standards. • Be able to think unconventionally to find the best way to solve for a defined use case with fuzzy requirements. • Self-starter mentality. Willing to do their own research to solve problems and can clearly present findings and engage in conversation on what makes one solution better than another. Must-Have Tech Stack: • 5+ years of professional software development experience and must be able to code. • Python: strong proficiency, including type hints, testing (unittest/pytest), and packaging • Terraform specifically for Infrastructure as Code related to AWS • GitHub • Confluence • Amazon OpenSearch: index design, query DSL, and vector/k-NN search • AI/ML: embeddings, RAG patterns, foundation models, and ML model integration • AWS services: • DynamoDB • Glue • IAM • S3 • API Gateway • Lambda Nice to Have: • Collibra or experience integrating with enterprise data catalog or governance platforms • Kafka or other event-driven architecture • AWS services: Lake Formation