

VBeyond Corporation
AWS Data Engineer - Bedrock (W2)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior AWS Data Engineer with Bedrock in Chicago, IL, on a long-term contract. Key skills include Python, PySpark, AWS, and big data experience. An engineering degree and technical certifications are preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
November 6, 2025
π - Duration
Unknown
-
ποΈ - Location
Hybrid
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Chicago, IL
-
π§ - Skills detailed
#Docker #PostgreSQL #Scala #Knowledge Graph #Big Data #NoSQL #Kafka (Apache Kafka) #SQL (Structured Query Language) #Redis #Lambda (AWS Lambda) #Data Engineering #Data Pipeline #Elasticsearch #Spark (Apache Spark) #GIT #Athena #PySpark #Datasets #Airflow #Agile #AWS (Amazon Web Services) #IAM (Identity and Access Management) #Python #Databases #Complex Queries #Terraform
Role description
Job Description
Job Title: Senior AWS Data Engineer with Bedrock
Location: Chicago, IL (3 days onsite/2 days remote/week)
Employment Type: Contract (Long-Term)
Mandatory skills
β’ Python, PySpark, AWS & Bedrock
Good to have skills: -
β’ EMR, Spark, Kafka/ Kinesis
Our vision: -
To inspire new possibilities for the health ecosystem with technology and human ingenuity.
What is in it for you?
As a Data Engineer with deep expertise in Python, AWS, big data ecosystems, and SQL/NoSQL technologies, driving scalable, real-time data solutions with CI/CD and stream-processing frameworks.
Responsibilities: -
β’ Proficient developer in multiple languages, Python is a must, with the ability to quickly learn new ones.
β’ Expertise in SQL (complex queries, relational databases preferably PostgreSQL, and NoSQL database - Redis and Elasticsearch).
β’ Extensive big data experience, including EMR, Spark, Kafka/ Kinesis, and optimizing data pipelines, architectures, and datasets.
β’ AWS expert with hands-on experience in Lambda, Glue,Athena, Kinesis, IAM, EMR/PySpark, Docker,
β’ Proficient in CI/CD development using Git, Terraform, and agile methodologies.
β’ Comfortable with stream-processing systems (Storm, Spark-Streaming) and workflow management tools (Airflow).
β’ Exposure to knowledge graph technologies (Graph DB, OWL, SPARQL) is a plus. .
Educational Qualifications: -
β’ Engineering Degree β BE/ME/BTech/MTech/BSc/MSc.
β’ Technical certification in multiple technologies is desirable.
Job Description
Job Title: Senior AWS Data Engineer with Bedrock
Location: Chicago, IL (3 days onsite/2 days remote/week)
Employment Type: Contract (Long-Term)
Mandatory skills
β’ Python, PySpark, AWS & Bedrock
Good to have skills: -
β’ EMR, Spark, Kafka/ Kinesis
Our vision: -
To inspire new possibilities for the health ecosystem with technology and human ingenuity.
What is in it for you?
As a Data Engineer with deep expertise in Python, AWS, big data ecosystems, and SQL/NoSQL technologies, driving scalable, real-time data solutions with CI/CD and stream-processing frameworks.
Responsibilities: -
β’ Proficient developer in multiple languages, Python is a must, with the ability to quickly learn new ones.
β’ Expertise in SQL (complex queries, relational databases preferably PostgreSQL, and NoSQL database - Redis and Elasticsearch).
β’ Extensive big data experience, including EMR, Spark, Kafka/ Kinesis, and optimizing data pipelines, architectures, and datasets.
β’ AWS expert with hands-on experience in Lambda, Glue,Athena, Kinesis, IAM, EMR/PySpark, Docker,
β’ Proficient in CI/CD development using Git, Terraform, and agile methodologies.
β’ Comfortable with stream-processing systems (Storm, Spark-Streaming) and workflow management tools (Airflow).
β’ Exposure to knowledge graph technologies (Graph DB, OWL, SPARQL) is a plus. .
Educational Qualifications: -
β’ Engineering Degree β BE/ME/BTech/MTech/BSc/MSc.
β’ Technical certification in multiple technologies is desirable.






