Technology Partners

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Pacific, Missouri, on a contract-to-hire basis, paying $73.50 - $105 per hour. Requires 3-5+ years in data engineering, expertise in Golang, RESTful APIs, cloud platforms, and Kubernetes.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
840
-
πŸ—“οΈ - Date
May 16, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Pacific, MO
-
🧠 - Skills detailed
#Golang #Datasets #REST (Representational State Transfer) #.Net #Apache Beam #Storage #Apache Kafka #Cloud #Deployment #Data Processing #Data Modeling #Kafka (Apache Kafka) #REST API #Scala #NoSQL #Dataflow #GCP (Google Cloud Platform) #API (Application Programming Interface) #Data Engineering #Data Science #Computer Science #Docker #Azure #Kubernetes #Databases #AWS (Amazon Web Services) #Unit Testing #Data Storage #Security
Role description
β€’ Location: Pacific, Missouri β€’ Type: Contract To Hire β€’ Job #46935 Technology Partners is currently seeking a talented Senior Data Engineer (Golang, Cloud, Kubernetes). Do you have experience developing large-scale data APIs using RESTful architecture and deploying containerized applications on cloud platforms like Google Cloud? Let us help you make your next big career move a reality! What You Will Be Doing: As a senior member of our data engineering team, you will design and implement distributed analysis capabilities across diverse datasets to accelerate crop system development. You will collaborate with top-tier talent to solve complex technical challenges with real-world impact. Your role involves guiding and mentoring other engineers, exploring and selecting appropriate technology stacks, and leading technical initiatives by communicating your strategic vision. You will project your expertise into relevant projects and have opportunities to present our innovations at industry conferences. While there is a strong preference for local candidates, those residing in Austin, Charlotte, Dallas, Denver, Chicago/Lockport, Houston, Lansing, Memphis, or San Antonio will be considered if they are willing to work in the office at least three days per week for the first 60 days. Key Responsibilities: β€’ Develop and maintain scalable data APIs following RESTful principles to support large scientific datasets. β€’ Design, implement, and optimize distributed analysis pipelines using cloud-native technologies. β€’ Collaborate with software engineers, data scientists, and stakeholders to translate scientific datasets into impactful software products. β€’ Mentor other engineers on data engineering best practices, algorithms, and modern technology stacks. β€’ Lead technical initiatives, including evaluating new tools and frameworks suitable for large-scale data processing. β€’ Contribute to open-source and industry conferences by presenting innovative solutions and sharing expertise. β€’ Ensure robust unit testing and implement test-driven development methodologies across projects. β€’ Manage containerized deployments using Docker and orchestrate these deployments with Kubernetes clusters. β€’ Build and support cloud infrastructure on Google Cloud Platform, ensuring scalability and security. β€’ Model and optimize data storage solutions, including relational and NoSQL databases, for large datasets. Required Skills & Experience: β€’ 3-5+ years of experience in data engineering, software development, or related fields. β€’ Proven experience developing and launching software products or features in Go. β€’ Extensive experience building data-intensive APIs using RESTful approaches. β€’ Hands-on experience with stream processing using Apache Kafka. β€’ Strong familiarity with Unit Testing and Test Driven Development methodologies. β€’ Proficiency in containerization with Docker and deployment orchestration using Kubernetes. β€’ Experience constructing and maintaining cloud infrastructure on Google Cloud Platform, AWS, or Azure. β€’ Data modeling expertise for large-scale relational or NoSQL databases. β€’ Excellent verbal and written communication skills in English. β€’ Bachelor’s degree in Computer Science, Engineering, Data Science, or related discipline. Desired Skills & Experience: β€’ Familiarity with protocol buffers and gRPC for efficient data serialization. β€’ Experience with Google Cloud Dataflow or Apache Beam for large-scale data processing. β€’ Background working with scientific datasets, especially in bioinformatics or genomics. β€’ Knowledge of variant data storage, annotation, and genotype-phenotype correlation. Pay: $73.50 - $105 per hour Keywords: Go, cloud data engineering, Kubernetes, REST API, Kafka, Google Cloud Platform, data modeling, bioinformatics, gRPC, containerization We are interested in every qualified candidate who is eligible to work in the United States. However, we are not able to provide sponsorship at this time or accept candidates who would require a corp-to-corp agreement. β€’ If this position sounds like you, WE SHOULD TALK! Your better future is ready, and we want to put the right tools in your hands to get you there. Let's go! Looking for more opportunities with Technology Partners? Check out technologypartners.net/jobs (https://technologypartners.net/jobs/)! All offers of employment at Technology Partners are contingent upon clear results of a thorough background check and drug screening that meet corresponding laws and regulations at the city, state and federal level. Pay ranges are influenced by candidate qualifications, experience, and role specifics, with the actual rate determined considering skills, market conditions, and are subject to change by the employer; pay negotiations follow all state and federal legal guidelines.