

Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 12-month remote contract, requiring 5+ years in Data Engineering. Key skills include AWS, Kubernetes, Python, and data platform optimization. W-2 only; no Corp-to-Corp or 1099.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
640
-
ποΈ - Date discovered
July 10, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Charlotte, NC
-
π§ - Skills detailed
#Data Security #Kubernetes #PostgreSQL #Vault #Data Engineering #"ETL (Extract #Transform #Load)" #Deployment #Lambda (AWS Lambda) #Logging #SQS (Simple Queue Service) #Scala #Data Quality #Agile #Spark (Apache Spark) #Data Lakehouse #Data Lake #Java #Containers #Redshift #Big Data #Cloud #API (Application Programming Interface) #RDS (Amazon Relational Database Service) #Python #GIT #SQL (Structured Query Language) #AWS (Amazon Web Services) #Terraform #Aurora #Athena #Hadoop #IAM (Identity and Access Management) #S3 (Amazon Simple Storage Service) #SNS (Simple Notification Service) #Kafka (Apache Kafka) #Storage #AWS IAM (AWS Identity and Access Management) #Security #Data Encryption #Debugging
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Location: Remote (must be able to work East Coast hours)
Duration: 12-month contract with strong potential for extension or full-time conversion
W-2 only (no Corp-to-Corp or 1099)
Brooksource is search for a Subject Matter Expert Data Engineer to join our Fortune 500 Energy & Utilities client. The successful candidate will assist in developing the companyβs Data Fabric as an interconnected network of data capabilities and data products designed to deliver data efficiently and at scale. Candidates should have expertise in developing and building data platforms, demonstrating experience with overcoming obstacles and avoiding pitfalls. They should also possess skills in optimizing and automating deliverables to production using the required tech stack. Additionally, candidates should be experienced and adaptable to changing demands and priorities in an Agile development environment.
We are specifically looking for individuals with at least 5+ years of experience in Data Engineering and/or Software Engineering roles who can provide knowledge and support to our existing engineers.
REQUIRED QUALIFICATIONS:
Must have experience with similar platform engineering/management solutions:
β’ Building/optimizing Data LakeHouse with Open Table formats
β’ Kubernetes deployments/cluster administration
β’ Transitioning on-premise big data platforms to scalable cloud-based platforms like AWS
β’ Distributed Systems, Microservice architecture, and containers
β’ Cloud Streaming use cases in Big Data Ecosystems (e.g., EMR, EKS, Hadoop, Spark, Hudi, Kafka/Kinesis)
Must have experience with below tech stack:
β’ Git Hib and Git Hub Actions
β’ AWS (IAM, API Gateway, Lambda, Step Functions, Lake formation, EKS & Kubernetes, Glue: Catalog/ETL/Crawler, Athena, Lambda, S3 (Strong foundational concepts like object data store vs block data store, encryption/decryption, storage tiers etc)
β’ Apache Hudi
β’ Apache Flink
β’ PostgreSQL and SQL
β’ RDS (Relational Database Services).
β’ Python
β’ Java
β’ Terraform Enterprise (Must be able to explain what TF is used for, must understand and explain basic principles (e.g. modules, providers, functions), must be able to write and debug TF)
DESIRED QUALIFICATIONS:
β’ Helm
β’ Kafka and Kafka Schema Registry
β’ AWS Services: CloudTrail, SNS, SQS, CloudWatch, Step Functions, Aurora, EMR, Redshift, Iceberg
β’ Secrets Management Platform: Vault, AWS Secrets manager
β’ Core Responsibilities and Soft Skills
β’ Provides technical direction, engage team in discussion on how to best guide/build features on key technical aspects and responsible for product tech delivery
β’ Works closely with the Product Owner and team to align on delivery goals and timing
β’ Collaborates with architects on key technical decisions for data and overall solution
β’ Lead design and implementation of data quality check methods
β’ Ensure data security and permissions solutions, including data encryption, user access controls and logging
β’ Be able to think unconventionally to find the best way to solve for a defined use case with fuzzy requirements.
β’ Self-starter mentality (willing to do their own research to solve problems and can clearly present findings and engage in conversation on what makes one solution better than another)
β’ Thrive in a fail-fast environment, involving mini PoCs, and participate in an inspect and adapt process.
β’ Questioning and Improvement mindset (must be ready to ask questions about why something is currently done the way it is and suggest alternative solutions)
β’ Customer facing skills (interfacing with stakeholders and other product teams via pairing, troubleshooting support, and debugging issues they encounter with our products)
Eight Eleven Group provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, national origin, age, sex, citizenship, disability, genetic information, gender, sexual orientation, gender identity, marital status, amnesty, or status as a covered veteran in accordance with applicable federal, state, and local laws.