

Data Engineer - Senior
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer, remote for over 6 months, with a pay rate of $50,000 - $100,000 annually. Requires 7+ years of experience, strong Python or Java skills, SQL proficiency, and familiarity with cloud-based storage.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
454.5454545455
-
ποΈ - Date discovered
August 30, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Tulsa, OK
-
π§ - Skills detailed
#Python #Batch #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Data Modeling #AWS S3 (Amazon Simple Storage Service) #Data Quality #Kafka (Apache Kafka) #Agile #Cloud #Unit Testing #Airflow #EC2 #AWS (Amazon Web Services) #Datasets #Normalization #Scala #Data Engineering #Java #GCP (Google Cloud Platform) #Data Analysis #Lambda (AWS Lambda) #S3 (Amazon Simple Storage Service) #Storage #Data Processing #Data Governance
Role description
What We're Working On
We help global enterprises launch digital products that reach millions of users. Our projects involve massive datasets, complex pipelines, and real-world impact across industries.
What Youβll Do
β’ Join the team as a Senior-Level Data Engineer
β’ Design, build, and maintain reliable ETL pipelines from the ground up
β’ Work with large, complex datasets using Python or Java and raw SQL
β’ Build scalable, efficient data flows and transformations
β’ Collaborate with data analysts, product managers, and developers to deliver actionable data to stakeholders
β’ Ensure data quality, consistency, and performance across systems
What Weβre Looking For
β’ 7+ years of experience as a Data Engineer
β’ Strong skills in Python or Java for data processing
β’ Proficient in SQL, especially for querying large datasets
β’ Experience with batch and/or stream data processing pipelines
β’ Familiarity with cloud-based storage and compute (e.g., AWS S3, EC2, Lambda, GCP Cloud Storage, etc.)
β’ Knowledge of data modeling, normalization, and performance optimization
β’ Comfortable working in agile, collaborative, and fully remote environments
β’ Fluent in English (spoken and written)
Nice to Have (Not Required)
β’ Experience with Airflow, Kafka, or similar orchestration/message tools
β’ Exposure to basic data governance or privacy standards
β’ Unit testing and CI/CD pipelines for data workflows
This job is 100% Remote β please ensure you have a comfortable home office setup in your preferred work location.
Salary
Salary range: $50,000 - $100,000 annually, with final compensation determined by your qualifications, expertise, experience, and the role's scope.
In addition to competitive pay, we offer a variety of benefits to support your professional and personal growth, including:
β’ Flexible working hours in a remote environment.
β’ Health insurance (medical and dental) for W2 Employees
β’ 401K Contribution
β’ A professional development fund to enhance your skills and knowledge.
β’ 15 days of paid time off annually.
β’ Access to soft-skill development courses to further your career.
This is a full-time position requiring a minimum of 40 hours per week, Monday through Friday.
At Lumenalta, we are committed to creating an environment that prioritizes growth, work-life balance, and the diverse needs of our team members.
Ongoing recruitment β no set deadline.
What We're Working On
We help global enterprises launch digital products that reach millions of users. Our projects involve massive datasets, complex pipelines, and real-world impact across industries.
What Youβll Do
β’ Join the team as a Senior-Level Data Engineer
β’ Design, build, and maintain reliable ETL pipelines from the ground up
β’ Work with large, complex datasets using Python or Java and raw SQL
β’ Build scalable, efficient data flows and transformations
β’ Collaborate with data analysts, product managers, and developers to deliver actionable data to stakeholders
β’ Ensure data quality, consistency, and performance across systems
What Weβre Looking For
β’ 7+ years of experience as a Data Engineer
β’ Strong skills in Python or Java for data processing
β’ Proficient in SQL, especially for querying large datasets
β’ Experience with batch and/or stream data processing pipelines
β’ Familiarity with cloud-based storage and compute (e.g., AWS S3, EC2, Lambda, GCP Cloud Storage, etc.)
β’ Knowledge of data modeling, normalization, and performance optimization
β’ Comfortable working in agile, collaborative, and fully remote environments
β’ Fluent in English (spoken and written)
Nice to Have (Not Required)
β’ Experience with Airflow, Kafka, or similar orchestration/message tools
β’ Exposure to basic data governance or privacy standards
β’ Unit testing and CI/CD pipelines for data workflows
This job is 100% Remote β please ensure you have a comfortable home office setup in your preferred work location.
Salary
Salary range: $50,000 - $100,000 annually, with final compensation determined by your qualifications, expertise, experience, and the role's scope.
In addition to competitive pay, we offer a variety of benefits to support your professional and personal growth, including:
β’ Flexible working hours in a remote environment.
β’ Health insurance (medical and dental) for W2 Employees
β’ 401K Contribution
β’ A professional development fund to enhance your skills and knowledge.
β’ 15 days of paid time off annually.
β’ Access to soft-skill development courses to further your career.
This is a full-time position requiring a minimum of 40 hours per week, Monday through Friday.
At Lumenalta, we are committed to creating an environment that prioritizes growth, work-life balance, and the diverse needs of our team members.
Ongoing recruitment β no set deadline.