Data Engineer - Senior

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer, offering a full-time remote contract of over 6 months at a pay rate of $50,000 - $100,000 annually. Key skills include 7+ years of experience, proficiency in Python or Java, and SQL expertise.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
454.5454545455
-
πŸ—“οΈ - Date discovered
August 4, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Tulsa, OK
-
🧠 - Skills detailed
#S3 (Amazon Simple Storage Service) #Java #Scala #GCP (Google Cloud Platform) #Normalization #Agile #Lambda (AWS Lambda) #AWS S3 (Amazon Simple Storage Service) #Data Engineering #Data Governance #SQL (Structured Query Language) #Data Modeling #Storage #Batch #Data Analysis #Kafka (Apache Kafka) #AWS (Amazon Web Services) #Cloud #EC2 #Unit Testing #Airflow #Data Processing #Datasets #Data Quality #Python #"ETL (Extract #Transform #Load)"
Role description
What We're Working On We help global enterprises launch digital products that reach millions of users. Our projects involve massive datasets, complex pipelines, and real-world impact across industries. What You’ll Do β€’ Join the team as a Senior-Level Data Engineer β€’ Design, build, and maintain reliable ETL pipelines from the ground up β€’ Work with large, complex datasets using Python or Java and raw SQL β€’ Build scalable, efficient data flows and transformations β€’ Collaborate with data analysts, product managers, and developers to deliver actionable data to stakeholders β€’ Ensure data quality, consistency, and performance across systems What We’re Looking For β€’ 7+ years of experience as a Data Engineer β€’ Strong skills in Python or Java for data processing β€’ Proficient in SQL, especially for querying large datasets β€’ Experience with batch and/or stream data processing pipelines β€’ Familiarity with cloud-based storage and compute (e.g., AWS S3, EC2, Lambda, GCP Cloud Storage, etc.) β€’ Knowledge of data modeling, normalization, and performance optimization β€’ Comfortable working in agile, collaborative, and fully remote environments β€’ Fluent in English (spoken and written) Nice to Have (Not Required) β€’ Experience with Airflow, Kafka, or similar orchestration/message tools β€’ Exposure to basic data governance or privacy standards β€’ Unit testing and CI/CD pipelines for data workflows This job is 100% Remote – please ensure you have a comfortable home office setup in your preferred work location. Salary Salary range: $50,000 - $100,000 annually, with final compensation determined by your qualifications, expertise, experience, and the role's scope. In addition to competitive pay, we offer a variety of benefits to support your professional and personal growth, including: β€’ Flexible working hours in a remote environment. β€’ Health insurance (medical and dental) for W2 Employees β€’ 401K Contribution β€’ A professional development fund to enhance your skills and knowledge. β€’ 15 days of paid time off annually. β€’ Access to soft-skill development courses to further your career. This is a full-time position requiring a minimum of 40 hours per week, Monday through Friday. At Lumenalta, we are committed to creating an environment that prioritizes growth, work-life balance, and the diverse needs of our team members. Ongoing recruitment – no set deadline.