Data Engineer - Senior

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer, offering a full-time remote position for over 6 months with a salary range of $50,000 - $100,000. Requires 7+ years of experience, strong skills in Python or Java, and proficiency in SQL.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
454.5454545455
-
πŸ—“οΈ - Date discovered
August 11, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
W2 Contractor
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#GCP (Google Cloud Platform) #SQL (Structured Query Language) #Batch #Data Processing #"ETL (Extract #Transform #Load)" #Data Modeling #Datasets #EC2 #Scala #Unit Testing #AWS (Amazon Web Services) #AWS S3 (Amazon Simple Storage Service) #Data Quality #S3 (Amazon Simple Storage Service) #Storage #Python #Lambda (AWS Lambda) #Airflow #Agile #Data Analysis #Data Governance #Data Engineering #Kafka (Apache Kafka) #Java #Normalization #Cloud
Role description
What We're Working On We help global enterprises launch digital products that reach millions of users. Our projects involve massive datasets, complex pipelines, and real-world impact across industries. What You’ll Do β€’ Join the team as a Senior-Level Data Engineer β€’ Design, build, and maintain reliable ETL pipelines from the ground up β€’ Work with large, complex datasets using Python or Java and raw SQL β€’ Build scalable, efficient data flows and transformations β€’ Collaborate with data analysts, product managers, and developers to deliver actionable data to stakeholders β€’ Ensure data quality, consistency, and performance across systems What We’re Looking For β€’ 7+ years of experience as a Data Engineer β€’ Strong skills in Python or Java for data processing β€’ Proficient in SQL, especially for querying large datasets β€’ Experience with batch and/or stream data processing pipelines β€’ Familiarity with cloud-based storage and compute (e.g., AWS S3, EC2, Lambda, GCP Cloud Storage, etc.) β€’ Knowledge of data modeling, normalization, and performance optimization β€’ Comfortable working in agile, collaborative, and fully remote environments β€’ Fluent in English (spoken and written) Nice to Have (Not Required) β€’ Experience with Airflow, Kafka, or similar orchestration/message tools β€’ Exposure to basic data governance or privacy standards β€’ Unit testing and CI/CD pipelines for data workflows This job is 100% Remote – please ensure you have a comfortable home office setup in your preferred work location. Salary Salary range: $50,000 - $100,000 annually, with final compensation determined by your qualifications, expertise, experience, and the role's scope. In addition to competitive pay, we offer a variety of benefits to support your professional and personal growth, including: β€’ Flexible working hours in a remote environment. β€’ Health insurance (medical and dental) for W2 Employees β€’ 401K Contribution β€’ A professional development fund to enhance your skills and knowledge. β€’ 15 days of paid time off annually. β€’ Access to soft-skill development courses to further your career. This is a full-time position requiring a minimum of 40 hours per week, Monday through Friday. At Lumenalta, we are committed to creating an environment that prioritizes growth, work-life balance, and the diverse needs of our team members. Ongoing recruitment – no set deadline.