

Data Engineer/Backend Engineer Remote
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer/Backend Engineer position, offering a 12+ month contract at $60/hr. Remote work is available. Key skills include AWS-EMR, Python, Spark, and SQL. Requires 3+ years of experience in data engineering or backend development.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
480
-
ποΈ - Date discovered
June 25, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Los Angeles, CA
-
π§ - Skills detailed
#Spark (Apache Spark) #Unix #NoSQL #AWS (Amazon Web Services) #Apache Spark #GCP (Google Cloud Platform) #dbt (data build tool) #Airflow #Agile #Leadership #Linux #Python #Java #Databases #SQL (Structured Query Language) #Migration #GIT #Debugging #"ETL (Extract #Transform #Load)" #Cloud #Data Engineering
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Please contact: To discuss this amazing opportunity, reach out to our Talent Acquisition Specialist Abhinav Chakraborty at email address Abhinav.Chakraborty@generistek.com can be reached on # 630-576-1925.
We have Contract role Data engineer/Backend Engineer Remotefor our client at Los Angeles CA. Please let me know if you or any of your friends would be interested in this position.
Position Details:
Data engineer/Backend Engineer Remote-Los Angeles CA
Location : Remote
Project Duration : 12+ Months Contract
Pay Rate : $60/hr. on W2
Must-have skills/qualifications (technical, soft skills, certifications, tools):
β’ Technical: AWS-EMR, Airflow, DBT, GCP, Big Query, BQ-ETL, Python, Spark, Java, Git, Unix/Linux, CI/CD tools
β’ Soft Skills: Communication, Proactive Attitude, Quick Learner
β’ Ideal experience level (years, leadership, industries):3+ years
β’ Any preferred industries or companies for background? Internet, Technology, Consumer Products
β’ Desired personality or work style: Analytical Thinker, Independent and Accountable, Collaborative, Agile and Flexible
β’ Key attributes or values sought in the candidate: Integrity, Ownership, Good Communication
Minimum Qualifications:
β’ 3+ years of experience in data engineering or backend development
β’ Strong proficiency in Python
β’ Experience with distributed processing tools like Apache Spark
β’ Solid SQL skills and experience with both relational and NoSQL databases
β’ Hands-on with GCP or AWS and relevant cloud APIs
β’ Familiarity with Git, CI/CD, and testing frameworks
β’ Strong communication and time management; can operate independently
Qualifications
β’ Primary responsibilities (daily/weekly): Software development, design, coding, debugging
β’ Key projects or initiatives for the role: Migration of Search reporting tools to Cloud AWS/GCP
β’ Success metrics or KPIs for this role: Successful migration to cloud and deprecating on-perm tools
β’ How is success measured? Parity of Cloud Tools with on-perm Tools supporting all reporting use cases with no business impact.
To discuss this amazing opportunity, reach out to our Talent Acquisition Specialist Abhinav Chakraborty at email address Abhinav.Chakraborty@generistek.com can be reached on # 630-576-1925.