

BeaconFire Inc.
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in California, offering market rate pay. Candidates must have 5-6 years of experience with SQL and Python, and expertise in building ETL pipelines. Experience in legal privacy work is a plus. Hybrid work location.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 31, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
California, United States
-
🧠 - Skills detailed
#Airflow #Python #Data Engineering #Monitoring #Security #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language)
Role description
Title: Data Engineer
Location: California
Rate: Market Rate
Candidate Requirements
Must-Have Skills
1 SQL
2 Python
3 Experience building ETL pipelines
Nice-to-have Skills
1 Volunteer work or internship with a law firm or worked in privacy
2 Experience with remediating data loss .
Job Description:
• Experienced Data Engineer to support our infrastructure work.
• This role would be primarily responsible for building and modifying high-complexity pipelines and alerting systems for our team in legal.
• We're looking for someone who is senior, with at least 5-6 years of experience with SQL and python-based pipeline development (dataswarm, airflow, etc).
• Beyond YOE, due to the technical complexity of the role, we'd ideally like to find someone who has experience working on data warehousing and building (+ leading end-to-end projects to build) high-complexity monitoring and alerting systems.
• We're a very XFN-heavy team, so someone who is very communicative, collaborative, comfortable navigating ambiguity / tenacious in finding answers, and cognizant of privacy/security would do very well in this role.
• Strong preference for someone who is based on the west coast, slight preference for someone who could occasionally come into the SF office, but that's not required.
Title: Data Engineer
Location: California
Rate: Market Rate
Candidate Requirements
Must-Have Skills
1 SQL
2 Python
3 Experience building ETL pipelines
Nice-to-have Skills
1 Volunteer work or internship with a law firm or worked in privacy
2 Experience with remediating data loss .
Job Description:
• Experienced Data Engineer to support our infrastructure work.
• This role would be primarily responsible for building and modifying high-complexity pipelines and alerting systems for our team in legal.
• We're looking for someone who is senior, with at least 5-6 years of experience with SQL and python-based pipeline development (dataswarm, airflow, etc).
• Beyond YOE, due to the technical complexity of the role, we'd ideally like to find someone who has experience working on data warehousing and building (+ leading end-to-end projects to build) high-complexity monitoring and alerting systems.
• We're a very XFN-heavy team, so someone who is very communicative, collaborative, comfortable navigating ambiguity / tenacious in finding answers, and cognizant of privacy/security would do very well in this role.
• Strong preference for someone who is based on the west coast, slight preference for someone who could occasionally come into the SF office, but that's not required.





