

Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a "Senior Data Engineer" on a 3-6+ month contract, remote (California PST). Key skills include Python, GCP (BigQuery, Dataflow), SQL, and experience with data pipelines. A Bachelor's degree in Computer Science or related field is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 7, 2025
π - Project duration
3 to 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
California, United States
-
π§ - Skills detailed
#Storage #AWS (Amazon Web Services) #Code Reviews #BigQuery #SQL (Structured Query Language) #Programming #Security #JavaScript #Azure #Documentation #GCP (Google Cloud Platform) #Scala #Data Engineering #.Net #Python #Dataflow #Informatica #Leadership #Java #"ETL (Extract #Transform #Load)" #Microservices #Computer Science #Business Analysis #Data Pipeline #Cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Position Title: Data Engineering Expert
Duration: 3-6+ Months Contract work with possible extension
Work Location: California - PST Shift (Remote)
this is a remote role with limited travel to San Francisco, CA.
Feedback from Hiring Manager on Must haves:
β’ Bachelorβs degree in computer science, Engineering, or a related field (Master's degree preferred).
Strong background in software development.
β’ In-depth knowledge of software architecture principles, design patterns, and best practices.
o GCP Services: level of experience with GCP services like BigQuery, Cloud Storage, Dataflow, and Pub/Sub
β’ Experience with Data Pipeline Development ( informatica background a plus)
β’ Experience with SQL and Big Query
β’ Must have experience with developing and optimizing codes and able to provide code review and ensure all QA checks are completed
β’ Proficiency in multiple programming languages and frameworks, such as Java, .NET, Python, or JavaScript.
β’ Experience with cloud platforms(Google Cloud) and microservices architecture.
Experience with Ascend ETL
Updates from client-
Why Hiring Manager did not like previous candidates:
1. Candidates are far too senior and, in most cases, focused on high level architecture / leadership roles.
1. Not the right mix of technologyβ¦I need strong Data Engineers with python, GCP Skills
What is he looking for:
1. Strong Data Engineering background preferably with Ascend.io but must be very strong with Python
1. Strong GCP skillsβ¦big query, cloud run, composer etc..
1. Need to be a true engineer that can get hands dirty in the code, this person needs to be very hands-on keyboard.
Job Description:
Responsibilities:
β’ Collaborate with clients, business analysts, and development teams to understand business requirements and translate them into technical solutions.
β’ Design and develop end-to-end architecture for complex software systems, ensuring scalability, reliability, and security.
β’ Evaluate and select appropriate technologies, frameworks, and tools to meet project requirements.
β’ Create architectural blueprints, diagrams, and documentation to communicate the solution design to stakeholders.
β’ Assist development teams throughout the software development lifecycle.
β’ Conduct regular code reviews and provide technical guidance to ensure adherence to architectural standards.
β’ Identify and mitigate technical risks and issues, ensuring timely resolution.
β’ Stay up-to-date with industry trends and emerging technologies, providing recommendations for continuous improvement.
β’ Act with integrity, professionalism, and personal responsibility to uphold the firmβs respectful and courteous work environment.
Qualifications:
β’ Bachelor's degree in Computer Science, Engineering, or a related field (Master's degree preferred).
β’ In-depth knowledge of software architecture principles, design patterns, and best practices.
GCP Services: level of experience with GCP services like BigQuery, Cloud Storage, Dataflow, and Pub/Sub
β’ Experience with Data Pipeline Development ( informatica background a plus)
β’ Experience with SQL and Big Query
β’ Must have experience with developing and optimizing codes and able to provide code review and ensure all QA checks are completed
β’ Proficiency in multiple programming languages and frameworks, such as Java, .NET, Python, or JavaScript.
β’ Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and microservices architecture.