

Data Engineer, Mid-Level
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Mid-Level Data Engineer to design data-centric solutions on Google Cloud Platform for a Materials Management Platform. Contract length is "unknown," with a pay rate of "$XX/hour." Key skills include GCP, ETL, Java, and Python. A bachelor's degree in a related field is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 6, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Detroit Metropolitan Area
-
π§ - Skills detailed
#Dataflow #SQL (Structured Query Language) #Databases #Deployment #Security #Monitoring #Cloud #"ETL (Extract #Transform #Load)" #Datasets #Computer Science #Data Processing #Storage #AI (Artificial Intelligence) #Data Engineering #Programming #Python #Logging #Version Control #NoSQL #Google Cloud Storage #Java #SonarQube #GCP (Google Cloud Platform) #Data Governance #Documentation
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Verified Job On Employer Career Site
Job Summary:
Apex Systems is a world-class IT services company that serves thousands of clients across the globe. They are seeking a Data Engineer to design and deploy data-centric architecture on Google Cloud Platform for their Materials Management Platform, which is part of a larger IT transformation effort. The role involves building ETL pipelines, optimizing data workflows, and collaborating with stakeholders to ensure data requirements align with business objectives.
Responsibilities:
β’ Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Big Query, Google Cloud Storage, Cloud SQL, Memory Store, Dataflow, Dataproc, Artifact Registry, Cloud Build, Cloud Run, Vertex AI, Pub/Sub, GCP APIs.
β’ Build ETL pipelines to ingest the data from heterogeneous sources into our system
β’ Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data
β’ Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets
β’ Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements
β’ Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure.
β’ Implement version control and CI/CD practices for data engineering workflows to ensure reliable and efficient deployments.
β’ Utilize GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures
β’ Troubleshoot and resolve issues related to data processing, storage, and retrieval.
β’ Promptly address code quality issues using SonarQube, Checkmarx, Fossa, and Cycode throughout the development lifecycle
β’ Implement security measures and data governance policies to ensure the integrity and confidentiality of data
β’ Collaborate with stakeholders to gather and define data requirements, ensuring alignment with business objectives.
β’ Develop and maintain documentation for data engineering processes, ensuring knowledge transfer and ease of system maintenance.
β’ Participate in on-call rotations to address critical issues and ensure the reliability of data engineering systems.
β’ Provide mentorship and guidance to junior team members, fostering a collaborative and knowledge-sharing environment.
Qualifications:
Required:
β’ Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Big Query, Google Cloud Storage, Cloud SQL, Memory Store, Dataflow, Dataproc, Artifact Registry, Cloud Build, Cloud Run, Vertex AI, Pub/Sub, GCP APIs.
β’ Build ETL pipelines to ingest the data from heterogeneous sources into our system
β’ Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data
β’ Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets
β’ Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements
β’ Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure.
β’ Implement version control and CI/CD practices for data engineering workflows to ensure reliable and efficient deployments.
β’ Utilize GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures
β’ Troubleshoot and resolve issues related to data processing, storage, and retrieval.
β’ Promptly address code quality issues using SonarQube, Checkmarx, Fossa, and Cycode throughout the development lifecycle
β’ Implement security measures and data governance policies to ensure the integrity and confidentiality of data
β’ Collaborate with stakeholders to gather and define data requirements, ensuring alignment with business objectives.
β’ Develop and maintain documentation for data engineering processes, ensuring knowledge transfer and ease of system maintenance.
β’ Participate in on-call rotations to address critical issues and ensure the reliability of data engineering systems.
β’ Provide mentorship and guidance to junior team members, fostering a collaborative and knowledge-sharing environment.
β’ Requires a bachelorβs or foreign equivalent degree in computer science, information technology or a technology related field
Company:
Apex Systems, a division of On Assignment, provides organizations with IT staffing solutions to address gaps in their current workforce. Founded in 1995, the company is headquartered in Richmond, Virginia, USA, with a team of 1001-5000 employees. The company is currently Late Stage. Apex Systems has a track record of offering H1B sponsorships.