

Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 6-month contract, remote in Michigan, with a pay rate of $60-$70/hr. Requires 7+ years in data engineering, 5+ years with GCP, and expertise in Python, Apache Spark, and ETL. Google Cloud certification preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
560
-
ποΈ - Date discovered
June 11, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Yes
-
π - Location detailed
United States
-
π§ - Skills detailed
#Security #Apache Spark #Scala #Observability #BitBucket #"ETL (Extract #Transform #Load)" #Automation #Data Quality #Anomaly Detection #Python #POSTMAN #Cloud #Monitoring #IAM (Identity and Access Management) #API (Application Programming Interface) #Batch #GCP (Google Cloud Platform) #Spark (Apache Spark) #VPC (Virtual Private Cloud) #Jenkins #Computer Science #SQL (Structured Query Language) #DevOps #Data Engineering #Compliance #Dataflow #Apache Airflow #BigQuery #GIT #Storage #Data Science #Airflow #Data Governance #PySpark
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Akkodis is seeking a GCP Data Engineer role is a Contract with a client located in Michigan, (Remote 100%), Weβre ideally looking for applicants with solid exp with GCP, Python, API's, Spark, ETL would come as a big plus.
Pay Range: $60-$70/hr. on W2 The pay may be negotiable based on experience, education, geographic location, and other factors.
Key Responsibilities
Data Engineering & Development
β’ Design, build, and optimize scalable ELT/ETL pipelines to process structured and unstructured data across batch and streaming systems.
β’ Architect and deploy cloud-native data workflows using GCP services including BigQuery, Cloud Storage, Cloud Functions, Cloud Pub/Sub, Dataflow, and Cloud Composer.
β’ Build high-throughput Apache Spark workloads in Python and SQL, with performance tuning for scale and cost.
β’ Develop parameterized DAGs in Apache Airflow with retry logic, alerting, SLA/SLO enforcement, and robust monitoring.
β’ Build reusable frameworks for high-volume API ingestion, transforming Postman collections into production-ready Python modules.
β’ Translate business and product requirements into scalable, efficient data systems that are reliable and secure.
Cloud Infrastructure & Security
Implement IAM and VPC-based security to manage and deploy GCP infrastructure for secure data operations.
β’ Ensure robustness, scalability, and cost-efficiency of all infrastructure, following FinOps best practices.
β’ Apply automation through CI/CD pipelines using tools like Git, Jenkins, or Bitbucket.
Data Quality, Governance & Optimization
β’ Design and implement data quality frameworks, monitoring, validation, and anomaly detection.
β’ Build observability dashboards to ensure pipeline health and proactively address issues.
β’ Ensure compliance with data governance policies, privacy regulations, and security standards.
β’ Collaboration & Project DeliveryWork closely with cross-functional stakeholders including data scientists, analysts, DevOps, product managers, and business teams.
β’ Effectively communicate technical solutions to non-technical stakeholders.
β’ Manage multiple concurrent projects, shifting priorities quickly and delivering under tight timelines.
β’ Collaborate within a globally distributed team with real-time engagement through 2 p.m. U.S. Eastern Time.
Qualifications & Certifications
β’ Education Bachelorβs or Masterβs degree in Computer Science, Information Technology, Engineering, or a related field.
β’ Experience Minimum 7+ years in data engineering with 5+ years of hands-on experience on GCP.
β’ Proven track record with tools and services like BigQuery, Cloud Composer (Apache Airflow), Cloud Functions, Pub/Sub, Cloud Storage, Dataflow, and IAM/VPC.
β’ Demonstrated expertise in Apache Spark (batch and streaming), PySpark, and building scalable API integrations.
β’ Advanced Airflow skills including custom operators, dynamic DAGs, and workflow performance tuning.
Certifications
β’ Google Cloud Professional Data Engineer certification preferred.
If you are interested in this role, then please click APPLY NOW. For other opportunities available at Akkodis, or any questions, feel free to contact me at govind.choudhary@akkodisgroup.com.
Equal Opportunity Employer/Veterans/Disabled
Benefits offerings include but are not limited to:
β’ 401(k) with match
β’ Medical insurance
β’ Dental Insurance
β’ Vision assistance
β’ Paid Holidays Off
To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit https://www.akkodis.com/en/privacy-policy.
The Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable:
Β· The California Fair Chance Act
Β· Los Angeles City Fair Chance Ordinance
Β· Los Angeles County Fair Chance Ordinance for Employers
Β· San Francisco Fair Chance Ordinance