

BigQuery Data Engineer | W2 | 10 Years+ | GCP
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a BigQuery Data Engineer with 10+ years of experience, focusing on GCP and cloud migration. It offers a remote contract with a competitive pay rate. Key skills include Python, SQL, BigQuery, and data pipeline development.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 26, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Delta Lake #Cloud #Dremio #Data Pipeline #dbt (data build tool) #Scala #Logging #GIT #Databases #SQL (Structured Query Language) #Monitoring #Snowflake #Spark (Apache Spark) #Datasets #Java #Apache Spark #Apache Iceberg #Version Control #Airflow #AWS (Amazon Web Services) #Data Processing #Data Science #BigQuery #Migration #GCP (Google Cloud Platform) #NoSQL #Kafka (Apache Kafka) #Python #"ETL (Extract #Transform #Load)" #Data Engineering
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Description Below:-
Job Title: Senior Data Engineer
Location: Remote
Description
We are seeking a hands-on Data Engineer to join our team and contribute to critical backend components that are being migrated to a modern cloud environment (AWS or GCP). This role involves close collaboration with the architecture team to design and develop scalable, high-performance data pipelines and backend services to support advanced analytics and data science initiatives.
Key Responsibilities:
β’ Support cloud migration of backend and data components (AWS/GCP)
β’ Collaborate with architects to design robust systems and integrations
β’ Design, develop, and maintain data pipelines and ELT workflows
β’ Implement solutions enabling Data Scientists to access, transform, and analyze data
β’ Write clean, efficient code in Python, with occasional Java involvement
β’ Work with SQL/NoSQL databases and build scalable data models
β’ Engage with CI/CD tools, monitoring systems, and infrastructure
β’ Ensure reliability and performance in production environments
β’ Troubleshoot and debug data pipeline and infrastructure issues
β’ Translate business needs into reliable datasets and models
β’ Follow best practices in modern data engineering and cloud development
Must-Have Qualifications:
β’ 10+ years of experience in data engineering or backend development
β’ Strong hands-on coding experience in Python
β’ Experience with cloud platforms β GCP preferred, AWS acceptable
β’ Hands-on experience with BigQuery
β’ Strong SQL skills; experience with NoSQL databases
β’ Familiarity with distributed data processing tools like Apache Spark
β’ Experience integrating with backend/data services
β’ Solid understanding of CI/CD, version control (Git), and testing practices
β’ Strong analytical, communication, and time management skills
Nice-to-Have Skills:
β’ Experience with DBT (Data Build Tool) or BigQuery Dataform
β’ Familiarity with Java (basic to intermediate level)
β’ Experience with Snowflake, Dremio, or similar data platforms
β’ Knowledge of Apache Iceberg, Delta Lake, or open table formats
β’ Familiarity with orchestration tools like Airflow, Prefect, or Dagster
β’ Exposure to real-time data tools (e.g., Kafka, Spark Streaming)
β’ Experience with monitoring and logging tools in cloud environments