

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Bernards, NJ, offering a contract position. Pay rate is unspecified. Requires 8+ years of data engineering experience, strong SQL and Python skills, and expertise in Google Cloud technologies, especially BigQuery and Airflow.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 23, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Bernardsville, NJ
-
π§ - Skills detailed
#Agile #UAT (User Acceptance Testing) #Data Architecture #GitHub #Teradata #Documentation #Dataflow #Version Control #Data Mart #Scripting #BigQuery #Data Engineering #Airflow #Data Science #Migration #Deployment #Scrum #Cloud #BTEQ #Python #Data Pipeline #Data Modeling #Bash #"ETL (Extract #Transform #Load)" #Data Governance #Data Quality #Data Warehouse #SQL (Structured Query Language) #Programming #GCP (Google Cloud Platform)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Hi,
Please find the below job description and let me know the best possible time to connect with you. You can directly call and text me at 469-217-5559 & Share resume at Singh.ankita@net2source.com
Legal name:
Location (Country, Street name (including apt/house # if applicable), city, state, and zip code):
Relocate?
Rate:
Availability:
Phone #:
Mobile#:
Skype ID:
Email address:
Full visa expiration date:
Hiring Status:
Role: GCP Data Engineer
Work location: Bernards, NJ
Contract
JOB DESCRIPTION
β’ Responsibilities Contribute to the migration of legacy data warehouse to a google cloud-based data warehouse for a Telecom Major..
β’ Collaborate with Data Product Managers, Data Architects to design, implement, and deliver successful data solutions
β’ Help architect data pipelines for the underlying data warehouse and data marts
β’ Design and develop very complex ETL pipelines in Google cloud Data environments.
β’ Our legacy tech stack includes Teradata and new tech stack includes GCP Cloud Data Technologies like BigQuery and Airflow and languages include SQL , Python
β’ Maintain detailed documentation of your work and changes to support data quality and data governance
β’ Support QA and UAT data testing activities
β’ Support Deployment activities to higher environments
β’ Ensure high operational efficiency and quality of your solutions to meet SLAs and support commitment to our customers (Data Science, Data Analytics teams)
β’ Be an active participant and advocate of agile/scrum practice to ensure health and process improvements for your team
Basic Qualifications
8+ years of data engineering experience developing large data pipelines in very complex environments
β’ Very Strong SQL skills and ability to build Very complex transformation data pipelines using custom ETL framework in Google BigQuery environment
β’ Exposure to Teradata and ability to understand complex Teradata BTEQ scripts
β’ String Python programming skills
β’ Strong Skills on build Airflow Jobs and Debug issues
β’ Ability to Optimize the Query in BigQuery
β’ Hands-on experience on Google Cloud data Technologies ( GCS , BigQuery, Dataflow, Pub sub, Data Fusion , Cloud Function)
β’ Preferred Qualifications
β’ Experience with cloud data warehouse technology BigQuery.
β’ Nice to have experience with Cloud technologies like GCP (GCS , Data Proc, Pub/sub, Data flow, Data Fusion, Cloud Function)
β’ Nice to have exposure to Teradata
β’ Solid experience with Job Orchestration Tools like Airflow and ability to build complex Jobs.
β’ Writing and maintaining large Data Pipelines using Custom ETL framework
β’ Ability to Automate Jobs using Python
β’ Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices
β’ Very good experience with Code Version control repository like Github
β’ Good Scripting skills, including Bash scripting and Python
β’ Familiar with Scrum and Agile methodologies
β’ Problem solver with strong attention to detail and excellent analytical and communication skills
β’ Ability to work in Onsite / Offshore model and able to lead a Team.
Skills: Google Cloud Platform, cloud computing, cloud providers, information technology, technology
Best Regards,
Ankita | Talent Acquisition Specialist
N2S Global Workforce Solutions Ltd.
Global HQ Address β 270 Davidson Ave, Suite 704, Somerset, NJ 08873, USA