

Net2Source Inc.
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 12+ month contract, remote (EST), offering competitive pay. Key skills include database management, cloud databases, and machine learning. Must have a Bachelor's in Computer Science and 5 years of experience with relational databases.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 6, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Cloud #Consulting #Disaster Recovery #Monitoring #Leadership #Databases #Computer Science #Data Warehouse #Java #Database Design #Logging #Deployment #SQL Server #"ETL (Extract #Transform #Load)" #Database Performance #Storage #Data Pipeline #Database Administration #Oracle #MySQL #RDBMS (Relational Database Management System) #Database Management #GCP (Google Cloud Platform) #Golang #Clustering #ML (Machine Learning) #PostgreSQL #Security #Dataflow #Data Engineering #Python #SQL (Structured Query Language)
Role description
Role name: Data Engineer
Location: Remote (EST)
Duration: 12+ Months
Job Description:
Responsibilities:
• Function as the lead Google data team point of contact to support NOTAM data platform
• Be highly collaborative and work closely with data producers and data consumers, to understand the data needs, provide consultation, and align data solutions.
• Lead database administration best practices including backup and recovery, performance tuning, scaling, data archival, database design and provide implementation support.
• Create and deliver best practices, recommendations, sample code, and technical presentations, adapting to different levels of key business and technical stakeholders.
• Analyse on-premises and cloud database environments, consulting on the optimal design for performance and deployment on Google Cloud Platform. Support the design, development, and maintenance of RDBMS, data warehouse, and data pipeline solutions.
Preferred Skillset Requirement:
• Experience with database management tools for backups, recovery, snapshot management, sharding, partitioning and database performance tuning.
• Experience working with cloud databases such as AlloyDB, CloudSQL, Big Query Experience with MLOps, data warehousing, and data pipeline development, including ETL and ELT, dataflow, cloud functions.
• Experience with application development.
• Experience in database administration techniques including storage, clustering, availability, disaster recovery, security, logging, performance tuning, monitoring and auditing.
• Experience developing, deploying, and managing machine learning models, including experience writing software in one or more languages, such as Java, Python, Golang
Must-Have Qualifications:
• US Citizen
• Bachelor's degree in Computer Science or equivalent practical experience.
• 5 years of experience with relational database technologies such as PostgreSQL, MySQL, SQL Server, or Oracle.
• Experience working with business stakeholders to understand requirements, provide technical leadership, and educate teams on GCP best practices.
Regards
Divyansh
Role name: Data Engineer
Location: Remote (EST)
Duration: 12+ Months
Job Description:
Responsibilities:
• Function as the lead Google data team point of contact to support NOTAM data platform
• Be highly collaborative and work closely with data producers and data consumers, to understand the data needs, provide consultation, and align data solutions.
• Lead database administration best practices including backup and recovery, performance tuning, scaling, data archival, database design and provide implementation support.
• Create and deliver best practices, recommendations, sample code, and technical presentations, adapting to different levels of key business and technical stakeholders.
• Analyse on-premises and cloud database environments, consulting on the optimal design for performance and deployment on Google Cloud Platform. Support the design, development, and maintenance of RDBMS, data warehouse, and data pipeline solutions.
Preferred Skillset Requirement:
• Experience with database management tools for backups, recovery, snapshot management, sharding, partitioning and database performance tuning.
• Experience working with cloud databases such as AlloyDB, CloudSQL, Big Query Experience with MLOps, data warehousing, and data pipeline development, including ETL and ELT, dataflow, cloud functions.
• Experience with application development.
• Experience in database administration techniques including storage, clustering, availability, disaster recovery, security, logging, performance tuning, monitoring and auditing.
• Experience developing, deploying, and managing machine learning models, including experience writing software in one or more languages, such as Java, Python, Golang
Must-Have Qualifications:
• US Citizen
• Bachelor's degree in Computer Science or equivalent practical experience.
• 5 years of experience with relational database technologies such as PostgreSQL, MySQL, SQL Server, or Oracle.
• Experience working with business stakeholders to understand requirements, provide technical leadership, and educate teams on GCP best practices.
Regards
Divyansh






