DMV IT Service

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of over 6 months, paying a competitive rate. Key skills include GCP expertise, SQL proficiency, and data pipeline development. Remote work is available, requiring 3+ years of relevant experience.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
450
-
πŸ—“οΈ - Date
October 9, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Freeport, ME
-
🧠 - Skills detailed
#Scala #Google Cloud Storage #Web Services #Data Engineering #Jira #Computer Science #Debugging #UAT (User Acceptance Testing) #BigQuery #GitHub #GCP (Google Cloud Platform) #Compliance #Cybersecurity #Cloud #Security #SQL (Structured Query Language) #SQL Queries #Data Pipeline #Data Ingestion #"ETL (Extract #Transform #Load)" #Consulting #Datasets #Storage #Databases #SQL Server #Version Control #Oracle
Role description
Job Title: Data Engineer Location: Freeport, ME Employment Type: Contract About Us : DMV IT Service LLC, founded in 2020, is a trusted IT consulting firm specializing in IT infrastructure optimization, cybersecurity, networking, and staffing solutions. We partner with clients to achieve technology goals through expert guidance, workforce support, and innovative solutions. With a client-focused approach, we also provide online training and job placements, ensuring long-term IT success. Job Purpose: The Data Engineer will be responsible for designing, developing, and maintaining robust data ingestion and pipeline processes on the Google Cloud Platform (GCP). The role focuses on transforming raw data into structured, high-quality datasets that support business analytics, reporting, and decision-making. This position ensures the reliability, scalability, and security of data workflows while continuously optimizing performance and supporting production systems. Requirements: Key Responsibilities: Design, build, and optimize data ingestion and transformation pipelines within GCP. Translate business requirements into detailed technical designs and specifications. Develop solutions using a combination of code development and configuration tools. Perform data validation and troubleshooting through advanced SQL queries. Conduct testing activities such as unit, system, and user acceptance testing, and document results. Manage project deliverables independently or as part of a team, ensuring timely completion. Participate in code and design reviews, maintaining compliance with security and coding standards. Provide technical expertise and production support during on-call rotations. Track project timelines and communicate progress to stakeholders. Identify and recommend process enhancements to improve data efficiency and reliability. Required Skills & Experience: Education: Bachelor’s degree in Computer Science, Information Technology, or a related discipline. Experience: At least 3 years of professional experience in data engineering, data pipeline development, or related fields. Technical Skills (Must Have): Proficiency with Google Cloud Platform (GCP) components including: Google Pub/Sub BigQuery Google Dataform Google Cloud Storage Cloud Composer Strong knowledge of SQL for querying, analysis, and debugging. Experience in data ingestion to BigQuery. Familiarity with GitHub for version control. Preferred Skills: Experience using Cloud Data Fusion. Working knowledge of relational databases such as SQL Server, DB2, or Oracle. Understanding of RESTful APIs and SOAP Web Services integration. Exposure to Jira and Confluence for project collaboration. Experience connecting with various Google Cloud services and external databases.