

Lorven Technologies Inc.
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Hanover, NH, on a contract basis. Key skills include ETL/ELT pipeline development, cloud platforms (AWS/Azure/GCP), and big data technologies (Spark, Snowflake). Requires experience in data architecture, compliance, and data quality management.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 14, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Hanover, NH
-
🧠 - Skills detailed
#Version Control #Kafka (Apache Kafka) #BigQuery #Redshift #Data Pipeline #Batch #"ETL (Extract #Transform #Load)" #Data Analysis #Data Warehouse #Data Lineage #Snowflake #Scala #GCP (Google Cloud Platform) #Hadoop #Databricks #Data Mart #Metadata #Data Storage #Documentation #Compliance #Data Science #Synapse #Data Architecture #AWS (Amazon Web Services) #Data Processing #Data Ingestion #Data Privacy #Big Data #Spark (Apache Spark) #Data Engineering #Security #Storage #Azure #Cloud #Monitoring #Deployment #S3 (Amazon Simple Storage Service) #Data Quality
Role description
Position: Data Engineer
Location: Hanover, NH
Contract
Job description:
Data Engineering & Pipelines
• Design, develop, and maintain scalable ETL/ELT pipelines for batch and real-time data processing.
• Build and optimize data ingestion workflows from multiple structured and unstructured data sources.
• Ensure data is cleansed, validated, and transformed according to business rules.
Data Architecture & Modeling
• Build robust data models, data warehouse layers (staging, integration, semantic), and data marts.
• Work with architects to define data standards, metadata, lineage, and governance frameworks.
• Optimize data storage and compute performance across cloud and on-prem platforms.
Cloud & Big Data Technologies
• Develop solutions using cloud platforms such as AWS / Azure / GCP.
• Work with big data technologies including Spark, Databricks, Snowflake, Hadoop, Kafka, Flink, etc.
• Manage cloud data services such as S3, Redshift, Glue, BigQuery, Synapse, or similar technologies.
Collaboration & Delivery
• Partner closely with data analysts, data scientists, product managers, and engineering teams.
• Translate business requirements into technical specification documents.
• Support deployment, monitoring, and troubleshooting of data pipelines.
Quality, Security & Compliance
• Implement data quality checks, audits, and monitoring frameworks.
• Ensure compliance with data privacy, security, and regulatory standards.
• Maintain proper documentation, version control, and configuration management
Position: Data Engineer
Location: Hanover, NH
Contract
Job description:
Data Engineering & Pipelines
• Design, develop, and maintain scalable ETL/ELT pipelines for batch and real-time data processing.
• Build and optimize data ingestion workflows from multiple structured and unstructured data sources.
• Ensure data is cleansed, validated, and transformed according to business rules.
Data Architecture & Modeling
• Build robust data models, data warehouse layers (staging, integration, semantic), and data marts.
• Work with architects to define data standards, metadata, lineage, and governance frameworks.
• Optimize data storage and compute performance across cloud and on-prem platforms.
Cloud & Big Data Technologies
• Develop solutions using cloud platforms such as AWS / Azure / GCP.
• Work with big data technologies including Spark, Databricks, Snowflake, Hadoop, Kafka, Flink, etc.
• Manage cloud data services such as S3, Redshift, Glue, BigQuery, Synapse, or similar technologies.
Collaboration & Delivery
• Partner closely with data analysts, data scientists, product managers, and engineering teams.
• Translate business requirements into technical specification documents.
• Support deployment, monitoring, and troubleshooting of data pipelines.
Quality, Security & Compliance
• Implement data quality checks, audits, and monitoring frameworks.
• Ensure compliance with data privacy, security, and regulatory standards.
• Maintain proper documentation, version control, and configuration management






