Highbrow Technology Inc

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with 7+ years of experience in Data Engineering, focusing on migration and consolidation. It offers a remote location, a pay rate of "unknown," and requires strong skills in Python, SQL, and cloud platforms.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 10, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New Jersey, United States
-
🧠 - Skills detailed
#GDPR (General Data Protection Regulation) #Snowflake #Data Migration #Data Privacy #DMP (Data Management Platform) #AWS (Amazon Web Services) #Data Governance #Cloud #GCP (Google Cloud Platform) #Microservices #Python #FHIR (Fast Healthcare Interoperability Resources) #Migration #SQL (Structured Query Language) #PySpark #Data Integration #Oracle #Scala #Data Pipeline #Data Warehouse #BigQuery #Delta Lake #AI (Artificial Intelligence) #Knowledge Graph #Redshift #Azure #Data Processing #Synapse #Data Engineering #Databricks #"ETL (Extract #Transform #Load)" #SAP #Observability #Spark (Apache Spark) #Data Management #MDM (Master Data Management) #Kafka (Apache Kafka)
Role description
Senior Data Engineer – Migration & Consolidation Experience Required: 7+ Years Location: Remote (United States) More Than 20+ Openings Travel: Up to 40% travel to client locations may be required Required Skills • 7+ years of experience in Data Engineering, System Integration, or Data Migration • Strong experience with Python and SQL • Hands-on experience with PySpark and modern data processing frameworks • Experience working with ETL/ELT and data integration tools • Strong experience building scalable and maintainable data pipelines • Experience with cloud platforms such as AWS, Azure, or GCP • Experience with modern data warehouse and lakehouse platforms such as Snowflake, Databricks, BigQuery, Redshift, Synapse, or Delta Lake • Experience integrating enterprise systems such as SAP and Oracle • Knowledge of semantic modeling, knowledge graphs, or ontology frameworks • Exposure to AI-driven tooling or LLM-based solutions in data workflows • Strong problem-solving skills and ability to work on complex enterprise data environments • Experience collaborating with technical teams and enterprise stakeholders Preferred Skills • Experience with real-time streaming platforms such as Kafka, Kinesis, Event Hubs, or Pub/Sub • Experience working with enterprise Master Data Management platforms • Knowledge of CI/CD pipelines and infrastructure-as-code tools • Experience building APIs and microservices • Understanding of data governance, data privacy, and regulatory frameworks such as GDPR, HIPAA, SOC2, or CCPA • Familiarity with data observability, catalog, and lineage platforms • Knowledge of enterprise business processes such as financial operations, supply chain, procurement, or revenue workflows • Knowledge of industry standards such as EDI, HL7, FHIR, SWIFT, and XBRL