GeoLogics Corporation

Data Engineer-Cybersecurity & GRC Experience - UPDATED 02/10/2026

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with Cybersecurity and GRC experience, remote, contract length unspecified, hourly rate open. Requires expertise in data warehousing, ETL/ELT, SQL, and API integration. Must be a US Citizen; familiarity with cloud platforms preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 11, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#AI (Artificial Intelligence) #Python #Automated Testing #Informatica #Programming #Azure #Databricks #Microsoft Power BI #GIT #Data Integration #Automation #Cybersecurity #Data Engineering #MySQL #Scala #Security #"ETL (Extract #Transform #Load)" #Storage #SQL Server #Version Control #Datasets #Data Governance #ML (Machine Learning) #Data Analysis #GraphQL #AWS (Amazon Web Services) #BI (Business Intelligence) #SQL (Structured Query Language) #Data Pipeline #Compliance #Cloud #Java #Databases #PostgreSQL #Matillion #Snowflake
Role description
[make sure when you answer the questions below and you do have that specific experience, that experience shows in the body of your resume. Thx] GeoLogics is working with Raytheon Technologies - search for Cyber Data Engineer with expertise in data warehousing, Cyber, Governance and ETL. Remote Must be a US Citizen Hourly rate: OPEN Summary: We are seeking an experienced Data Engineer to join the Governance, Risk, and Compliance (GRC) team at Raytheon Technologies. The candidate will work closely with our GRC Dev Ops team and various IT and Cybersecurity stakeholders to design, implement, and maintain data warehousing solutions. This role focuses on building scalable data pipelines and models, transforming raw data (from structured, semi-structured and unstructured sources) into curated datasets, and ensuring data is accessible for BI reporting and AI/ML use cases. Responsibilities: • Collaborate with the Business and Data Analysts as well as Front-end and Full Stack AI Developers to understand data requirements and deliver scalable solutions that support large-scale automation initiatives, incorporating AI/ML. • Design, develop, and optimize ETL/ELT pipelines to process, model and transform data from raw to curated layers, enabling seamless integration into published layers for BI and advanced analytics. • Implement and manage data warehousing solutions using Object storage, Snowflake, Databricks, Matillion, and Informatica. • Develop and maintain APIs to facilitate secure and efficient data integration between various IT, Cyber and GRC systems, applications, and data pipelines. • Ensure the accuracy, reliability, and scalability of data pipelines and data models. • Support the ingestion, integration, and transformation of large datasets to meet IT, Cybersecurity and GRC operational and reporting needs. • General: Partner with stakeholders to understand their data and reporting requirements and provide tailored solutions. • General: Stay informed on the latest advancements in data engineering, warehousing, and integration tools and methodologies. Qualifications: • Proven experience as a Data Engineer with a focus on data warehousing, ETL/ELT development, and pipeline design. • Strong proficiency in SQL, experience with relational and non-relational databases (e.g., MySQL, PostgreSQL, SQL Server, Snowflake, Databricks). • Experience building APIs and integrating data pipelines with RESTful or GraphQL APIs and implement CI/CD for pipelines and SQL transformations (Git workflows, automated testing, release/version control) • Hands-on experience with ETL/ELT tools and platforms such as Matillion, Informatica, or equivalent. • Proficiency in programming languages such as Python or Java for building and optimizing data pipelines. • General: Expertise in cloud platforms (AWS, Google Cloud, Azure) and their data services. • General: Familiarity with BI tools like Power BI and an understanding of how to prepare data for reporting needs. • General: Strong analytical and problem-solving skills with a focus on delivering high-quality, scalable solutions. • General: Excellent communication and collaboration skills for cross-functional teamwork. Preferred Qualifications (pluses, not required): • Experience working on Cybersecurity or GRC-related projects or industries. • Working knowledge of machine learning and AI concepts (model registry/model hub workflows or equivalent). • Familiarity with data governance, security, and compliance principles. • Understanding of regulatory compliance standards and frameworks. • This role offers the opportunity to work on impactful projects that bridge data engineering, analytics, and AI to drive innovation and efficiency in the GRC domain. Interested, please send resume to sgephart@geologics.com. Sam Gephart Recruiter GeoLogics Corporation 888-303-3603 sgephart@geologics.com