

GeoLogics Corporation
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with expertise in data warehousing and ETL, offering a remote contract with an open hourly rate. Key skills include SQL, Python, and experience with Snowflake and Databricks. Cybersecurity or GRC industry experience is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 18, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Cybersecurity #Programming #Azure #Security #Informatica #Matillion #SQL (Structured Query Language) #Java #AWS (Amazon Web Services) #SQL Server #AI (Artificial Intelligence) #Python #"ETL (Extract #Transform #Load)" #Cloud #Microsoft Power BI #Compliance #Databricks #Storage #Scala #Automation #PostgreSQL #Data Governance #Data Engineering #Data Integration #BI (Business Intelligence) #MySQL #ML (Machine Learning) #Data Pipeline #GraphQL #Data Analysis #Databases #Datasets #Snowflake
Role description
GeoLogics is working with Raytheon Technologies in search of a Data Engineer with expertise in data warehousing and ETL.
Raytheon Technologies
Job Title: Data Engineer
Location: Remote
Must be a US Citizen
Hourly rate: OPEN
Job Summary:
We are seeking an experienced Data Engineer to join the Governance, Risk, and Compliance (GRC) team at Raytheon Technologies. The candidate will work closely with our GRC Dev Ops team and various IT and Cybersecurity stakeholders to design, implement, and maintain data warehousing solutions. This role focuses on building scalable data pipelines and models, transforming raw data into curated datasets, and ensuring data is accessible for BI reporting and AI/ML use cases. This role offers the opportunity to work on impactful projects that bridge data engineering, analytics, and AI to drive innovation and efficiency in the GRC domain.
Key Responsibilities:
• Collaborate with the Business and Data Analysts, as well as, Front-end and Full Stack AI Developers to understand data requirements and deliver scalable solutions that support large-scale automation initiatives, incorporating AI/ML.
• Design, develop, and optimize ETL/ELT pipelines to process, model and transform data from raw to curated layers, enabling seamless integration into published layers for BI and advanced analytics.
• Implement and manage data warehousing solutions using Object storage, Snowflake, Databricks, Matillion, and Informatica.
• Develop and maintain APIs to facilitate secure and efficient data integration between various IT, Cyber and GRC systems, applications, and data pipelines.
• Ensure the accuracy, reliability, and scalability of data pipelines and data models.
• Support the ingestion, integration, and transformation of large datasets to meet IT, Cybersecurity and GRC operational and reporting needs.
• General: Partner with stakeholders to understand their data and reporting requirements and provide tailored solutions.
• General: Stay informed on the latest advancements in data engineering, warehousing, and integration tools and methodologies.
Qualifications:
• Proven experience as a Data Engineer with a focus on data warehousing, ETL/ELT development, and pipeline design.
• Strong proficiency in SQL, experience with relational and non-relational databases (e.g., MySQL, PostgreSQL, SQL Server, Snowflake, Databricks).
• Experience building APIs and integrating data pipelines with RESTful or GraphQL APIs.
• Hands-on experience with ETL/ELT tools and platforms such as Matillion, Informatica, or equivalent.
• Proficiency in programming languages such as Python or Java for building and optimizing data pipelines.
• General: Expertise in cloud platforms (AWS, Google Cloud, Azure) and their data services.
• General: Familiarity with BI tools like Power BI and an understanding of how to prepare data for reporting needs.
• General: Strong analytical and problem-solving skills with a focus on delivering high-quality, scalable solutions.
• General: Excellent communication and collaboration skills for cross-functional teamwork.
Preferred Qualifications (pluses, not required):
• Experience working on Cybersecurity or GRC-related projects or industries.
• Working knowledge of machine learning and AI concepts.
• Familiarity with data governance, security, and compliance principles.
• Understanding of regulatory compliance standards and frameworks.
Interested, please send resume to sgephart@geologics.com.
Sam Gephart
Recruiter
GeoLogics Corporation
888-303-3603
sgephart@geologics.com
GeoLogics is working with Raytheon Technologies in search of a Data Engineer with expertise in data warehousing and ETL.
Raytheon Technologies
Job Title: Data Engineer
Location: Remote
Must be a US Citizen
Hourly rate: OPEN
Job Summary:
We are seeking an experienced Data Engineer to join the Governance, Risk, and Compliance (GRC) team at Raytheon Technologies. The candidate will work closely with our GRC Dev Ops team and various IT and Cybersecurity stakeholders to design, implement, and maintain data warehousing solutions. This role focuses on building scalable data pipelines and models, transforming raw data into curated datasets, and ensuring data is accessible for BI reporting and AI/ML use cases. This role offers the opportunity to work on impactful projects that bridge data engineering, analytics, and AI to drive innovation and efficiency in the GRC domain.
Key Responsibilities:
• Collaborate with the Business and Data Analysts, as well as, Front-end and Full Stack AI Developers to understand data requirements and deliver scalable solutions that support large-scale automation initiatives, incorporating AI/ML.
• Design, develop, and optimize ETL/ELT pipelines to process, model and transform data from raw to curated layers, enabling seamless integration into published layers for BI and advanced analytics.
• Implement and manage data warehousing solutions using Object storage, Snowflake, Databricks, Matillion, and Informatica.
• Develop and maintain APIs to facilitate secure and efficient data integration between various IT, Cyber and GRC systems, applications, and data pipelines.
• Ensure the accuracy, reliability, and scalability of data pipelines and data models.
• Support the ingestion, integration, and transformation of large datasets to meet IT, Cybersecurity and GRC operational and reporting needs.
• General: Partner with stakeholders to understand their data and reporting requirements and provide tailored solutions.
• General: Stay informed on the latest advancements in data engineering, warehousing, and integration tools and methodologies.
Qualifications:
• Proven experience as a Data Engineer with a focus on data warehousing, ETL/ELT development, and pipeline design.
• Strong proficiency in SQL, experience with relational and non-relational databases (e.g., MySQL, PostgreSQL, SQL Server, Snowflake, Databricks).
• Experience building APIs and integrating data pipelines with RESTful or GraphQL APIs.
• Hands-on experience with ETL/ELT tools and platforms such as Matillion, Informatica, or equivalent.
• Proficiency in programming languages such as Python or Java for building and optimizing data pipelines.
• General: Expertise in cloud platforms (AWS, Google Cloud, Azure) and their data services.
• General: Familiarity with BI tools like Power BI and an understanding of how to prepare data for reporting needs.
• General: Strong analytical and problem-solving skills with a focus on delivering high-quality, scalable solutions.
• General: Excellent communication and collaboration skills for cross-functional teamwork.
Preferred Qualifications (pluses, not required):
• Experience working on Cybersecurity or GRC-related projects or industries.
• Working knowledge of machine learning and AI concepts.
• Familiarity with data governance, security, and compliance principles.
• Understanding of regulatory compliance standards and frameworks.
Interested, please send resume to sgephart@geologics.com.
Sam Gephart
Recruiter
GeoLogics Corporation
888-303-3603
sgephart@geologics.com





