

GeoLogics Corporation
Data Engineer - Building Pipeline - ETL/ELT
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with expertise in ETL/ELT, data warehousing, and pipeline building. It offers remote work for a US citizen, with a contract length of unspecified duration and pay rates of $65-$70 (W2) or $80-$85 (C2C). Key skills include Python, SQL, API integration, and experience with Snowflake and Databricks.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
March 6, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Databases #"ETL (Extract #Transform #Load)" #SQL Server #AI (Artificial Intelligence) #Programming #Datasets #Data Pipeline #Microsoft Power BI #Compliance #SQL (Structured Query Language) #.Net #Normalization #Snowflake #GIT #Cloud #Security #PostgreSQL #Data Modeling #Azure #Scala #Data Engineering #Version Control #Java #Cybersecurity #AWS (Amazon Web Services) #BI (Business Intelligence) #Data Integration #API (Application Programming Interface) #Automated Testing #Databricks #Data Analysis #Matillion #Informatica #MySQL #Storage #C# #Automation #ML (Machine Learning) #Python #GraphQL
Role description
GeoLogics is working with Raytheon Technologies - search for Data Engineer with expertise in data warehousing, ETL/ELT and building pipelines.
Stuff to know
• Coding API integrations and ETL/ ELT
• ETL/ ELT: data modeling & normalization across disparate data sets
• ETL and Data Movement: within and across ADFS/ Blob, Snowflake medallion structure, Databricks
• Coding Languages: Python, Java, C#/.NET, SQL
Remote
Must be a US Citizen
Hourly rate: $65 to $70 (w2 non-benefited) OR $80-$85 (c2c – your company)
Summary:
We are seeking an experienced Data Engineer to join the Governance, Risk, and Compliance (GRC) team at Raytheon Technologies. The candidate will work closely with our GRC Dev Ops team and various IT and Cybersecurity stakeholders to design, implement, and maintain data warehousing solutions. This role focuses on building scalable data pipelines and models, transforming raw data (from structured, semi-structured and unstructured sources) into curated datasets, and ensuring data is accessible for BI reporting and AI/ML use cases.
Responsibilities:
• Collaborate with the Business and Data Analysts as well as Front-end and Full Stack AI Developers to understand data requirements and deliver scalable solutions that support large-scale automation initiatives, incorporating AI/ML.
• Design, develop, and optimize ETL/ELT pipelines to process, model and transform data from raw to curated layers, enabling seamless integration into published layers for BI and advanced analytics.
• Implement and manage data warehousing solutions using Object storage, Snowflake, Databricks, Matillion, and Informatica.
• Develop and maintain APIs to facilitate secure and efficient data integration between various IT, Cyber and GRC systems, applications, and data pipelines.
• Ensure the accuracy, reliability, and scalability of data pipelines and data models.
• Support the ingestion, integration, and transformation of large datasets to meet IT, Cybersecurity and GRC operational and reporting needs.
• General: Partner with stakeholders to understand their data and reporting requirements and provide tailored solutions.
• General: Stay informed on the latest advancements in data engineering, warehousing, and integration tools and methodologies.
Qualifications:
• Proven experience as a Data Engineer with a focus on data warehousing, ETL/ELT development, and pipeline design.
• Strong proficiency in SQL, experience with relational and non-relational databases (e.g., MySQL, PostgreSQL, SQL Server, Snowflake, Databricks).
• Experience building APIs and integrating data pipelines with RESTful or GraphQL APIs and implement CI/CD for pipelines and SQL transformations (Git workflows, automated testing, release/version control)
• Hands-on experience with ETL/ELT tools and platforms such as Matillion, Informatica, or equivalent.
• Proficiency in programming languages such as Python or Java for building and optimizing data pipelines.
• General: Expertise in cloud platforms (AWS, Google Cloud, Azure) and their data services.
• General: Familiarity with BI tools like Power BI and an understanding of how to prepare data for reporting needs.
• General: Strong analytical and problem-solving skills with a focus on delivering high-quality, scalable solutions.
• General: Excellent communication and collaboration skills for cross-functional teamwork.
GeoLogics is working with Raytheon Technologies - search for Data Engineer with expertise in data warehousing, ETL/ELT and building pipelines.
Stuff to know
• Coding API integrations and ETL/ ELT
• ETL/ ELT: data modeling & normalization across disparate data sets
• ETL and Data Movement: within and across ADFS/ Blob, Snowflake medallion structure, Databricks
• Coding Languages: Python, Java, C#/.NET, SQL
Remote
Must be a US Citizen
Hourly rate: $65 to $70 (w2 non-benefited) OR $80-$85 (c2c – your company)
Summary:
We are seeking an experienced Data Engineer to join the Governance, Risk, and Compliance (GRC) team at Raytheon Technologies. The candidate will work closely with our GRC Dev Ops team and various IT and Cybersecurity stakeholders to design, implement, and maintain data warehousing solutions. This role focuses on building scalable data pipelines and models, transforming raw data (from structured, semi-structured and unstructured sources) into curated datasets, and ensuring data is accessible for BI reporting and AI/ML use cases.
Responsibilities:
• Collaborate with the Business and Data Analysts as well as Front-end and Full Stack AI Developers to understand data requirements and deliver scalable solutions that support large-scale automation initiatives, incorporating AI/ML.
• Design, develop, and optimize ETL/ELT pipelines to process, model and transform data from raw to curated layers, enabling seamless integration into published layers for BI and advanced analytics.
• Implement and manage data warehousing solutions using Object storage, Snowflake, Databricks, Matillion, and Informatica.
• Develop and maintain APIs to facilitate secure and efficient data integration between various IT, Cyber and GRC systems, applications, and data pipelines.
• Ensure the accuracy, reliability, and scalability of data pipelines and data models.
• Support the ingestion, integration, and transformation of large datasets to meet IT, Cybersecurity and GRC operational and reporting needs.
• General: Partner with stakeholders to understand their data and reporting requirements and provide tailored solutions.
• General: Stay informed on the latest advancements in data engineering, warehousing, and integration tools and methodologies.
Qualifications:
• Proven experience as a Data Engineer with a focus on data warehousing, ETL/ELT development, and pipeline design.
• Strong proficiency in SQL, experience with relational and non-relational databases (e.g., MySQL, PostgreSQL, SQL Server, Snowflake, Databricks).
• Experience building APIs and integrating data pipelines with RESTful or GraphQL APIs and implement CI/CD for pipelines and SQL transformations (Git workflows, automated testing, release/version control)
• Hands-on experience with ETL/ELT tools and platforms such as Matillion, Informatica, or equivalent.
• Proficiency in programming languages such as Python or Java for building and optimizing data pipelines.
• General: Expertise in cloud platforms (AWS, Google Cloud, Azure) and their data services.
• General: Familiarity with BI tools like Power BI and an understanding of how to prepare data for reporting needs.
• General: Strong analytical and problem-solving skills with a focus on delivering high-quality, scalable solutions.
• General: Excellent communication and collaboration skills for cross-functional teamwork.






