

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
592
-
🗓️ - Date discovered
September 16, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Snowflake #Azure cloud #Data Pipeline #Databricks #Datasets #SQL Server #Cloud #SnowPipe #Scala #SQL (Structured Query Language) #Data Engineering #"ETL (Extract #Transform #Load)" #Data Ingestion #SnowSQL #Azure #Data Analysis
Role description
Job Summary
Optomi, in partnership with a leading telecommunications company, is seeking a data engineer to join their team. You will play a key role in transitioning legacy SQL Server processes to Snowflake, managing ETL jobs, and ensuring data pipelines are efficient, scalable, and reliable.
Qualifications Required:
• Strong hands-on experience with SQL and Snowflake.
• Solid understanding of ETL processes and data pipeline orchestration.
• Experience with Azure cloud services.
• Experience working in a Databricks environment.
• Familiarity with Infoworks as a data ingestion/scheduling platform.
• Self-motivated, curious, and able to thrive in a fast-moving, collaborative environment.
• Excellent communicator and team player—confident but humble.
Job Responsibilities:
• Manage and maintain data ingestion jobs using Infoworks for scheduling.
• Build and optimize scalable data pipelines and ETL processes from various sources into Snowflake.
• Migrate existing processes from SQL Server to Snowflake, including stored procedures.
• Collaborate with data analysts, engineers, and stakeholders to define and deliver clean, well-modeled datasets.
• Take ownership of your work—always looking for the next challenge and opportunities to improve.
• Ensure flexibility and reliability in scheduling and data operations.
Nice to Haves:
• Experience with broader Snowflake ecosystem (e.g., Snowpipe, SnowSQL).
• Domain knowledge in HR, HRIS, or People Analytics.
• Previous experience building in modern cloud data environments.
Job Summary
Optomi, in partnership with a leading telecommunications company, is seeking a data engineer to join their team. You will play a key role in transitioning legacy SQL Server processes to Snowflake, managing ETL jobs, and ensuring data pipelines are efficient, scalable, and reliable.
Qualifications Required:
• Strong hands-on experience with SQL and Snowflake.
• Solid understanding of ETL processes and data pipeline orchestration.
• Experience with Azure cloud services.
• Experience working in a Databricks environment.
• Familiarity with Infoworks as a data ingestion/scheduling platform.
• Self-motivated, curious, and able to thrive in a fast-moving, collaborative environment.
• Excellent communicator and team player—confident but humble.
Job Responsibilities:
• Manage and maintain data ingestion jobs using Infoworks for scheduling.
• Build and optimize scalable data pipelines and ETL processes from various sources into Snowflake.
• Migrate existing processes from SQL Server to Snowflake, including stored procedures.
• Collaborate with data analysts, engineers, and stakeholders to define and deliver clean, well-modeled datasets.
• Take ownership of your work—always looking for the next challenge and opportunities to improve.
• Ensure flexibility and reliability in scheduling and data operations.
Nice to Haves:
• Experience with broader Snowflake ecosystem (e.g., Snowpipe, SnowSQL).
• Domain knowledge in HR, HRIS, or People Analytics.
• Previous experience building in modern cloud data environments.