

InstantServe LLC
Data Warehouse Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Warehouse Engineer for a contract length of "unknown," offering a pay rate of "unknown," and is remote. Key skills include Azure, Snowflake, SQL, and data modeling. A Bachelor's degree and 5 years of programming experience are required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
March 4, 2026
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Nevada, United States
-
π§ - Skills detailed
#Programming #Data Warehouse #Scripting #"ETL (Extract #Transform #Load)" #Cloud #Statistics #Databases #Web Services #Java #SQL (Structured Query Language) #C++ #Visualization #Scala #Scrum #Snowflake #Tableau #Datasets #Azure #Agile
Role description
This role develops and implements pipelines that extract, transform, and load data into an information product that helps the organization create a comprehensive data repository, Focuses on ingesting, storing, processing, and analyzing large datasets, creates scalable, high-performance web services for tracking data, translates complex technical and functional requirements into detailed designs, creates scripts to integrate various sources of data, investigates options for data storing and processing to ensure implementation of the most streamlined solutions, and works closely with statistics team to implement data analytics pipelines.
Experience and Skills Required-
β’ Experience with Azure, Snowflake, or other cloud technologies
β’ Understanding of data-warehousing and data-modeling techniques
β’ Knowledge of visualization and analytics tools (ex: Tableau, PowerBI)
β’ SQL, C++, Java, or other data programming language
β’ Knowledge of databases, stored procedures, and common scripting languages
β’ Good interpersonal, communication, and organizational skills
Experience and Skills Preferred-
β’ 5 yearsβ experience programming
β’ Bachelor's degree or an equivalent combination of education and experience such as any relevant IT certifications or experience
β’ Experience with Scrum or Agile methodologies
β’ Knowledge of national standards such as USJR, NODS, NIEM and/or Justice Counts a benefit
β’ Knowledge of courts and justice processes a plus
This role develops and implements pipelines that extract, transform, and load data into an information product that helps the organization create a comprehensive data repository, Focuses on ingesting, storing, processing, and analyzing large datasets, creates scalable, high-performance web services for tracking data, translates complex technical and functional requirements into detailed designs, creates scripts to integrate various sources of data, investigates options for data storing and processing to ensure implementation of the most streamlined solutions, and works closely with statistics team to implement data analytics pipelines.
Experience and Skills Required-
β’ Experience with Azure, Snowflake, or other cloud technologies
β’ Understanding of data-warehousing and data-modeling techniques
β’ Knowledge of visualization and analytics tools (ex: Tableau, PowerBI)
β’ SQL, C++, Java, or other data programming language
β’ Knowledge of databases, stored procedures, and common scripting languages
β’ Good interpersonal, communication, and organizational skills
Experience and Skills Preferred-
β’ 5 yearsβ experience programming
β’ Bachelor's degree or an equivalent combination of education and experience such as any relevant IT certifications or experience
β’ Experience with Scrum or Agile methodologies
β’ Knowledge of national standards such as USJR, NODS, NIEM and/or Justice Counts a benefit
β’ Knowledge of courts and justice processes a plus






