Data Engineer with Scope Scripting Experience

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with Scope Scripting Experience, requiring 10-15 years of experience. Contract length is C2C, with on-site work in Redmond, Washington. Key skills include Scope/iScope scripting, SQL, Azure Data Services, and data visualization tools.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
May 24, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Redmond, WA
-
🧠 - Skills detailed
#Data Architecture #Python #Scala #Scripting #Azure #BI (Business Intelligence) #Consulting #Data Pipeline #Grafana #Storage #Data Processing #C# #Consul #"ETL (Extract #Transform #Load)" #Data Engineering #Visualization #Synapse #Data Quality #ADF (Azure Data Factory) #Data Lake #SQL (Structured Query Language) #Microsoft Power BI #Data Enrichment
Role description
Who We Are Artmac Soft is a technology consulting and service-oriented IT company dedicated to providing innovative technology solutions and services to Customers. Job Description Job Title : Data Engineer with Scope Scripting Experience Job Type : C2C Experience : 10 - 15 Years Location : Redmond, Washington(On-site) Required Skills β€’ Mandatory: Experience with Scope/iScope scripting and Cosmos VCs β€’ Strong proficiency in SQL, C#, and scripting languages β€’ Experience with Python and/or Scala is a plus β€’ Proficiency in Azure Data Services (ADF, Synapse, Data Lake) β€’ Familiarity with ETL processes and data enrichment pipelines β€’ Expertise in data visualization tools (Power BI, Grafana) β€’ Strong analytical and problem-solving skills β€’ Knowledge of data quality validation and KPI development Responsibilities β€’ Design and develop scalable end-to-end data pipelines β€’ Write and maintain Scope/iScope scripts for large-scale data processing β€’ Construct and optimize data architectures and storage systems β€’ Perform data validation and define effective data KPIs β€’ Develop and deploy pipelines using Azure Synapse, ADF, and Data Lakes β€’ Generate insights through data engineering and analytics β€’ Manage Azure subscriptions and monitor data environments β€’ Visualize data using tools like Power BI, Grafana, etc. β€’ Collaborate with stakeholders for data requirement gathering and delivery Qualification β€’ Bachelor's degree or equivalent combination of education and experience.