

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer contract (W2) position, remote, requiring 12 years of experience, strong skills in Snowflake, Python or DB2, BI tools (Looker, Grafana), and cloud environments (AWS, Azure, GCP).
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 17, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Grafana #Cloud #BI (Business Intelligence) #Snowflake #SQL (Structured Query Language) #Scala #Python #Datadog #Looker #AWS (Amazon Web Services) #Prometheus #GCP (Google Cloud Platform) #Dimensional Modelling #Monitoring #Data Modeling #Azure #"ETL (Extract #Transform #Load)" #Data Engineering #Data Pipeline
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Data Engineer
Location: Remote
Type: Contract Strictly W2
Open for any Independent Visa and Also Considering H1 Transfers
Job Summary:
We are seeking a skilled Data Engineer with strong experience in Snowflake, data modeling, and building robust data pipelines using Python or DB2. The ideal candidate should also have hands-on expertise with BI tools like Looker, Grafana, or equivalents, and a strong understanding of infrastructure monitoring and alerting systems.
This role will play a key part in designing scalable data solutions, ensuring data availability, and integrating analytics systems to support critical business decisions.
Required Skills & Qualifications:
β’ 12 years of professional experience as a Data Engineer or similar role.
β’ Strong hands-on experience with Snowflake (warehousing, performance tuning, role-based access control).
β’ Proficiency in data modelling (dimensional modelling, star/snowflake schemas).
β’ Experience building ETL/ELT pipelines in Python and/or DB2.
β’ Familiarity with Looker, Grafana, or other BI/reporting tools.
β’ Exposure to infrastructure monitoring systems (e.g., Prometheus, Datadog, CloudWatch, PagerDuty).
β’ Strong understanding of SQL and performance optimization techniques.
β’ Experience working with cloud environments (AWS, Azure, or GCP).
β’ Excellent communication and collaboration skills.
Interested candidates are requested to submit their updated resumes to Raj@tekvividinc.com