

Big Data Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Big Data Developer with a contract length of "unknown" and a pay rate of "unknown." Candidates must have 10+ years of experience, proficiency in Snowflake, Qlik, Oracle, and strong programming skills in Python, Java, or Scala.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 29, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Houston, TX
-
π§ - Skills detailed
#Data Pipeline #Java #Big Data #dbt (data build tool) #Python #Data Governance #Scala #Qlik #Security #Databases #Oracle #"ETL (Extract #Transform #Load)" #Data Engineering #Cloud #Programming #Data Warehouse #Data Integrity #Snowflake #Data Integration #SQL (Structured Query Language)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
We are seeking a seasoned Senior Data Engineer with 10+ years of experience in designing and implementing large-scale data solutions across cloud and on-premises environments. The ideal candidate will be proficient in modern data integration frameworks and skilled in building scalable, high-performance data pipelines.
Key Responsibilities:
Design, develop, and implement robust data pipelines and ETL solutions for Snowflake Cloud Data Warehouse.
Integrate data from various sources using tools like Qlik, dbt, and traditional databases (Oracle, SQL).
Utilize programming languages such as Python, Java, or Scala to create custom data workflows.
Collaborate with stakeholders to support end-user testing and ensure data integrity.
Conduct data validation, quality checks, and analytics to maintain high standards of reporting.
Follow established IT policies, security standards, and best practices throughout the project lifecycle.
Required Skills:
Proven experience with Snowflake, Qlik Replicate, Oracle, SQL, and dbt.
Strong programming background in Python, Java, or Scala.
Deep understanding of data pipeline design and performance optimization.
Expertise in both cloud and on-premise data integration strategies.
Strong grasp of data governance, testing methodology, and reporting best practices.