

Data Engineer with Snowflake & AWS
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position based in Plano, Texas, for over 6 months, offering $50.00 - $55.00 per hour. Requires 8+ years of IT experience, expertise in Snowflake, AWS, SQL, and data pipeline tools.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
440
-
ποΈ - Date discovered
August 2, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Plano, TX 75075
-
π§ - Skills detailed
#GIT #SQL (Structured Query Language) #Azure #Data Lake #Jira #GCP (Google Cloud Platform) #Talend #Data Pipeline #Big Data #Version Control #Automation #BitBucket #Informatica #Airflow #Databases #Python #"ETL (Extract #Transform #Load)" #Cloud #Data Engineering #AWS (Amazon Web Services) #Scripting #dbt (data build tool) #Oracle #Migration #Deployment #Bash #Hadoop #Apache Airflow #Snowflake
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
It is fully onsite position in Plano, Texas (5 Days a week in office) - anyone with less than 8 years of experience will be automatically rejected.
8+ years of IT experience troubleshooting data driven defects and snowflake/data lake/AWS integration
Data Engineer support research on data needed for new functionality, troubleshooting data driven defects.
Experience in writing SQL for any one Relational database ex. Oracle
Extensive knowledge in HiveSQL Hadoop
Experience in Version control through Bitbucket Issue tracking through JIRA CICD
Extensive hands-on experience with Snowflake cloud data platform.
Strong proficiency in SQL and performance tuning in Snowflake.
Experience with data pipeline tools and ETL/ELT processes (e.g., Apache Airflow, Talend, Informatica, dbt).
Solid understanding of cloud ecosystems (AWS, Azure, or GCP) and how Snowflake integrates with them.
Experience with data modelling, warehousing concepts, and best practices.
Knowledge of scripting languages (Python, Bash) for automation and orchestration.
4 to 5 years of experience in Data analytics, DWH, Big data
Should have experience in development, deployment and publishing the code
Added advantage if have experience on Migration projects
Added advantage if have experience on AWS environment
Value added with GIT, CICD pipelines and Talend experience.
Job Types: Full-time, Contract
Pay: $50.00 - $55.00 per hour
Application Question(s):
Can you do 5 days a week in office?
Are you ok with the given rates - 55 - 60 USD/hour?
Do you have hands on experience with relational databases?
Do you have 8+ years of experience in troubleshooting data driven defects and snowflake/data lake/AWS integration?
Do you have minimum 5 years of experience with - Apache Airflow, Talend, Informatica, dbt?
Do you have 4 to 5 years of experience in Data analytics, DWH, Big data?
Do you have proficiency with Python or Bash for automation and orchestration?
Are you US citizen?
Ability to Commute:
Plano, TX 75075 (Required)
Ability to Relocate:
Plano, TX 75075: Relocate before starting work (Required)
Work Location: In person