

Data Scientist
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Scientist in Dallas, TX, with a 5+ year experience requirement in data architecture or engineering. Key skills include SQL, Python, and cloud platforms. A bachelor's or master’s degree in a related field is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
August 7, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
On-site
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Data Manipulation #GCP (Google Cloud Platform) #Airflow #Spark (Apache Spark) #SQL (Structured Query Language) #Data Modeling #BigQuery #Cloud #Data Science #Apache Spark #dbt (data build tool) #SQL Server #Normalization #Python #AWS (Amazon Web Services) #Data Architecture #MySQL #Data Engineering #PostgreSQL #Azure #Data Warehouse #Databases #Snowflake #Scripting #Computer Science #Redshift
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Title: Data Scientist
Location: Dallas, TX
Job Description:
Bachelor's or master’s degree in computer science, Data Science, Information Systems, or related field.
5+ years of professional experience in data architecture, data engineering, or related field.
Expert in SQL (e.g., complex joins, CTEs, window functions, performance tuning).
Strong experience with Python for data manipulation, scripting, and pipeline development.
Proficiency with relational databases (e.g., PostgreSQL, MySQL, SQL Server) and cloud data warehouses (e.g., Snowflake, Redshift, BigQuery).
Familiarity with data modeling techniques (e.g., star schema, snowflake schema, normalization).
Hands-on experience with tools like Airflow, dbt, Apache Spark, or similar is a plus.
Knowledge of cloud platforms (AWS, GCP, or Azure) is a strong advantage.
Strong analytical and problem-solving skills.
Excellent communication and collaboration skills.