ConglomerateIT

Dataiku Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Dataiku Developer on a 12+ month contract in Richardson, TX (Hybrid), offering a competitive pay rate. Key skills include Dataiku DSS, Python, SQL, ETL solutions, and cloud platforms. A bachelor's degree in a related field is required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
February 19, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Richardson, TX
-
🧠 - Skills detailed
#Pandas #Scala #Complex Queries #Data Architecture #Data Processing #Automation #Azure #Data Engineering #Data Modeling #Dataiku #GIT #Python #Data Pipeline #Cloud #Programming #Agile #Deployment #Scrum #"ETL (Extract #Transform #Load)" #Mathematics #AWS (Amazon Web Services) #GCP (Google Cloud Platform) #Datasets #SQL (Structured Query Language) #Version Control #Computer Science
Role description
Job Title: Dataiku Developer Tax Term: W2/1099 Location: Richardson TX (Hybrid) Employment Type: Contract Duration: 12+ About Us ConglomerateIT is a certified and a pioneer in providing premium end-to-end Global Workforce Solutions and IT Services to diverse clients across various domains. Visit us at http://www.conglomerateit.com Our mission is to establish global cross culture human connections that further the careers of our employees and strengthen the businesses of our clients. We are driven to use the power of global network to connect business with the right people without bias. We provide Global Workforce Solutions with affability. Job Summary We are looking for an experienced Dataiku Developer with strong Python and SQL expertise to design, build, and manage scalable data pipelines and analytics workflows using the Dataiku DSS platform. The ideal candidate will play a key role in developing datasets, automating processes, optimizing transformations, and supporting production deployments in a fast-paced enterprise environment. Core Technical Requirements β€’ Hands-on experience with Dataiku DSS, including workflow development and dataset creation β€’ Strong programming skills in Python (pandas) for data processing and transformation β€’ Advanced SQL expertise with the ability to develop and optimize complex queries β€’ Experience building and maintaining ETL/data pipeline solutions β€’ Solid understanding of data warehousing principles and data modeling concepts β€’ Experience working with large datasets and implementing data validation and quality checks β€’ Exposure to cloud platforms such as AWS, GCP, or Azure β€’ Understanding of scalable data architecture and distributed processing concepts β€’ Experience supporting and troubleshooting production workflows β€’ Ability to integrate data from multiple enterprise systems β€’ Working knowledge of CI/CD methodologies and deployment practices β€’ Experience with version control systems like Git β€’ Familiarity with Agile/Scrum frameworks β€’ Understanding of workflow automation and release management processes Soft Skills & Collaboration β€’ Strong analytical thinking and problem-solving abilities β€’ Excellent communication skills, both verbal and written β€’ Ability to collaborate effectively with cross-functional teams including data engineers and business stakeholders β€’ Capability to translate business needs into technical solutions Educational Qualifications β€’ Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related discipline β€’ Relevant professional experience in data engineering or analytics development