Golden Technology

AI Assisted Data Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AI Assisted Data Developer on a contract basis, remote in ET or CT time zone. Requires 3+ years of data engineering experience, strong SQL, Python, Azure, Snowflake, and familiarity with AI tools like Claude.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 24, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Data Integration #Data Pipeline #Data Quality #AWS (Amazon Web Services) #AI (Artificial Intelligence) #ChatGPT #Computer Science #Data Engineering #Data Modeling #dbt (data build tool) #Compliance #Data Governance #Redshift #Snowflake #"ETL (Extract #Transform #Load)" #GitHub #Debugging #GCP (Google Cloud Platform) #GIT #Talend #Deployment #Scala #SQL (Structured Query Language) #Azure #Informatica #Monitoring #ML (Machine Learning) #Documentation #Airflow #Data Privacy #Python #Security #Programming #Normalization #Automation #Cloud #Version Control #Spark (Apache Spark) #Data Warehouse #Kafka (Apache Kafka) #BigQuery
Role description
Your Next Career Move Starts Here! The Role: AI Assisted Data Developer Location: Remote in ET or CT time zone Type: Contract What You’ll Do We are seeking an AI Assisted Data Developer who combines strong data engineering and analytics skills with the ability to leverage AI-powered tools to accelerate development, improve data quality, and optimize workflows. This role focuses on building scalable data solutions while using AI to enhance productivity, automate repetitive tasks, and generate insights more efficiently. An AI Assisted Data Developer works with an LLM to do the full SDLC. They plan and analyze in Claude, they create development plans in Claude, they manage work and builds with Claude, they create testing harnesses and deployment pipelines within Claude, they test and refactor in Claude. Key Responsibilities β€’ Develop, test, and deploy high-quality software solutions using modern programming languages and frameworks β€’ Design, build, and maintain scalable data pipelines, ETL/ELT processes, and data integrations β€’ Leverage AI-assisted tools to accelerate SQL development, data transformation, debugging, and documentation β€’ Develop and optimize data models for analytics, reporting, and operational use cases β€’ Validate, test, and refine AI-generated queries, pipelines, and code to ensure accuracy and performance β€’ Collaborate with business stakeholders, analysts, and engineers to translate requirements into data solutions β€’ Implement data quality checks, monitoring, and governance practices β€’ Integrate AI/ML capabilities (e.g., APIs, models, automation tools) into data workflows where applicable β€’ Build reusable components, templates, and prompt frameworks to improve team efficiency β€’ Document data flows, architectures, and AI-assisted processes Required Qualifications β€’ Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or related field (or equivalent experience) β€’ 3+ years of experience in data engineering, data development, or analytics engineering β€’ Must have strong Data experience and Full Stack Development, Azure, Snowflake, Python, Informatica, LLM, Claude, etc. experience. β€’ Strong SQL skills and experience with relational and/or cloud data warehouses (e.g., Snowflake, BigQuery, Redshift) β€’ Proficiency in at least one programming language (e.g., Python, Scala) β€’ Experience with ETL/ELT tools (e.g., dbt, Informatica, Airflow, Talend) β€’ Familiarity with AI-assisted development tools (e.g., Claude, GitHub Copilot, ChatGPT, CodeWhisperer) β€’ Understanding of data modeling concepts (star/snowflake schemas, normalization, dimensional modeling) β€’ Experience with version control (Git) and CI/CD practices Additional Qualifications β€’ Experience integrating AI/ML models or working with large language models (LLMs) in data workflows β€’ Knowledge of prompt engineering for generating SQL, documentation, and transformations β€’ Experience with cloud platforms (AWS, Azure, GCP) and modern data stacks β€’ Familiarity with data governance, lineage, and cataloging tools β€’ Exposure to real-time/streaming data tools (e.g., Kafka, Spark Streaming) β€’ Understanding of data privacy, security, and compliance considerations Key Skills β€’ AI-assisted development (prompting, iteration, validation) β€’ Data pipeline design and optimization β€’ Analytical thinking and problem-solving β€’ Data quality and governance mindset β€’ Communication with technical and non-technical stakeholders β€’ Adaptability in a rapidly evolving data and AI landscape : Why Golden Technology? Founded in 1997, Golden Technology has grown from a two-person vision into a trusted partner for Fortune 500 clients nationwide. Along the way, we’ve built a culture centered on family, professional growth, and giving back to our communities through our Golden Community initiatives. Here’s What Sets Us Apart β€’ Family-first culture that values balance and support. β€’ Career development through mentorship, opportunities, and investment in our people. β€’ Community impact by dedicating time, talent, and resources to the places we live and work. What’s In It For You β€’ Exciting opportunities with innovative companies. β€’ Guidance from a recruiting team with decades of experience. β€’ A professional, supportive culture built on trust and collaboration. Ready to take the next step? Apply today and let’s build your brighter future together.