OVA.Work

ETL Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior ETL Developer in CA, offering over 6 months of full-time work at a competitive pay rate. Requires 3+ years of ETL experience, strong SQL skills, and proficiency with tools like Informatica and Talend.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 9, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Scala #SSIS (SQL Server Integration Services) #Informatica #Redshift #Data Profiling #Business Analysis #DevOps #Data Engineering #Documentation #Computer Science #Big Data #Spark (Apache Spark) #Deployment #Azure #BigQuery #Data Warehouse #API (Application Programming Interface) #Snowflake #Data Manipulation #Data Analysis #Talend #Automation #Airflow #GCP (Google Cloud Platform) #Data Architecture #DataStage #Python #Cloud #Scripting #Data Modeling #Kafka (Apache Kafka) #SQL (Structured Query Language) #Data Integration #Data Pipeline #AWS Glue #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #Code Reviews #dbt (data build tool) #Storage #Databases #SQL Server #Hadoop #Oracle
Role description
Job Title: ETL Developer Location: CA Employment Type: Full-time Experience Level: Senior About The Role We are seeking a skilled ETL Developer to design, develop, and maintain robust data integration solutions. The ideal candidate will be responsible for transforming, cleaning, and loading large volumes of data from multiple sources into data warehouses or analytical platforms. You will work closely with data analysts, engineers, and business teams to ensure the availability, accuracy, and efficiency of enterprise data pipelines. Key Responsibilities • Design, develop, and implement ETL (Extract, Transform, Load) processes and data integration workflows. • Work with diverse data sources including SQL, APIs, flat files, and cloud storage systems. • Develop and maintain data pipelines to ensure reliable and scalable data movement. • Optimize ETL processes for performance, scalability, and error handling. • Collaborate with data architects and business analysts to define data models and mappings. • Perform data profiling, validation, and quality checks to ensure accuracy and consistency. • Monitor ETL jobs, troubleshoot failures, and perform root cause analysis. • Maintain documentation for ETL processes, data flows, and transformations. • Support data warehouse design and ensure data is properly structured for reporting and analytics. • Participate in code reviews, deployment planning, and continuous improvement initiatives. Required Skills & Qualifications • Bachelor's degree in Computer Science, Information Systems, or a related field. • 3+ years of experience in ETL development, data integration, or data engineering. • Strong proficiency in SQL and experience with ETL tools such as: • Informatica, Talend, SSIS, DataStage, Pentaho, or AWS Glue. • Hands-on experience with data warehousing concepts and relational databases (e.g., Oracle, SQL Server, Snowflake, Redshift, BigQuery). • Knowledge of scripting languages (Python, Shell, etc.) for automation and data manipulation. • Understanding of data modeling, performance tuning, and error handling. • Familiarity with cloud platforms (AWS, Azure, or GCP) and modern data pipelines (Airflow, dbt, etc.) is a plus. Preferred Qualifications • Experience with big data technologies (Spark, Hadoop, Kafka). • Exposure to API integration and RESTful data sources. • Knowledge of DevOps and CI/CD practices for data pipeline deployment. • Strong analytical and problem-solving skills with attention to detail. Why Join Us • Opportunity to work on cutting-edge data technologies. • Collaborative and innovative team environment. • Competitive salary and benefits package. • Career growth and learning opportunities in data engineering and analytics.