eAspire Technolabs Inc.

ETL Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an ETL Engineer in Windsor, CT, with a 12+ year experience requirement. The contract offers a competitive pay rate and demands expertise in ETL tools, SQL, data modeling, and cloud platforms. A Bachelor's degree is required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 21, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Windsor, CT
-
🧠 - Skills detailed
#Python #Computer Science #Data Lake #"ETL (Extract #Transform #Load)" #Shell Scripting #Snowflake #DataStage #ADF (Azure Data Factory) #Airflow #Scripting #AWS (Amazon Web Services) #Data Warehouse #AWS Glue #Data Quality #Data Processing #Documentation #SSIS (SQL Server Integration Services) #Data Architecture #Cloud #Talend #Azure #PostgreSQL #Oracle #Data Integration #Migration #Data Migration #MySQL #Dataflow #SQL Server #SQL (Structured Query Language) #Scala #Automation #Databases #Informatica #Data Modeling #Azure Data Factory #Data Pipeline
Role description
Role: ETL Engineer Location: Windsor, CT (Onsite) Visa: H1B Experience: 12+ Years Job Summary: We are seeking a skilled ETL Engineer to design, develop, and maintain scalable data integration solutions. The ideal candidate will be responsible for extracting data from various sources, transforming it into usable formats, and loading it into data warehouses or analytical systems. This role requires strong experience with ETL tools, SQL, data modeling, and performance optimization. Key Responsibilities: β€’ Design, develop, and maintain ETL workflows and data pipelines for large-scale data integration. β€’ Extract, transform, and load data from various structured and unstructured sources into target systems (e.g., Data Warehouse, Data Lake). β€’ Work closely with data architects, analysts, and business stakeholders to understand data requirements and deliver high-quality data solutions. β€’ Optimize ETL processes for performance, scalability, and reliability. β€’ Ensure data quality, accuracy, and integrity across all stages of data processing. β€’ Monitor ETL jobs, troubleshoot issues, and perform root cause analysis for failures or delays. β€’ Implement automation and scheduling for ETL processes using tools such as Airflow, Control-M, or similar. β€’ Support data migration, cleansing, and validation activities as part of ongoing data initiatives. β€’ Create and maintain technical documentation for ETL processes and data flows. Required Skills and Qualifications: β€’ Bachelor’s degree in Computer Science, Information Systems, or related field. β€’ 12+ years of experience in ETL development and data integration. β€’ Hands-on experience with ETL tools such as Informatica, Talend, DataStage, SSIS, Pentaho, or similar. β€’ Strong proficiency in SQL and experience working with relational databases (e.g., Oracle, SQL Server, MySQL, PostgreSQL). β€’ Experience with data warehousing concepts, data modeling, and dimensional modeling (Star/Snowflake schema). β€’ Knowledge of Python, Shell scripting, or other scripting languages for automation. β€’ Familiarity with cloud-based ETL and data platforms (e.g., AWS Glue, Azure Data Factory, Google Cloud Dataflow, Snowflake). β€’ Strong analytical, problem-solving, and communication skills.