ETL Designer

⭐ - Featured Role | Apply direct with Data Freelance Hub
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 16, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Plano, TX
-
🧠 - Skills detailed
#Batch #Dataflow #Deployment #GIT #Security #SQL (Structured Query Language) #API (Application Programming Interface) #Data Ingestion #Data Architecture #Python #Agile #BigQuery #Cloud #Data Integrity #Version Control #Data Lake #GCP (Google Cloud Platform) #Apache Airflow #Airflow #"ETL (Extract #Transform #Load)" #Computer Science #Apache Beam #Monitoring #Data Processing #Scala #Data Engineering #Storage
Role description
Job Description: We are seeking a highly experienced ETL designer to join our team. The ideal candidate will have over 12-15 years of experience in designing and implementing data solutions, with a strong focus on cloud data warehousing and real-time data processing. This role requires advanced knowledge in various data technologies and the ability to translate business requirements into scalable and efficient data architectures. Key Responsibilities: β€’ Design, develop, and maintain robust ETL pipelines on GCP using tools like Dataflow, BigQuery, Cloud Composer (Apache Airflow), and Pub/Sub β€’ Optimize data ingestion, transformation, and loading processes for high-volume, real-time, and batch data processing β€’ Ensure data integrity, quality, and security throughout the ETL lifecycle β€’ Collaborate with data engineers, analysts, and business teams to understand data needs and translate them into efficient ETL solutions β€’ Implement best practices for performance tuning, error handling, and monitoring of ETL jobs β€’ Work with structured and unstructured data sources, integrating API-based, event-driven, and batch processing workflows β€’ Automate and document ETL processes to enhance maintainability and scalability β€’ Utilize Git for version control and deployment of ETL workflows, ensuring smooth CI/CD integration Key Skills: β€’ 12+ years of experience in ETL design and development β€’ Proficiency in SQL, Python, and Apache Beam for data processing using dataflow, dataproc appliances. β€’ Strong experience in advance DML operations, query optimization β€’ Strong expertise in GCP services (Big Query, Dataflow, Cloud Storage, Pub/Sub, Cloud Composer) β€’ Experience with data modelling, warehousing concepts, and data lakes β€’ Hands-on experience with orchestration tools like Apache Airflow or Cloud Composer on DAG design, Error handling and monitoring β€’ Knowledge of performance tuning, data partitioning, and optimization techniques β€’ Experience in real-time streaming ETL and event-driven architectures is a plus β€’ Familiarity with Git for version control and deployment of ETL pipelines β€’ Strong problem-solving skills and ability to work in an agile environment Qualifications: β€’ Bachelor’s or master’s degree in computer science, Information Technology, or a related field. β€’ Proven experience in ETL design and implementation in DWH systems. β€’ Strong problem-solving skills and the ability to work in a fast-paced environment. Excellent communication and collaboration skills.