PALNAR

Only W2 | Sr. Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer on a contract basis, offering a competitive pay rate. Candidates should have 4–8 years of data integration experience, proficiency in ETL tools, SQL, Python, and familiarity with cloud platforms like GCP.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 14, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Catalog #Kafka (Apache Kafka) #BigQuery #Automation #Computer Science #Data Pipeline #Batch #Dataflow #"ETL (Extract #Transform #Load)" #Redshift #Talend #Boomi #Data Lineage #Snowflake #XML (eXtensible Markup Language) #Scala #GCP (Google Cloud Platform) #REST (Representational State Transfer) #NiFi (Apache NiFi) #Business Analysis #SQL (Structured Query Language) #SSIS (SQL Server Integration Services) #Databases #BI (Business Intelligence) #Data Management #Metadata #Documentation #JSON (JavaScript Object Notation) #Compliance #Data Manipulation #Data Science #Data Ingestion #GDPR (General Data Protection Regulation) #Spark (Apache Spark) #Scripting #SaaS (Software as a Service) #DevOps #Informatica #Data Engineering #Debugging #Security #Python #Data Integration #Cloud #Monitoring #Data Security #Data Quality #Apache NiFi
Role description
Job Summary: We are looking for a detail-oriented and technically skilled Data Integration Engineer to design, develop, and manage robust data integration solutions. The ideal candidate will have hands-on experience in integrating data across disparate systems, building ETL/ELT pipelines, and ensuring the accuracy, quality, and consistency of enterprise data. You will play a key role in enabling seamless data flow between systems to support business intelligence, analytics, and operational needs. Key Responsibilities: Integration Design & Development • Design and implement data integration workflows between internal and external systems, including APIs, databases, SaaS applications, and cloud platforms. • Develop and maintain scalable ETL/ELT pipelines for structured and unstructured data using tools like Informatica, Talend, SSIS, Apache NiFi, or custom Python/SQL scripts. • Build and manage real-time and batch data pipelines leveraging technologies like Kafka, Spark Streaming,. Data Quality & Governance • Ensure high data quality, accuracy, and consistency during data ingestion and transformation. • Implement data validation, cleansing, deduplication, and monitoring mechanisms. • Contribute to metadata management, data lineage, and data catalog initiatives. Collaboration & Troubleshooting • Collaborate with data engineers, business analysts, data scientists, and application teams to understand integration needs and deliver effective solutions. • Troubleshoot and resolve data integration and pipeline issues in a timely manner. • Provide documentation and knowledge transfer for developed solutions. Platform & Infrastructure Support • Support data movement across hybrid environments (on-prem, cloud, third-party systems). • Work with DevOps or platform teams to ensure scalability, security, and performance of data integration infrastructure. Qualifications: • Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related field. • 4–8 years of experience in data integration, data engineering, with strong ETL and SQL. • Strong experience with integration tools such as Informatica, Talend, MuleSoft, SSIS, or Boomi. • Proficient in SQL, Python, and scripting for data manipulation and automation. • Experience with cloud data platforms (GCP) and services such as Google Cloud Dataflow. • Familiarity with REST/SOAP APIs, JSON, XML, and flat file integrations. Preferred Skills: • Experience with message queues or data streaming platforms (Kafka, RabbitMQ, Kinesis). • Understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, BigQuery). • Knowledge of data security, privacy, and compliance best practices (HIPAA, GDPR, etc.). • Prior experience in industries like healthcare, fintech, or e-commerce is a plus. Soft Skills: • Strong problem-solving and debugging skills. • Excellent communication and collaboration abilities. • Ability to manage multiple priorities and deliver in a fast-paced environment. Attention to detail and a commitment to delivering high-quality work.