Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Integration Engineer with 3+ years of Astronomer/Airflow experience. It is a hybrid position in Metro Park, NJ, offering a competitive pay rate. Key skills include Python, Pyspark, SQL, and cloud data warehousing experience.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 23, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
New Jersey, United States
🧠 - Skills detailed
#Data Engineering #Data Modeling #Airflow #Data Bricks #Data Ingestion #"ETL (Extract #Transform #Load)" #PySpark #Spark SQL #Data Integration #SQL Queries #Databricks #Python #Synapse #Consulting #EDW (Enterprise Data Warehouse) #Big Data #Data Architecture #Migration #Scala #Redshift #Programming #Data Warehouse #Data Management #Data Quality #Spark (Apache Spark) #Consul #SQL (Structured Query Language) #Data Pipeline #Datasets #Security #Business Analysis #Delta Lake #Snowflake #Code Reviews #Cloud #SQL Server
Role description

Title - Data Integration Engineer - specifically with Astronomer Airflow experience

Work - Hybrid

Location - Metro park , NJ

Company Description

NInfo Systems Inc. is a Certified minority-owned national IT Recruiting and Solutions provider with two decades of experience. It works with Fortune 500 corporations, mid-sized companies, Boutique Consulting companies, startups, SME-level organizations, Federal/ State agencies, and tier-one vendors.

We are looking for an experienced Integration engineer with a background in Airflow, Python, Pyspark, SQL and data warehousing for enterprise level systems.

The position calls for someone that is comfortable working with business users along with business analyst expertise.

Skills -

   • 3+ years Astronomer/Airflow DAG development

   • 5+ years Python coding experience.

   • 5+ years - SQL Server based development of large datasets

   • 5+ years with Experience with developing and deploying ETL pipelines using Databricks Pyspark.

   • Experience in any cloud data warehouse like Synapse, Big Query, Redshift, Snowflake.

   • Experience in Data warehousing - OLTP, Dimensions, Facts, and Data modeling.

   • Previous experience leading an enterprise-wide Cloud Data Platform migration with strong architectural and design skills.

   • Experience with Cloud based data architectures, messaging, and analytics.

   • loud certification(s).

Responsibilities:

   • Build and optimize data pipelines for efficient data ingestion, transformation and loading from various sources while ensuring data quality and integrity.

   • Design, develop, and deploy Spark program in data bricks environment to process and analyze large volumes of data.

   • Experience of Delta Lake, DWH, Data Integration, Cloud, Design and Data Modelling.

   • Proficient in developing programs in Python and SQL

   • Experience with Data warehouse Dimensional data modeling.

   • Working with event based/streaming technologies to ingest and process data.

   • Working with structured, semi structured and unstructured data.

   • Optimize Databricks jobs for performance and scalability to handle big data workloads.

   • Monitor and troubleshoot Databricks jobs, identify and resolve issues or bottlenecks.

   • Implement best practices for data management, security, and governance within the Databricks environment.

   • Experience designing and developing Enterprise Data Warehouse solutions.

   • Proficient writing SQL queries and programming including stored procedures and reverse engineering existing process.

   • Perform code reviews to ensure fit to requirements, optimal execution patterns and adherence to established standards.

Additional Information:

Join a diverse team at NInfo Systems, where innovation meets expertise in IT solutions. Empower your career with diverse projects with your talent in a dynamic, growth-oriented environment. Thrive in our inclusive culture that values innovation and two decades of industry expertise. Let's innovate together.