

TEKFORTUNE INC
Azure Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer with a 12+ year background in data architecture, focusing on banking, healthcare, and retail. Contract length is unspecified, pay rate is "unknown," and it requires expertise in ETL/ELT, Python, and Azure Data Factory.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
April 23, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Fremont, CA
-
🧠 - Skills detailed
#Scala #SQL (Structured Query Language) #Spark (Apache Spark) #ADF (Azure Data Factory) #PostgreSQL #PySpark #Airflow #AWS Glue #Apache Kafka #AWS (Amazon Web Services) #AWS Kinesis #Data Engineering #Synapse #S3 (Amazon Simple Storage Service) #Oracle #Physical Data Model #Data Processing #ADLS (Azure Data Lake Storage) #Azure Data Factory #Azure #Indexing #Triggers #"ETL (Extract #Transform #Load)" #Snowflake #Delta Lake #Apache Airflow #Redshift #Apache Spark #Python #Databricks #Kafka (Apache Kafka) #HDFS (Hadoop Distributed File System) #Monitoring #Data Architecture #Cloud
Role description
Note:- Only US Citizen and Green Card can apply.
JD:-
Accomplished Data Architect / Senior Data Engineer with 12+ years of experience designing and modernizing
enterprise data platforms across banking, healthcare and retail domains
Architected scalable ETL/ELT pipelines using Python, PySpark, Databricks, AWS Glue and Azure Data Factory,
supporting high-volume transactional and regulatory data processing.
Led enterprise-scale Data Architecture initiatives, defining logical and physical data models, governance
standards and cloud-native platform blueprints across AWS and Azure environments.
Designed and implemented Medallion (Bronze/Silver/Gold) Lakehouse architectures using Delta Lake, S3, ADLS
Gen2, Snowflake, Redshift and Synapse Analytics.
Engineered large-scale distributed processing workloads using Apache Spark, PySpark, Databricks, EMR, Hive
and HDFS, processing billions of records for enterprise analytics.
Orchestrated complex data workflows using Apache Airflow, Databricks Workflows, AWS Step Functions and
Azure Data Factory triggers, ensuring SLA-driven pipeline execution.
Strong hands-on experience in Advanced SQL, including complex joins, CTEs, window functions, stored
procedures, indexing strategies, partitioning and execution plan optimization across Snowflake, PostgreSQL
and Oracle.
Built real-time streaming architectures using Apache Kafka, AWS Kinesis, Azure Event Hub and Service Bus,
supporting fraud detection, claims monitoring and operational telemetry.
Note:- Only US Citizen and Green Card can apply.
JD:-
Accomplished Data Architect / Senior Data Engineer with 12+ years of experience designing and modernizing
enterprise data platforms across banking, healthcare and retail domains
Architected scalable ETL/ELT pipelines using Python, PySpark, Databricks, AWS Glue and Azure Data Factory,
supporting high-volume transactional and regulatory data processing.
Led enterprise-scale Data Architecture initiatives, defining logical and physical data models, governance
standards and cloud-native platform blueprints across AWS and Azure environments.
Designed and implemented Medallion (Bronze/Silver/Gold) Lakehouse architectures using Delta Lake, S3, ADLS
Gen2, Snowflake, Redshift and Synapse Analytics.
Engineered large-scale distributed processing workloads using Apache Spark, PySpark, Databricks, EMR, Hive
and HDFS, processing billions of records for enterprise analytics.
Orchestrated complex data workflows using Apache Airflow, Databricks Workflows, AWS Step Functions and
Azure Data Factory triggers, ensuring SLA-driven pipeline execution.
Strong hands-on experience in Advanced SQL, including complex joins, CTEs, window functions, stored
procedures, indexing strategies, partitioning and execution plan optimization across Snowflake, PostgreSQL
and Oracle.
Built real-time streaming architectures using Apache Kafka, AWS Kinesis, Azure Event Hub and Service Bus,
supporting fraud detection, claims monitoring and operational telemetry.






