

Intellectt Inc
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position for a 6-month contract, offering a pay rate of "XX" per hour, remote work. Key skills include GCP, Azure, Python, SQL, and experience with AI/ML integration. Certifications in GCP or Azure are preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 5, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Atlanta, GA
-
🧠 - Skills detailed
#Data Lake #ML (Machine Learning) #Data Governance #Batch #IoT (Internet of Things) #Automation #Databricks #AI (Artificial Intelligence) #Scala #Langchain #SQL (Structured Query Language) #DevOps #Kafka (Apache Kafka) #Data Architecture #Microsoft Azure #GCP (Google Cloud Platform) #Airflow #"ETL (Extract #Transform #Load)" #Dataflow #BigQuery #Python #Cloud #Data Science #Version Control #GIT #Synapse #Spark (Apache Spark) #Data Pipeline #Security #Deployment #Infrastructure as Code (IaC) #Docker #Azure #Data Engineering #Kubernetes #DataOps #Databases #Data Processing
Role description
We are seeking an experienced and forward-thinking Data Engineer with hands-on expertise across Google Cloud Platform (GCP), Microsoft Azure, and AI-driven data solutions. The ideal candidate will design, build, and optimize scalable data architectures and pipelines that leverage cutting-edge technologies in cloud computing, machine learning, and automation.
You will work closely with cross-functional teams (Data Science, AI/ML, and Cloud Infrastructure) to shape the next generation of data systems that enable intelligent, real-time, and predictive decision-making.
Key Responsibilities:
• Design, develop, and manage data pipelines, ETL/ELT processes, and data models across GCP and Azure environments.
• Build real-time and batch data processing systems using modern frameworks and cloud-native services.
• Collaborate with AI/ML teams to operationalize models and enable automated data-driven insights.
• Implement data governance, quality, and security best practices across multiple cloud platforms.
• Integrate and optimize data from diverse sources including APIs, IoT, streaming data, and unstructured formats.
• Explore and adopt emerging technologies such as GenAI, MLOps, DataOps, Edge Computing, and LLM-based data engineering tools.
• Leverage infrastructure as code (IaC) and CI/CD pipelines for scalable and automated deployments.
• Continuously evaluate new technologies to enhance data platform capabilities and efficiency.
Required Skills & Experience:
• Strong proficiency in GCP (BigQuery, Dataflow, Pub/Sub, AI Platform) and Azure (Synapse, Data Factory, Databricks, Azure ML).
• Hands-on experience with Python, SQL, and Spark for large-scale data processing.
• Solid understanding of data warehousing, data lakes, and modern lakehouse architectures.
• Experience with AI/ML integration, MLOps, and working with AI APIs or LLMs.
• Proficiency with containerization and orchestration tools (Docker, Kubernetes, Airflow).
• Familiarity with DevOps/DataOps principles and version control (Git, CI/CD pipelines).
• Experience working in multi-cloud or hybrid environments.
• Strong problem-solving skills and ability to work with emerging, fast-evolving technologies.
Preferred / Nice-to-Have Skills:
• Experience with GenAI platforms (Vertex AI, Azure OpenAI, LangChain, Vector Databases).
• Knowledge of real-time analytics tools (Kafka, Flink, Kinesis).
• Exposure to Edge AI, Serverless Computing, and Quantum-inspired data systems.
• Certifications in GCP or Azure (e.g., Professional Data Engineer, Azure Data Engineer Associate).
We are seeking an experienced and forward-thinking Data Engineer with hands-on expertise across Google Cloud Platform (GCP), Microsoft Azure, and AI-driven data solutions. The ideal candidate will design, build, and optimize scalable data architectures and pipelines that leverage cutting-edge technologies in cloud computing, machine learning, and automation.
You will work closely with cross-functional teams (Data Science, AI/ML, and Cloud Infrastructure) to shape the next generation of data systems that enable intelligent, real-time, and predictive decision-making.
Key Responsibilities:
• Design, develop, and manage data pipelines, ETL/ELT processes, and data models across GCP and Azure environments.
• Build real-time and batch data processing systems using modern frameworks and cloud-native services.
• Collaborate with AI/ML teams to operationalize models and enable automated data-driven insights.
• Implement data governance, quality, and security best practices across multiple cloud platforms.
• Integrate and optimize data from diverse sources including APIs, IoT, streaming data, and unstructured formats.
• Explore and adopt emerging technologies such as GenAI, MLOps, DataOps, Edge Computing, and LLM-based data engineering tools.
• Leverage infrastructure as code (IaC) and CI/CD pipelines for scalable and automated deployments.
• Continuously evaluate new technologies to enhance data platform capabilities and efficiency.
Required Skills & Experience:
• Strong proficiency in GCP (BigQuery, Dataflow, Pub/Sub, AI Platform) and Azure (Synapse, Data Factory, Databricks, Azure ML).
• Hands-on experience with Python, SQL, and Spark for large-scale data processing.
• Solid understanding of data warehousing, data lakes, and modern lakehouse architectures.
• Experience with AI/ML integration, MLOps, and working with AI APIs or LLMs.
• Proficiency with containerization and orchestration tools (Docker, Kubernetes, Airflow).
• Familiarity with DevOps/DataOps principles and version control (Git, CI/CD pipelines).
• Experience working in multi-cloud or hybrid environments.
• Strong problem-solving skills and ability to work with emerging, fast-evolving technologies.
Preferred / Nice-to-Have Skills:
• Experience with GenAI platforms (Vertex AI, Azure OpenAI, LangChain, Vector Databases).
• Knowledge of real-time analytics tools (Kafka, Flink, Kinesis).
• Exposure to Edge AI, Serverless Computing, and Quantum-inspired data systems.
• Certifications in GCP or Azure (e.g., Professional Data Engineer, Azure Data Engineer Associate).






