Tekleads LLC

Data Engineer

โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 7+ years of experience, offering a contract of 40 hours per week at $40.00 - $50.00 per hour, in-person. Key skills include SQL, Python, ETL/ELT, and cloud infrastructure (AWS, Azure, GCP).
๐ŸŒŽ - Country
United States
๐Ÿ’ฑ - Currency
$ USD
-
๐Ÿ’ฐ - Day rate
400
-
๐Ÿ—“๏ธ - Date
November 9, 2025
๐Ÿ•’ - Duration
Unknown
-
๐Ÿ๏ธ - Location
On-site
-
๐Ÿ“„ - Contract
Unknown
-
๐Ÿ”’ - Security
Unknown
-
๐Ÿ“ - Location detailed
California City, CA 93505
-
๐Ÿง  - Skills detailed
#Data Processing #Luigi #Data Modeling #Data Governance #Infrastructure as Code (IaC) #GCP (Google Cloud Platform) #Data Quality #Python #Kafka (Apache Kafka) #Scala #Data Architecture #Programming #Azure #Databricks #ML (Machine Learning) #Redshift #Schema Design #Terraform #Cloud #Data Science #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Snowflake #BigQuery #Airflow #Data Warehouse #Compliance #AWS (Amazon Web Services) #Data Engineering #Data Pipeline #Documentation #Java #Data Lake #dbt (data build tool)
Role description
About the RoleWeโ€™re seeking an experienced Data Engineer (7+ years) to lead the design and development of scalable, high-performance data systems. In this role, youโ€™ll architect end-to-end data solutions, mentor junior engineers, and ensure that our data infrastructure supports strategic business and analytics goals. Key Responsibilities Architect and implement robust ETL/ELT pipelines for large-scale data processing. Design and optimize data lakes, data warehouses, and data models to support analytics, reporting, and ML workloads. Ensure data quality, integrity, and governance across all data platforms. Collaborate with cross-functional teams โ€” including analytics, engineering, and product โ€” to define data requirements and deliver insights. Drive best practices in data engineering, CI/CD, testing, and documentation. Evaluate and integrate new data technologies to improve scalability and performance. Mentor and guide junior data engineers in technical design and development. Qualifications 7+ years of experience in data engineering, data architecture, or related fields. Strong expertise in SQL and at least one major programming language (Python, Java, or Scala). Proven experience building data pipelines using tools such as Airflow, dbt, Luigi, or Prefect. Hands-on experience with modern data warehouses (Snowflake, BigQuery, Redshift, Databricks, etc.). Strong understanding of cloud infrastructure (AWS, Azure, or GCP). Knowledge of data modeling, schema design, and performance tuning. Experience with streaming and real-time data (Kafka, Kinesis, or Pub/Sub) is a plus. Excellent problem-solving, communication, and collaboration skills. Nice to Have Experience supporting data science or ML pipelines. Familiarity with infrastructure as code (Terraform, CloudFormation). Understanding of data governance, lineage, and compliance frameworks. Contributions to open-source or community data projects. Job Type: Contract Pay: $40.00 - $50.00 per hour Expected hours: 40 per week Work Location: In person