

Sr. Data Engineer With Strong GCP Background
Role: Sr. Data Engineer With strong GCP Background
Location: Remote
Mode: Contract
Primary Skills- PySpark, Spark, Python, Big Data, GCP, Apache Beam, Dataflow, Airflow, Kafka and BigQuery
Good to Have: GFO, Google Analytics
GCP Data Engineering and ecommerce is must
Job Description:
• 13 years' experience as a data warehouse engineer/architect designing and deploying data systems in a startup environment
• Mastery of database and data warehouse methodologies and techniques from transactional databases to dimensional data modeling, to wide deformalized data marts
• Deep understanding of SQL-based Big Data systems and experience with modern ETL tools
• Expertise in designing data warehouses using Google Big Query
• Experience developing data pipelines in Python
• A firm believer in data-driven decision-making, and extensive experience developing for high/elastic scalable, 24x7x365 high availability digital marketing or e-commerce systems
• Hands-on experience with data computing, storage, and security components and using Cloud Platforms (preferably GCP) provided Big Data technologies
• Hands-on experience with real-time streaming processing as well as high volume batch processing, and skilled in Advanced SQL, GCP BigQuery, Apache Kafka, Data-Lakes, etc.
• Hands-on Experience in Big Data technologies - Hadoop, Hive, and Spark, and an enterprise-scale Customer Data Platform (CDP)
• Experience in at least one programming language (Python strongly preferred), cloud computing platforms (e.g., GCP), big data tools such as Spark/PySpark, columnar datastores (BigQuery preferred), DevOps processes/tooling (CI/CD, GitHub Actions), infrastructure as code frameworks (Terraform), BI Tools (e.g. DOMO, Tableau, Looker,), pipeline orchestration (eg. Airflow)
• Fluency in data science/machine learning basics (model types, data prep, training process, etc.)
• Experience using version control systems (Git is strongly preferred)
• Experience with data governance and data security
• Strong analytical, problem solving and interpersonal skills, a hunger to learn, and the ability to operate in a self-guided manner in a fast-paced rapidly changing environment
Role: Sr. Data Engineer With strong GCP Background
Location: Remote
Mode: Contract
Primary Skills- PySpark, Spark, Python, Big Data, GCP, Apache Beam, Dataflow, Airflow, Kafka and BigQuery
Good to Have: GFO, Google Analytics
GCP Data Engineering and ecommerce is must
Job Description:
• 13 years' experience as a data warehouse engineer/architect designing and deploying data systems in a startup environment
• Mastery of database and data warehouse methodologies and techniques from transactional databases to dimensional data modeling, to wide deformalized data marts
• Deep understanding of SQL-based Big Data systems and experience with modern ETL tools
• Expertise in designing data warehouses using Google Big Query
• Experience developing data pipelines in Python
• A firm believer in data-driven decision-making, and extensive experience developing for high/elastic scalable, 24x7x365 high availability digital marketing or e-commerce systems
• Hands-on experience with data computing, storage, and security components and using Cloud Platforms (preferably GCP) provided Big Data technologies
• Hands-on experience with real-time streaming processing as well as high volume batch processing, and skilled in Advanced SQL, GCP BigQuery, Apache Kafka, Data-Lakes, etc.
• Hands-on Experience in Big Data technologies - Hadoop, Hive, and Spark, and an enterprise-scale Customer Data Platform (CDP)
• Experience in at least one programming language (Python strongly preferred), cloud computing platforms (e.g., GCP), big data tools such as Spark/PySpark, columnar datastores (BigQuery preferred), DevOps processes/tooling (CI/CD, GitHub Actions), infrastructure as code frameworks (Terraform), BI Tools (e.g. DOMO, Tableau, Looker,), pipeline orchestration (eg. Airflow)
• Fluency in data science/machine learning basics (model types, data prep, training process, etc.)
• Experience using version control systems (Git is strongly preferred)
• Experience with data governance and data security
• Strong analytical, problem solving and interpersonal skills, a hunger to learn, and the ability to operate in a self-guided manner in a fast-paced rapidly changing environment