

Subject Matter Expert (SME) in Data Programming
β - Featured Role | Apply direct with Data Freelance Hub
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
280
-
ποΈ - Date discovered
September 9, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Remote
-
π§ - Skills detailed
#SQL (Structured Query Language) #Azure #GIT #Snowflake #Data Management #BigQuery #Data Warehouse #ML (Machine Learning) #Spark (Apache Spark) #Data Storytelling #Python #Metadata #Databases #Data Governance #Anomaly Detection #Data Quality #BI (Business Intelligence) #Pandas #Data Cleaning #Retool #Programming #Tableau #Airflow #Data Manipulation #dbt (data build tool) #"ETL (Extract #Transform #Load)" #Storage #Ansible #Data Analysis #Docker #Storytelling #Stories #Data Science #Visualization #Microsoft Power BI #JSON (JavaScript Object Notation) #Data Engineering #Security #AI (Artificial Intelligence) #Complex Queries
Role description
Required Qualifications:
We are seeking a highly skilled Data Programming Subject Matter Expert (SME) to join our content development and learning design team. The SME will play a critical role in reviewing, curating, and designing high-quality learning content for data programming courses. This position requires deep expertise in programming for data analysis, data engineering, and data science applications, along with strong instructional design sensibilities to ensure content meets industry standards and learner expectations.
SME Scope of Work :
Analyze/create learning objectives for each course.
Review/create Course Outline for each of the courses.
Review video scripts (7-9 per course) and confirm technical accuracy of the content, suggest edits and updates as required. Incorporate one round of internal and client feedback.
Provide relevant static or recorded demos/ screencast to be integrated in the videos. Incorporate one round of internal and client feedback.
In case of AI/software/tool-based courses, suggest relevant freeware. Write/review and test the codes to check.
Review readings (4-6 per course, each up to 1200 words) and confirm technical accuracy of the content, suggest edits and updates as required. Incorporate one round of internal and client feedback.
Create hands-on activities (1-2 lab or any other client preferred format) per course. Incorporate one round of internal and client feedback.
Review practice quiz and graded assessments (5 files, each comprising 5-10 questions) and suggest suitable edits, confirm technical accuracy. Incorporate one round of internal and client feedback.
Record talking head videos (onsite/virtually on Zoom) for each course. There will be approx 20-25 minutes of video recording involved per course. Incorporate one round of internal and client feedback.
Provide digital signatures to be included on the client platform.
For all reviews - validate the content accuracy and provide recommendations/suggestions, write/re-write to fill content gaps as necessary, write/test codes and labs, incorporate 1 round of internal feedback and 2 rounds of client feedback.
Be available for Client discussions and content discussions as and when required.
Courses covered:
Map Data Flows Fast
Optimize & Automate Pipelines
Unify Data SourcesModel SCD2 in dbt
Orchestrate with Airflow
Tune Spark Performance
Fix Distributed Skews
Enable Delta Time-Travel
Choose Storage Formats
Design Star Schemas
Normalize to 3NF
Master SQL Windows
Replicate Postgres Slots
Load Snowflake Incrementals
Resolve & Bisect in Git
Dockerize Spark & dbt
Install Airflow via AnsibleShip
Pipelines via CI/CD
Speed Up Queries
Assert Data Expectations
Trace Data Anomalies
Debug Python Pipelines
Build & Trace Pipelines
Optimize & Automate Pipelines
Master SQL Windows
Validate & Automate Transforms
Unify Data Sources
Tune Service Performance Insights
Master Table Transforms
Design Robust Data Models
Craft Efficient Schemas
Scale Data Warehouses
Build Lakehouse Architecture
Speed Up QueriesTest & Advance SQL
Automate DB Development
Safe Data Manipulation
Manage Warehouse Resources
Administer Databases
Administer Databases
Engineer Data Infrastructure
Ensure Data Validity Improve Data Quality
Secure Data Assets
Drive Decisions with Data
Automate ML Pipelines Safely
Design Robust Data Infrastructure
Strengthen Data Governance Program
Secure Data with Zero Trust
Harden Application Security Practices
Standardize Enterprise Data Management
Build Multimodal ETL
Ensure Data Quality
Guard PHI & Message Securely
Protect Charts & Privacy
Grasp Coding Basics
SQL Flow Mastery
Wrangle JSON Quickly
Spark & Pandas Queries
Build Star Schemas Fast
Engage Metric Essentials
Data Decisions That Stick
Think Analytically Fast
Tell Stories with Data
Query SQL Like Pro
Manage Metadata Easily
Clean Data & Outliers
Enter Data Error-Free
Decide with Data
Craft Data Stories
Govern Data Correctly
Secure Analytics Assets
Map Audience Demographics
Target Winning Demographics
Storytell with Data
Requirements
Technical Expertise:
Advanced proficiency in SQL: window functions, complex queries, optimization.
Strong knowledge of data cleaning, anomaly detection, and outlier treatment.
Hands-on experience in wrangling JSON and semi-structured data.
Expertise in data storytelling, visualization, and narrative-driven analytics.
Ability to design analytical workflows and trace anomalies quickly.
Strong understanding of data quality, integrity, and governance.
Content Development Skills:
Prior experience in curriculum design, training, or instructional content creation.
Ability to translate complex technical concepts into clear, learner-friendly explanations.
Experience in creating labs, case studies, and practical assessments.
Skilled in delivering educational demos, screencasts, and storytelling with data.
General Competencies:
Excellent problem-solving and analytical skills.
Strong communication and technical writing ability.
Comfortable collaborating with cross-functional teams.
Adaptable to iterative feedback cycles with clients and instructional designers.
Preferred Certifications:
Microsoft Certified: Data Analyst Associate or Azure Data Engineer Associate.
Google Data Analytics Professional Certificate.
Snowflake or BigQuery Certification (preferred for data querying at scale).
Certifications in SQL (advanced level) or data storytelling/visualization tools (Tableau, Power BI).
Timelines and Payout:
Expected availability - 25 hours per course
Expected project start date - Immediate
Job Types: Part-time, Contract, Temporary
Pay: $35.00 per hour
Application Question(s):
Are you a camera-friendly?
Do you have online or virtual teaching experience in the field?
Work Location: Remote
Required Qualifications:
We are seeking a highly skilled Data Programming Subject Matter Expert (SME) to join our content development and learning design team. The SME will play a critical role in reviewing, curating, and designing high-quality learning content for data programming courses. This position requires deep expertise in programming for data analysis, data engineering, and data science applications, along with strong instructional design sensibilities to ensure content meets industry standards and learner expectations.
SME Scope of Work :
Analyze/create learning objectives for each course.
Review/create Course Outline for each of the courses.
Review video scripts (7-9 per course) and confirm technical accuracy of the content, suggest edits and updates as required. Incorporate one round of internal and client feedback.
Provide relevant static or recorded demos/ screencast to be integrated in the videos. Incorporate one round of internal and client feedback.
In case of AI/software/tool-based courses, suggest relevant freeware. Write/review and test the codes to check.
Review readings (4-6 per course, each up to 1200 words) and confirm technical accuracy of the content, suggest edits and updates as required. Incorporate one round of internal and client feedback.
Create hands-on activities (1-2 lab or any other client preferred format) per course. Incorporate one round of internal and client feedback.
Review practice quiz and graded assessments (5 files, each comprising 5-10 questions) and suggest suitable edits, confirm technical accuracy. Incorporate one round of internal and client feedback.
Record talking head videos (onsite/virtually on Zoom) for each course. There will be approx 20-25 minutes of video recording involved per course. Incorporate one round of internal and client feedback.
Provide digital signatures to be included on the client platform.
For all reviews - validate the content accuracy and provide recommendations/suggestions, write/re-write to fill content gaps as necessary, write/test codes and labs, incorporate 1 round of internal feedback and 2 rounds of client feedback.
Be available for Client discussions and content discussions as and when required.
Courses covered:
Map Data Flows Fast
Optimize & Automate Pipelines
Unify Data SourcesModel SCD2 in dbt
Orchestrate with Airflow
Tune Spark Performance
Fix Distributed Skews
Enable Delta Time-Travel
Choose Storage Formats
Design Star Schemas
Normalize to 3NF
Master SQL Windows
Replicate Postgres Slots
Load Snowflake Incrementals
Resolve & Bisect in Git
Dockerize Spark & dbt
Install Airflow via AnsibleShip
Pipelines via CI/CD
Speed Up Queries
Assert Data Expectations
Trace Data Anomalies
Debug Python Pipelines
Build & Trace Pipelines
Optimize & Automate Pipelines
Master SQL Windows
Validate & Automate Transforms
Unify Data Sources
Tune Service Performance Insights
Master Table Transforms
Design Robust Data Models
Craft Efficient Schemas
Scale Data Warehouses
Build Lakehouse Architecture
Speed Up QueriesTest & Advance SQL
Automate DB Development
Safe Data Manipulation
Manage Warehouse Resources
Administer Databases
Administer Databases
Engineer Data Infrastructure
Ensure Data Validity Improve Data Quality
Secure Data Assets
Drive Decisions with Data
Automate ML Pipelines Safely
Design Robust Data Infrastructure
Strengthen Data Governance Program
Secure Data with Zero Trust
Harden Application Security Practices
Standardize Enterprise Data Management
Build Multimodal ETL
Ensure Data Quality
Guard PHI & Message Securely
Protect Charts & Privacy
Grasp Coding Basics
SQL Flow Mastery
Wrangle JSON Quickly
Spark & Pandas Queries
Build Star Schemas Fast
Engage Metric Essentials
Data Decisions That Stick
Think Analytically Fast
Tell Stories with Data
Query SQL Like Pro
Manage Metadata Easily
Clean Data & Outliers
Enter Data Error-Free
Decide with Data
Craft Data Stories
Govern Data Correctly
Secure Analytics Assets
Map Audience Demographics
Target Winning Demographics
Storytell with Data
Requirements
Technical Expertise:
Advanced proficiency in SQL: window functions, complex queries, optimization.
Strong knowledge of data cleaning, anomaly detection, and outlier treatment.
Hands-on experience in wrangling JSON and semi-structured data.
Expertise in data storytelling, visualization, and narrative-driven analytics.
Ability to design analytical workflows and trace anomalies quickly.
Strong understanding of data quality, integrity, and governance.
Content Development Skills:
Prior experience in curriculum design, training, or instructional content creation.
Ability to translate complex technical concepts into clear, learner-friendly explanations.
Experience in creating labs, case studies, and practical assessments.
Skilled in delivering educational demos, screencasts, and storytelling with data.
General Competencies:
Excellent problem-solving and analytical skills.
Strong communication and technical writing ability.
Comfortable collaborating with cross-functional teams.
Adaptable to iterative feedback cycles with clients and instructional designers.
Preferred Certifications:
Microsoft Certified: Data Analyst Associate or Azure Data Engineer Associate.
Google Data Analytics Professional Certificate.
Snowflake or BigQuery Certification (preferred for data querying at scale).
Certifications in SQL (advanced level) or data storytelling/visualization tools (Tableau, Power BI).
Timelines and Payout:
Expected availability - 25 hours per course
Expected project start date - Immediate
Job Types: Part-time, Contract, Temporary
Pay: $35.00 per hour
Application Question(s):
Are you a camera-friendly?
Do you have online or virtual teaching experience in the field?
Work Location: Remote