

SME- Big Data & Analytics
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Subject Matter Expert in Big Data & Analytics, offering a 6-month remote contract at a competitive pay rate. Requires 8+ years of experience, proficiency in Hadoop, Apache Spark, SQL, and cloud platforms, plus teaching and video editing skills.
π - Country
United States
π± - Currency
Unknown
-
π° - Day rate
-
ποΈ - Date discovered
June 14, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Remote
-
π§ - Skills detailed
#SQL (Structured Query Language) #Spark (Apache Spark) #Data Pipeline #Data Architecture #BI (Business Intelligence) #PyTorch #Compliance #Python #Talend #TensorFlow #HBase #Tableau #Data Modeling #Retool #ML (Machine Learning) #Looker #Data Engineering #Airflow #Apache Spark #Cloud #Synapse #GCP (Google Cloud Platform) #NiFi (Apache NiFi) #Predictive Modeling #Computer Science #Azure #Storage #Scala #AI (Artificial Intelligence) #Kafka (Apache Kafka) #HDFS (Hadoop Distributed File System) #Apache NiFi #BigQuery #AWS (Amazon Web Services) #Data Integration #Microsoft Power BI #Hadoop #Data Lake #Java #Pig #Redshift #Data Science #Big Data
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Background:
We are seeking a Subject Matter Expert (SME) in Big Data & Analytics to lead the design, development, and optimization of data-driven solutions. The ideal candidate will have deep experience in big data technologies, data pipelines, and advanced analytics to drive business intelligence, predictive modeling, and strategic decision-making.
Scope of Work:
β Create a course structure for a certificate program with 4-5 courses (number of courses to be based on scoping). Each course is likely to have 4-5 modules and a total of 25 lessons. So a 4-course program could have up to 100 lessons.
β Work closely with the client in a rigorous scoping process to create a Job Task Analysis document and content structure for each program and course.
β Create program-level learning objectives for professional certificate courses. The number of objectives will depend on the level - beginner, intermediate, or advanced - and the type of certification course.
β Create course-level learning objectives aligned with the overall certification goal.
β Create module-level learning objectives based on skill development relevant to the TGβs career track.
β Review/create Course Outlines for each of the courses.
β Review video scripts and confirm technical accuracy of the content, suggest edits and updates as required. Re-write content and codes as needed. Incorporate one round of internal and client feedback.
β Record talking head videos (onsite/virtually on Zoom) for each course. Incorporate one round of internal and client feedback.
β Provide relevant recorded demos/ screencasts to be integrated in the videos. Check the codes and technical accuracy before providing the demos for integration. Incorporate one round of internal and client feedback.
β For AI/software/tool-based courses, suggest relevant freeware. Write/review and test the codes to check.
β Create/review 2-3 readings per lesson (why and what, 1500 words maximum per reading). The How readings should have detailed instructions/screenshots with short code block type practice that learners can do in their local environment.
β Create One Coach item per lesson - review/reflect on key ideas
β Create/review an ungraded lab per lesson - in-depth activity to apply skills in the learner's local environment.
β Create/review practice quizzes for each lesson and suggest suitable edits, confirm technical accuracy. Incorporate one round of internal and client feedback.
β Create module-level and course-level graded assignments that meet ACE recommendation requirements with 2 additional variations to each item in an assessment bank for each course.
β Create hands-on activities (3-4 lab or any other client preferred format) per course. Incorporate one round of internal and client feedback.
β Create a minimum of one 3-5 min career resources video per course that showcases career path planning.
β For all reviews - validate the content accuracy and provide recommendations/suggestions, write/re-write to fill content gaps as necessary, write/test codes and labs, incorporate 1 round of internal feedback and 2 rounds of client feedback.
β Be available for client discussions and content discussions as and when required.
Requirements:
β 8+ years of experience in data engineering, big data architecture, or analytics roles.
β Strong expertise in Hadoop ecosystem (HDFS, Hive, Pig, HBase) and Apache Spark.
β Proficiency in data integration tools and frameworks like Apache NiFi, Airflow, or Talend. β Experience with cloud platforms (AWS Redshift, Azure Synapse, Google BigQuery) and data lake/storage solutions.
β Hands-on experience with SQL, Python, Scala, or Java.
β Solid understanding of data warehousing, data modeling, and real-time data streaming (e.g., Kafka, Flink).
β Familiarity with BI tools like Power BI, Tableau, or Looker.
β Strong problem-solving and communication skills with the ability to explain technical concepts to non-technical stakeholders. Preferred Qualifications:
β Master's or Bachelor's degree in Computer Science, Data Science, Engineering, or related field.
β Experience working in regulated industries (e.g., finance, healthcare) with a focus on data compliance and privacy.
β Familiarity with AI/ML frameworks like TensorFlow, PyTorch, or MLlib.
β Certifications in cloud platforms or big data technologies (e.g., AWS Big Data Specialty, GCP Data Engineer).
Timelines and Payout:
Project start date : Immediate Project
Duration: 6 Months
Time Availability : 25 hours per course
Job Type : Contract
Work Location : Remote
Job Types: Part-time, Contract, Temporary
Application Question(s):
Are you a US citizen?
How many years experience do you have with Big Data?
Do you have teaching experience?
Do you have video editing experience? Are you camera friendly?
Work Location: Remote