https://github.com/interactive-video-labs/interactive-video-core
💡 Framework-agnostic TypeScript engine for building cue-based interactive video experiences. Lightweight, extendable, and works with any web framework — React, Vue, or plain JS.
https://github.com/interactive-video-labs/interactive-video-core
framework-agnostic front-end-library html5-video interactive-video learning-platform media-framework npm-package state-machine typescript video-cues video-interaction video-player web-video
Last synced: 6 months ago
JSON representation
💡 Framework-agnostic TypeScript engine for building cue-based interactive video experiences. Lightweight, extendable, and works with any web framework — React, Vue, or plain JS.
- Host: GitHub
- URL: https://github.com/interactive-video-labs/interactive-video-core
- Owner: interactive-video-labs
- License: mit
- Created: 2025-07-25T16:28:55.000Z (7 months ago)
- Default Branch: main
- Last Pushed: 2025-07-31T18:46:42.000Z (7 months ago)
- Last Synced: 2025-07-31T19:58:29.883Z (7 months ago)
- Topics: framework-agnostic, front-end-library, html5-video, interactive-video, learning-platform, media-framework, npm-package, state-machine, typescript, video-cues, video-interaction, video-player, web-video
- Language: TypeScript
- Homepage: https://www.npmjs.com/package/@interactive-video-labs/core
- Size: 302 KB
- Stars: 1
- Watchers: 0
- Forks: 2
- Open Issues: 2
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
README
# @interactive-video-labs/core
Welcome to `@interactive-video-labs/core` — the framework-agnostic TypeScript module powering interactive video experiences. This engine is designed to be lightweight, extendable, and usable across React, Vue, or plain JavaScript apps.
---
## 🚀 Getting Started
To get started with `@interactive-video-labs/core`, follow these steps:
### Installation
If you're integrating this module into an existing project, you can install it via `pnpm`, `npm`, or `yarn`:
```bash
pnpm add @interactive-video-labs/core
# or
npm install @interactive-video-labs/core
# or
yarn add @interactive-video-labs/core
```
### Basic Usage
For a basic working example, refer to the [`examples/index.html`](examples/index.html) file. It demonstrates how to initialize and use the `IVLabsPlayer` in a plain HTML setup.
Here's a simplified snippet of how you might set up the player:
```typescript
import { IVLabsPlayer, InMemoryDecisionAdapter, LocalStorageDecisionAdapter } from '@interactive-video-labs/core';
// Initialize a decision adapter (e.g., for in-memory storage or local storage)
// const decisionAdapter = new InMemoryDecisionAdapter();
const decisionAdapter = new LocalStorageDecisionAdapter();
const playerConfig = {
videoUrl: 'https://interactive-video-labs.github.io/assets/sample-video.mp4',
cues: [
{
id: 'intro',
time: 2,
label: 'Introduction',
payload: { message: 'Welcome!' }
},
{
id: 'question1',
time: 5,
label: 'Question 1',
payload: {
interaction: {
type: 'choice',
title: 'Quick Question',
description: 'Please select the correct answer.',
question: 'What is 2+2?',
options: ['3', '4', '5'],
correctAnswer: '4'
}
}
},
{
id: 'segmentChange',
time: 10,
label: 'Segment Change',
payload: {
interaction: {
type: 'choice-video-segment-change',
title: 'Choose your path',
description: 'Select a video segment to jump to.',
question: 'Where would you like to go?',
options: [
{
level: 'Segment A',
video: 'https://interactive-video-labs.github.io/assets/sample-interaction-1.mp4'
},
{
level: 'Segment B',
video: 'https://interactive-video-labs.github.io/assets/sample-interaction-2.mp4'
}
]
}
}
},
{
id: 'midpoint',
time: 8,
label: 'Midpoint',
payload: { message: 'Halfway there!' }
},
{
id: 'question2',
time: 12,
label: 'Question 2',
payload: {
interaction: {
type: 'text',
title: 'Your Information',
description: 'Please enter your name below.',
question: 'Your name?'
}
}
},
{
id: 'end',
time: 15,
label: 'End',
payload: { message: 'Thanks for watching!' }
}
],
initialState: 'idle',
decisionAdapter: decisionAdapter, // Pass the decision adapter to the player config
};
// Assuming you have a div with id="player-container" in your HTML
const player = new IVLabsPlayer('player-container', playerConfig);
player.init();
// player.play(); // Start playback
```
### Decision Tracking and Persistence
The `@interactive-video-labs/core` module now supports tracking and persisting user decisions made during interactions. This is achieved through the `DecisionAdapter` interface, which allows you to define how decisions are stored and retrieved. Two concrete implementations are provided:
* **`InMemoryDecisionAdapter`**: Stores decisions in memory. This is useful for temporary tracking within a single session.
* **`LocalStorageDecisionAdapter`**: Stores decisions in the browser's local storage, providing persistence across sessions.
To use decision tracking, simply instantiate one of the adapters and pass it to the `IVLabsPlayer` configuration:
```typescript
import { IVLabsPlayer, LocalStorageDecisionAdapter } from '@interactive-video-labs/core';
const decisionAdapter = new LocalStorageDecisionAdapter();
const playerConfig = {
// ... other player configurations
decisionAdapter: decisionAdapter,
};
const player = new IVLabsPlayer('player-container', playerConfig);
player.init();
// To retrieve decision history:
async function getHistory() {
const history = await decisionAdapter.getDecisionHistory();
console.log('Decision History:', history);
}
// To clear decision history:
async function clearHistory() {
await decisionAdapter.clearDecisionHistory();
console.log('Decision History cleared.');
}
```
This allows you to easily integrate decision tracking into your interactive video experiences, enabling features like personalized content, progress saving, or analytics based on user choices.
## ✨ Features
The `@interactive-video-labs/core` engine includes the following core functionalities:
* **Core Player (`IVLabsPlayer`):** Orchestrates video playback, cue handling, interactions, and analytics.
* **State Machine:** Manages player state transitions (e.g., idle, playing, waitingForInteraction).
* **Cue Handler:** Manages and triggers cue points based on video time updates.
* **Interaction Manager:** Renders and manages various interaction types (choice, text, video segment changes) and handles user responses.
* **Analytics System:** Provides a flexible hook system for tracking player and interaction events.
* **Segment Manager:** Handles seamless transitions between different video segments.
* **Centralized Types:** Defines all core types and interfaces for consistency across the module.
* **Basic HTML Demo:** A raw HTML demo (`examples/index.html`) to validate core functionality without frameworks.
* **Testing Framework:** Utilizes Vitest for comprehensive unit testing.
* **Localization Support:** Enabled for broader audience reach.
* **Subtitle-based Cue Generation:** Supports generating cues from subtitles.
* **Multi-segment Video Lessons:** Capability to support video content divided into multiple segments.
* **Build and Publish Pipeline:** Configured for NPM publishing.
* **Decision Tracking and Persistence:** Supports tracking and persisting user decisions using configurable adapters (e.g., `InMemoryDecisionAdapter`, `LocalStorageDecisionAdapter`).
---
## 📁 Examples
```
/examples/
└── index.html # A raw HTML demo using the built UMD bundle
```
This example helps test core functionality without frameworks like React/Vue.
---
## 🧑💻 For Developers
For detailed development setup, project structure, testing, build, and publishing instructions, please refer to our [Developer Guide](DEVELOPER.md).
---
## 🤝 Contributing
We welcome contributions! Please see our [CONTRIBUTING](CONTRIBUTING.md) for guidelines.
---
## 📄 License
This project is licensed under the [MIT License](LICENSE).