https://github.com/paxnea/llm-multimodal-nudging
Zero-Shot Learning for Multimodal Nudging
https://github.com/paxnea/llm-multimodal-nudging
content-creation generative-ai llm multimodal recommender-system zero-shot-learning
Last synced: 6 months ago
JSON representation
Zero-Shot Learning for Multimodal Nudging
- Host: GitHub
- URL: https://github.com/paxnea/llm-multimodal-nudging
- Owner: paxnea
- Created: 2023-05-27T19:52:11.000Z (over 2 years ago)
- Default Branch: master
- Last Pushed: 2023-09-06T17:18:00.000Z (about 2 years ago)
- Last Synced: 2025-04-13T17:04:17.024Z (6 months ago)
- Topics: content-creation, generative-ai, llm, multimodal, recommender-system, zero-shot-learning
- Language: Jupyter Notebook
- Homepage:
- Size: 94.7 MB
- Stars: 3
- Watchers: 2
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Zero-Shot Learning for Multimodal Nudging
Source code for the paper "[Zero-Shot Recommendations with Pre-Trained Large Language Models for Multimodal Nudging](https://arxiv.org/abs/2309.01026)".
## File Structure
* directory `./content_generation/` contains files used for synthetic data generation
* directory `./data/` contains the generated dataset: user descriptions, messages, and image captions
* `main.ipynb` loads the generated data and computes recommendations via the proposed zero-shot approach## Examples of Multimodal Recommendations




