https://github.com/gromgit/yt_podoff
A simple bash YouTube podcatcher
https://github.com/gromgit/yt_podoff
bash-script podcatcher youtube-dl
Last synced: 9 months ago
JSON representation
A simple bash YouTube podcatcher
- Host: GitHub
- URL: https://github.com/gromgit/yt_podoff
- Owner: gromgit
- License: mit
- Created: 2018-12-05T16:36:08.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2019-07-03T08:42:44.000Z (almost 7 years ago)
- Last Synced: 2025-04-13T04:38:23.313Z (about 1 year ago)
- Topics: bash-script, podcatcher, youtube-dl
- Language: Shell
- Homepage:
- Size: 2.93 KB
- Stars: 10
- Watchers: 2
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# yt_podoff
Create offline caches of YouTube podcast feeds
## USAGE
1. Create one or more directories to contain your desired feeds.
2. Create a file in each directory containing the YouTube user/channel/playlist URLs for that feed.
3. Run `yt_podoff` on those files:
```
$ yt_podoff [... --] ...
```
## EXAMPLE
```
$ mkdir -p ${HOME}/podcasts/Hak5
$ echo "https://www.youtube.com/playlist?list=PLW5y1tjAOzI0w_GbtiEbYS5PGJ2pmxAIX" > ${HOME}/podcasts/Hak5/.url
$ yt_podoff --write-info-json -- ${HOME}/podcasts/Hak5/.url
```
## INSPIRATION
I'd already been using the venerable [youtube-dl](https://rg3.github.io/youtube-dl/) to catch up on several podcasts that had moved to YouTube.
Then a [stray Reddit question](https://www.reddit.com/r/commandline/comments/a31eos/using_newsboat_automatic_download_of_new_youtube/) made me take a second look at my setup, and realize that `youtube-dl` was actually quite inefficient at determining which videos to download.
So I rewrote my caching script to process the YouTube RSS feeds instead, and pass the necessary URLs to `youtube-dl`.
And I'm now a *much* happier camper.