Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/thetardigrade/golang-cachedpagedownloader
Go package to download webpages, or access previously cached versions of them.
https://github.com/thetardigrade/golang-cachedpagedownloader
downloader go golang web-scraper web-scraping webpage-capture
Last synced: 6 days ago
JSON representation
Go package to download webpages, or access previously cached versions of them.
- Host: GitHub
- URL: https://github.com/thetardigrade/golang-cachedpagedownloader
- Owner: theTardigrade
- License: gpl-3.0
- Created: 2023-02-12T16:50:37.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2023-02-28T03:19:19.000Z (almost 2 years ago)
- Last Synced: 2024-11-08T11:13:39.190Z (about 2 months ago)
- Topics: downloader, go, golang, web-scraper, web-scraping, webpage-capture
- Language: Go
- Homepage:
- Size: 58.6 KB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# golang-cachedPageDownloader
This package makes it easy to download — or access previously cached versions of — webpages.
[![Go Reference](https://pkg.go.dev/badge/github.com/theTardigrade/golang-cachedPageDownloader.svg)](https://pkg.go.dev/github.com/theTardigrade/golang-cachedPageDownloader) [![Go Report Card](https://goreportcard.com/badge/github.com/theTardigrade/golang-cachedPageDownloader)](https://goreportcard.com/report/github.com/theTardigrade/golang-cachedPageDownloader)
## Example
```golang
package mainimport (
"fmt"
"time"cachedPageDownloader "github.com/theTardigrade/golang-cachedPageDownloader"
)const (
exampleURL = "https://google.com/"
)func main() {
downloader, err := cachedPageDownloader.NewDownloader(&cachedPageDownloader.Options{
CacheDir: "./cache",
MaxCacheDuration: time.Minute * 5,
ShouldKeepCacheOnClose: false,
})
if err != nil {
panic(err)
}
defer downloader.Close()// calling the function below will retrieve the content of the webpage from the internet
content, isFromCache, err := downloader.Download(exampleURL)
if err != nil {
panic(err)
}fmt.Println(len(content))
fmt.Println(isFromCache) // falsefmt.Println("*****")
// calling the function again will retrieve the content of the webpage from our cache
content, isFromCache, err = downloader.Download(exampleURL)
if err != nil {
panic(err)
}fmt.Println(len(content))
fmt.Println(isFromCache) // true
}
```## Support
If you use this package, or find any value in it, please consider donating:
[![ko-fi](https://ko-fi.com/img/githubbutton_sm.svg)](https://ko-fi.com/S6S2EIRL0)