Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/pacuna/discogs-bq-importer
Scio job to import discogs xml dumps into BigQuery
https://github.com/pacuna/discogs-bq-importer
apache beam discogs discogs-dump discogs-importer scala scio xml
Last synced: 27 days ago
JSON representation
Scio job to import discogs xml dumps into BigQuery
- Host: GitHub
- URL: https://github.com/pacuna/discogs-bq-importer
- Owner: pacuna
- Created: 2023-07-08T16:39:26.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2023-07-10T00:06:54.000Z (over 1 year ago)
- Last Synced: 2024-10-24T22:40:32.774Z (2 months ago)
- Topics: apache, beam, discogs, discogs-dump, discogs-importer, scala, scio, xml
- Language: Java
- Homepage:
- Size: 43 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Discogs BigQuery importer
## Raison d'être
Scio Dataflow job to load the [discogs xml dumps](https://discogs-data-dumps.s3.us-west-2.amazonaws.com/index.html) into BigQuery tables.
## Running
Current available jobs:
- [x] Labels
- [x] Artists
- [x] Masters
- [x] ReleasesTo run the jobs, download the compressed XML dumps and upload them to your own gcp bucket.
Follow the [Scio instructions](https://spotify.github.io/scio/Getting-Started.html) to set up your GCP project.All the times were measured using the default run arguments for Dataflow.
### Releases (~ 1 hour)
```
sbt "runMain discogs.ReleasesJob
--project=your-gpc-project-id
--runner=DataflowRunner
--region=us-central1
--input=gs://your-bucket/releases.xml.gz
--output=your-project.bq-dataset.releases-bq-table"
```### Artists (~ 11 minutes)
```
sbt "runMain discogs.ArtistsJob
--project=your-gpc-project-id
--runner=DataflowRunner
--region=us-central1
--input=gs://your-bucket/artists.xml.gz
--output=your-project.bq-dataset.artists-bq-table"
```### Masters (~ 6 minutes)
```
sbt "runMain discogs.MastersJob
--project=your-gpc-project-id
--runner=DataflowRunner
--region=us-central1
--input=gs://your-bucket/masters.xml.gz
--output=your-project.bq-dataset.masters-bq-table"
```### Labels (~ 6 minutes)
Because of an Apache Beam [XmlIO](https://beam.apache.org/releases/javadoc/2.3.0/org/apache/beam/sdk/io/xml/XmlIO.html) limitation regarding
nested tags with the same outer label (``), the original labels file cannot be processed as is. There's a small
script to convert the nested `` tags into `` in `src/main/java/utils/LabelRenamer.java`. Download the original final, decompress it
and run the converter. Then compress the output file and upload it to your GCP bucket. Then use that file as the input for the job.```
sbt "runMain discogs.MastersJob
--project=your-gpc-project-id
--runner=DataflowRunner
--region=us-central1
--input=gs://your-bucket/labels-renamed.xml.gz
--output=your-project.bq-dataset.labels-bq-table"
```