https://github.com/jmandel/fhir-bulk-data-to-bigquery
Bulk Data FHIR API client, with metadata-based schema generation, staging, and push to cloud storage + BigQuery
https://github.com/jmandel/fhir-bulk-data-to-bigquery
Last synced: about 2 months ago
JSON representation
Bulk Data FHIR API client, with metadata-based schema generation, staging, and push to cloud storage + BigQuery
- Host: GitHub
- URL: https://github.com/jmandel/fhir-bulk-data-to-bigquery
- Owner: jmandel
- Created: 2018-01-28T18:31:00.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2018-05-12T09:57:40.000Z (about 7 years ago)
- Last Synced: 2025-04-16T19:36:43.534Z (2 months ago)
- Language: Python
- Homepage:
- Size: 17.6 KB
- Stars: 7
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# fhir-bulk-data-to-bigquery
Load data from a FHIR Bulk Data API into a BigQuery DataSet
# Try it
## Prerequisites
* Docker
* Docker Compose
* gcloud, with credentials for a user account or service account configured## Setup
```
git clone https://github.com/jmandel/fhir-bulk-data-to-bigquery
cd fhir-bulk-data-to-bigquery
docker-compose build
```Edit `config/servers.json` to add details for any server(s) you want to connect to.
Run the loader, specifying a:
* `--source` matching the name of an entry in your `config/servers.json`
* `--bigquery-dataset` specifying the dataset in which you'll create tables
* `--gcs-bucket` specifying the bucket to which you'll write the data for storage (**Warning**: Existing `.ndjson` files will be deleted from your bucket!)## Example
```
docker-compose run loader \
--source smart-bulk-data \
--bigquery-dataset fhir-org-starter-project:bulk_data_smart_100 \
--gcs-bucket fhir-bulk-data
```