Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/bartolomej/wikinet
Visualization of wikipedia knowledge graph.
https://github.com/bartolomej/wikinet
knowledge-graph scrapes visualization wikipedia-knowledge-graph
Last synced: about 17 hours ago
JSON representation
Visualization of wikipedia knowledge graph.
- Host: GitHub
- URL: https://github.com/bartolomej/wikinet
- Owner: bartolomej
- License: mit
- Created: 2019-06-21T10:12:50.000Z (over 5 years ago)
- Default Branch: old-improved
- Last Pushed: 2023-01-06T04:35:37.000Z (almost 2 years ago)
- Last Synced: 2023-03-06T23:09:16.551Z (over 1 year ago)
- Topics: knowledge-graph, scrapes, visualization, wikipedia-knowledge-graph
- Language: JavaScript
- Homepage:
- Size: 1.47 MB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 10
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# wikinet
Visualize how pages on wikipedia are connected (up to 3 degrees of depth).
![ims](https://i.ibb.co/8P0byVb/Screenshot-2019-11-16-at-22-17-29.png)
## Similar projects
- [Six Degrees of Wikipedia](https://www.sixdegreesofwikipedia.com/)
- [All pages lead to Philosophy](https://www.xefer.com/wikipedia)## Run with docker
1. [install docker-compose](https://docs.docker.com/compose/install/).
2. create `.env` file with the required env variables defined bellow
3. run the app with the following command: `docker-compose up`## Running manually
In the project directory, you can run:
1. Install dependencies with `npm i` or `yarn install`
2. Run server app with `npm start` or `yarn start`
## Setup environment vars
Provide mysql database credentials manually:
```dotenv
DB_HOST = localhost
DB_PORT = 3306
DB_USER = someuser
DB_NAME = databasename
DB_PASSWORD = yourpassword
```### Cli commands
- List of available commands: `node src/bin/cli --help`
- Start with this command. Scrapes all neighbours from given page url: `node src/bin/cli scrape --link `
- Scrapes 3 degrees from given page url (breath first search):`node src/bin/cli bfs --initial `
- Scrapes degrees from given url (depth first scrape):`node src/bin/cli dbs --initial --degrees `