Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/metalaka/worldoflogs-parser
https://github.com/metalaka/worldoflogs-parser
Last synced: about 2 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/metalaka/worldoflogs-parser
- Owner: Metalaka
- License: mit
- Created: 2021-06-02T13:14:40.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2021-12-14T23:15:53.000Z (about 3 years ago)
- Last Synced: 2023-08-10T08:23:03.854Z (over 1 year ago)
- Language: C#
- Size: 34.2 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# WorldOfLogs-Parser
## Description
This tool allow you to store in a database the javascript data gathered from a world of logs expression editor backup.
## Run/config
Use `Data/appsettings.sample.json` to create the configuration file `appsettings.json`
build and run with the path argument: `dotnet FileImporter.dll --path="/path/to/backup/"`
Every html files of that folder will be parsed and imported to the db.
## Context
World of Logs was an online service that allow to analyse logs of World of Warcraft fights.
http://www.worldoflogs.com/ has been shut down on 2021-05-17.It was the last place were logs of WOW: WOTLK expansion can still be accessible and can be uploaded.
These two points are necessary to compare behavior of different versions of a similar fight.
### AnalyseWOL allow to upload *reports* organised by *guilds* but not to download reports.
There where many way to analyse each report
- by actor (player, creature, boss)
- by spell
- with death log
- expression editor: ability to query the whole log with conditions/expressionsThe expression editor can be used without conditions to see all lines of the report.
So with some time, a complete report can be downloaded with a web crawler.Moreover the expression editor UI was mainly build on the client-side with raw data injected inside the html sources.
The raw json data is pretty simply attainable and so there is no need to parse formatted html to rebuild the data!After backing-up a report, data have to be extracted from the html files.
I would like to exploit the data through a database so it's why this project target that output.