{"id":26279721,"url":"https://github.com/alephdata/cronodump","last_synced_at":"2025-05-07T03:04:18.234Z","repository":{"id":41522708,"uuid":"384341341","full_name":"alephdata/cronodump","owner":"alephdata","description":"A Cronos database converter","archived":false,"fork":false,"pushed_at":"2023-10-02T21:12:21.000Z","size":138,"stargazers_count":79,"open_issues_count":10,"forks_count":18,"subscribers_count":13,"default_branch":"master","last_synced_at":"2025-05-07T03:03:43.529Z","etag":null,"topics":["cronos","cronospro","database"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/alephdata.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null}},"created_at":"2021-07-09T06:19:55.000Z","updated_at":"2025-03-31T19:14:30.000Z","dependencies_parsed_at":"2023-10-03T02:55:18.889Z","dependency_job_id":null,"html_url":"https://github.com/alephdata/cronodump","commit_stats":{"total_commits":77,"total_committers":3,"mean_commits":"25.666666666666668","dds":"0.35064935064935066","last_synced_commit":"157fd7560d6d7c4064240ea324577f86c9e8d8cd"},"previous_names":[],"tags_count":1,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alephdata%2Fcronodump","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alephdata%2Fcronodump/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alephdata%2Fcronodump/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alephdata%2Fcronodump/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/alephdata","download_url":"https://codeload.github.com/alephdata/cronodump/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":252804206,"owners_count":21806769,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["cronos","cronospro","database"],"created_at":"2025-03-14T14:16:06.538Z","updated_at":"2025-05-07T03:04:18.202Z","avatar_url":"https://github.com/alephdata.png","language":"Python","readme":"# cronodump\n\nThe cronodump utility can parse most of the databases created by the [CronosPro](https://www.cronos.ru/) database software\nand dump it to several output formats.\n\nThe software is popular among Russian public offices, companies and police agencies.\n\n\n# Quick start\n\nIn its simplest form, without any dependencies, the croconvert command creates a [CSV](https://en.wikipedia.org/wiki/Comma-separated_values) representation of all the database's tables and a copy of all files contained in the database:\n\n```bash\nbin/croconvert --csv test_data/all_field_types\n```\n\nBy default it creates a `cronodump-YYYY-mm-DD-HH-MM-SS-ffffff/` directory containing CSV files for each table found. It will under this directory also create a `Files-FL/` directory containing all the files stored in the Database, regardless if they are (still) referenced in any data table. All files that are actually referenced (and thus are known by their filename) will be stored under the `Files-Referenced` directory. With the `--outputdir` option you can chose your own dump location.\n\nWhen you get an error message, or just unreadable data, chances are your database is protected. You may need to look into the `--dbcrack` or `--strucrack` options, explained below.\n\n\n# Templates\n\nThe croconvert command can use the powerful [jinja templating framework](https://jinja.palletsprojects.com/en/3.0.x/) to render more file formats like PostgreSQL and HTML.\nThe default action for `croconvert` is to convert the database using the `html` template.\nUse\n\n```bash\npython3 -m venv ./venc\n. venv/bin/activate\npip install jinja2\nbin/croconvert test_data/all_field_types \u003e test_data.html\n```\n\nto dump an HTML file with all tables found in the database, files listed and ready for download as inlined [data URI](https://en.wikipedia.org/wiki/Data_URI_scheme) and all table images inlined as well. Note that the resulting HTML file can be huge for large databases, causing a lot of load on browsers when trying to open them.\n\n\nThe `-t postgres` command will dump the table schemes and records as valid `CREATE TABLE` and `INSERT INTO` statements to stdout. This dump can then be imported in a PostgreSQL database. Note that the backslash character is not escaped and thus the [`standard_conforming_strings`](https://www.postgresql.org/docs/current/runtime-config-compatible.html#GUC-STANDARD-CONFORMING-STRINGS) option should be off.\n\nPull requests for [more templates supporting other output types](/templates) are welcome.\n\n\n# Inspection\n\nThere's a `bin/crodump` tool to further investigate databases. This might be useful for extracting metadata like path names of table image files or input and output forms. Not all metadata has yet been completely reverse engineered, so some experience with understanding binary dumps might be required.\n\nThe crodump script has a plethora of options but in the most basic for the `strudump` sub command will provide a rich variety of metadata to look further:\n\n```bash\nbin/crodump strudump -v -a test_data/all_field_types/\n```\nThe `-a` option tells strudump to output ascii instead of a hexdump.\n\nFor a low level dump of the database contents, use:\n```bash\nbin/crodump crodump -v  test_data/all_field_types/\n```\nThe `-v` option tells crodump to include all unused byte ranges, this may be useful when identifying deleted records.\n\nFor a bit higher level dump of the database contents, use:\n```bash\nbin/crodump recdump  test_data/all_field_types/\n```\nThis will print a hexdump of all records for all tables.\n\n\n## decoding password protected databases\n\nCronos v4 and higher are able to password protect databases, the protection works\nby modifying the KOD sbox. `cronodump` has two methods of deriving the KOD sbox from\na database:\n\nBoth these methods are statistics based operations, it may not always\nyield the correct KOD sbox.\n\n\n### 1. strudump\n\nWhen the database has a sufficiently large CroStru.dat file,\nit is easy to derive the nodified KOD-sbox from the CroStru file, the `--strucrack` option\nwill do this. \n\n    crodump --strucrack  recdump \u003cdbpath\u003e\n\n### 2. dbdump\n\nWhen the Bank and Index files are compressed, we can derive the KOD sbox by inspecting\nthe fourth byte of each record, which should decode to a zero.\n\nThe `--dbcrack` option will do this.\n\n    crodump --dbcrack  recdump \u003cdbpath\u003e\n\n\n# Installing\n\n`cronodump` requires python 3.7 or later. It has been tested on Linux, MacOS and Windows.\nThere is one optional requirement: the `Jinja2` templating engine, but it will install fine without.\n\nThere are several ways of installing `cronodump`:\n\n * You can run `cronodump` directly from the cloned git repository, by using the shell scripts in the `bin` subdirectory.\n * You can install `cronodump` in your python environment by ruinning: `python setup.py  build install`.\n * You can install `cronodump` from the public [pypi repository](https://pypi.org/project/cronodump/) with `pip install cronodump`.\n * You can install `cronodump` with the `Jinja2` templating engine from the public [pypi repository](https://pypi.org/project/cronodump/) with `pip install cronodump[templates]`.\n\n\n# Terminology\n\nWe decided to use the more common terminology for database, tables, records, etc.\nHere is a table showing how cronos calls these:\n\n| what | cronos english | cronos russian\n|:------ |:------ |:------ \n| Database  |  Bank   | Банк \n| Table     |  Base   | Базы\n| Record    |  Record | Записи\n| Field     |  Field  | поля\n| recid     |  System Number | Системный номер\n\n\n# License\n\ncronodump is released under the [MIT license](LICENSE).\n\n\n# References\n\ncronodump builds upon [documentation of the file format found in older versions of Cronos](http://sergsv.narod.ru/cronos.htm) and\nthe [subsequent implementation of a parser for the old file format](https://github.com/occrp/cronosparser) but dropped the heuristic\napproach to guess offsets and obfuscation parameters for a more rigid parser. Refer to [the docs](docs/cronos-research.md) for further\ndetails.\n","funding_links":[],"categories":["[](#table-of-contents) Table of contents"],"sub_categories":["[](#netflix)Netflix"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Falephdata%2Fcronodump","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Falephdata%2Fcronodump","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Falephdata%2Fcronodump/lists"}