{"id":13624435,"url":"https://github.com/lifthrasiir/roadroller","last_synced_at":"2025-04-05T15:05:04.201Z","repository":{"id":59490491,"uuid":"395492135","full_name":"lifthrasiir/roadroller","owner":"lifthrasiir","description":"Roadroller: Flattens Your JavaScript Demo","archived":false,"fork":false,"pushed_at":"2022-07-24T16:10:28.000Z","size":590,"stargazers_count":336,"open_issues_count":36,"forks_count":12,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-03-29T14:09:31.452Z","etag":null,"topics":["compression","compressor","javascript","js","js13kgames"],"latest_commit_sha":null,"homepage":"https://lifthrasiir.github.io/roadroller/","language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"other","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/lifthrasiir.png","metadata":{"files":{"readme":"README.md","changelog":"CHANGELOG.md","contributing":null,"funding":null,"license":"LICENSE.txt","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2021-08-13T01:56:11.000Z","updated_at":"2025-02-09T07:24:22.000Z","dependencies_parsed_at":"2022-09-18T00:01:27.506Z","dependency_job_id":null,"html_url":"https://github.com/lifthrasiir/roadroller","commit_stats":null,"previous_names":[],"tags_count":6,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lifthrasiir%2Froadroller","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lifthrasiir%2Froadroller/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lifthrasiir%2Froadroller/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lifthrasiir%2Froadroller/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/lifthrasiir","download_url":"https://codeload.github.com/lifthrasiir/roadroller/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247353729,"owners_count":20925329,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["compression","compressor","javascript","js","js13kgames"],"created_at":"2024-08-01T21:01:42.565Z","updated_at":"2025-04-05T15:05:04.182Z","avatar_url":"https://github.com/lifthrasiir.png","language":"JavaScript","readme":"# Roadroller: Flattens Your JavaScript Demo\n\n**Roadroller** is a heavyweight JavaScript packer for large [demos][demo]. It was originally designed for [js13kGames], but it remains usable for demos as small as 4KB. Depending on the input it can provide up to 15% additional compression compared to best ZIP/gzip recompressors. **[Try it online][online]!**\n\nRoadroller is considered \"heavyweight\" unlike typical JS packers such as [JSCrush] or [RegPack], because it is quite resource intensive and requires both a considerable amount of memory and a non-negligible run time. The default should work for most devices, but you can configure both aspects as you need.\n\n## Quick Start\n\nIn addition to the [online demo][online], Roadroller is available as an [NPM package][npm]:\n\n```\n$ npx roadroller input.js -o output.js\n```\n\nYou can also use Roadroller as a library to integrate with your build pipeline.\n\n```javascript\nimport { Packer } from 'roadroller';\n\nconst inputs = [\n    {\n        data: 'console.log(\"Hello, world!\");',\n        type: 'js',\n        action: 'eval',\n    },\n];\n\nconst options = {\n    // see the Usage for available options.\n};\n\nconst packer = new Packer(inputs, options);\nawait packer.optimize(); // takes less than 10 seconds by default\n\nconst { firstLine, secondLine } = packer.makeDecoder();\nconsole.log(firstLine + secondLine);\n```\n\nRoadroller as a library or a CLI command requires Node.js 14 or later. Node.js 16 is strongly recommended because Roadroller is substantially faster in 16 than in 14.\n\n## Usage\n\nBy default Roadroller receives your JS code and returns a compressed JS code that should be further compressed with ZIP, gzip or PNG bootstrap (or more accurately, [DEFLATE]). Ideally your JS code should be already minified, probably using [Terser] or [Closure Compiler]; Roadroller only does a minimal whitespace and comment suppression.\n\nThe resulting code will look like this: (the newline is mostly for the explanation and can be removed)\n\n```javascript\neval(Function(\"[M='Zos~ZyF_sTdvfgJ^bIq\u001f_wJWLGSIz}\u001fChb?rMch}...'\"\n,...']charCodeAtUinyxp',\"for(;e\u003c12345;c[e++]=p-128)/* omitted */;return o\")([],[],12345678,/* omitted */))\n```\n\nThe first line is a compressed data. It can contain control characters like `\u001c` (U+001C) that might not render in certain environments. Nevertheless you should make sure that they are all copied in verbatim.\n\nThe second line is a compressor tuned for this particular input. By default the decompressed data immediately goes through `eval`, but you can configure what to do with that.\n\nThe first line is very incompressible unlike the second line, so ideally you should compress two lines separately. This is best done by using ADVZIP from [AdvanceCOMP] or [ECT]. The first line and second line may form a single statement as above so they should not be separated; you can only put whitespace between them.\n\n\u003c!--\n### Multiple Inputs\n\nYou can also give multiple inputs to Roadroller; for example you can put shaders and map data. The executed code will receive all decompressed inputs. This is more efficient than putting them into strings because each input can be separately modelled.\n--\u003e\n\n### Input Configuration\n\nEach input can be further configured by input type and action. In the CLI you put corresponding options *before* the file path.\n\n**Input type** (CLI `-t|--type TYPE`, API `type` in the input object) determines the preprocessing step to improve the compression. \u003c!--Dropping a file to the input window also tries to detect the correct input type.--\u003e\n\n* **JavaScript** (`js`) assumes a valid JS code. Automatically removes all redundant whitespace and comments and enables a separate modelling for embedded strings. This also works for JSON.\n\n\u003c!--* **GLSL** (`glsl`) assumes a valid GLSL code. Automatically removes all redundant whitespace and comments.--\u003e\n\n\u003c!--* **HTML** (`html`) assumes a valid HTML. (TODO)--\u003e\n\n* **Text** (`text`) assumes a human-readable Unicode text that can be encoded in UTF-8. This can also be used for JavaScript code that should not undergo preprocessing.\n\n\u003c!--* **Binary** (`binary`) does nothing. You can choose base64 (`binary:base64`) or hex (`binary:hex`) for the input encoding.--\u003e\n\n**Input action** (CLI `-a|--action ACTION`, API `action` in the input object) determines what to do with the decompressed data. \u003c!--All action except for the evaluate produces a value to the variable named `_` by default, which is either a value itself for a single input and an array of values for multiple inputs.--\u003e\n\n* \u003c!--*(JS, text only)*--\u003e\n  **Evaluate** (`eval`) evaluates the decompressed JavaScript code. If there are multiple inputs there should be exactly one JavaScript input with evaluate action, since subsequent inputs will be decompressed in that code. The resulting value is always a code string, which may include decoders for subsequent inputs.\n\n\u003c!--* *(JS, text only)* **JSON decode** (`json`) parses and returns a JSON value with `JSON.parse`.--\u003e\n\n\u003c!--* *(No binary)* **String** (`string`) returns a string.--\u003e\n\n* \u003c!--*(No binary)*--\u003e\n  **Write to document** (`write`) writes a decompressed string to `document`. Typically used with HTML.\n\n\u003c!--* **Array** (`array`) returns an array of bytes.--\u003e\n\n\u003c!--* **Typed array** (`u8array`) returns a `Uint8Array` value.--\u003e\n\n\u003c!--* **Base64** (`base64`) returns a base64-encoded string. Handy for data URIs.--\u003e\n\n\u003c!--\n**Input name** (CLI `-n|--name NAME`, API `name`) is required for accessing each input from the decompressed code. This is required if the input produces an output value.\n\n**Extract inputs** (CLI `-x|--extract`, not available in API) can be used for the JavaScript input with the evaluate action. This will try to extract long embedded strings and determine the best type for each input. This assumes that the compressed code doesn't make use of the output variable elsewhere; you can change the variable name from the configuration.\n--\u003e\n\n### Output Configuration\n\n**Number of contexts** (CLI `-S|--selectors xCOUNT`) relates to the complexity of modelling. The larger number of contexts will compress better, but at the expense of linear increase in both the time and memory usage. The default is 12, which targets at most 1 second of latency permitted for typical 30 KB input.\n\n**Maximum memory usage** (CLI `-M|--max-memory MEGABYTES`, API `maxMemoryMB` in the options object) configures the maximum memory to be used for decompression. Increasing or decreasing memory usage mostly affects the compression ratio and not the run time. The default is 150 MB and a larger value is not recommended for various reasons:\n\n* Any further gain for larger memory use is negligible for typical inputs less than 100 KB.\n\n* The compression may use more memory than the decompression: an one-shot compression may use up to 50% more memory, the optimizer will use 50% more on top of that.\n\n* It does take time to allocate and initialize a larger memory (~500 ms for 1 GB), so it is not a good choice for small inputs.\n\nThe actual memory usage can be as low as a half of the specified due to the internal architecture; `-v` will print the actual memory usage to stderr.\n\n**Allowing the decoder to pollute the global scope** (CLI `-D|--dirty`, API `allowFreeVars` in the options object) is unsafe especially when the Roadroller output should coexist with other code or there are elements with single letter `id` attributes and turned off by default. But if you can control your environment (typical for demos), you can turn this on for a smaller decoder.\n\n**Optimize parameters** (CLI `-O|--optimize LEVEL`, API `Packer.optimize`) searches for better modelling parameters. If parameters are already given the optimizer will try to improve upon that, and the optimizer prints best parameters at the end which can be reused for faster iteration. Parameters are solely related to the compression ratio so you can try this as many as you can afford. Each level does the following:\n\n* Level 0 does absolutely nothing and uses given parameters or default parameters if none. This is the default when any optimizable parameters are given.\n\n* Level 1 runs a quick search with about 30 sets of parameters and takes less than 10 seconds for typical 30 KB input. This is the default when no optimizable parameters are given, and intended for the typical build process.\n\n* Level 2 runs a thorough search with about 300 sets of parameters and takes about a minute or two. This is best useful for the release build and you would like to save best parameters for later uses.\n\n* Level ∞ is a special option only available in the CLI (`-OO`, with two capital Ohs) and runs increasingly slower optimizations in a run. Once the highest level is reached it runs that level forever. You need to explicitly terminate the search (e.g. CTRL-C), then it will proceed with the best parameters so far.\n\n### Advanced Configuration\n\n**Number of context bits** (CLI `-Zco|--context-bits BITS`, API `contextBits` in the options object) sets the size of individual model as opposed to the total memory use (`-M`), which is a product of the number of context and the size of each model. This explicit option is most useful for the fair benchmarking, since some parameters like `-Zpr` or `-Zmc` affect the memory use and therefore this parameter.\n\n\u003c!--**Optimize for uncompressed size** (CLI `--uncompressed-only`) assumes the absence of the outer comperssion algorithm like DEFLATE. This is *bad* for the compression since the compressor has to work strictly within the limits of JS source code including escape sequences. This should be the last resort where you can't even use the PNG-based self extraction and everything has to be in a single file.--\u003e\n\n\u003e Following parameters can be automatically optimized and normally you don't have to touch them unless you want to reproduce a particular set of parameters. As such, the default optimization (`-O1`) is disabled if any of these arguments are given in the CLI.\n\n**Chosen contexts** (CLI `-S|--selectors SELECTOR,SELECTOR,...`, API `sparseSelectors` in the options object) determine which byte contexts are used for each model. \u003ci\u003eK\u003c/i\u003eth bit of the number (where K \u003e 0) is set if the context contains the \u003ci\u003eK\u003c/i\u003eth-to-last byte: 5 = 101\u003csub\u003e(2)\u003c/sub\u003e for example would correspond to the context of the last byte and third-to-last byte, also called a sparse context (0,2). There is no particular limit for the number, but Roadroller only considers up to 9th order for the optimization process.\n\n**Precision** (CLI `-Zpr|--precision BITS`, API `precision` in the options object) is the number of fractional bits used in the internal fixed point representation. This is shared between the entropy coder and context models and can't be decoupled. The default of 16 should be enough, you can also try to decrease it.\n\n**Learning rate** (CLI `-Zlr|--learning-rate RATE`, API `recipLearningRate` in the options object) adjusts how fast would the context mixer adapt, where smaller is faster. The default is 500 which should be fine for long enough inputs. If your demo is smaller than 10 KB you can also try smaller numbers.\n\n**Model max count** (CLI `-Zmc|--model-max-count COUNT`, API `modelMaxCount` in the options object) adjusts how fast would individual contexts adapt, where smaller is faster. The model adapts fastest when a particular context is first seen, but that process becomes slower as the context is seen multiple times. This parameter limits how slowest the adaptation process can be. The default of 5 is specifically tuned for JS code inputs.\n\n**Model base divisor** (CLI `-Zmd|--model-base-divisor DIVISOR`, API `modelRecipBaseCount` in the options object) adjusts how fast should individual contexts adapt *initially*, where larger is faster. The optimal value typically ranges from 10 to 100 for JS code inputs.\n\n**Dynamic model flags** (CLI `-Zdy|--dynamic-models FLAGS`, API `dynamicModels` in the options object) are used to enable or disable specific dynamic models, where each bit is turned on if the model is in use. There is currently one supported model:\n\n* The bit 0 (value 1) models quoted strings (', \" or \\`) and works well for source codes. It assumes that every quotes are paired, so it can't be used in English texts with contractions (e.g. isn't) and turned off by default in non-JS inputs.\n\n**Number of abbreviations** (CLI `-Zab|--num-abbreviations NUM`, API `numAbbreviations` in the options object) affects the preprocessing for JS code inputs. Common identifiers and reserved words can be abbreviated to single otherwise unused bytes during the preprocessing; this lessens the burden of context modelling which can only look at the limited number of past bytes. If this parameter is less than the number of allowable abbreviations some identifiers will be left as is, which can sometimes improve the compression.\n\n### Tips and Tricks\n\n* The current algorithm slightly prefers 7-bit and 8-bit inputs for the decoder simplicity. You can still use emojis and other tricks that stuff many bits into Unicode code points, but the compression ratio might be decreased. Keep in mind that Roadroller is already doing the hard work for you and you might not need to repeat that.\n\n* The compressed JS code doesn't do anything beyond computation and the final action, so you can do anything before or after that. The [online demo][online] for example inserts a sort of splash screen as a fallback.\n\n* Roadroller, while being super effective for many inputs, is not a panacea. Roadroller is weaker at exploiting the duplication at a distance than DEFLATE. Make sure to check ADVZIP or ECT out.\n\nSee also the [wiki] for more information.\n\n## Compatibility\n\nRoadroller itself and resulting packed codes are ECMAScript 2015 (ES6) compatible and should run in every modern Web browser and JS implementation. Implementations are assumed to be reasonably fast but otherwise it can run in slower interpreters. MSIE is not directly supported but it works fine (slowly) after simple transpiling.\n\nRoadroller and packed codes extensively use `Math.exp` and `Math.log` that are [implementation-approximated](https://262.ecma-international.org/#implementation-approximated), so there is a small but real possibility that they behave differently in different implementations. This is known to be a non-issue for browser JS engines as well as V8 and Node.js as they use the same math library (fdlibm) for those functions, but you have been warned.\n\n## Internals\n\nRoadroller is mostly possible due to the progress in data compression algorithms as recent as 2010s:\n\n* Bytewise [rANS] coder, adapted from Fabien Giesen's [public domain code][ryg_rans].\n\n* [Logistic context mixing], which is a type of neural network specifically designed for the data compression.\n\n* Sparse context models up to 9th order. Models are tuned for each input with simulated annealing. (You may have noticed that this entire architecture is similar to [Crinkler], but Roadroller uses a faster and possibly better parameter search algorithm.)\n\nThe minimal JS code for this algorithm was initially adapted from a [golf.horse submission](http://golf.horse/wordlist.asc/contextually-F81hkL3e5HgGOj4bhaSfXIGSI0DSTkb5n58Qqc6NFmc) by Hasegawa Sayuri (public domain). The main difference is that Roadroller implements hashed contexts and thus order 3+ context models.\n\n## License\n\nThe Roadroller compressor proper is licensed under the MIT license. In addition to this, any decoder code produced by Roadroller, that is, everything in the second line is put in the public domain.\n\n[npm]: https://www.npmjs.com/package/roadroller\n[online]: https://lifthrasiir.github.io/roadroller/\n[wiki]: https://github.com/lifthrasiir/roadroller/wiki\n\n[js13kGames]: https://js13kgames.com/\n\n[JSCrush]: http://www.iteral.com/jscrush/\n[RegPack]: https://siorki.github.io/regPack.html\n[Terser]: https://terser.org/\n[Closure Compiler]: https://closure-compiler.appspot.com/home\n[AdvanceCOMP]: http://www.advancemame.it/comp-readme\n[ECT]: https://github.com/fhanau/Efficient-Compression-Tool/\n[Crinkler]: https://github.com/runestubbe/Crinkler\n[ryg_rans]: https://github.com/rygorous/ryg_rans/\n\n[demo]: https://en.wikipedia.org/wiki/Demoscene\n[DEFLATE]: https://en.wikipedia.org/wiki/Deflate\n[Logistic context mixing]: https://en.wikipedia.org/wiki/Context_mixing#Logistic_Mixing\n[rANS]: https://en.wikipedia.org/wiki/Asymmetric_numeral_systems#Range_variants_(rANS)_and_streaming\n\n","funding_links":[],"categories":["javascript","JavaScript","js"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flifthrasiir%2Froadroller","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Flifthrasiir%2Froadroller","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flifthrasiir%2Froadroller/lists"}