{"id":15314097,"url":"https://github.com/dennissmolek/denoiser","last_synced_at":"2025-06-16T06:07:20.018Z","repository":{"id":247548706,"uuid":"819885548","full_name":"DennisSmolek/Denoiser","owner":"DennisSmolek","description":"AI Denoising in the browser. Based on OIDN, powered by tensorflow.js","archived":false,"fork":false,"pushed_at":"2024-09-24T03:00:49.000Z","size":25532,"stargazers_count":33,"open_issues_count":12,"forks_count":0,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-06-10T00:38:15.315Z","etag":null,"topics":["denoising","oidn","tensorflowjs","typescript"],"latest_commit_sha":null,"homepage":"https://webgpu-basic.vercel.app/","language":"TypeScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/DennisSmolek.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null}},"created_at":"2024-06-25T11:22:45.000Z","updated_at":"2025-04-30T00:07:34.000Z","dependencies_parsed_at":"2025-04-13T12:33:38.680Z","dependency_job_id":null,"html_url":"https://github.com/DennisSmolek/Denoiser","commit_stats":null,"previous_names":["dennissmolek/oidnflow.js","dennissmolek/denoiser"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/DennisSmolek/Denoiser","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DennisSmolek%2FDenoiser","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DennisSmolek%2FDenoiser/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DennisSmolek%2FDenoiser/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DennisSmolek%2FDenoiser/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/DennisSmolek","download_url":"https://codeload.github.com/DennisSmolek/Denoiser/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/DennisSmolek%2FDenoiser/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":260109505,"owners_count":22960033,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["denoising","oidn","tensorflowjs","typescript"],"created_at":"2024-10-01T08:44:26.690Z","updated_at":"2025-06-16T06:07:19.982Z","avatar_url":"https://github.com/DennisSmolek.png","language":"TypeScript","readme":"![title-card-resized](https://github.com/DennisSmolek/Denoiser/assets/1397052/ffe87fd5-00e6-464e-b8a2-ba80402b9d2f)\n\n## AI Denoising that runs in the browser.\n\n#### Based on the Open Image Denoiser and powered by Tensorflow.js\n\n\n\nBecause of Tensorflow’s broad support, Denoiser works in almost any javascript environment but runs fastest where it can run WebGL, and most advanced on WebGPU.\n\n\n### Basic Javascript Example\n```ts\nimport { Denoiser } from \"denoiser\";\n//* Elements\nconst noisey = document.getElementById(\"noisey-img\");\nconst outputCanvas = document.getElementById(\"output-canvas\");\nconst button = document.getElementById(\"execute-button\");\n\nconst denoiser = new Denoiser();\n// set the canvas for quick denoising output\ndenoiser.setCanvas(outputCanvas);\n\n// function to denoise the image when clicked\nasync function doDenoise() {\n\tawait denoiser.execute(noisey);\n}\n\n//* add a click litener to the button\nbutton.addEventListener(\"click\", doDenoise);\n```\n[See This Example Live](example.com/link)\n\n## Overview\n\n[Open Image Denoiser (OIDN)](https://github.com/RenderKit/oidn) is a Native Neural Network Denoiser designed specifically for use with pathtracing to get the highest quality results from the lowest number of samples. \n\nDenoiser uses the same model structure (UNet) and OIDN’s pre-trained weights, with tensorflow.js to offer high quality denoising in the browser.\n\nWe take a variety of different inputs (Images, ImgData, Canvas Data, WebGLTextures, GPUBuffers, ArrayBuffers, etc) and convert them into tensorflow tensors.\n\nWe then pre-process and bundle the inputs to ready them for execution.\n\nThen we build a custom model at runtime based on the inputs and configuration, run the model, and ready the results.\n\nIf the size of the images/data is too large we automatically tile/batch the input to reduce GPU memory load.\n\nWith the results from the model, Denoiser can immediately render them to a canvas, return a new image, output a texture, a WebGPU Buffer, etc. based on your choices.\n\n\nUsing Tensorflow, we keep as much of these operations as possible on the GPU making things incredibly fast.\n\nDenoiser has a ton of options and configurations for different needs and platforms, I’ll do my best to document them.\n\n### Install:\n```sh\nyarn add denoiser\n```\n#### Including the weights\n\nI tried many things to do this automatically, but my options were to make the library size huge with the weights bundled, or have the user load the weights their own.\n\nI have bundled the weights in the `node_module` and you can make a [script to copy them]() in your build process.\n\nOr you can just copy the weights you will use [from here]() and put them into a folder named `tzas`;\n\nBy default, `Denoiser` will look for these in the root path, so `https://yourapp.com/tzas`\n\nIf you are using Vite, this just means create a `tzas` folder in your `public` folder and put whatever weights you might use. \n*(all of the examples use this method)*\n\n#### If your path isn't on the root or you want to load weights from a URL you can do that too:\n```ts\ndenoiser.weightPath = 'path/from/root/tzas';\n//or override completely with a URL\ndenoiser.weightUrl = 'https://whereyourweightsare.com/tzas';\n```\n### Getting Set Up\nThere are a [ton of options](#denoiser) and inputs that control how things flow through the denoiser. You probably need none of it.\n\n1. [Install](#install) and put weights in the root folder.\n2. Import and create the denoiser:\n```ts\nimport { Denoiser } from 'denoiser';\nconst denoiser = new Denoiser();\n```\n**DONE**\n\nSeriously, most other things are taken care of automatically.\nThe only optional adjustment you have to set is the quality.\n\nDefault `quality` is `'fast'` which will work for 90% of what you do. You can also set `'balanced'` which ranges from 10-200% slower, with not that much difference in quality.\n\nAs of 7/24 we don't support `'high'` as OIDN lies about this anyway. Even if you set it, unless all the other props are set it actually just runs `'balanced'`. I'll start supporting it [When I add other UNets](#Unets)\n\nThe other possible option you could use is `hdr`. This changes how the model loads and is mostly used in textures/buffers if you know what you are doing. If you're using hdr data, be sure to set this. *(note: if you are using images as input, you absolutely don't need hdr )*\n\n#### Standard Options example\n```ts\ndenoiser.quality = 'balanced';\ndenoiser.hdr = true;\n// these can be set too to help with WebGL/WebGPU modes\ndenoiser.height = 720;\ndenoiser.width = 1280;\n```\nCheckout setting up with [WebGL](#webgl) and [WebGPU](#webgpu) for more advanced users.\n\nFull list of [Denoiser Props](#denoiser)\n\n***\n\n### Getting Data In\nThere are two ways of getting data into the denoiser. Setting a input **BEFORE** execution with a explicit set function,\n or setting the input **DURING** execution using `inputMode` to indicate how you want the inputs handled.\n\n#### execute(*color?, albedo?, normal?*)\n`Denoiser.execute()` is the most common input and output method combining the explicit input/output handlers into one action.\n\nUsing `inputMode` and `outputMode` you can inform the denoiser what kind of data you will send and expect when you run `execute()`\n\n*By default, `inputMode` and `outputMode` are set to `imgData`*;\n\n```ts\nconst noisey = document.getElementById(\"noisey-img\");\nconst albedo = document.getElementById(\"albedo-img\");\n\ndenoiser.outputMode = 'webgpu';\nconst outputBuffer = await denoiser.execute(color, albedo);\n// send the buffer to the renderer directly\nrenderer.renderBuffer(outputBuffer);\n```\n#### inputMode/outputMode\n```ts\ninputMode: 'imgData' | 'webgl' | 'webgpu' | 'tensor';\noutputMode: 'imgData' | 'webgl' | 'webgpu' | 'tensor' | 'float32';\n```\n---\n\n#### ImageData *(default)*\n\nThe default way to get data in is using `ImgData` from a canvas or even just passing a HTML `Image` object directly.\n\n```ts\nconst noisey = document.getElementById(\"noisey-img\");\n\n// using the execute function\nconst output = await denoiser.execute(noisey);\n\n// using setImage\ndenoiser.setImage('color', noisey);\nconst output = denoiser.execute();\n\n```\n\n#### setImage(name, data)\nname: `'color' | 'albedo' | 'normal'`\n\ndata: `PixelData|ImageData|HTMLImageElement|HTMLCanvasElement| HTMLVideoElement|ImageBitmap`\n\n`color` and `albedo` inputs are regular (non-normalized) RGB(a) inputs. (sRGB Colorspace)\n\n`normal` is linear and as a saved image assumed to be in 0-255. Note: When normalized (OIDN expects) so we transform these to [-1, 1]\n\n*NOTE: Tensorflow automatically strips all alpha channels from image data. When we return it all alpha will be set as 100%. If you need alpha output consider a different method of input*\n\n---\n#### WebGL Texture\n##### *Make sure to read: [Running in a shared WebGL Context]()*\n*(Note: WebGL is being weird at the moment)*\n#### SetInputTexture(`name, texture, height, width, channels?`)\nIf the denoiser is running in the same context you can pass a `WebGLTexture` directly as input without needing to sync the CPU/GPU.\n\nTextures are expected to already be normalize [0, 1] for `color`, `albedo` and [-1, 1] for `normal`\n\nBe sure to set the height and width of the texture, we can't determine this from the data alone. You can set these before directly on the denoiser.\n\nWe assume the data will have an alpha channel, and will actually parse this data back onto the texture when returning it.(it will NOT be denoised) If your texture doesn't have alpha set the channels to 3.\n\n```ts\n// ** somewhere inside a loop **\nconst texture = frameBuffer.texture;\ndenoiser.setInputTexture('color', texture, renderer.height, renderer.width);\n\n// If you want to use texture with execute\ndenoiser.inputMode = 'webgl' // only has to be set anytime before and only once.\nconst { colorTexture, albedoTexture, normalTexture} = renderer.getMRTOutput();\nconst outputTexture = await denoiser.execute(colorTexture, albedoTexture, normalTexture);\n// render the results\nrenderer.renderTexture(outputTexture);\n```\n---\n#### WebGPU Buffer\n##### *Make sure to read: [Running WebGPU](#webgpu)*\nWebGPU is the easiest advanced setup to get data in/out of the Neural Network without causing a CPU/GPU Sync.\n\nBuffers are expected to already be normalize [0, 1] for `color`, `albedo` and [-1, 1] for `normal`\n\nBe sure to set the height and width as we can't determine that from data. You can set these before directly on the denoiser.\n\n\u003e [!WARNING]\n\u003eAlso a note about `usage:` I got errors until I made sure the `GPUBufferUsage.COPY_DST` was set. I'm not sure what Tensorflow is doing with the buffer that it needs to write to it but it will throw errors without this set.\n\n\n\n```ts\n// setup code\nconst denoiser = new Denoiser('webgpu', renderer.device);\ndenoiser.height = 720;\ndenoiser.width = 1280;\n\n/* Somewhere deep in your render code */\n// Create an output buffer\nconst outputBuffer = device.createBuffer({\n    size: 1280 * 720 * 4 * 4, // 1280x720 pixels, 4 channels (RGBA), 4 bytes per float\n    usage: GPUBufferUsage.STORAGE | GPUBufferUsage.COPY_SRC | GPUBufferUsage.COPY_DST,\n    label: 'processBuffer'\n});\n/* Do Fancy WebGPU Stuff */\n// all done on the GPU, send it to denoise. (note, no need to set height/width if pre-set)\ndenoiser.setInputBuffer('color', outputBuffer);\n\n//* Different ex With execute -----------------------------\n\ndenoiser.inputMode = 'webgpu'; // only needs to be set once\nconst inBuffer = renderer.getLatestBuffer();\ndenoiser.execute(inBuffer);\n```\n\n\n---\n#### Data Buffer\n##### setData(`name, Float32Array | Uint8Array, height, width, channels?`)\nYou probably don't actually need this.\nA case where you would is if using HDR loaded on the CPU like with threejs's [RGBELoader](https://github.com/mrdoob/three.js/blob/master/examples/jsm/loaders/RGBELoader.js) and then get the data out as a `Float32Array` or if using some sort of camera input. \n```ts\n    const rawData = system.getRawOutput();\n    denoiser.setData('color', rawData, system.height, system.width)\n    setData(name: 'color' | 'albedo' | 'normal', data: Float32Array | Uint8Array, height: number, width: number, channels = 4) {\n\n```\n---\n\n### Getting Data Out\nJust like with input there are two main methods for getting data out of the denoiser. You can use the output of the `execute()` function with `outputMode` set to whatever you like, or you can create an `executionListener` that will fire on every execution and output whatever way you set, regardless of `outputMode`\n\nThere is also the super easy `setCanvas()` which is fine for many cases.\n\n#### setCanvas(`HTMLCanvasElement`)\nThis dumps the outputTensor directly to a canvas on every execution. This is faster than pulling the data and drawing it yourself, as Tensorflow draws the canvas directly with the tensor data.\n\nI use this often for debugging as it's guaranteed to draw exactly what the `outputTensor` holds. Very useful for testing renderers. \n\nIt also runs regardless of `outputMode` \n```ts\nconst noisey = document.getElementById(\"noisey-img\");\nconst outputCanvas = document.getElementById(\"output-canvas\");\n\nconst denoiser = new Denoiser();\n// set the canvas for quick denoising output\ndenoiser.setCanvas(outputCanvas);\n\n// we dont care about the output data, just be sure noisey is loaded. might wrap this in a onLoaded to be 100%\ndenoiser.execute(noisey);\n\n//To deactivate it, you have to set:\ndenoiser.outputToCanvas = false;\n```\n---\n#### execute()\nExecute is the primary way to get data in/out of the denoiser, but not the only way.\nIf you are only calling the denoiser once and you are handling input/output in the same place execute is the way to go.\n\nIt's particualarly useful for things like timers or setting/restoring state after the denoiser runs that need to bracket the denoise step.\n\n```ts\n//* Basic Example ===\n//load\nconst noisy = renderer.getCanvasInfo();\n//denoise\nconst imgData = await denoiser.execute(noisy);\n//draw\nrenderer.drawToCanvas(imgData);\n\n//* Advanced Bracketing Example ===\ndenoiser.outputMode = 'webgl';\ndenoiser.height = 720;\ndenoiser.width = 1280;\nconst startTime = performance.now();\n// load\nconst colorTexture = renderer.getColorTexture();\n// advanced thing, don't worry about it\ndensoiser.restoreWebGLState();\n// denoise\nconst outputTexture = await denoiser.execute(colorTexture);\ndenoiser.saveWebGLState();\n//draw\nrenderer.drawTexture(outputTexture);\nstatsOutput('renderDenoised', startTime, performance.now());\n```\n---\n\n#### onExecute(`callback(output), outputMode`)\nAttaching a listener to the denoiser lets you decouple input, execution, and output in a much greater/more flexible way.\n\nYou can add as many listeners as you want that each have their own `outputMode`s meaning you could send one to a compute shader and another to a renderer so you can compare results for example.\n\nAdding a listener returns a return function which will remove the listener.\n\nNOTE: Using listeners disables the output when you call `execute()` to avoid costly outputHandling\n\n#### Decoupled Handling Example:\n```ts\nconst device = renderer.getDevice();\nconst denoiser = new Denoiser('webgpu', device);\n// attach execution listener now\ndenoiser.onExecute((outputBuffer) =\u003e {\n    //draw\n    renderer.renderBuffer(outputBuffer);\n}, 'webgpu');\n\n\nasync function whenSomethingHappens(url: string) {\n    const buffer = await renderer.makeBuffer(url);\n    // set the input on the denoiser\n    denoiser.setInputBuffer('color', buffer, renderer.height, renderer.width);\n}\n\nfunction doDenoise() {\n    denoiser.execute();\n}\n\nloadButton.addEventHandler('pointerDown', (() =\u003e whenSomethingHappens(input.value));\ndenoiseButton.addEventHandler('pointerDown', doDenoise);\n```\n\n---\n\n### Other UNets\nWhy did I change the name and this isn't just an OIDN for the web?\n\nNow that I have a UNet operating and am much more comfortable with tensorflow there are [other more advanced/modern UNets](https://github.com/cszn/SCUNet) that I think could be used in conjunction with the OIDN UNet.\n\nOIDN is specifically designed for pathtracers and 3D renderers with MRT outputs of albedo and normal data. The noise is almost always uniformly black which means noise due to sensor data, compression, or other sources is often replicated. \n\nThere are also many methods to potentially speed up the denoising/handling that would be very different that the core OIDN Unets. Therefore, I didn't feel right calling it OIDNFlow, OIWDN, etc as it would be fundementally different.\n\nOnce I have other UNet's/Models in the works I will add the \"Large\" UNet that is required for \"high\" quality.\n\n\n\n\nProblem is, as of right now it has almost no mobile support and limited desktop support.\n\nAlso, *(although untested)* it was reported that for tensorflow, webGL is actually still slightly faster to execute.\n\n\nSUper advanced future stuff:\n\n\n(I haven't exposed the option to override with a single `tensorMap`)","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdennissmolek%2Fdenoiser","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdennissmolek%2Fdenoiser","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdennissmolek%2Fdenoiser/lists"}