{"id":16321469,"url":"https://github.com/doodlewind/beam","last_synced_at":"2025-04-04T08:03:20.227Z","repository":{"id":41570870,"uuid":"198213783","full_name":"doodlewind/beam","owner":"doodlewind","description":"✨ Expressive WebGL","archived":false,"fork":false,"pushed_at":"2022-11-02T10:52:10.000Z","size":2041,"stargazers_count":520,"open_issues_count":3,"forks_count":44,"subscribers_count":16,"default_branch":"master","last_synced_at":"2024-10-11T22:48:18.168Z","etag":null,"topics":["3d","frontend","graphics","image-processing","library","particles","pbr","renderer","shadow-mapping","webgl","webgl-library"],"latest_commit_sha":null,"homepage":"https://doodlewind.github.io/beam/examples.html","language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/doodlewind.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2019-07-22T11:53:56.000Z","updated_at":"2024-09-30T12:20:20.000Z","dependencies_parsed_at":"2023-01-21T04:19:16.615Z","dependency_job_id":null,"html_url":"https://github.com/doodlewind/beam","commit_stats":null,"previous_names":[],"tags_count":2,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/doodlewind%2Fbeam","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/doodlewind%2Fbeam/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/doodlewind%2Fbeam/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/doodlewind%2Fbeam/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/doodlewind","download_url":"https://codeload.github.com/doodlewind/beam/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":247139257,"owners_count":20890196,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["3d","frontend","graphics","image-processing","library","particles","pbr","renderer","shadow-mapping","webgl","webgl-library"],"created_at":"2024-10-10T22:47:52.853Z","updated_at":"2025-04-04T08:03:20.204Z","avatar_url":"https://github.com/doodlewind.png","language":"JavaScript","readme":"# Beam\nExpressive WebGL\n\n\u003ca href=\"./package.json\"\u003e\n  \u003cimg src=\"https://img.shields.io/npm/v/beam-gl.svg?maxAge=300\u0026color=2254f4\"/\u003e\n\u003c/a\u003e\n\u003ca href=\"./package.json\"\u003e\n  \u003cimg src=\"https://img.shields.io/bundlephobia/min/beam-gl\"/\u003e\n\u003c/a\u003e\n\u003ca href=\"./src/index.d.ts\"\u003e\n  \u003cimg src=\"https://img.shields.io/npm/types/beam-gl\"/\u003e\n\u003c/a\u003e\n\u003ca href=\"./package.json\"\u003e\n  \u003cimg src=\"https://img.shields.io/maintenance/yes/2023\"/\u003e\n\u003c/a\u003e\n\n![beam-logo](./gallery/assets/images/beam-logo.png)\n\n中文介绍\n\n* [如何设计一个 WebGL 基础库](https://zhuanlan.zhihu.com/p/93500639)\n* [实用 WebGL 图像处理入门](https://zhuanlan.zhihu.com/p/100388037)\n* [WebGL 版康威生命游戏](https://zhuanlan.zhihu.com/p/197675822)\n\n## Introduction\nBeam is a tiny (~10KB) WebGL library. It's **NOT** a renderer or 3D engine by itself. Instead, Beam provides some essential abstractions, allowing you to build WebGL infrastructures within a very small and easy-to-use API surface.\n\nThe WebGL API is known to be verbose, with a steep learning curve. Just like how jQuery simplifies DOM operations, Beam wraps WebGL in a succinct way, making it easier to build WebGL renderers with clean and terse code.\n\nHow is this possible? Instead of just reorganizing boilerplate code, Beam defines some essential concepts on top of WebGL, which can be much easier to be understood and used. These highly simplified concepts include:\n\n* **Shaders** - Objects containing graphics algorithms. In contrast of JavaScript that only runs on CPU with a single thread, shaders are run in parallel on GPU, computing colors for millions of pixels every frame.\n* **Resources** - Objects containing graphics data. Just like how JSON works in your web app, resources are the data passed to shaders, which mainly includes triangle arrays (aka buffers), image textures, and global options (uniforms).\n* **Draw** - Requests for running shaders with resources. To render a scene, different shaders and resources may be used. You are free to combine them, so as to fire multi draw calls that eventually compose a frame. In fact, each draw call will start the graphics render pipeline for once.\n\nSo there are only 3 concepts to learn, represented by 3 core APIs in Beam: **beam.shader**, **beam.resource** and **beam.draw**. Conceptually only with these 3 methods, you can render a frame with WebGL.\n\nIf you are a beginner, you can check out the tutorial below to get started. For API definitions, please refer to [index.d.ts](./src/index.d.ts).\n\n## Installation\n``` bash\nnpm install beam-gl\n```\n\nOr you can clone this repository and start a static HTTP server to try it out. Beam runs directly in modern browser, without any need to build or compile.\n\n## Hello World with Beam\nNow we are going to write a simplest WebGL app with Beam, which renders a colorful triangle:\n\n![beam-hello-world](./gallery/assets/images/beam-hello-world.png)\n\nHere is the code snippet:\n\n``` js\nimport { Beam, ResourceTypes } from 'beam-gl'\nimport { MyShader } from './my-shader.js'\nconst { VertexBuffers, IndexBuffer } = ResourceTypes\n\n// Remember to create a `\u003ccanvas\u003e` element in HTML\nconst canvas = document.querySelector('canvas')\n// Init Beam instance\nconst beam = new Beam(canvas)\n\n// Init shader for triangle rendering\nconst shader = beam.shader(MyShader)\n\n// Init vertex buffer resource\nconst vertexBuffers = beam.resource(VertexBuffers, {\n  position: [\n    -1, -1, 0, // vertex 0, bottom left\n    0, 1, 0, // vertex 1, top middle\n    1, -1, 0 // vertex 2, bottom right\n  ],\n  color: [\n    1, 0, 0, // vertex 0, red\n    0, 1, 0, // vertex 1, green\n    0, 0, 1 // vertex 2, blue\n  ]\n})\n// Init index buffer resource with 3 indices\nconst indexBuffer = beam.resource(IndexBuffer, {\n  array: [0, 1, 2]\n})\n\n// Clear the screen, then draw with shader and resources\nbeam\n  .clear()\n  .draw(shader, vertexBuffers, indexBuffer)\n```\n\nNow let's take a look at some pieces of code in this example. Firstly we need to init Beam instance with a canvas:\n\n``` js\nconst canvas = document.querySelector('canvas')\nconst beam = new Beam(canvas)\n```\n\nThen we can init a shader with `beam.shader`. The content in `MyShader` will be explained later:\n\n``` js\nconst shader = beam.shader(MyShader)\n```\n\nFor the triangle, use the `beam.resource` API to create its data, which is contained in different buffers. Beam use the `VertexBuffers` type to represent them. There are 3 vertices in the triangle, each vertex has two attributes, which is **position** and **color**. Every vertex attribute has its vertex buffer, which can be declared as a flat and plain JavaScript array (or TypedArray). Beam will upload these data to GPU behind the scene:\n\n``` js\nconst vertexBuffers = beam.resource(VertexBuffers, {\n  position: [\n    -1, -1, 0, // vertex 0, bottom left\n    0, 1, 0, // vertex 1, top middle\n    1, -1, 0 // vertex 2, bottom right\n  ],\n  color: [\n    1, 0, 0, // vertex 0, red\n    0, 1, 0, // vertex 1, green\n    0, 0, 1 // vertex 2, blue\n  ]\n})\n```\n\nVertex buffers usually contain a compact dataset. We can define a subset or superset of which to render, so that we can reduce redundancy and reuse more vertices. To do that we need to introduce another type of buffer called `IndexBuffer`, which contains indices of the vertices in `vertexBuffers`:\n\n``` js\nconst indexBuffer = beam.resource(IndexBuffer, {\n  array: [0, 1, 2]\n})\n```\n\n\u003e In this example, each index refers to 3 spaces in the vertex array.\n\nFinally we can render with WebGL. `beam.clear` can clear the frame, then the chainable `beam.draw` can draw with **one shader object and multi resource objects**:\n\n``` js\nbeam\n  .clear()\n  .draw(shader, vertexBuffers, indexBuffer)\n```\n\nThe `beam.draw` API is flexible, if you have multi shaders and resources, just combine them to make draw calls at your wish, composing a complex scene:\n\n``` js\nbeam\n  .draw(shaderX, ...resourcesA)\n  .draw(shaderY, ...resourcesB)\n  .draw(shaderZ, ...resourcesC)\n```\n\nThere's one missing point: How to decide the render algorithm of the triangle? This is done in the `MyShader` variable, which is a schema of the shader object, and it looks like this:\n\n``` js\nimport { SchemaTypes } from 'beam-gl'\n\nconst vertexShader = `\nattribute vec4 position;\nattribute vec4 color;\nvarying highp vec4 vColor;\nvoid main() {\n  vColor = color;\n  gl_Position = position;\n}\n`\nconst fragmentShader = `\nvarying highp vec4 vColor;\nvoid main() {\n  gl_FragColor = vColor;\n}\n`\n\nconst { vec4 } = SchemaTypes\nexport const MyShader = {\n  vs: vertexShader,\n  fs: fragmentShader,\n  buffers: {\n    position: { type: vec4, n: 3 },\n    color: { type: vec4, n: 3 }\n  }\n}\n```\n\nThis shows a simple shader schema in Beam, which is made of a string for vertex shader, a string for fragment shader, and other schema fields. From a very brief view, vertex shader is executed once per vertex, and fragment shader is executed once per pixel. They are written in the GLSL shader language. In WebGL, the vertex shader always writes to `gl_Position` as its output, and the fragment shader writes to  `gl_FragColor` for final pixel color. The `vColor` varying variable is interpolated and passed from vertex shader to fragment shader, and the `position` and `color` vertex attribute variables, are corresponding to the buffer keys in `vertexBuffers`. That's a convention to simplify boilerplates.\n\n## Build Something Bigger\nNow we have known how to render a triangle with Beam. What's next? Here is a very brief guide, showing how can we use Beam to handle more complex WebGL scenarios:\n\n### Render 3D Graphics\nThe \"Hello World\" triangle we have drawn, is just a 2D shape. How about boxes, balls, and other complex 3D models? Just a little bit more vertices and shader setups. Let's see how to render following 3D ball in Beam:\n\n![basic-ball](./gallery/assets/images/basic-ball.png)\n\n3D graphics are composed of triangles, which are still further composed of vertices. For the triangle example, every vertex has two attributes, which is **position** and **color**. For a basic 3D ball, we need to talk about **position** and **normal**. The normal attribute contains the vector perpendicular to the ball at that position, which is critical to compute lighting.\n\nMoreover, to transform a vertex from 3D space to 2D screen coordinates, we need a \"camera\", which is compsed of matrices. For each vertex being passed to the vertex shader, we should apply same transform matrices to it. These matrix variables are \"global\" to all shaders running in parallel, which is called **uniforms** in WebGL. `Uniforms` is also a resource type in Beam, containing multi global options for shaders, like camera positions, line colors, effect strength factors and so on.\n\nSo to render a simplest ball, we can reuse exactly the same fragment shader as the triangle example, just update the vertex shader string as following:\n\n``` glsl\nattribute vec4 position;\nattribute vec4 normal;\n\n// Transform matrices\nuniform mat4 modelMat;\nuniform mat4 viewMat;\nuniform mat4 projectionMat;\n\nvarying highp vec4 vColor;\n\nvoid main() {\n  gl_Position = projectionMat * viewMat * modelMat * position;\n  vColor = normal; // visualize normal vector\n}\n```\n\nSince we have added uniform variables in shader, the schema should also be updated, with a new `uniforms` field:\n\n``` js\nconst identityMat = [1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1]\nconst { vec4, mat4 } = SchemaTypes\n\nexport const MyShader = {\n  vs: vertexShader,\n  fs: fragmentShader,\n  buffers: {\n    position: { type: vec4, n: 3 },\n    normal: { type: vec4, n: 3 }\n  },\n  uniforms: {\n    // The default field is handy for reducing boilerplate\n    modelMat: { type: mat4, default: identityMat },\n    viewMat: { type: mat4 },\n    projectionMat: { type: mat4 }\n  }\n}\n```\n\nThen we can still write expressive WebGL render code:\n\n``` js\nconst beam = new Beam(canvas)\n\nconst shader = beam.shader(NormalColor)\nconst cameraMats = createCamera({ eye: [0, 10, 10] })\nconst ball = createBall()\n\nbeam.clear().draw(\n  shader,\n  beam.resource(VertexBuffers, ball.vertex),\n  beam.resource(IndexBuffer, ball.index),\n  beam.resource(Uniforms, cameraMats)\n)\n```\n\nAnd that's all. See the [Basic Ball](./gallery/pages/basic-graphics/basic-ball.js) page for a working example.\n\n\u003e Beam is a WebGL library without 3D assumptions. So graphics objects and matrix algorithms are not part of it. For convenience there are some related utils shipping with Beam examples, but don't expect too strict on them.\n\n### Animate Graphics\nHow to move the graphics object in WebGL? Certainly you can update the buffers with new positions, but that can be quite slow. Another solution is to just update the tranform matrices we mentioned above, which are uniforms, very small pieces of options.\n\nWith the `requestAnimationFrame` API, we can easily zoom the ball we rendered before:\n\n``` js\nconst beam = new Beam(canvas)\n\nconst shader = beam.shader(NormalColor)\nconst ball = createBall()\nconst buffers = [\n  beam.resource(VertexBuffers, ball.vertex),\n  beam.resource(IndexBuffer, ball.index)\n]\nlet i = 0; let d = 10\nconst cameraMats = createCamera({ eye: [0, d, d] })\nconst camera = beam.resource(Uniforms, cameraMats)\n\nconst tick = () =\u003e {\n  i += 0.02\n  d = 10 + Math.sin(i) * 5\n  const { viewMat } = createCamera({ eye: [0, d, d] })\n\n  // Update uniform resource\n  camera.set('viewMat', viewMat)\n\n  beam.clear().draw(shader, ...buffers, camera)\n  requestAnimationFrame(tick)\n}\ntick() // Begin render loop\n```\n\nThe `camera` variable is a `Uniforms` resource instance in Beam, whose data are stored in key-value pairs. You are free to add or modify different uniform keys. When `beam.draw` is fired, only the keys that match the shader will be uploaded to GPU.\n\nSee the [Zooming Ball](./gallery/pages/basic-graphics/zooming-ball.js) page for a working example.\n\n\u003e Buffer resources also supports `set()` in a similar way. Make sure you know what you are doing, since this can be slow for heavy workload in WebGL.\n\n### Render Images\nWe have met the `VertexBuffers`, `IndexBuffer` and `Uniforms` resouce types in Beam. If we want to render an image, we need the last critical resouce type, which is `Textures`. A basic related example would be a 3D box with image like this:\n\n![basic-texture](./gallery/assets/images/basic-texture.png)\n\nFor graphics with texture, besides the **position** and **normal**, we need an extra **texCoord** attribute, which aligns the image to the graphics at that position, and also being interpolated in the fragment shader. See the new vertex shader:\n\n``` glsl\nattribute vec4 position;\nattribute vec4 normal;\nattribute vec2 texCoord;\n\nuniform mat4 modelMat;\nuniform mat4 viewMat;\nuniform mat4 projectionMat;\n\nvarying highp vec2 vTexCoord;\n\nvoid main() {\n  vTexCoord = texCoord;\n  gl_Position = projectionMat * viewMat * modelMat * position;\n}\n```\n\nAnd the new fragment shader:\n\n``` glsl\nuniform sampler2D img;\nuniform highp float strength;\n\nvarying highp vec2 vTexCoord;\n\nvoid main() {\n  gl_FragColor = texture2D(img, vTexCoord);\n}\n```\n\nNow we need a new shader schema with `textures` field:\n\n``` js\nconst { vec4, vec2, mat4, tex2D } = SchemaTypes\nexport const MyShader = {\n  vs: vertexShader,\n  fs: fragmentShader,\n  buffers: {\n    position: { type: vec4, n: 3 },\n    texCoord: { type: vec2 }\n  },\n  uniforms: {\n    modelMat: { type: mat4, default: identityMat },\n    viewMat: { type: mat4 },\n    projectionMat: { type: mat4 }\n  },\n  textures: {\n    img: { type: tex2D }\n  }\n}\n```\n\nAnd finally let's checkout the render logic:\n\n``` js\nconst beam = new Beam(canvas)\n\nconst shader = beam.shader(MyShader)\nconst cameraMats = createCamera({ eye: [10, 10, 10] })\nconst box = createBox()\n\nloadImage('prague.jpg').then(image =\u003e {\n  const imageState = { image, flip: true }\n  beam.clear().draw(\n    shader,\n    beam.resource(VertexBuffers, box.vertex),\n    beam.resource(IndexBuffer, box.index),\n    beam.resource(Uniforms, cameraMats),\n    // The 'img' key is defined to match the shader\n    beam.resource(Textures, { img: imageState })\n  )\n})\n```\n\nThat's all for basic texture resource usage. Since we have direct access to image shaders, we can also easily add image processing effects with Beam.\n\nSee the [Image Box](./gallery/pages/basic-graphics/image-box.js) page for a working example.\n\n\u003e You are free to replace the `createBox` with `createBall` and see the difference.\n\n### Render Multi Objects\nHow to render different graphics objects? Let's see the flexibility of `beam.draw` API:\n\n![multi-graphics](./gallery/assets/images/multi-graphics.png)\n\nTo render multi balls and multi boxes, we only need 2 group of `VertexBuffers` and `IndexBuffer`, one for ball and one for box:\n\n``` js\nconst shader = beam.shader(MyShader)\nconst ball = createBall()\nconst box = createBox()\nconst ballBuffers = [\n  beam.resource(VertexBuffers, ball.vertex),\n  beam.resource(IndexBuffer, ball.index)\n]\nconst boxBuffers = [\n  beam.resource(VertexBuffers, box.vertex),\n  beam.resource(IndexBuffer, box.index)\n]\n```\n\nThen in a `for` loop, we can easily draw them with different uniform options. By changing `modelMat` before `beam.draw`, we can update an object's position in world space, so that the box and ball can both appear on screen multi times:\n\n``` js\nconst cameraMats = createCamera(\n  { eye: [0, 50, 50], center: [10, 10, 0] }\n)\nconst camera = beam.resource(Uniforms, cameraMats)\nconst baseMat = mat4.create()\n\nconst render = () =\u003e {\n  beam.clear()\n  for (let i = 1; i \u003c 10; i++) {\n    for (let j = 1; j \u003c 10; j++) {\n      const modelMat = mat4.translate(\n        [], baseMat, [i * 2, j * 2, 0]\n      )\n      camera.set('modelMat', modelMat)\n      const resources = (i + j) % 2\n        ? ballBuffers\n        : boxBuffers\n\n      beam.draw(shader, ...resources, camera)\n    }\n  }\n}\n\nrender()\n```\n\nThe `render` function begins with a `beam.clear`, then we're free to use `beam.draw` that makes up complex render logic.\n\nSee the [Multi Graphics](./gallery/pages/basic-graphics/multi-graphics.js) page for a working example.\n\n### Offscreen Rendering\nIn WebGL we use framebuffer object for offscreen rendering, which renders the output to a texture. To do that, Beam provides a corresponding `beam.target` API. It automatically creates such a target with texture attached. We can explictly `use` this target and smootyly make any `beam.draw` call rendering into this texture.\n\nSay the default render logic looks something like this:\n\n``` js\nbeam\n  .clear()\n  .draw(shaderX, ...resourcesA)\n  .draw(shaderY, ...resourcesB)\n  .draw(shaderZ, ...resourcesC)\n```\n\nWith the `target.use` method, this render logic can be simply nested in a function scope in this way:\n\n``` js\n// Prepare an offscreen target with a 2048x2048 color texture attached\nconst target = beam.target(2048, 2048)\nbeam.clear()\n// Draw into this texture\ntarget.use(() =\u003e {\n  beam\n    .draw(shaderX, ...resourcesA)\n    .draw(shaderY, ...resourcesB)\n    .draw(shaderZ, ...resourcesC)\n})\n\n// The texture attached to the target can now be used in following drawing process\nmyTextures.set('img', target.texture)\n// ...\n```\n\nThis redirects the render output to the offscreen texture resource.\n\nSee the [Basic Mesh](./gallery/pages/offscreen/basic-mesh.js) page for a working example.\n\n### Advanced Render Techniques\nFor realtime rendering, physically based rendering (PBR) and shadow mapping are two major advanced techniques. Beam has demonstrated basic support of them in examples, like these PBR material balls:\n\n![pbr-balls](./gallery/assets/images/pbr-balls.png)\n\nThese examples focus more on readability instead of completeness. To get started, checkout:\n\n* [Material Ball](./gallery/pages/3d-models/material-ball.js) page for a working PBR example.\n* [Basic Shadow](./gallery/pages/offscreen/basic-shadow.js) page for a working shadow mapping example.\n\n## More Examples\nSee [Beam Examples](https://doodlewind.github.io/beam/examples.html) for more versatile WebGL snippets based on Beam, including:\n\n* Render multi 3D objects\n* Mesh loading\n* Texture config\n* Classic lighting\n* Physically based rendering (PBR)\n* Chainable Image Filters\n* Offscreen rendering (using FBO)\n* Shadow mapping\n* Basic particles\n* WebGL extension config\n* Customize your renderers\n\nPull requests for new examples are also welcomed :)\n\n## License\nMIT\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdoodlewind%2Fbeam","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdoodlewind%2Fbeam","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdoodlewind%2Fbeam/lists"}