{"id":15468149,"url":"https://github.com/lesnitsky/webgl-month","last_synced_at":"2025-04-09T21:20:14.569Z","repository":{"id":85541865,"uuid":"194667033","full_name":"lesnitsky/webgl-month","owner":"lesnitsky","description":"🎓 Daily WebGL tutorials","archived":false,"fork":false,"pushed_at":"2024-08-28T07:21:47.000Z","size":944,"stargazers_count":224,"open_issues_count":4,"forks_count":14,"subscribers_count":10,"default_branch":"dev","last_synced_at":"2025-04-02T13:58:06.389Z","etag":null,"topics":["3d-rendering","beginner","code-samples","javascript","tutorials","webgl"],"latest_commit_sha":null,"homepage":"https://dev.to/lesnitsky","language":"JavaScript","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/lesnitsky.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2019-07-01T12:18:08.000Z","updated_at":"2024-12-16T20:30:04.000Z","dependencies_parsed_at":null,"dependency_job_id":"c70dc012-c40d-47db-ac92-42046358f74d","html_url":"https://github.com/lesnitsky/webgl-month","commit_stats":{"total_commits":39,"total_committers":1,"mean_commits":39.0,"dds":0.0,"last_synced_commit":"71bc84d4c78482ce9eb475cf65123943303cdd06"},"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lesnitsky%2Fwebgl-month","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lesnitsky%2Fwebgl-month/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lesnitsky%2Fwebgl-month/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/lesnitsky%2Fwebgl-month/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/lesnitsky","download_url":"https://codeload.github.com/lesnitsky/webgl-month/tar.gz/refs/heads/dev","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":248112226,"owners_count":21049620,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["3d-rendering","beginner","code-samples","javascript","tutorials","webgl"],"created_at":"2024-10-02T01:40:18.149Z","updated_at":"2025-04-09T21:20:14.539Z","avatar_url":"https://github.com/lesnitsky.png","language":"JavaScript","readme":"# WebGL month\n\nHi 👋 My name is Andrei. I have some fun experience with WebGL and I want to share it. I'm starting a month of WebGL, each day I will post a WebGL related tutorial. Not Three.js, not pixi.js, WebGL API itself.\n\n[Follow me on twitter](https://twitter.com/lesnitsky_a) to get WebGL month updates or [join WebGL month mailing list](http://eepurl.com/gwiSeH)\n\n\n## Day 1. Intro\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Soruce code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\nWelcome to day 1 of WebGL month. In this article we'll get into high level concepts of rendering which are improtant to understand before approaching actual WebGL API.\n\nWebGL API is often treated as 3D rendering API, which is a wrong assumption. So what WebGL does?\nTo answer this question let's try to render smth with canvas 2d.\n\n\nWe'll need simple html\n\n📄 index.html\n```html\n\u003c!DOCTYPE html\u003e\n\u003chtml lang=\"en\"\u003e\n  \u003chead\u003e\n    \u003cmeta charset=\"UTF-8\" /\u003e\n    \u003cmeta name=\"viewport\" content=\"width=device-width, initial-scale=1.0\" /\u003e\n    \u003cmeta http-equiv=\"X-UA-Compatible\" content=\"ie=edge\" /\u003e\n    \u003ctitle\u003eWebGL Month\u003c/title\u003e\n  \u003c/head\u003e\n  \u003cbody\u003e\u003c/body\u003e\n\u003c/html\u003e\n\n```\nand canvas\n\n📄 index.html\n```diff\n      \u003cmeta http-equiv=\"X-UA-Compatible\" content=\"ie=edge\" /\u003e\n      \u003ctitle\u003eWebGL Month\u003c/title\u003e\n    \u003c/head\u003e\n-   \u003cbody\u003e\u003c/body\u003e\n+   \u003cbody\u003e\n+     \u003ccanvas\u003e\u003c/canvas\u003e\n+   \u003c/body\u003e\n  \u003c/html\u003e\n\n```\nDon't forget beloved JS\n\n📄 index.html\n```diff\n    \u003c/head\u003e\n    \u003cbody\u003e\n      \u003ccanvas\u003e\u003c/canvas\u003e\n+     \u003cscript src=\"./src/canvas2d.js\"\u003e\u003c/script\u003e\n    \u003c/body\u003e\n  \u003c/html\u003e\n\n```\n📄 src/canvas2d.js\n```js\nconsole.log('Hello WebGL month');\n```\nLet's grab a reference to canvas and get 2d context\n\n📄 src/canvas2d.js\n```diff\n- console.log('Hello WebGL month');+ console.log('Hello WebGL month');\n+ \n+ const canvas = document.querySelector('canvas');\n+ const ctx = canvas.getContext('2d');\n\n```\nand do smth pretty simple, like drawing a black rectangle\n\n📄 src/canvas2d.js\n```diff\n  \n  const canvas = document.querySelector('canvas');\n  const ctx = canvas.getContext('2d');\n+ \n+ ctx.fillRect(0, 0, 100, 50);\n\n```\nOk, this is pretty simple right?\nBut let's think about what this signle line of code actually did.\nIt filled every pixel inside of rectangle with black color.\n\nAre there any ways to do the same but w/o `fillRect`?\nThe answer is yes\n\n\nLet's implement our own version of\n\n📄 src/canvas2d.js\n```diff\n  const canvas = document.querySelector('canvas');\n  const ctx = canvas.getContext('2d');\n  \n- ctx.fillRect(0, 0, 100, 50);\n+ function fillRect(top, left, width, height) {\n+ \n+ }\n\n```\nSo basically each pixel is just a color encoded in 4 integers. R, G, B channel and Alpha.\nTo store info about each pixel of canvas we'll need a `Uint8ClampedArray`.\nThe size of this array is `canvas.width * canvas.height` (pixels count) `* 4` (each pixel has 4 channels).\n\n📄 src/canvas2d.js\n```diff\n  const ctx = canvas.getContext('2d');\n  \n  function fillRect(top, left, width, height) {\n- \n+     const pixelStore = new Uint8ClampedArray(canvas.width * canvas.height * 4);\n  }\n\n```\nNow we can fill each pixel storage with colors. Note that alpha component is also in  range unlike CSS\n\n📄 src/canvas2d.js\n```diff\n  \n  function fillRect(top, left, width, height) {\n      const pixelStore = new Uint8ClampedArray(canvas.width * canvas.height * 4);\n+ \n+     for (let i = 0; i \u003c pixelStore.length; i += 4) {\n+         pixelStore[i] = 0; // r\n+         pixelStore[i + 1] = 0; // g\n+         pixelStore[i + 2] = 0; // b\n+         pixelStore[i + 3] = 255; // alpha\n+     }\n  }\n\n```\nBut how do we render this pixels? There is a special canvas renderable class\n\n📄 src/canvas2d.js\n```diff\n          pixelStore[i + 2] = 0; // b\n          pixelStore[i + 3] = 255; // alpha\n      }\n+ \n+     const imageData = new ImageData(pixelStore, canvas.width, canvas.height);\n+     ctx.putImageData(imageData, 0, 0);\n  }\n+ \n+ fillRect();\n\n```\nWhoa 🎉 We filled canvas with a color manually iterating over each pixel! But we're not taking into account passed arguments, let's fix it.\n\n\nCalculate pixel indices inside rectangle\n\n📄 src/canvas2d.js\n```diff\n  const canvas = document.querySelector('canvas');\n  const ctx = canvas.getContext('2d');\n  \n+ function calculatePixelIndices(top, left, width, height) {\n+     const pixelIndices = [];\n+ \n+     for (let x = left; x \u003c left + width; x++) {\n+         for (let y = top; y \u003c top + height; y++) {\n+             const i =\n+                 y * canvas.width * 4 + // pixels to skip from top\n+                 x * 4; // pixels to skip from left\n+ \n+             pixelIndices.push(i);\n+         }\n+     }\n+ \n+     return pixelIndices;\n+ }\n+ \n  function fillRect(top, left, width, height) {\n      const pixelStore = new Uint8ClampedArray(canvas.width * canvas.height * 4);\n  \n\n```\nand iterate over these pixels instead of the whole canvas\n\n📄 src/canvas2d.js\n```diff\n  \n  function fillRect(top, left, width, height) {\n      const pixelStore = new Uint8ClampedArray(canvas.width * canvas.height * 4);\n+     \n+     const pixelIndices = calculatePixelIndices(top, left, width, height);\n  \n-     for (let i = 0; i \u003c pixelStore.length; i += 4) {\n+     pixelIndices.forEach((i) =\u003e {\n          pixelStore[i] = 0; // r\n          pixelStore[i + 1] = 0; // g\n          pixelStore[i + 2] = 0; // b\n          pixelStore[i + 3] = 255; // alpha\n-     }\n+     });\n  \n      const imageData = new ImageData(pixelStore, canvas.width, canvas.height);\n      ctx.putImageData(imageData, 0, 0);\n  }\n  \n- fillRect();\n+ fillRect(10, 10, 100, 50);\n\n```\nCool 😎 We've just reimplemented `fillRect`! But what does it have in common with WebGL?\n\n![Everything](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/thanos-everyhting.jpg)\n\nThat's exactly what WebGL API does – _it calculates color of each pixel and fills it with calculated color_\n\n### What's next?\n\nIn next article we'll start working with WebGL API and render a WebGL \"Hello world\". See you tomorrow\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](eepurl.com/gwiSeH)\n\n[Soruce code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\n### Homework\n\nExtend custom `fillRect` to support custom colors\n\n\n## Day 2. Simple shader and triangle\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Soruce code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\n---\n\n[Yesterday](https://dev.to/lesnitsky/webgl-month-day-1-19ha) we've learned what WebGL does – calculates each pixel color inside renderable area. But how does it actually do that?\n\n\nWebGL is an API which works with your GPU to render stuff. While JavaScript is executed by v8 on a CPU, GPU can't execute JavaScript, but it is still programmable\n\nOne of the languages GPU \"understands\" is [GLSL](https://en.wikipedia.org/wiki/OpenGL_Shading_Language), so we'll famialarize ourselves not only with WebGL API, but also with this new language.\n\nGLSL is a C like programming language, so it is easy to learn and write for JavaScript developers.\n\nBut where do we write glsl code? How to pass it to GPU in order to execute?\n\nLet's write some code\n\n\nLet's create a new js file and get a reference to WebGL rendering context\n\n📄 index.html\n```diff\n    \u003c/head\u003e\n    \u003cbody\u003e\n      \u003ccanvas\u003e\u003c/canvas\u003e\n-     \u003cscript src=\"./src/canvas2d.js\"\u003e\u003c/script\u003e\n+     \u003cscript src=\"./src/webgl-hello-world.js\"\u003e\u003c/script\u003e\n    \u003c/body\u003e\n  \u003c/html\u003e\n\n```\n📄 src/webgl-hello-world.js\n```js\nconst canvas = document.querySelector('canvas');\nconst gl = canvas.getContext('webgl');\n\n```\nThe program executable by GPU is created by  method of WebGL rendering context\n\n📄 src/webgl-hello-world.js\n```diff\n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n+ \n+ const program = gl.createProgram();\n\n```\nGPU program consists of two \"functions\"\nThese functions are called `shaders`\nWebGL supports several types of shaders\n\nIn this example we'll work with `vertex` and `fragment` shaders.\nBoth could be created with `createShader` method\n\n📄 src/webgl-hello-world.js\n```diff\n  const gl = canvas.getContext('webgl');\n  \n  const program = gl.createProgram();\n+ \n+ const vertexShader = gl.createShader(gl.VERTEX_SHADER);\n+ const fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);\n\n```\nNow let's write the simpliest possible shader\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  const vertexShader = gl.createShader(gl.VERTEX_SHADER);\n  const fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);\n+ \n+ const vShaderSource = `\n+ void main() {\n+     \n+ }\n+ `;\n\n```\nThis should look pretty familiar to those who has some C/C++ experience\n\n\nUnlike C or C++ `main` doesn't return anyhting, it assignes a value to a global variable `gl_Position` instead\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  const vShaderSource = `\n  void main() {\n-     \n+     gl_Position = vec4(0, 0, 0, 1);\n  }\n  `;\n\n```\nNow let's take a closer look to what is being assigned.\n\nThere is a bunch of functions available in shaders.\n\n`vec4` function creates a vector of 4 components.\n\n`gl_Position = vec4(0, 0, 0, 1);`\n\nLooks weird.. We live in 3-dimensional world, what on earth is the 4th component? Is it `time`? 😕\n\nNot really\n\n[Quote from MDN](https://developer.mozilla.org/en-US/docs/Web/API/WebGL_API/WebGL_model_view_projection#Homogeneous_Coordinates)\n\n\u003e It turns out that this addition allows for lots of nice techniques for manipulating 3D data.\n\u003e A three dimensional point is defined in a typical Cartesian coordinate system. The added 4th dimension changes this point into a homogeneous coordinate. It still represents a point in 3D space and it can easily be demonstrated how to construct this type of coordinate through a pair of simple functions.\n\nFor now we can just ingore the 4th component and set it to `1.0` just because\n\n\nAlright, we have a shader variable, shader source in another variable. How do we connect these two? With\n\n📄 src/webgl-hello-world.js\n```diff\n      gl_Position = vec4(0, 0, 0, 1);\n  }\n  `;\n+ \n+ gl.shaderSource(vertexShader, vShaderSource);\n\n```\nGLSL shader should be compiled in order to be executed\n\n📄 src/webgl-hello-world.js\n```diff\n  `;\n  \n  gl.shaderSource(vertexShader, vShaderSource);\n+ gl.compileShader(vertexShader);\n\n```\nCompilation result could be retreived by . This method returns a \"compiler\" output. If it is an empty string – everyhting is good\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  gl.shaderSource(vertexShader, vShaderSource);\n  gl.compileShader(vertexShader);\n+ \n+ console.log(gl.getShaderInfoLog(vertexShader));\n\n```\nWe'll need to do the same with fragment shader, so let's implement a helper function which we'll use for fragment shader as well\n\n📄 src/webgl-hello-world.js\n```diff\n  }\n  `;\n  \n- gl.shaderSource(vertexShader, vShaderSource);\n- gl.compileShader(vertexShader);\n+ function compileShader(shader, source) {\n+     gl.shaderSource(shader, source);\n+     gl.compileShader(shader);\n  \n- console.log(gl.getShaderInfoLog(vertexShader));\n+     const log = gl.getShaderInfoLog(shader);\n+ \n+     if (log) {\n+         throw new Error(log);\n+     }\n+ }\n+ \n+ compileShader(vertexShader, vShaderSource);\n\n```\nHow does the simpliest fragment shader looks like? Exactly the same\n\n📄 src/webgl-hello-world.js\n```diff\n  }\n  `;\n  \n+ const fShaderSource = `\n+     void main() {\n+         \n+     }\n+ `;\n+ \n  function compileShader(shader, source) {\n      gl.shaderSource(shader, source);\n      gl.compileShader(shader);\n\n```\nComputation result of a fragment shader is a color, which is also a vector of 4 components (r, g, b, a). Unlike CSS, values are in range of `[0..1]` instead of `[0..255]`. Fragment shader computation result should be assigned to the variable `gl_FragColor`\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  const fShaderSource = `\n      void main() {\n-         \n+         gl_FragColor = vec4(1, 0, 0, 1);\n      }\n  `;\n  \n  }\n  \n  compileShader(vertexShader, vShaderSource);\n+ compileShader(fragmentShader, fShaderSource);\n\n```\nNow we should connect `program` with our shaders\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  compileShader(vertexShader, vShaderSource);\n  compileShader(fragmentShader, fShaderSource);\n+ \n+ gl.attachShader(program, vertexShader);\n+ gl.attachShader(program, fragmentShader);\n\n```\nNext step – link program. This phase is required to verify if vertex and fragment shaders are compatible with each other (we'll get to more details later)\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  gl.attachShader(program, vertexShader);\n  gl.attachShader(program, fragmentShader);\n+ \n+ gl.linkProgram(program);\n\n```\nOur application could have several programs, so we should tell gpu which program we want to use before issuing a draw call\n\n📄 src/webgl-hello-world.js\n```diff\n  gl.attachShader(program, fragmentShader);\n  \n  gl.linkProgram(program);\n+ \n+ gl.useProgram(program);\n\n```\nOk, we're ready to draw something\n\n📄 src/webgl-hello-world.js\n```diff\n  gl.linkProgram(program);\n  \n  gl.useProgram(program);\n+ \n+ gl.drawArrays();\n\n```\nWebGL can render several types of \"primitives\"\n\n-   Points\n-   Lines\n-   Triangels\n\nWe should pass a primitive type we want to render\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  gl.useProgram(program);\n  \n- gl.drawArrays();\n+ gl.drawArrays(gl.POINTS);\n\n```\nThere is a way to pass input data containing info about positions of our primitives to vertex shader, so we need to pass the index of the first primitive we want to render\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  gl.useProgram(program);\n  \n- gl.drawArrays(gl.POINTS);\n+ gl.drawArrays(gl.POINTS, 0);\n\n```\nand primitives count\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  gl.useProgram(program);\n  \n- gl.drawArrays(gl.POINTS, 0);\n+ gl.drawArrays(gl.POINTS, 0, 1);\n\n```\nNothing rendered 😢\nWhat is wrong?\n\nActually to render point, we should also specify a point size inside vertex shader\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  const vShaderSource = `\n  void main() {\n+     gl_PointSize = 20.0;\n      gl_Position = vec4(0, 0, 0, 1);\n  }\n  `;\n\n```\nWhoa 🎉 We have a point!\n\n![WebGL Point](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/webgl-point.png)\n\nIt is rendered in the center of the canvas because `gl_Position` is `vec4(0, 0, 0, 1)` =\u003e `x == 0` and `y == 0`\nWebGL coordinate system is different from `canvas2d`\n\n`canvas2d`\n\n```\n0.0\n-----------------------→ width (px)\n|\n|\n|\n↓\nheight (px)\n```\n\n`webgl`\n\n```\n                    (0, 1)\n                      ↑\n                      |\n                      |\n                      |\n(-1, 0) ------ (0, 0)-·---------\u003e (1, 0)\n                      |\n                      |\n                      |\n                      |\n                    (0, -1)\n```\n\n\nNow let's pass point coordinate from JS instead of hardcoding it inside shader\n\nInput data of vertex shader is called `attribute`\nLet's define `position` attribute\n\n📄 src/webgl-hello-world.js\n```diff\n  const fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);\n  \n  const vShaderSource = `\n+ attribute vec2 position;\n+ \n  void main() {\n      gl_PointSize = 20.0;\n-     gl_Position = vec4(0, 0, 0, 1);\n+     gl_Position = vec4(position.x, position.y, 0, 1);\n  }\n  `;\n  \n\n```\nIn order to fill attribute with data we need to get attribute location. Think of it as of unique identifier of attribute in javascript world\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  gl.useProgram(program);\n  \n+ const positionPointer = gl.getAttribLocation(program, 'position');\n+ \n  gl.drawArrays(gl.POINTS, 0, 1);\n\n```\nGPU accepts only typed arrays as input, so let's define a `Float32Array` as a storage of our point position\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  const positionPointer = gl.getAttribLocation(program, 'position');\n  \n+ const positionData = new Float32Array([0, 0]);\n+ \n  gl.drawArrays(gl.POINTS, 0, 1);\n\n```\nBut this array couldn't be passed to GPU as-is, GPU should have it's own buffer.\nThere are different kinds of \"buffers\" in GPU world, in this case we need `ARRAY_BUFFER`\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  const positionData = new Float32Array([0, 0]);\n  \n+ const positionBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n+ \n  gl.drawArrays(gl.POINTS, 0, 1);\n\n```\nTo make any changes to GPU buffers, we need to \"bind\" it. After buffer is bound, it is treated as \"current\", and any buffer modification operation will be performed on \"current\" buffer.\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  const positionBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n  \n+ gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);\n+ \n  gl.drawArrays(gl.POINTS, 0, 1);\n\n```\nTo fill buffer with some data, we need to call `bufferData` method\n\n📄 src/webgl-hello-world.js\n```diff\n  const positionBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n  \n  gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);\n+ gl.bufferData(gl.ARRAY_BUFFER, positionData);\n  \n  gl.drawArrays(gl.POINTS, 0, 1);\n\n```\nTo optimize buffer operations (memory management) on GPU side, we should pass a \"hint\" to GPU indicating how this buffer will be used. [There are several ways to use buffers](https://developer.mozilla.org/en-US/docs/Web/API/WebGLRenderingContext/bufferData#Parameters)\n\n-   `gl.STATIC_DRAW`: Contents of the buffer are likely to be used often and not change often. Contents are written to the buffer, but not read.\n-   `gl.DYNAMIC_DRAW`: Contents of the buffer are likely to be used often and change often. Contents are written to the buffer, but not read.\n-   `gl.STREAM_DRAW`: Contents of the buffer are likely to not be used often. Contents are written to the buffer, but not read.\n\n    When using a WebGL 2 context, the following values are available additionally:\n\n-   `gl.STATIC_READ`: Contents of the buffer are likely to be used often and not change often. Contents are read from the buffer, but not written.\n-   `gl.DYNAMIC_READ`: Contents of the buffer are likely to be used often and change often. Contents are read from the buffer, but not written.\n-   `gl.STREAM_READ`: Contents of the buffer are likely to not be used often. Contents are read from the buffer, but not written.\n-   `gl.STATIC_COPY`: Contents of the buffer are likely to be used often and not change often. Contents are neither written or read by the user.\n-   `gl.DYNAMIC_COPY`: Contents of the buffer are likely to be used often and change often. Contents are neither written or read by the user.\n-   `gl.STREAM_COPY`: Contents of the buffer are likely to be used often and not change often. Contents are neither written or read by the user.\n\n📄 src/webgl-hello-world.js\n```diff\n  const positionBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n  \n  gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);\n- gl.bufferData(gl.ARRAY_BUFFER, positionData);\n+ gl.bufferData(gl.ARRAY_BUFFER, positionData, gl.STATIC_DRAW);\n  \n  gl.drawArrays(gl.POINTS, 0, 1);\n\n```\nNow we need to tell GPU how it should read the data from our buffer\n\nRequired info:\n\nAttribute size (2 in case of `vec2`, 3 in case of `vec3` etc)\n\n📄 src/webgl-hello-world.js\n```diff\n  gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);\n  gl.bufferData(gl.ARRAY_BUFFER, positionData, gl.STATIC_DRAW);\n  \n+ const attributeSize = 2;\n+ \n  gl.drawArrays(gl.POINTS, 0, 1);\n\n```\ntype of data in buffer\n\n📄 src/webgl-hello-world.js\n```diff\n  gl.bufferData(gl.ARRAY_BUFFER, positionData, gl.STATIC_DRAW);\n  \n  const attributeSize = 2;\n+ const type = gl.FLOAT;\n  \n  gl.drawArrays(gl.POINTS, 0, 1);\n\n```\nnormalized – indicates if data values should be clamped to a certain range\n\nfor `gl.BYTE` and `gl.SHORT`, clamps the values to `[-1, 1]` if true\n\nfor `gl.UNSIGNED_BYTE` and `gl.UNSIGNED_SHORT`, clamps the values to `[0, 1]` if true\n\nfor types `gl.FLOAT` and `gl.HALF_FLOAT`, this parameter has no effect.\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  const attributeSize = 2;\n  const type = gl.FLOAT;\n+ const nomralized = false;\n  \n  gl.drawArrays(gl.POINTS, 0, 1);\n\n```\nWe'll talk about these two later 😉\n\n📄 src/webgl-hello-world.js\n```diff\n  const attributeSize = 2;\n  const type = gl.FLOAT;\n  const nomralized = false;\n+ const stride = 0;\n+ const offset = 0;\n  \n  gl.drawArrays(gl.POINTS, 0, 1);\n\n```\nNow we need to call `vertexAttribPointer` to setup our `position` attribute\n\n📄 src/webgl-hello-world.js\n```diff\n  const stride = 0;\n  const offset = 0;\n  \n+ gl.vertexAttribPointer(positionPointer, attributeSize, type, nomralized, stride, offset);\n+ \n  gl.drawArrays(gl.POINTS, 0, 1);\n\n```\nLet's try to change a position of the point\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  const positionPointer = gl.getAttribLocation(program, 'position');\n  \n- const positionData = new Float32Array([0, 0]);\n+ const positionData = new Float32Array([1.0, 0.0]);\n  \n  const positionBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n  \n\n```\nNothing changed 😢 But why?\n\nTurns out – all attributes are disabled by default (filled with 0), so we need to `enable` our position attrbiute\n\n📄 src/webgl-hello-world.js\n```diff\n  const stride = 0;\n  const offset = 0;\n  \n+ gl.enableVertexAttribArray(positionPointer);\n  gl.vertexAttribPointer(positionPointer, attributeSize, type, nomralized, stride, offset);\n  \n  gl.drawArrays(gl.POINTS, 0, 1);\n\n```\nNow we can render more points!\nLet's mark every corner of a canvas with a point\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  const positionPointer = gl.getAttribLocation(program, 'position');\n  \n- const positionData = new Float32Array([1.0, 0.0]);\n+ const positionData = new Float32Array([\n+     -1.0, // point 1 x\n+     -1.0, // point 1 y\n+ \n+     1.0, // point 2 x\n+     1.0, // point 2 y\n+ \n+     -1.0, // point 3 x\n+     1.0, // point 3 y\n+ \n+     1.0, // point 4 x\n+     -1.0, // point 4 y\n+ ]);\n  \n  const positionBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n  \n  gl.enableVertexAttribArray(positionPointer);\n  gl.vertexAttribPointer(positionPointer, attributeSize, type, nomralized, stride, offset);\n  \n- gl.drawArrays(gl.POINTS, 0, 1);\n+ gl.drawArrays(gl.POINTS, 0, positionData.length / 2);\n\n```\nLet's get back to our shader\n\nWe don't necessarily need to explicitly pass `position.x` and `position.y` to a `vec4` constructor, there is a `vec4(vec2, float, float)` override\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  void main() {\n      gl_PointSize = 20.0;\n-     gl_Position = vec4(position.x, position.y, 0, 1);\n+     gl_Position = vec4(position, 0, 1);\n  }\n  `;\n  \n  const positionPointer = gl.getAttribLocation(program, 'position');\n  \n  const positionData = new Float32Array([\n-     -1.0, // point 1 x\n-     -1.0, // point 1 y\n+     -1.0, // top left x\n+     -1.0, // top left y\n  \n      1.0, // point 2 x\n      1.0, // point 2 y\n\n```\nNow let's move all points closer to the center by dividing each position by 2.0\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  void main() {\n      gl_PointSize = 20.0;\n-     gl_Position = vec4(position, 0, 1);\n+     gl_Position = vec4(position / 2.0, 0, 1);\n  }\n  `;\n  \n\n```\nResult:\n\n![Result](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/4points.png)\n\n### Conclusion\n\nWe now have a better understanding of how does GPU and WebGL work and can render something very basic\nWe'll explore more primitive types tomorrow!\n\n### Homework\n\nRender a `Math.cos` graph with dots\nHint: all you need is fill `positionData` with valid values\n\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Soruce code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 3. Shader uniforms, lines and triangles\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Soruce code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\n[Yesterday](https://dev.to/lesnitsky/shaders-and-points-3h2c) we draw the simplies primitive possible – point. Let's first solve the \"homework\"\n\n\nWe need to remove hardcoded points data\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  const positionPointer = gl.getAttribLocation(program, 'position');\n  \n- const positionData = new Float32Array([\n-     -1.0, // top left x\n-     -1.0, // top left y\n- \n-     1.0, // point 2 x\n-     1.0, // point 2 y\n- \n-     -1.0, // point 3 x\n-     1.0, // point 3 y\n- \n-     1.0, // point 4 x\n-     -1.0, // point 4 y\n- ]);\n+ const points = [];\n+ const positionData = new Float32Array(points);\n  \n  const positionBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n  \n\n```\nIterate over each vertical line of pixels of canvas `[0..width]`\n\n📄 src/webgl-hello-world.js\n```diff\n  const positionPointer = gl.getAttribLocation(program, 'position');\n  \n  const points = [];\n+ \n+ for (let i = 0; i \u003c canvas.width; i++) {\n+ \n+ }\n+ \n  const positionData = new Float32Array(points);\n  \n  const positionBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n\n```\nTransform value from `[0..width]` to `[-1..1]` (remember webgl coordinat grid? this is left most and right most coordinates)\n\n📄 src/webgl-hello-world.js\n```diff\n  const points = [];\n  \n  for (let i = 0; i \u003c canvas.width; i++) {\n- \n+     const x = i / canvas.width * 2 - 1;\n  }\n  \n  const positionData = new Float32Array(points);\n\n```\nCalculate `cos` and add both x and y to `points` array\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  for (let i = 0; i \u003c canvas.width; i++) {\n      const x = i / canvas.width * 2 - 1;\n+     const y = Math.cos(x * Math.PI);\n+ \n+     points.push(x, y);\n  }\n  \n  const positionData = new Float32Array(points);\n\n```\nGraph looks a bit weird, let's fix our vertex shader\n\n📄 src/webgl-hello-world.js\n```diff\n  attribute vec2 position;\n  \n  void main() {\n-     gl_PointSize = 20.0;\n-     gl_Position = vec4(position / 2.0, 0, 1);\n+     gl_PointSize = 2.0;\n+     gl_Position = vec4(position, 0, 1);\n  }\n  `;\n  \n\n```\nNiiiice 😎 We now have fancy cos graph!\n\n![Cos graph](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/cos-graph.png)\n\n\nWe calculated `cos` with JavaScript, but if we need to calculate something for a large dataset, javascript may block rendering thread. Why won't facilitate computation power of GPU (cos will be calculated for each point in parallel).\n\nGLSL doesn't have `Math` namespace, so we'll need to define `M_PI` variable\n`cos` function is there though 😏\n\n📄 src/webgl-hello-world.js\n```diff\n  const vShaderSource = `\n  attribute vec2 position;\n  \n+ #define M_PI 3.1415926535897932384626433832795\n+ \n  void main() {\n      gl_PointSize = 2.0;\n-     gl_Position = vec4(position, 0, 1);\n+     gl_Position = vec4(position.x, cos(position.y * M_PI), 0, 1);\n  }\n  `;\n  \n  \n  for (let i = 0; i \u003c canvas.width; i++) {\n      const x = i / canvas.width * 2 - 1;\n-     const y = Math.cos(x * Math.PI);\n- \n-     points.push(x, y);\n+     points.push(x, x);\n  }\n  \n  const positionData = new Float32Array(points);\n\n```\nWe have another JavaScript computation inside cycle where we transform pixel coordinates to `[-1..1]` range\nHow do we move this to GPU?\nWe've learned that we can pass some data to a shader with `attribute`, but `width` is constant, it doesn't change between points.\n\nThere is a special kind of variables – `uniforms`. Treat uniform as a global variable which can be assigned only once before draw call and stays the same for all \"points\"\n\n\nLet's define a `uniform`\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  const vShaderSource = `\n  attribute vec2 position;\n+ uniform float width;\n  \n  #define M_PI 3.1415926535897932384626433832795\n  \n\n```\nTo assign a value to a uniform, we'll need to do smth similar to what we did with attribute. We need to get location of the uniform.\n\n📄 src/webgl-hello-world.js\n```diff\n  gl.useProgram(program);\n  \n  const positionPointer = gl.getAttribLocation(program, 'position');\n+ const widthUniformLocation = gl.getUniformLocation(program, 'width');\n  \n  const points = [];\n  \n\n```\nThere's a bunch of methods which can assign different types of values to uniforms\n\n* `gl.uniform1f` – assigns a number to a float uniform (`gl.uniform1f(0.0)`)\n* `gl.uniform1fv` – assigns an array of length 1 to a float uniform (`gl.uniform1fv([0.0])`)\n* `gl.uniform2f` - assigns two numbers to a vec2 uniform (`gl.uniform2f(0.0, 1.0)`)\n* `gl.uniform2f` - assigns an array of length 2 to a vec2 uniform (`gl.uniform2fv([0.0, 1.0])`)\n\netc\n\n📄 src/webgl-hello-world.js\n```diff\n  const positionPointer = gl.getAttribLocation(program, 'position');\n  const widthUniformLocation = gl.getUniformLocation(program, 'width');\n  \n+ gl.uniform1f(widthUniformLocation, canvas.width);\n+ \n  const points = [];\n  \n  for (let i = 0; i \u003c canvas.width; i++) {\n\n```\nAnd finally let's move our js computation to a shader\n\n📄 src/webgl-hello-world.js\n```diff\n  #define M_PI 3.1415926535897932384626433832795\n  \n  void main() {\n+     float x = position.x / width * 2.0 - 1.0;\n      gl_PointSize = 2.0;\n-     gl_Position = vec4(position.x, cos(position.y * M_PI), 0, 1);\n+     gl_Position = vec4(x, cos(x * M_PI), 0, 1);\n  }\n  `;\n  \n  const points = [];\n  \n  for (let i = 0; i \u003c canvas.width; i++) {\n-     const x = i / canvas.width * 2 - 1;\n-     points.push(x, x);\n+     points.push(i, i);\n  }\n  \n  const positionData = new Float32Array(points);\n\n```\n### Rendering lines\n\nNow let's try to render lines\n\nWe need to fill our position data with line starting and ending point coordinates\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  gl.uniform1f(widthUniformLocation, canvas.width);\n  \n- const points = [];\n+ const lines = [];\n+ let prevLineY = 0;\n  \n- for (let i = 0; i \u003c canvas.width; i++) {\n-     points.push(i, i);\n+ for (let i = 0; i \u003c canvas.width - 5; i += 5) {\n+     lines.push(i, prevLineY);\n+     const y =  Math.random() * canvas.height;\n+     lines.push(i + 5, y);\n+ \n+     prevLineY = y;\n  }\n  \n- const positionData = new Float32Array(points);\n+ const positionData = new Float32Array(lines);\n  \n  const positionBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n  \n\n```\nWe'll also need to transform `y` to a WebGL clipspace, so let's pass a resolution of canvas, not just width\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  const vShaderSource = `\n  attribute vec2 position;\n- uniform float width;\n+ uniform vec2 resolution;\n  \n  #define M_PI 3.1415926535897932384626433832795\n  \n  void main() {\n-     float x = position.x / width * 2.0 - 1.0;\n+     vec2 transformedPosition = position / resolution * 2.0 - 1.0;\n      gl_PointSize = 2.0;\n-     gl_Position = vec4(x, cos(x * M_PI), 0, 1);\n+     gl_Position = vec4(transformedPosition, 0, 1);\n  }\n  `;\n  \n  gl.useProgram(program);\n  \n  const positionPointer = gl.getAttribLocation(program, 'position');\n- const widthUniformLocation = gl.getUniformLocation(program, 'width');\n+ const resolutionUniformLocation = gl.getUniformLocation(program, 'resolution');\n  \n- gl.uniform1f(widthUniformLocation, canvas.width);\n+ gl.uniform2fv(resolutionUniformLocation, [canvas.width, canvas.height]);\n  \n  const lines = [];\n  let prevLineY = 0;\n\n```\nThe final thing – we need to change primitive type to `gl.LINES`\n\n📄 src/webgl-hello-world.js\n```diff\n  gl.enableVertexAttribArray(positionPointer);\n  gl.vertexAttribPointer(positionPointer, attributeSize, type, nomralized, stride, offset);\n  \n- gl.drawArrays(gl.POINTS, 0, positionData.length / 2);\n+ gl.drawArrays(gl.LINES, 0, positionData.length / 2);\n\n```\nCool! We can render lines now 👍\n\n![Lines](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/line-graph.png)\n\nLet's try to make the line a bit thicker\n\n\nUnlike point size, line width should be set from javascript. There is a method `gl.lineWidth(width)`\n\nLet's try to use it\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);\n  gl.bufferData(gl.ARRAY_BUFFER, positionData, gl.STATIC_DRAW);\n+ gl.lineWidth(10);\n  \n  const attributeSize = 2;\n  const type = gl.FLOAT;\n\n```\nNothing changed 😢 But why??\n\nThat's why 😂\n\n![Line browser support](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/line-width-support.png)\n\nNobody cares.\n\nSo if you need a fancy line with custom line cap – `gl.LINES` is not for you\n\n\nBut how do we render fancy line?\n\nTurns out – everything could be rendered with help of next WebGL primitive – triangle.\nThis is the last primitive which could be rendered with WebGL\n\nBuilding a line of custom width from triangle might seem like a tough task, but don't worry, there are a lot of packages that could help you render custom 2d shapes (and even svg)\n\nSome of these tools:\n\n- [svg-path-contours](https://github.com/mattdesl/svg-path-contours)\n- [cdt2d](https://www.npmjs.com/package/cdt2d)\n- [adaptive-bezier-curve](https://www.npmjs.com/package/adaptive-bezier-curve)\n\nand others\n\nFrom now on, remember: EVERYTHING, could be built with triangles and that's how rendering works\n\n1. Input – triangle vertices\n2. vertex shader – transform vertices to webgl clipspace\n3. Rasterization – calculate which pixels are inside of certain triangle\n4. Calculate color of each pixel\n\nHere's an illustration of this process from [https://opentechschool-brussels.github.io/intro-to-webGL-and-shaders/log1_graphic-pipeline](https://opentechschool-brussels.github.io/intro-to-webGL-and-shaders/log1_graphic-pipeline)\n\n![WebGL pipeline](https://opentechschool-brussels.github.io/intro-to-webGL-and-shaders/assets/log1_graphicPipeline.jpg)\n\n\u003e Disclamer: this is a simplified version of what's going on under the hood, [read this](https://www.khronos.org/opengl/wiki/Rendering_Pipeline_Overview) for more detailed explanation\n\n\nSo lets finally render a triangle\n\nAgain – we need to update our position data\n\n\nand change primitive type\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  gl.uniform2fv(resolutionUniformLocation, [canvas.width, canvas.height]);\n  \n- const lines = [];\n- let prevLineY = 0;\n+ const triangles = [\n+     0, 0, // v1 (x, y)\n+     canvas.width / 2, canvas.height, // v2 (x, y)\n+     canvas.width, 0, // v3 (x, y)\n+ ];\n  \n- for (let i = 0; i \u003c canvas.width - 5; i += 5) {\n-     lines.push(i, prevLineY);\n-     const y =  Math.random() * canvas.height;\n-     lines.push(i + 5, y);\n- \n-     prevLineY = y;\n- }\n- \n- const positionData = new Float32Array(lines);\n+ const positionData = new Float32Array(triangles);\n  \n  const positionBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n  \n  gl.enableVertexAttribArray(positionPointer);\n  gl.vertexAttribPointer(positionPointer, attributeSize, type, nomralized, stride, offset);\n  \n- gl.drawArrays(gl.LINES, 0, positionData.length / 2);\n+ gl.drawArrays(gl.TRIANGLES, 0, positionData.length / 2);\n\n```\nAnd one more thing... Let's pass a color from javascript instead of hardcoding it inside fragment shader.\n\nWe'll need to go through the same steps as for resolution uniform, but declare this uniform in fragment shader\n\n📄 src/webgl-hello-world.js\n```diff\n  `;\n  \n  const fShaderSource = `\n+     uniform vec4 color;\n+ \n      void main() {\n-         gl_FragColor = vec4(1, 0, 0, 1);\n+         gl_FragColor = color / 255.0;\n      }\n  `;\n  \n  \n  const positionPointer = gl.getAttribLocation(program, 'position');\n  const resolutionUniformLocation = gl.getUniformLocation(program, 'resolution');\n+ const colorUniformLocation = gl.getUniformLocation(program, 'color');\n  \n  gl.uniform2fv(resolutionUniformLocation, [canvas.width, canvas.height]);\n+ gl.uniform4fv(colorUniformLocation, [255, 0, 0, 255]);\n  \n  const triangles = [\n      0, 0, // v1 (x, y)\n\n```\nWait, what? An Error 🛑 😱\n\n```\nNo precision specified for (float)\n```\n\nWhat is that?\n\nTurns out that glsl shaders support different precision of float and you need to specify it.\nUsually `mediump` is both performant and precise, but sometimes you might want to use `lowp` or `highp`. But be careful, `highp` is not supported by some mobile GPUs and there is no guarantee you won't get any weird rendering artifacts withh high precesion\n\n📄 src/webgl-hello-world.js\n```diff\n  `;\n  \n  const fShaderSource = `\n+     precision mediump float;\n      uniform vec4 color;\n  \n      void main() {\n\n```\n### Homework\n\nRender different shapes using triangles:\n\n* rectangle\n* hexagon\n* circle\n\n\nSee you tomorrow 👋\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Soruce code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 4. Shader varyings\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Soruce code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\n\n[Yesterday](https://dev.to/lesnitsky/webgl-month-day-3-shader-uniforms-lines-and-triangles-5dof) we learned how to render lines and triangles, so let's get started with the homework\n\nHow do we draw a rectangle if webgl can only render triangles? We should split a rectangle into two triangles\n\n```\n-------\n|    /|\n|  /  |\n|/    |\n-------\n```\n\nPretty simple, right?\n\n\nLet's define the coordinates of triangle vertices\n\n📄 src/webgl-hello-world.js\n```diff\n  gl.uniform4fv(colorUniformLocation, [255, 0, 0, 255]);\n  \n  const triangles = [\n-     0, 0, // v1 (x, y)\n-     canvas.width / 2, canvas.height, // v2 (x, y)\n-     canvas.width, 0, // v3 (x, y)\n+     // first triangle\n+     0, 150, // top left\n+     150, 150, // top right\n+     0, 0, // bottom left\n+     \n+     // second triangle\n+     0, 0, // bottom left\n+     150, 150, // top right\n+     150, 0, // bottom right\n  ];\n  \n  const positionData = new Float32Array(triangles);\n\n```\n![Rectangle](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/rectangle.png)\n\nGreat, we can render rectangles now!\n\n\nNow let's draw a hexagon. This is somewhat harder to draw manually, so let's create a helper function\n\n📄 src/webgl-hello-world.js\n```diff\n      150, 0, // bottom right\n  ];\n  \n+ function createHexagon(center, radius, segmentsCount) {\n+     \n+ }\n+ \n  const positionData = new Float32Array(triangles);\n  \n  const positionBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n\n```\nWe need to iterate over (360 - segment angle) degrees with a step of a signle segment angle\n\n📄 src/webgl-hello-world.js\n```diff\n  gl.uniform2fv(resolutionUniformLocation, [canvas.width, canvas.height]);\n  gl.uniform4fv(colorUniformLocation, [255, 0, 0, 255]);\n  \n- const triangles = [\n-     // first triangle\n-     0, 150, // top left\n-     150, 150, // top right\n-     0, 0, // bottom left\n-     \n-     // second triangle\n-     0, 0, // bottom left\n-     150, 150, // top right\n-     150, 0, // bottom right\n- ];\n- \n- function createHexagon(center, radius, segmentsCount) {\n-     \n+ const triangles = [createHexagon()];\n+ \n+ function createHexagon(centerX, centerY, radius, segmentsCount) {\n+     const vertices = [];\n+ \n+     for (let i = 0; i \u003c Math.PI * 2; i += Math.PI * 2 / (segmentsCount - 1)) {\n+         \n+     }\n+ \n+     return vertices;\n  }\n  \n  const positionData = new Float32Array(triangles);\n\n```\nAnd apply some simple school math\n\n![Hexagon](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/hexagon.png)\n\n📄 src/webgl-hello-world.js\n```diff\n  gl.uniform2fv(resolutionUniformLocation, [canvas.width, canvas.height]);\n  gl.uniform4fv(colorUniformLocation, [255, 0, 0, 255]);\n  \n- const triangles = [createHexagon()];\n+ const triangles = createHexagon(canvas.width / 2, canvas.height / 2, canvas.height / 2, 6);\n  \n  function createHexagon(centerX, centerY, radius, segmentsCount) {\n      const vertices = [];\n+     const segmentAngle =  Math.PI * 2 / (segmentsCount - 1);\n  \n-     for (let i = 0; i \u003c Math.PI * 2; i += Math.PI * 2 / (segmentsCount - 1)) {\n-         \n+     for (let i = 0; i \u003c Math.PI * 2; i += segmentAngle) {\n+         const from = i;\n+         const to = i + segmentAngle;\n+ \n+         vertices.push(centerX, centerY);\n+         vertices.push(centerX + Math.cos(from) * radius, centerY + Math.sin(from) * radius);\n+         vertices.push(centerX + Math.cos(to) * radius, centerY + Math.sin(to) * radius);\n      }\n  \n      return vertices;\n\n```\nNow how do we render circle?\nActually a circle can be built with the same function, we just need to increase the number of \"segments\"\n\n📄 src/webgl-hello-world.js\n```diff\n  gl.uniform2fv(resolutionUniformLocation, [canvas.width, canvas.height]);\n  gl.uniform4fv(colorUniformLocation, [255, 0, 0, 255]);\n  \n- const triangles = createHexagon(canvas.width / 2, canvas.height / 2, canvas.height / 2, 6);\n+ const triangles = createHexagon(canvas.width / 2, canvas.height / 2, canvas.height / 2, 360);\n  \n  function createHexagon(centerX, centerY, radius, segmentsCount) {\n      const vertices = [];\n\n```\n![Circle](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/circle.png)\n\n\n## Varyings\n\nOk, what next? Let's add some color 🎨\nAs we already know, we can pass a color to a fragment shader via `uniform`\nBut that's not the only way.\nVertex shader can pass a `varying` to a fragment shader for each vertex, and the value will be interpolated\n\nSounds a bit complicated, let's see how it works\n\n\nWe need to define a `varying` in both vertex and fragment shaders.\nMake sure type matches. If e.g. varying will be `vec3` in vertex shader and `vec4` in fragment shader – `gl.linkProgram(program)` will fail. You can check if program was successfully linked with `gl.getProgramParameter(program, gl.LINK_STATUS)` and if it is false – `gl.getProgramInfoLog(program)` to see what went wrang\n\n📄 src/webgl-hello-world.js\n```diff\n  attribute vec2 position;\n  uniform vec2 resolution;\n  \n+ varying vec4 vColor;\n+ \n  #define M_PI 3.1415926535897932384626433832795\n  \n  void main() {\n      vec2 transformedPosition = position / resolution * 2.0 - 1.0;\n      gl_PointSize = 2.0;\n      gl_Position = vec4(transformedPosition, 0, 1);\n+ \n+     vColor = vec4(255, 0, 0, 255);\n  }\n  `;\n  \n  const fShaderSource = `\n      precision mediump float;\n-     uniform vec4 color;\n+ \n+     varying vec4 vColor;\n  \n      void main() {\n-         gl_FragColor = color / 255.0;\n+         gl_FragColor = vColor / 255.0;\n      }\n  `;\n  \n  \n  const positionPointer = gl.getAttribLocation(program, 'position');\n  const resolutionUniformLocation = gl.getUniformLocation(program, 'resolution');\n- const colorUniformLocation = gl.getUniformLocation(program, 'color');\n  \n  gl.uniform2fv(resolutionUniformLocation, [canvas.width, canvas.height]);\n- gl.uniform4fv(colorUniformLocation, [255, 0, 0, 255]);\n  \n  const triangles = createHexagon(canvas.width / 2, canvas.height / 2, canvas.height / 2, 360);\n  \n\n```\nNow let's try to colorize our circle based on `gl_Position`\n\n📄 src/webgl-hello-world.js\n```diff\n      gl_PointSize = 2.0;\n      gl_Position = vec4(transformedPosition, 0, 1);\n  \n-     vColor = vec4(255, 0, 0, 255);\n+     vColor = vec4((gl_Position.xy + 1.0 / 2.0) * 255.0, 0, 255);\n  }\n  `;\n  \n\n```\n![Colorized circle](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/colorized-circle.png)\n\nLooks cool, right?\n\nBut how do we pass some specific colors from js?\n\n\nWe need to create another attribute\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  const vShaderSource = `\n  attribute vec2 position;\n+ attribute vec4 color;\n  uniform vec2 resolution;\n  \n  varying vec4 vColor;\n      gl_PointSize = 2.0;\n      gl_Position = vec4(transformedPosition, 0, 1);\n  \n-     vColor = vec4((gl_Position.xy + 1.0 / 2.0) * 255.0, 0, 255);\n+     vColor = color;\n  }\n  `;\n  \n  \n  gl.useProgram(program);\n  \n- const positionPointer = gl.getAttribLocation(program, 'position');\n+ const positionLocation = gl.getAttribLocation(program, 'position');\n+ const colorLocation = gl.getAttribLocation(program, 'color');\n+ \n  const resolutionUniformLocation = gl.getUniformLocation(program, 'resolution');\n  \n  gl.uniform2fv(resolutionUniformLocation, [canvas.width, canvas.height]);\n  const stride = 0;\n  const offset = 0;\n  \n- gl.enableVertexAttribArray(positionPointer);\n- gl.vertexAttribPointer(positionPointer, attributeSize, type, nomralized, stride, offset);\n+ gl.enableVertexAttribArray(positionLocation);\n+ gl.vertexAttribPointer(positionLocation, attributeSize, type, nomralized, stride, offset);\n  \n  gl.drawArrays(gl.TRIANGLES, 0, positionData.length / 2);\n\n```\nSetup buffer for this attribute\n\n📄 src/webgl-hello-world.js\n```diff\n  }\n  \n  const positionData = new Float32Array(triangles);\n+ const colorData = new Float32Array(colors);\n  \n  const positionBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n+ const colorBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n+ \n+ gl.bindBuffer(gl.ARRAY_BUFFER, colorBuffer);\n+ gl.bufferData(gl.ARRAY_BUFFER, colorData, gl.STATIC_DRAW);\n  \n  gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);\n  gl.bufferData(gl.ARRAY_BUFFER, positionData, gl.STATIC_DRAW);\n\n```\nFill buffer with data\n\n📄 src/webgl-hello-world.js\n```diff\n  gl.uniform2fv(resolutionUniformLocation, [canvas.width, canvas.height]);\n  \n  const triangles = createHexagon(canvas.width / 2, canvas.height / 2, canvas.height / 2, 360);\n+ const colors = fillWithColors(360);\n  \n  function createHexagon(centerX, centerY, radius, segmentsCount) {\n      const vertices = [];\n      return vertices;\n  }\n  \n+ function fillWithColors(segmentsCount) {\n+     const colors = [];\n+ \n+     for (let i = 0; i \u003c segmentsCount; i++) {\n+         for (let j = 0; j \u003c 3; j++) {\n+             if (j == 0) { // vertex in center of circle\n+                 colors.push(0, 0, 0, 255);\n+             } else {\n+                 colors.push(i / 360 * 255, 0, 0, 255);\n+             }\n+         }\n+     }\n+ \n+     return colors;\n+ }\n+ \n  const positionData = new Float32Array(triangles);\n  const colorData = new Float32Array(colors);\n  \n\n```\nAnd setup the attribute pointer (the way how attribute reads data from the buffer).\n\n📄 src/webgl-hello-world.js\n```diff\n  gl.enableVertexAttribArray(positionLocation);\n  gl.vertexAttribPointer(positionLocation, attributeSize, type, nomralized, stride, offset);\n  \n+ gl.bindBuffer(gl.ARRAY_BUFFER, colorBuffer);\n+ \n+ gl.enableVertexAttribArray(colorLocation);\n+ gl.vertexAttribPointer(colorLocation, 4, type, nomralized, stride, offset);\n+ \n  gl.drawArrays(gl.TRIANGLES, 0, positionData.length / 2);\n\n```\nNotice this `gl.bindBuffer` before attribute related calls. `gl.vertexAttribPointer` points attribute to a buffer which wa most recently bound, don't forget this step, this is a common mistake\n\n\n![Colored circle](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/colored-circle-2.png)\n\n### Conclusion\n\nWe've learned another way to pass data to a fragment shader.\nThis is useful for per vertex colors and textures (we'll work with textures later)\n\n### Homework\n\nRender a 7-gon and colorize each triangle with colors of rainbow 🌈\n\nSee you tomorrow 👋\n\n---\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Soruce code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 5. Interleaved buffers\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\n\nHey 👋 Welcome to a WebGL month. [Yesterday](https://dev.to/lesnitsky/shader-varyings-2p0f) we've learned how to use varyings. Today we're going to explore one more concept, but let's solve a homework from yesterday first\n\n\nWe need to define raingbow colors first\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  gl.uniform2fv(resolutionUniformLocation, [canvas.width, canvas.height]);\n  \n+ const rainbowColors = [\n+     [255, 0.0, 0.0, 255], // red\n+     [255, 165, 0.0, 255], // orange\n+     [255, 255, 0.0, 255], // yellow\n+     [0.0, 255, 0.0, 255], // green\n+     [0.0, 101, 255, 255], // skyblue\n+     [0.0, 0.0, 255, 255], // blue,\n+     [128, 0.0, 128, 255], // purple\n+ ];\n+ \n  const triangles = createHexagon(canvas.width / 2, canvas.height / 2, canvas.height / 2, 360);\n  const colors = fillWithColors(360);\n  \n\n```\nRender a 7-gon\n\n📄 src/webgl-hello-world.js\n```diff\n      [128, 0.0, 128, 255], // purple\n  ];\n  \n- const triangles = createHexagon(canvas.width / 2, canvas.height / 2, canvas.height / 2, 360);\n- const colors = fillWithColors(360);\n+ const triangles = createHexagon(canvas.width / 2, canvas.height / 2, canvas.height / 2, 7);\n+ const colors = fillWithColors(7);\n  \n  function createHexagon(centerX, centerY, radius, segmentsCount) {\n      const vertices = [];\n\n```\nFill colors buffer with rainbow colors\n\n📄 src/webgl-hello-world.js\n```diff\n  \n      for (let i = 0; i \u003c segmentsCount; i++) {\n          for (let j = 0; j \u003c 3; j++) {\n-             if (j == 0) { // vertex in center of circle\n-                 colors.push(0, 0, 0, 255);\n-             } else {\n-                 colors.push(i / 360 * 255, 0, 0, 255);\n-             }\n+             colors.push(...rainbowColors[i]);\n          }\n      }\n  \n\n```\n![Rainbow](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/rainbow.png)\n\nWhere's the red? Well, to render 7 polygons, we need 8-gon 🤦 My bad, sorry.\n\n\nNow we have a colored 8-gon and we store vertices coordinates and colors in two separate buffers.\nHaving two separate buffers allows to update them separately (imagine we need to change colors, but not positions)\n\nOn the other hand if both positions and colors will be the same – we can store this data in a single buffer.\n\nLet's refactor the code to acheive it\n\n\nWe need to structure our buffer data by attribute.\n\n```\nx1, y1, color.r, color.g, color.b, color.a\nx2, y2, color.r, color.g, color.b, color.a\nx3, y3, color.r, color.g, color.b, color.a\n...\n```\n\n📄 src/webgl-hello-world.js\n```diff\n  ];\n  \n  const triangles = createHexagon(canvas.width / 2, canvas.height / 2, canvas.height / 2, 7);\n- const colors = fillWithColors(7);\n  \n  function createHexagon(centerX, centerY, radius, segmentsCount) {\n-     const vertices = [];\n+     const vertexData = [];\n      const segmentAngle =  Math.PI * 2 / (segmentsCount - 1);\n  \n      for (let i = 0; i \u003c Math.PI * 2; i += segmentAngle) {\n          const from = i;\n          const to = i + segmentAngle;\n  \n-         vertices.push(centerX, centerY);\n-         vertices.push(centerX + Math.cos(from) * radius, centerY + Math.sin(from) * radius);\n-         vertices.push(centerX + Math.cos(to) * radius, centerY + Math.sin(to) * radius);\n+         const color = rainbowColors[i / segmentAngle];\n+ \n+         vertexData.push(centerX, centerY);\n+         vertexData.push(...color);\n+ \n+         vertexData.push(centerX + Math.cos(from) * radius, centerY + Math.sin(from) * radius);\n+         vertexData.push(...color);\n+ \n+         vertexData.push(centerX + Math.cos(to) * radius, centerY + Math.sin(to) * radius);\n+         vertexData.push(...color);\n      }\n  \n-     return vertices;\n+     return vertexData;\n  }\n  \n  function fillWithColors(segmentsCount) {\n\n```\nWe don't need color buffer anymore\n\n📄 src/webgl-hello-world.js\n```diff\n  }\n  \n  const positionData = new Float32Array(triangles);\n- const colorData = new Float32Array(colors);\n- \n  const positionBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n- const colorBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n- \n- gl.bindBuffer(gl.ARRAY_BUFFER, colorBuffer);\n- gl.bufferData(gl.ARRAY_BUFFER, colorData, gl.STATIC_DRAW);\n  \n  gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);\n  gl.bufferData(gl.ARRAY_BUFFER, positionData, gl.STATIC_DRAW);\n\n```\nand it also makes sense to rename `positionData` and `positionBuffer` to a `vertexData` and `vertexBuffer`\n\n📄 src/webgl-hello-world.js\n```diff\n      return colors;\n  }\n  \n- const positionData = new Float32Array(triangles);\n- const positionBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n+ const vertexData = new Float32Array(triangles);\n+ const vertexBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n  \n- gl.bindBuffer(gl.ARRAY_BUFFER, positionBuffer);\n- gl.bufferData(gl.ARRAY_BUFFER, positionData, gl.STATIC_DRAW);\n+ gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);\n+ gl.bufferData(gl.ARRAY_BUFFER, vertexData, gl.STATIC_DRAW);\n  gl.lineWidth(10);\n  \n  const attributeSize = 2;\n\n```\nBut how do we specify how this data should be read from buffer and passed to a valid shader attributes\n\nWe can do this with `vertexAttribPointer`, `stride` and `offset` arguments\n\n`stride` tells how much data should be read for each vertex in bytes\n\nEach vertex contains:\n\n- position (x, y, 2 floats)\n- color (r, g, b, a, 4 floats)\n\nSo we have a total of `6` floats `4` bytes each\nThis means that stride is `6 * 4`\n\n\nOffset specifies how much data should be skipped in the beginning of the chunk\n\nColor data goes right after position, position is 2 floats, so offset for color is `2 * 4`\n\n📄 src/webgl-hello-world.js\n```diff\n  const attributeSize = 2;\n  const type = gl.FLOAT;\n  const nomralized = false;\n- const stride = 0;\n+ const stride = 24;\n  const offset = 0;\n  \n  gl.enableVertexAttribArray(positionLocation);\n  gl.vertexAttribPointer(positionLocation, attributeSize, type, nomralized, stride, offset);\n  \n- gl.bindBuffer(gl.ARRAY_BUFFER, colorBuffer);\n- \n  gl.enableVertexAttribArray(colorLocation);\n- gl.vertexAttribPointer(colorLocation, 4, type, nomralized, stride, offset);\n+ gl.vertexAttribPointer(colorLocation, 4, type, nomralized, stride, 8);\n  \n- gl.drawArrays(gl.TRIANGLES, 0, positionData.length / 2);\n+ gl.drawArrays(gl.TRIANGLES, 0, vertexData.length / 6);\n\n```\nAnd voila, we have the same result, but with a single buffer 🎉\n\n\n### Conclusion\n\nLet's summarize how `vertexAttribPointer(location, size, type, normalized, stride offset)` method works for a single buffer (this buffer is called interleavd)\n\n- `location`: specifies which attribute do we want to setup\n- `size`: how much data should be read for this exact attribute\n- `type`: type of data being read\n- `normalized`: whether the data should be \"normalized\" (clamped to `[-1..1]` for gl.BYTE and gl.SHORT, and to `[0..1]` for gl.UNSIGNED_BYTE and gl.UNSIGNED_SHORT)\n- `stride`: how much data is there for each vertex in total (in bytes)\n- `offset`: how much data should be skipped in a beginning of each chunk of data\n\nSo now you can use different combinations of buffers to fill your attributes with data\n\nSee you tomorrow 👋\n\n---\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 6. Index buffer\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\nHey 👋Welcome back to WebGL month. [Yesterday](https://dev.to/lesnitsky/webgl-month-day-5-interleaved-buffers-2k9a) we've learned how to use interleaved buffers. However our buffer contains a lot of duplicate data, because some polygons share the same vertices\n\n\nLet's get back to a simple example of rectangle\n\n📄 src/webgl-hello-world.js\n```diff\n      [128, 0.0, 128, 255], // purple\n  ];\n  \n- const triangles = createHexagon(canvas.width / 2, canvas.height / 2, canvas.height / 2, 7);\n+ const triangles = createRect(0, 0, canvas.height, canvas.height);\n  \n  function createHexagon(centerX, centerY, radius, segmentsCount) {\n      const vertexData = [];\n\n```\nand fill it only with unique vertex coordinates\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  const triangles = createRect(0, 0, canvas.height, canvas.height);\n  \n+ function createRect(top, left, width, height) {\n+     return [\n+         left, top, // x1 y1\n+         left + width, top, // x2 y2\n+         left, top + height, // x3 y3\n+         left + width, top + height, // x4 y4\n+     ];\n+ }\n+ \n  function createHexagon(centerX, centerY, radius, segmentsCount) {\n      const vertexData = [];\n      const segmentAngle =  Math.PI * 2 / (segmentsCount - 1);\n\n```\nLet's also disable color attribute for now\n\n📄 src/webgl-hello-world.js\n```diff\n  const attributeSize = 2;\n  const type = gl.FLOAT;\n  const nomralized = false;\n- const stride = 24;\n+ const stride = 0;\n  const offset = 0;\n  \n  gl.enableVertexAttribArray(positionLocation);\n  gl.vertexAttribPointer(positionLocation, attributeSize, type, nomralized, stride, offset);\n  \n- gl.enableVertexAttribArray(colorLocation);\n- gl.vertexAttribPointer(colorLocation, 4, type, nomralized, stride, 8);\n+ // gl.enableVertexAttribArray(colorLocation);\n+ // gl.vertexAttribPointer(colorLocation, 4, type, nomralized, stride, 8);\n  \n  gl.drawArrays(gl.TRIANGLES, 0, vertexData.length / 6);\n\n```\nOk, so our buffer contains 4 vertices, but how does webgl render 2 triangles with only 4 vertices?\nTHere's a special type of buffer which can specify how to fetch data from vertex buffer and build primitives (in our case triangles)\n\n\nThis buffer is called `index buffer` and it contains indices of vertex data chunks in vertex buffer.\nSo we need to specify indices of triangle vertices.\n\n📄 src/webgl-hello-world.js\n```diff\n  const vertexData = new Float32Array(triangles);\n  const vertexBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n  \n+ const indexBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n+ \n+ const indexData = new Uint6Array([\n+     0, 1, 2, // first triangle\n+     1, 2, 3, // second trianlge\n+ ]);\n+ \n  gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);\n  gl.bufferData(gl.ARRAY_BUFFER, vertexData, gl.STATIC_DRAW);\n  gl.lineWidth(10);\n\n```\nNext step – upload data to a WebGL buffer.\nTo tell GPU that we're using `index buffer` we need to pass `gl.ELEMENT_ARRAY_BUFFER` as a first argument of `gl.bindBuffer` and `gl.bufferData`\n\n📄 src/webgl-hello-world.js\n```diff\n      1, 2, 3, // second trianlge\n  ]);\n  \n+ gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer);\n+ gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, indexData, gl.STATIC_DRAW);\n+ \n  gl.bindBuffer(gl.ARRAY_BUFFER, vertexBuffer);\n  gl.bufferData(gl.ARRAY_BUFFER, vertexData, gl.STATIC_DRAW);\n  gl.lineWidth(10);\n\n```\nAnd the final step: to render indexed vertices we need to call different method – `drawElements` instead of `drawArrays`\n\n📄 src/webgl-hello-world.js\n```diff\n  \n  const indexBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n  \n- const indexData = new Uint6Array([\n+ const indexData = new Uint8Array([\n      0, 1, 2, // first triangle\n      1, 2, 3, // second trianlge\n  ]);\n  // gl.enableVertexAttribArray(colorLocation);\n  // gl.vertexAttribPointer(colorLocation, 4, type, nomralized, stride, 8);\n  \n- gl.drawArrays(gl.TRIANGLES, 0, vertexData.length / 6);\n+ gl.drawElements(gl.TRIANGLES, indexData.length, gl.UNSIGNED_BYTE, 0);\n\n```\nWait, why is nothing rendered?\n\n\nThe reason is that we've disabled color attribute, so it got filled with zeros. (0, 0, 0, 0) – transparent black.\nLet's fix that\n\n📄 src/webgl-hello-world.js\n```diff\n  \n      void main() {\n          gl_FragColor = vColor / 255.0;\n+         gl_FragColor.a = 1.0;\n      }\n  `;\n  \n\n```\n### Conclusion\n\nWe now know how to use index buffer to eliminate number of vertices we need to upload to gpu.\nRectangle example is very simple (only 2 vertices are duplicated), on the other hand this is 33%, so on a larger amount of data being rendered this might be quite a performance improvement, especially if you update vertex data frequently and reupload buffer contents\n\n### Homework\n\nRender n-gon using index buffer\n\nSee you tomorrow 👋\n\n---\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\n\n## WebGL month. Day 7. Tooling and refactor\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋\n\nWelcome to the WebGL month.\n\nSince our codebase grows and will keep getting more complicated, we need some tooling and cleanup.\n\n\nWe'll need webpack, so let's create `package.json` and install it\n\n📄 package.json\n```json\n{\n  \"name\": \"webgl-month\",\n  \"version\": \"1.0.0\",\n  \"description\": \"Daily WebGL tutorials\",\n  \"main\": \"index.js\",\n  \"scripts\": {\n    \"test\": \"echo \\\"Error: no test specified\\\" \u0026\u0026 exit 1\"\n  },\n  \"repository\": {\n    \"type\": \"git\",\n    \"url\": \"git+https://github.com/lesnitsky/webgl-month.git\"\n  },\n  \"author\": \"\",\n  \"license\": \"ISC\",\n  \"bugs\": {\n    \"url\": \"https://github.com/lesnitsky/webgl-month/issues\"\n  },\n  \"homepage\": \"https://github.com/lesnitsky/webgl-month#readme\",\n  \"devDependencies\": {\n    \"webpack\": \"^4.35.2\",\n    \"webpack-cli\": \"^3.3.5\"\n  }\n}\n\n```\nWe'll need a simple webpack config\n\n📄 webpack.config.js\n```js\nconst path = require('path');\n\nmodule.exports = {\n    entry: {\n        'week-1': './src/week-1.js',\n    },\n\n    output: {\n        path: path.resolve(__dirname, 'dist'),\n        filename: '[name].js',\n    },\n\n    mode: 'development',\n};\n\n```\nand update script source\n\n📄 index.html\n```diff\n    \u003c/head\u003e\n    \u003cbody\u003e\n      \u003ccanvas\u003e\u003c/canvas\u003e\n-     \u003cscript src=\"./src/webgl-hello-world.js\"\u003e\u003c/script\u003e\n+     \u003cscript src=\"./dist/week-1.js\"\u003e\u003c/script\u003e\n    \u003c/body\u003e\n  \u003c/html\u003e\n\n```\nSince shaders are raw strings, we can store shader source in separate file and use `raw-loader` for `webpack`.\n\n📄 package.json\n```diff\n    },\n    \"homepage\": \"https://github.com/lesnitsky/webgl-month#readme\",\n    \"devDependencies\": {\n+     \"raw-loader\": \"^3.0.0\",\n      \"webpack\": \"^4.35.2\",\n      \"webpack-cli\": \"^3.3.5\"\n    }\n\n```\n📄 webpack.config.js\n```diff\n          filename: '[name].js',\n      },\n  \n+     module: {\n+         rules: [\n+             {\n+                 test: /\\.glsl$/,\n+                 use: 'raw-loader',\n+             },\n+         ],\n+     },\n+ \n      mode: 'development',\n  };\n\n```\nand let's actually move shaders to separate files\n\n📄 src/shaders/fragment.glsl\n```glsl\nprecision mediump float;\n\nvarying vec4 vColor;\n\nvoid main() {\n    gl_FragColor = vColor / 255.0;\n    gl_FragColor.a = 1.0;\n}\n\n```\n📄 src/shaders/vertex.glsl\n```glsl\nattribute vec2 position;\nattribute vec4 color;\nuniform vec2 resolution;\n\nvarying vec4 vColor;\n\n#define M_PI 3.1415926535897932384626433832795\n\nvoid main() {\n    vec2 transformedPosition = position / resolution * 2.0 - 1.0;\n    gl_PointSize = 2.0;\n    gl_Position = vec4(transformedPosition, 0, 1);\n\n    vColor = color;\n}\n\n```\n📄 src/week-1.js\n```diff\n+ import vShaderSource from './shaders/vertex.glsl';\n+ import fShaderSource from './shaders/fragment.glsl';\n+ \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  \n  const vertexShader = gl.createShader(gl.VERTEX_SHADER);\n  const fragmentShader = gl.createShader(gl.FRAGMENT_SHADER);\n  \n- const vShaderSource = `\n- attribute vec2 position;\n- attribute vec4 color;\n- uniform vec2 resolution;\n- \n- varying vec4 vColor;\n- \n- #define M_PI 3.1415926535897932384626433832795\n- \n- void main() {\n-     vec2 transformedPosition = position / resolution * 2.0 - 1.0;\n-     gl_PointSize = 2.0;\n-     gl_Position = vec4(transformedPosition, 0, 1);\n- \n-     vColor = color;\n- }\n- `;\n- \n- const fShaderSource = `\n-     precision mediump float;\n- \n-     varying vec4 vColor;\n- \n-     void main() {\n-         gl_FragColor = vColor / 255.0;\n-         gl_FragColor.a = 1.0;\n-     }\n- `;\n- \n  function compileShader(shader, source) {\n      gl.shaderSource(shader, source);\n      gl.compileShader(shader);\n\n```\nWe can also move functions which create vertices positions to separate file\n\n📄 src/shape-helpers.js\n```js\nexport function createRect(top, left, width, height) {\n    return [\n        left, top, // x1 y1\n        left + width, top, // x2 y2\n        left, top + height, // x3 y3\n        left + width, top + height, // x4 y4\n    ];\n}\n\nexport function createHexagon(centerX, centerY, radius, segmentsCount) {\n    const vertexData = [];\n    const segmentAngle =  Math.PI * 2 / (segmentsCount - 1);\n\n    for (let i = 0; i \u003c Math.PI * 2; i += segmentAngle) {\n        const from = i;\n        const to = i + segmentAngle;\n\n        const color = rainbowColors[i / segmentAngle];\n\n        vertexData.push(centerX, centerY);\n        vertexData.push(...color);\n\n        vertexData.push(centerX + Math.cos(from) * radius, centerY + Math.sin(from) * radius);\n        vertexData.push(...color);\n\n        vertexData.push(centerX + Math.cos(to) * radius, centerY + Math.sin(to) * radius);\n        vertexData.push(...color);\n    }\n\n    return vertexData;\n}\n\n```\n📄 src/week-1.js\n```diff\n  import vShaderSource from './shaders/vertex.glsl';\n  import fShaderSource from './shaders/fragment.glsl';\n  \n+ import { createRect } from './shape-helpers';\n+ \n+ \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  \n  \n  const triangles = createRect(0, 0, canvas.height, canvas.height);\n  \n- function createRect(top, left, width, height) {\n-     return [\n-         left, top, // x1 y1\n-         left + width, top, // x2 y2\n-         left, top + height, // x3 y3\n-         left + width, top + height, // x4 y4\n-     ];\n- }\n- \n- function createHexagon(centerX, centerY, radius, segmentsCount) {\n-     const vertexData = [];\n-     const segmentAngle =  Math.PI * 2 / (segmentsCount - 1);\n- \n-     for (let i = 0; i \u003c Math.PI * 2; i += segmentAngle) {\n-         const from = i;\n-         const to = i + segmentAngle;\n- \n-         const color = rainbowColors[i / segmentAngle];\n- \n-         vertexData.push(centerX, centerY);\n-         vertexData.push(...color);\n- \n-         vertexData.push(centerX + Math.cos(from) * radius, centerY + Math.sin(from) * radius);\n-         vertexData.push(...color);\n- \n-         vertexData.push(centerX + Math.cos(to) * radius, centerY + Math.sin(to) * radius);\n-         vertexData.push(...color);\n-     }\n- \n-     return vertexData;\n- }\n- \n  function fillWithColors(segmentsCount) {\n      const colors = [];\n  \n\n```\nSince we're no longer using color attribute, we can drop everyhting related to it\n\n📄 src/shaders/fragment.glsl\n```diff\n  precision mediump float;\n  \n- varying vec4 vColor;\n- \n  void main() {\n-     gl_FragColor = vColor / 255.0;\n-     gl_FragColor.a = 1.0;\n+     gl_FragColor = vec4(1, 0, 0, 1);\n  }\n\n```\n📄 src/shaders/vertex.glsl\n```diff\n  attribute vec2 position;\n- attribute vec4 color;\n  uniform vec2 resolution;\n  \n- varying vec4 vColor;\n- \n  #define M_PI 3.1415926535897932384626433832795\n  \n  void main() {\n      vec2 transformedPosition = position / resolution * 2.0 - 1.0;\n      gl_PointSize = 2.0;\n      gl_Position = vec4(transformedPosition, 0, 1);\n- \n-     vColor = color;\n  }\n\n```\n📄 src/week-1.js\n```diff\n  \n  import { createRect } from './shape-helpers';\n  \n- \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  \n  gl.useProgram(program);\n  \n  const positionLocation = gl.getAttribLocation(program, 'position');\n- const colorLocation = gl.getAttribLocation(program, 'color');\n- \n  const resolutionUniformLocation = gl.getUniformLocation(program, 'resolution');\n  \n  gl.uniform2fv(resolutionUniformLocation, [canvas.width, canvas.height]);\n  \n- const rainbowColors = [\n-     [255, 0.0, 0.0, 255], // red\n-     [255, 165, 0.0, 255], // orange\n-     [255, 255, 0.0, 255], // yellow\n-     [0.0, 255, 0.0, 255], // green\n-     [0.0, 101, 255, 255], // skyblue\n-     [0.0, 0.0, 255, 255], // blue,\n-     [128, 0.0, 128, 255], // purple\n- ];\n- \n  const triangles = createRect(0, 0, canvas.height, canvas.height);\n  \n- function fillWithColors(segmentsCount) {\n-     const colors = [];\n- \n-     for (let i = 0; i \u003c segmentsCount; i++) {\n-         for (let j = 0; j \u003c 3; j++) {\n-             colors.push(...rainbowColors[i]);\n-         }\n-     }\n- \n-     return colors;\n- }\n- \n  const vertexData = new Float32Array(triangles);\n  const vertexBuffer = gl.createBuffer(gl.ARRAY_BUFFER);\n  \n  gl.enableVertexAttribArray(positionLocation);\n  gl.vertexAttribPointer(positionLocation, attributeSize, type, nomralized, stride, offset);\n  \n- // gl.enableVertexAttribArray(colorLocation);\n- // gl.vertexAttribPointer(colorLocation, 4, type, nomralized, stride, 8);\n- \n  gl.drawElements(gl.TRIANGLES, indexData.length, gl.UNSIGNED_BYTE, 0);\n\n```\nWebpack will help us keep our codebase cleaner in the future, but we're good for now\n\nSee you tomorrow 👋\n\n---\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 8. Textures\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋 Welcome back to WebGL month.\n\nWe've already learned several ways to pass color data to shader, but there is one more and it is very powerful. Today we'll learn about textures\n\n\nLet's create simple shaders\n\n📄 src/shaders/texture.f.glsl\n```glsl\nprecision mediump float;\n\nvoid main() {\n    gl_FragColor = vec4(1, 0, 0, 1);\n}\n\n```\n📄 src/shaders/texture.v.glsl\n```glsl\nattribute vec2 position;\n\nvoid main() {\n    gl_Position = vec4(position, 0, 1);\n}\n\n```\n📄 src/texture.js\n```js\nimport vShaderSource from './shaders/texture.v.glsl';\nimport fShaderSource from './shaders/texture.f.glsl';\n\n```\nGet the webgl context\n\n📄 src/texture.js\n```diff\n  import vShaderSource from './shaders/texture.v.glsl';\n  import fShaderSource from './shaders/texture.f.glsl';\n+ \n+ const canvas = document.querySelector('canvas');\n+ const gl = canvas.getContext('webgl');\n\n```\nCreate shaders\n\n📄 src/texture.js\n```diff\n  import vShaderSource from './shaders/texture.v.glsl';\n  import fShaderSource from './shaders/texture.f.glsl';\n+ import { compileShader } from './gl-helpers';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n+ \n+ const vShader = gl.createShader(gl.VERTEX_SHADER);\n+ const fShader = gl.createShader(gl.FRAGMENT_SHADER);\n+ \n+ compileShader(gl, vShader, vShaderSource);\n+ compileShader(gl, fShader, fShaderSource);\n\n```\nand program\n\n📄 src/texture.js\n```diff\n  \n  compileShader(gl, vShader, vShaderSource);\n  compileShader(gl, fShader, fShaderSource);\n+ \n+ const program = gl.createProgram();\n+ \n+ gl.attachShader(program, vShader);\n+ gl.attachShader(program, fShader);\n+ \n+ gl.linkProgram(program);\n+ gl.useProgram(program);\n\n```\nCreate a vertex position buffer and fill it with data\n\n📄 src/texture.js\n```diff\n  import vShaderSource from './shaders/texture.v.glsl';\n  import fShaderSource from './shaders/texture.f.glsl';\n  import { compileShader } from './gl-helpers';\n+ import { createRect } from './shape-helpers';\n+ \n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  \n  gl.linkProgram(program);\n  gl.useProgram(program);\n+ \n+ const vertexPosition = new Float32Array(createRect(-1, -1, 2, 2));\n+ const vertexPositionBuffer = gl.createBuffer();\n+ \n+ gl.bindBuffer(gl.ARRAY_BUFFER, vertexPositionBuffer);\n+ gl.bufferData(gl.ARRAY_BUFFER, vertexPosition, gl.STATIC_DRAW);\n\n```\nSetup position attribute\n\n📄 src/texture.js\n```diff\n  \n  gl.bindBuffer(gl.ARRAY_BUFFER, vertexPositionBuffer);\n  gl.bufferData(gl.ARRAY_BUFFER, vertexPosition, gl.STATIC_DRAW);\n+ \n+ const attributeLocations = {\n+     position: gl.getAttribLocation(program, 'position'),\n+ };\n+ \n+ gl.enableVertexAttribArray(attributeLocations.position);\n+ gl.vertexAttribPointer(attributeLocations.position, 2, gl.FLOAT, false, 0, 0);\n\n```\nsetup index buffer\n\n📄 src/texture.js\n```diff\n  \n  gl.enableVertexAttribArray(attributeLocations.position);\n  gl.vertexAttribPointer(attributeLocations.position, 2, gl.FLOAT, false, 0, 0);\n+ \n+ const vertexIndices = new Uint8Array([0, 1, 2, 1, 2, 3]);\n+ const indexBuffer = gl.createBuffer();\n+ \n+ gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer);\n+ gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, vertexIndices, gl.STATIC_DRAW);\n\n```\nand issue a draw call\n\n📄 src/texture.js\n```diff\n  \n  gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer);\n  gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, vertexIndices, gl.STATIC_DRAW);\n+ \n+ gl.drawElements(gl.TRIANGLES, vertexIndices.length, gl.UNSIGNED_BYTE, 0);\n\n```\nSo now we can proceed to textures.\n\nYou can upload image to a GPU and use it to calculate pixel color. In a simple case, when canvas size is the same or at least proportional to image size, we can render image pixel by pixel reading each pixel color of image and using it as `gl_FragColor`\n\nLet's make a helper to load images\n\n📄 src/gl-helpers.js\n```diff\n          throw new Error(log);\n      }\n  }\n+ \n+ export async function loadImage(src) {\n+     const img = new Image();\n+ \n+     let _resolve;\n+     const p = new Promise((resolve) =\u003e _resolve = resolve);\n+ \n+     img.onload = () =\u003e {\n+         _resolve(img);\n+     }\n+ \n+     img.src = src;\n+ \n+     return p;\n+ }\n\n```\nLoad image and create webgl texture\n\n📄 src/texture.js\n```diff\n  import vShaderSource from './shaders/texture.v.glsl';\n  import fShaderSource from './shaders/texture.f.glsl';\n- import { compileShader } from './gl-helpers';\n+ import { compileShader, loadImage } from './gl-helpers';\n  import { createRect } from './shape-helpers';\n  \n+ import textureImageSrc from '../assets/images/texture.jpg';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer);\n  gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, vertexIndices, gl.STATIC_DRAW);\n  \n- gl.drawElements(gl.TRIANGLES, vertexIndices.length, gl.UNSIGNED_BYTE, 0);\n+ loadImage(textureImageSrc).then((textureImg) =\u003e {\n+     const texture = gl.createTexture();\n+ \n+     gl.drawElements(gl.TRIANGLES, vertexIndices.length, gl.UNSIGNED_BYTE, 0);\n+ });\n\n```\n[GTI} add image\n\n📄 assets/images/texture.jpg\n```jpg\n\n```\nwe also need an appropriate webpack loader\n\n📄 package.json\n```diff\n    \"homepage\": \"https://github.com/lesnitsky/webgl-month#readme\",\n    \"devDependencies\": {\n      \"raw-loader\": \"^3.0.0\",\n+     \"url-loader\": \"^2.0.1\",\n      \"webpack\": \"^4.35.2\",\n      \"webpack-cli\": \"^3.3.5\"\n    }\n\n```\n📄 webpack.config.js\n```diff\n                  test: /\\.glsl$/,\n                  use: 'raw-loader',\n              },\n+ \n+             {\n+                 test: /\\.jpg$/,\n+                 use: 'url-loader',\n+             },\n          ],\n      },\n  \n\n```\nto operate with textures we need to do the same as with buffers – bind it\n\n📄 src/texture.js\n```diff\n  loadImage(textureImageSrc).then((textureImg) =\u003e {\n      const texture = gl.createTexture();\n  \n+     gl.bindTexture(gl.TEXTURE_2D, texture);\n+ \n      gl.drawElements(gl.TRIANGLES, vertexIndices.length, gl.UNSIGNED_BYTE, 0);\n  });\n\n```\nand upload image to a bound texture\n\n📄 src/texture.js\n```diff\n  \n      gl.bindTexture(gl.TEXTURE_2D, texture);\n  \n+     gl.texImage2D(\n+         gl.TEXTURE_2D,\n+     );\n+ \n      gl.drawElements(gl.TRIANGLES, vertexIndices.length, gl.UNSIGNED_BYTE, 0);\n  });\n\n```\nLet's ignore the 2nd argument for now, we'll speak about it later\n\n📄 src/texture.js\n```diff\n  \n      gl.texImage2D(\n          gl.TEXTURE_2D,\n+         0,\n      );\n  \n      gl.drawElements(gl.TRIANGLES, vertexIndices.length, gl.UNSIGNED_BYTE, 0);\n\n```\nthe 3rd and the 4th argumetns specify internal texture format and source (image) format. For our image it is gl.RGBA. [Check out this page for more details about formats](https://developer.mozilla.org/en-US/docs/Web/API/WebGLRenderingContext/texImage2D)\n\n📄 src/texture.js\n```diff\n      gl.texImage2D(\n          gl.TEXTURE_2D,\n          0,\n+         gl.RGBA,\n+         gl.RGBA,\n      );\n  \n      gl.drawElements(gl.TRIANGLES, vertexIndices.length, gl.UNSIGNED_BYTE, 0);\n\n```\nnext argument specifies source type (0..255 is UNSIGNED_BYTE)\n\n📄 src/texture.js\n```diff\n          0,\n          gl.RGBA,\n          gl.RGBA,\n+         gl.UNSIGNED_BYTE,\n      );\n  \n      gl.drawElements(gl.TRIANGLES, vertexIndices.length, gl.UNSIGNED_BYTE, 0);\n\n```\nand image itself\n\n📄 src/texture.js\n```diff\n          gl.RGBA,\n          gl.RGBA,\n          gl.UNSIGNED_BYTE,\n+         textureImg,\n      );\n  \n      gl.drawElements(gl.TRIANGLES, vertexIndices.length, gl.UNSIGNED_BYTE, 0);\n\n```\nWe also need to specify different parameters of texture. We'll talk about this parameters in next tutorials.\n\n📄 src/texture.js\n```diff\n          textureImg,\n      );\n  \n+     gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);\n+     gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);\n+     gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);\n+     gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);\n+ \n      gl.drawElements(gl.TRIANGLES, vertexIndices.length, gl.UNSIGNED_BYTE, 0);\n  });\n\n```\nTo be able to work with texture in shader we need to specify a uniform of `sampler2D` type\n\n📄 src/shaders/texture.f.glsl\n```diff\n  precision mediump float;\n  \n+ uniform sampler2D texture;\n+ \n  void main() {\n      gl_FragColor = vec4(1, 0, 0, 1);\n  }\n\n```\nand specify the value of this uniform. There is a way to use multiple textures, we'll talk about it in next tutorials\n\n📄 src/texture.js\n```diff\n      position: gl.getAttribLocation(program, 'position'),\n  };\n  \n+ const uniformLocations = {\n+     texture: gl.getUniformLocation(program, 'texture'),\n+ };\n+ \n  gl.enableVertexAttribArray(attributeLocations.position);\n  gl.vertexAttribPointer(attributeLocations.position, 2, gl.FLOAT, false, 0, 0);\n  \n      gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);\n      gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);\n  \n+     gl.activeTexture(gl.TEXTURE0);\n+     gl.uniform1i(uniformLocations.texture, 0);\n+ \n      gl.drawElements(gl.TRIANGLES, vertexIndices.length, gl.UNSIGNED_BYTE, 0);\n  });\n\n```\nLet's also pass canvas resolution to a shader\n\n📄 src/shaders/texture.f.glsl\n```diff\n  precision mediump float;\n  \n  uniform sampler2D texture;\n+ uniform vec2 resolution;\n  \n  void main() {\n      gl_FragColor = vec4(1, 0, 0, 1);\n\n```\n📄 src/texture.js\n```diff\n  \n  const uniformLocations = {\n      texture: gl.getUniformLocation(program, 'texture'),\n+     resolution: gl.getUniformLocation(program, 'resolution'),\n  };\n  \n  gl.enableVertexAttribArray(attributeLocations.position);\n      gl.activeTexture(gl.TEXTURE0);\n      gl.uniform1i(uniformLocations.texture, 0);\n  \n+     gl.uniform2fv(uniformLocations.resolution, [canvas.width, canvas.height]);\n+ \n      gl.drawElements(gl.TRIANGLES, vertexIndices.length, gl.UNSIGNED_BYTE, 0);\n  });\n\n```\nThere is a special `gl_FragCoord` variable which contains coordinate of each pixel. Together with `resolution` uniform we can get a `texture coordinate` (coordinate of the pixel in image). Texture coordinates are in range `[0..1]`.\n\n📄 src/shaders/texture.f.glsl\n```diff\n  uniform vec2 resolution;\n  \n  void main() {\n+     vec2 texCoord = gl_FragCoord.xy / resolution;\n      gl_FragColor = vec4(1, 0, 0, 1);\n  }\n\n```\nand use `texture2D` to render the whole image.\n\n📄 src/shaders/texture.f.glsl\n```diff\n  \n  void main() {\n      vec2 texCoord = gl_FragCoord.xy / resolution;\n-     gl_FragColor = vec4(1, 0, 0, 1);\n+     gl_FragColor = texture2D(texture, texCoord);\n  }\n\n```\nCool 😎 We can now render images, but there is much more to learn about textures, so see you tomorrow\n\n---\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\n\n## WebGL Month. Day 9. Image filters\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\n\nHey 👋 Welcome back to WebGL month\n\n[Yesterday](https://dev.to/lesnitsky/webgl-month-day-8-textures-1mk8) we've learned how to use textures in webgl, so let's get advantage of that knowledge to build something fun.\n\nToday we're going to explore how to implement simple image filters\n\n\n### Inverse\n\nThe very first and simple filter might be inverse all colors of the image.\n\nHow do we inverse colors?\n\nOriginal values are in range `[0..1]`\n\nIf we subtract from each component `1` we'll get negative values, there's an `abs` function in glsl\n\nYou can also define other functions apart of `void main` in glsl pretty much like in C/C++, so let's create `inverse` function\n\n📄 src/shaders/texture.f.glsl\n```diff\n  uniform sampler2D texture;\n  uniform vec2 resolution;\n  \n+ vec4 inverse(vec4 color) {\n+     return abs(vec4(color.rgb - 1.0, color.a));\n+ }\n+ \n  void main() {\n      vec2 texCoord = gl_FragCoord.xy / resolution;\n      gl_FragColor = texture2D(texture, texCoord);\n\n```\nand let's actually use it\n\n📄 src/shaders/texture.f.glsl\n```diff\n  void main() {\n      vec2 texCoord = gl_FragCoord.xy / resolution;\n      gl_FragColor = texture2D(texture, texCoord);\n+ \n+     gl_FragColor = inverse(gl_FragColor);\n  }\n\n```\nVoila, we have an inverse filter with just 4 lines of code\n\n![Inverse](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/inverse-filter.png)\n\n\n### Black and White\n\nLet's think of how to implement black and white filter.\n\nWhite color is `vec4(1, 1, 1, 1)`\n\nBlack is `vec4(0, 0, 0, 1)`\n\nWhat are shades of gray? Aparently we need to add the same value to each color component.\n\nSo basically we need to calculate the \"brightness\" value of each component. In very naive implmentation we can just add all color components and divide by 3 (arithmetical mean).\n\n\u003e Note: this is not the best approach, as different colors will give the same result (eg. vec3(0.5, 0, 0) and vec3(0, 0.5, 0), but in reality these colors have different \"brightness\", I'm just trying to keep these examples simple to understand)\n\nOk, let's try to implement this\n\n📄 src/shaders/texture.f.glsl\n```diff\n      return abs(vec4(color.rgb - 1.0, color.a));\n  }\n  \n+ vec4 blackAndWhite(vec4 color) {\n+     return vec4(vec3(1.0, 1.0, 1.0) * (color.r + color.g + color.b) / 3.0, color.a);\n+ }\n+ \n  void main() {\n      vec2 texCoord = gl_FragCoord.xy / resolution;\n      gl_FragColor = texture2D(texture, texCoord);\n  \n-     gl_FragColor = inverse(gl_FragColor);\n+     gl_FragColor = blackAndWhite(gl_FragColor);\n  }\n\n```\nWhoa! Looks nice\n\n![Black and white](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/black-and-white.png)\n\n\n### Sepia\n\nOk, one more fancy effect is a \"old-fashioned\" photos with sepia filter.\n\n[Sepia is reddish-brown color](https://en.wikipedia.org/wiki/Sepia_%28color%29). RGB values are `112, 66, 20`\n\n\nLet's define `sepia` function and color\n\n📄 src/shaders/texture.f.glsl\n```diff\n      return vec4(vec3(1.0, 1.0, 1.0) * (color.r + color.g + color.b) / 3.0, color.a);\n  }\n  \n+ vec4 sepia(vec4 color) {\n+     vec3 sepiaColor = vec3(112, 66, 20) / 255.0;\n+ }\n+ \n  void main() {\n      vec2 texCoord = gl_FragCoord.xy / resolution;\n      gl_FragColor = texture2D(texture, texCoord);\n\n```\nA naive and simple implementation will be to interpolate original color with sepia color by a certain factor. There is a `mix` function for this\n\n📄 src/shaders/texture.f.glsl\n```diff\n  \n  vec4 sepia(vec4 color) {\n      vec3 sepiaColor = vec3(112, 66, 20) / 255.0;\n+     return vec4(\n+         mix(color.rgb, sepiaColor, 0.4),\n+         color.a\n+     );\n  }\n  \n  void main() {\n      vec2 texCoord = gl_FragCoord.xy / resolution;\n      gl_FragColor = texture2D(texture, texCoord);\n  \n-     gl_FragColor = blackAndWhite(gl_FragColor);\n+     gl_FragColor = sepia(gl_FragColor);\n  }\n\n```\nResult:\n\n![Sepia](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/sepia.png)\n\n\nThis should give you a better idea of what can be done in fragment shader.\n\nTry to implement some other filters, like saturation or vibrance\n\nSee you tomorrow 👋\n\n---\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\n\n## WebGL Month. Day 10. Multiple textures\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋 Welcome back to WebGL month.\nWe already know how to use a single image as a texture, but what if we want to render multiple images?\n\nWe'll learn how to do this today.\n\n\nFirst we need to define another `sampler2D` in fragment shader\n\n📄 src/shaders/texture.f.glsl\n```diff\n  precision mediump float;\n  \n  uniform sampler2D texture;\n+ uniform sampler2D otherTexture;\n  uniform vec2 resolution;\n  \n  vec4 inverse(vec4 color) {\n\n```\nAnd render 2 rectangles instead of a single one. Left rectangle will use already existing texture, right – new one.\n\n📄 src/texture.js\n```diff\n  gl.linkProgram(program);\n  gl.useProgram(program);\n  \n- const vertexPosition = new Float32Array(createRect(-1, -1, 2, 2));\n+ const vertexPosition = new Float32Array([\n+     ...createRect(-1, -1, 1, 2), // left rect\n+     ...createRect(-1, 0, 1, 2), // right rect\n+ ]);\n  const vertexPositionBuffer = gl.createBuffer();\n  \n  gl.bindBuffer(gl.ARRAY_BUFFER, vertexPositionBuffer);\n  gl.enableVertexAttribArray(attributeLocations.position);\n  gl.vertexAttribPointer(attributeLocations.position, 2, gl.FLOAT, false, 0, 0);\n  \n- const vertexIndices = new Uint8Array([0, 1, 2, 1, 2, 3]);\n+ const vertexIndices = new Uint8Array([\n+     // left rect\n+     0, 1, 2, \n+     1, 2, 3, \n+     \n+     // right rect\n+     4, 5, 6, \n+     5, 6, 7,\n+ ]);\n  const indexBuffer = gl.createBuffer();\n  \n  gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer);\n\n```\nWe'll also need a way to specify texture coordinates for each rectangle, as we can't use `gl_FragCoord` any longer, so we need to define another attribute (`texCoord`)\n\n📄 src/shaders/texture.v.glsl\n```diff\n  attribute vec2 position;\n+ attribute vec2 texCoord;\n  \n  void main() {\n      gl_Position = vec4(position, 0, 1);\n\n```\nThe content of this attribute should be coordinates of 2 rectangles. Top left is `0,0`, width and height are `1.0`\n\n📄 src/texture.js\n```diff\n  gl.linkProgram(program);\n  gl.useProgram(program);\n  \n+ const texCoords = new Float32Array([\n+     ...createRect(0, 0, 1, 1), // left rect\n+     ...createRect(0, 0, 1, 1), // right rect\n+ ]);\n+ const texCoordsBuffer = gl.createBuffer();\n+ \n+ gl.bindBuffer(gl.ARRAY_BUFFER, texCoordsBuffer);\n+ gl.bufferData(gl.ARRAY_BUFFER, texCoords, gl.STATIC_DRAW);\n+ \n  const vertexPosition = new Float32Array([\n      ...createRect(-1, -1, 1, 2), // left rect\n      ...createRect(-1, 0, 1, 2), // right rect\n\n```\nWe also need to setup texCoord attribute in JS\n\n📄 src/texture.js\n```diff\n  \n  const attributeLocations = {\n      position: gl.getAttribLocation(program, 'position'),\n+     texCoord: gl.getAttribLocation(program, 'texCoord'),\n  };\n  \n  const uniformLocations = {\n  gl.enableVertexAttribArray(attributeLocations.position);\n  gl.vertexAttribPointer(attributeLocations.position, 2, gl.FLOAT, false, 0, 0);\n  \n+ gl.bindBuffer(gl.ARRAY_BUFFER, texCoordsBuffer);\n+ \n+ gl.enableVertexAttribArray(attributeLocations.texCoord);\n+ gl.vertexAttribPointer(attributeLocations.texCoord, 2, gl.FLOAT, false, 0, 0);\n+ \n  const vertexIndices = new Uint8Array([\n      // left rect\n      0, 1, 2, \n\n```\nand pass this data to fragment shader via varying\n\n📄 src/shaders/texture.f.glsl\n```diff\n      );\n  }\n  \n+ varying vec2 vTexCoord;\n+ \n  void main() {\n-     vec2 texCoord = gl_FragCoord.xy / resolution;\n+     vec2 texCoord = vTexCoord;\n      gl_FragColor = texture2D(texture, texCoord);\n  \n      gl_FragColor = sepia(gl_FragColor);\n\n```\n📄 src/shaders/texture.v.glsl\n```diff\n  attribute vec2 position;\n  attribute vec2 texCoord;\n  \n+ varying vec2 vTexCoord;\n+ \n  void main() {\n      gl_Position = vec4(position, 0, 1);\n+ \n+     vTexCoord = texCoord;\n  }\n\n```\nOk, we rendered two rectangles, but they use the same texture. Let's add one more attribute which will specify which texture to use and pass this data to fragment shader via another varying\n\n📄 src/shaders/texture.v.glsl\n```diff\n  attribute vec2 position;\n  attribute vec2 texCoord;\n+ attribute float texIndex;\n  \n  varying vec2 vTexCoord;\n+ varying float vTexIndex;\n  \n  void main() {\n      gl_Position = vec4(position, 0, 1);\n  \n      vTexCoord = texCoord;\n+     vTexIndex = texIndex;\n  }\n\n```\nSo now fragment shader will know which texture to use\n\n\u003e DISCLAMER: this is not the perfect way to use multiple textures in a fragment shader, but rather an example of how to acheive this\n\n📄 src/shaders/texture.f.glsl\n```diff\n  }\n  \n  varying vec2 vTexCoord;\n+ varying float vTexIndex;\n  \n  void main() {\n      vec2 texCoord = vTexCoord;\n-     gl_FragColor = texture2D(texture, texCoord);\n  \n-     gl_FragColor = sepia(gl_FragColor);\n+     if (vTexIndex == 0.0) {\n+         gl_FragColor = texture2D(texture, texCoord);\n+     } else {\n+         gl_FragColor = texture2D(otherTexture, texCoord);\n+     }\n  }\n\n```\ntex indices are 0 for the left rectangle and 1 for the right\n\n📄 src/texture.js\n```diff\n  gl.bindBuffer(gl.ARRAY_BUFFER, texCoordsBuffer);\n  gl.bufferData(gl.ARRAY_BUFFER, texCoords, gl.STATIC_DRAW);\n  \n+ const texIndicies = new Float32Array([\n+     ...Array.from({ length: 4 }).fill(0), // left rect\n+     ...Array.from({ length: 4 }).fill(1), // right rect\n+ ]);\n+ const texIndiciesBuffer = gl.createBuffer();\n+ \n+ gl.bindBuffer(gl.ARRAY_BUFFER, texIndiciesBuffer);\n+ gl.bufferData(gl.ARRAY_BUFFER, texIndicies, gl.STATIC_DRAW);\n+ \n  const vertexPosition = new Float32Array([\n      ...createRect(-1, -1, 1, 2), // left rect\n      ...createRect(-1, 0, 1, 2), // right rect\n\n```\nand again, we need to setup vertex attribute\n\n📄 src/texture.js\n```diff\n  const attributeLocations = {\n      position: gl.getAttribLocation(program, 'position'),\n      texCoord: gl.getAttribLocation(program, 'texCoord'),\n+     texIndex: gl.getAttribLocation(program, 'texIndex'),\n  };\n  \n  const uniformLocations = {\n  gl.enableVertexAttribArray(attributeLocations.texCoord);\n  gl.vertexAttribPointer(attributeLocations.texCoord, 2, gl.FLOAT, false, 0, 0);\n  \n+ gl.bindBuffer(gl.ARRAY_BUFFER, texIndiciesBuffer);\n+ \n+ gl.enableVertexAttribArray(attributeLocations.texIndex);\n+ gl.vertexAttribPointer(attributeLocations.texIndex, 1, gl.FLOAT, false, 0, 0);\n+ \n  const vertexIndices = new Uint8Array([\n      // left rect\n      0, 1, 2, \n\n```\nNow let's load our second texture image\n\n📄 src/texture.js\n```diff\n  import { createRect } from './shape-helpers';\n  \n  import textureImageSrc from '../assets/images/texture.jpg';\n+ import textureGreenImageSrc from '../assets/images/texture-green.jpg';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer);\n  gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, vertexIndices, gl.STATIC_DRAW);\n  \n- loadImage(textureImageSrc).then((textureImg) =\u003e {\n+ Promise.all([\n+     loadImage(textureImageSrc),\n+     loadImage(textureGreenImageSrc),\n+ ]).then(([textureImg, textureGreenImg]) =\u003e {\n      const texture = gl.createTexture();\n  \n      gl.bindTexture(gl.TEXTURE_2D, texture);\n\n```\nAs we'll have to create another texture – we'll need to extract some common code to separate helper functions\n\n📄 src/gl-helpers.js\n```diff\n  \n      return p;\n  }\n+ \n+ export function createTexture(gl) {\n+     const texture = gl.createTexture();\n+     \n+     gl.bindTexture(gl.TEXTURE_2D, texture);\n+     \n+     gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);\n+     gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);\n+     gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);\n+     gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);\n+ \n+     return texture;\n+ }\n+ \n+ export function setImage(gl, texture, img) {\n+     gl.bindTexture(gl.TEXTURE_2D, texture);\n+ \n+     gl.texImage2D(\n+         gl.TEXTURE_2D,\n+         0,\n+         gl.RGBA,\n+         gl.RGBA,\n+         gl.UNSIGNED_BYTE,\n+         img,\n+     );\n+ }\n\n```\n📄 src/texture.js\n```diff\n      loadImage(textureImageSrc),\n      loadImage(textureGreenImageSrc),\n  ]).then(([textureImg, textureGreenImg]) =\u003e {\n-     const texture = gl.createTexture();\n- \n-     gl.bindTexture(gl.TEXTURE_2D, texture);\n- \n-     gl.texImage2D(\n-         gl.TEXTURE_2D,\n-         0,\n-         gl.RGBA,\n-         gl.RGBA,\n-         gl.UNSIGNED_BYTE,\n-         textureImg,\n-     );\n- \n-     gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);\n-     gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);\n-     gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);\n-     gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);\n+ \n  \n      gl.activeTexture(gl.TEXTURE0);\n      gl.uniform1i(uniformLocations.texture, 0);\n\n```\nNow let's use our newely created helpers\n\n📄 src/texture.js\n```diff\n  import vShaderSource from './shaders/texture.v.glsl';\n  import fShaderSource from './shaders/texture.f.glsl';\n- import { compileShader, loadImage } from './gl-helpers';\n+ import { compileShader, loadImage, createTexture, setImage } from './gl-helpers';\n  import { createRect } from './shape-helpers';\n  \n  import textureImageSrc from '../assets/images/texture.jpg';\n      loadImage(textureImageSrc),\n      loadImage(textureGreenImageSrc),\n  ]).then(([textureImg, textureGreenImg]) =\u003e {\n+     const texture = createTexture(gl);\n+     setImage(gl, texture, textureImg);\n  \n+     const otherTexture = createTexture(gl);\n+     setImage(gl, otherTexture, textureGreenImg);\n  \n      gl.activeTexture(gl.TEXTURE0);\n      gl.uniform1i(uniformLocations.texture, 0);\n\n```\nget uniform location\n\n📄 src/texture.js\n```diff\n  \n  const uniformLocations = {\n      texture: gl.getUniformLocation(program, 'texture'),\n+     otherTexture: gl.getUniformLocation(program, 'otherTexture'),\n      resolution: gl.getUniformLocation(program, 'resolution'),\n  };\n  \n\n```\nand set necessary textures to necessary uniforms\n\nto set a texture to a uniform you should specify\n\n* active texture unit in range `[gl.TEXTURE0..gl.TEXTURE31]` (number of texture units depends on GPU and can be retreived with `gl.getParameter`)\n* bind texture to a texture unit\n* set texture unit \"index\" to a `sampler2D` uniform\n\n📄 src/texture.js\n```diff\n      setImage(gl, otherTexture, textureGreenImg);\n  \n      gl.activeTexture(gl.TEXTURE0);\n+     gl.bindTexture(gl.TEXTURE_2D, texture);\n      gl.uniform1i(uniformLocations.texture, 0);\n  \n+     gl.activeTexture(gl.TEXTURE1);\n+     gl.bindTexture(gl.TEXTURE_2D, otherTexture);\n+     gl.uniform1i(uniformLocations.otherTexture, 1);\n+ \n      gl.uniform2fv(uniformLocations.resolution, [canvas.width, canvas.height]);\n  \n      gl.drawElements(gl.TRIANGLES, vertexIndices.length, gl.UNSIGNED_BYTE, 0);\n\n```\nThat's it, we can now render multiple textures\n\nSee you tomorrow 👋\n\n---\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[Subscribe](https://twitter.com/lesnitsky_a) for updates or [join mailing list](http://eepurl.com/gwiSeH)\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social\u0026hash=day-9)\n\n\u003e Built with [GitTutor](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 11. Reducing boilerplate\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social\u0026hash=day11)\n![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social\u0026hash=day11)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n[Yesterday](https://dev.to/lesnitsky/webgl-month-day-10-multiple-textures-gf3) we've learned how to use multiple textures. This required a shader modification, as well as javascript, but this changes might be partially done automatically\n\nThere is a package [glsl-extract-sync](https://www.npmjs.com/package/glsl-extract-sync) which can get the info about shader attributes and uniforms\n\n\nInstall this package with\n\n```sh\nnpm i glsl-extract-sync\n```\n\n📄 package.json\n```diff\n      \"url-loader\": \"^2.0.1\",\n      \"webpack\": \"^4.35.2\",\n      \"webpack-cli\": \"^3.3.5\"\n+   },\n+   \"dependencies\": {\n+     \"glsl-extract-sync\": \"0.0.0\"\n    }\n  }\n\n```\nNow let's create a helper function which will get all references to attributes and uniforms with help of this package\n\n📄 src/gl-helpers.js\n```diff\n+ import extract from 'glsl-extract-sync';\n+ \n  export function compileShader(gl, shader, source) {\n      gl.shaderSource(shader, source);\n      gl.compileShader(shader);\n          img,\n      );\n  }\n+ \n+ export function setupShaderInput(gl, program, vShaderSource, fShaderSource) {\n+ \n+ }\n\n```\nWe need to extract info about both vertex and fragment shaders\n\n📄 src/gl-helpers.js\n```diff\n  }\n  \n  export function setupShaderInput(gl, program, vShaderSource, fShaderSource) {\n- \n+     const vShaderInfo = extract(vShaderSource);\n+     const fShaderInfo = extract(fShaderSource);\n  }\n\n```\n📄 src/texture.js\n```diff\n  import vShaderSource from './shaders/texture.v.glsl';\n  import fShaderSource from './shaders/texture.f.glsl';\n- import { compileShader, loadImage, createTexture, setImage } from './gl-helpers';\n+ import { compileShader, loadImage, createTexture, setImage, setupShaderInput } from './gl-helpers';\n  import { createRect } from './shape-helpers';\n  \n  import textureImageSrc from '../assets/images/texture.jpg';\n  gl.bindBuffer(gl.ARRAY_BUFFER, vertexPositionBuffer);\n  gl.bufferData(gl.ARRAY_BUFFER, vertexPosition, gl.STATIC_DRAW);\n  \n+ console.log(setupShaderInput(gl, program, vShaderSource, fShaderSource));\n+ \n  const attributeLocations = {\n      position: gl.getAttribLocation(program, 'position'),\n      texCoord: gl.getAttribLocation(program, 'texCoord'),\n\n```\nOnly vertex shader might have attributes, but uniforms may be defined in both shaders\n\n📄 src/gl-helpers.js\n```diff\n  export function setupShaderInput(gl, program, vShaderSource, fShaderSource) {\n      const vShaderInfo = extract(vShaderSource);\n      const fShaderInfo = extract(fShaderSource);\n+ \n+     const attributes = vShaderInfo.attributes;\n+     const uniforms = [\n+         ...vShaderInfo.uniforms,\n+         ...fShaderInfo.uniforms,\n+     ];\n  }\n\n```\nNow we can get all attribute locations\n\n📄 src/gl-helpers.js\n```diff\n          ...vShaderInfo.uniforms,\n          ...fShaderInfo.uniforms,\n      ];\n+ \n+     const attributeLocations = attributes.reduce((attrsMap, attr) =\u003e {\n+         attrsMap[attr.name] = gl.getAttribLocation(program, attr.name);\n+         return attrsMap;\n+     }, {});\n  }\n\n```\nand enable all attributes\n\n📄 src/gl-helpers.js\n```diff\n          attrsMap[attr.name] = gl.getAttribLocation(program, attr.name);\n          return attrsMap;\n      }, {});\n+ \n+     attributes.forEach((attr) =\u003e {\n+         gl.enableVertexAttribArray(attributeLocations[attr.name]);\n+     });\n  }\n\n```\nWe should also get all uniform locations\n\n📄 src/gl-helpers.js\n```diff\n      attributes.forEach((attr) =\u003e {\n          gl.enableVertexAttribArray(attributeLocations[attr.name]);\n      });\n+ \n+     const uniformLocations = uniforms.reduce((uniformsMap, uniform) =\u003e {\n+         uniformsMap[uniform.name] = gl.getUniformLocation(program, uniform.name);\n+         return uniformsMap;\n+     }, {});\n  }\n\n```\nand finally return attribute and uniform locations\n\n📄 src/gl-helpers.js\n```diff\n          uniformsMap[uniform.name] = gl.getUniformLocation(program, uniform.name);\n          return uniformsMap;\n      }, {});\n+ \n+     return {\n+         attributeLocations,\n+         uniformLocations,\n+     }\n  }\n\n```\nOk, let's get advantage of our new sweet helper\n\n📄 src/texture.js\n```diff\n  gl.bindBuffer(gl.ARRAY_BUFFER, vertexPositionBuffer);\n  gl.bufferData(gl.ARRAY_BUFFER, vertexPosition, gl.STATIC_DRAW);\n  \n- console.log(setupShaderInput(gl, program, vShaderSource, fShaderSource));\n+ const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n  \n- const attributeLocations = {\n-     position: gl.getAttribLocation(program, 'position'),\n-     texCoord: gl.getAttribLocation(program, 'texCoord'),\n-     texIndex: gl.getAttribLocation(program, 'texIndex'),\n- };\n- \n- const uniformLocations = {\n-     texture: gl.getUniformLocation(program, 'texture'),\n-     otherTexture: gl.getUniformLocation(program, 'otherTexture'),\n-     resolution: gl.getUniformLocation(program, 'resolution'),\n- };\n- \n- gl.enableVertexAttribArray(attributeLocations.position);\n- gl.vertexAttribPointer(attributeLocations.position, 2, gl.FLOAT, false, 0, 0);\n+ gl.vertexAttribPointer(programInfo.attributeLocations.position, 2, gl.FLOAT, false, 0, 0);\n  \n  gl.bindBuffer(gl.ARRAY_BUFFER, texCoordsBuffer);\n- \n- gl.enableVertexAttribArray(attributeLocations.texCoord);\n- gl.vertexAttribPointer(attributeLocations.texCoord, 2, gl.FLOAT, false, 0, 0);\n+ gl.vertexAttribPointer(programInfo.attributeLocations.texCoord, 2, gl.FLOAT, false, 0, 0);\n  \n  gl.bindBuffer(gl.ARRAY_BUFFER, texIndiciesBuffer);\n- \n- gl.enableVertexAttribArray(attributeLocations.texIndex);\n- gl.vertexAttribPointer(attributeLocations.texIndex, 1, gl.FLOAT, false, 0, 0);\n+ gl.vertexAttribPointer(programInfo.attributeLocations.texIndex, 1, gl.FLOAT, false, 0, 0);\n  \n  const vertexIndices = new Uint8Array([\n      // left rect\n  \n      gl.activeTexture(gl.TEXTURE0);\n      gl.bindTexture(gl.TEXTURE_2D, texture);\n-     gl.uniform1i(uniformLocations.texture, 0);\n+     gl.uniform1i(programInfo.uniformLocations.texture, 0);\n  \n      gl.activeTexture(gl.TEXTURE1);\n      gl.bindTexture(gl.TEXTURE_2D, otherTexture);\n-     gl.uniform1i(uniformLocations.otherTexture, 1);\n+     gl.uniform1i(programInfo.uniformLocations.otherTexture, 1);\n  \n-     gl.uniform2fv(uniformLocations.resolution, [canvas.width, canvas.height]);\n+     gl.uniform2fv(programInfo.uniformLocations.resolution, [canvas.width, canvas.height]);\n  \n      gl.drawElements(gl.TRIANGLES, vertexIndices.length, gl.UNSIGNED_BYTE, 0);\n  });\n\n```\nLooks quite like a cleanup 😎\n\n\nOne more thing that we use often are buffers.\nLet's create a helper class\n\n📄 src/GLBuffer.js\n```js\nexport class GLBuffer {\n    constructor(gl, target, data) {\n\n    }\n}\n\n```\nWe'll need data, buffer target and actual gl buffer, so let's assign everything passed from outside and craete a gl buffer.\n\n📄 src/GLBuffer.js\n```diff\n  export class GLBuffer {\n      constructor(gl, target, data) {\n- \n+         this.target = target;\n+         this.data = data;\n+         this.glBuffer = gl.createBuffer();\n      }\n  }\n\n```\nWe didn't assign a `gl` to instance because it might cause a memory leak, so we'll need to pass it from outside\n\n\nLet's implement an alternative to a `gl.bindBuffer`\n\n📄 src/GLBuffer.js\n```diff\n          this.data = data;\n          this.glBuffer = gl.createBuffer();\n      }\n+ \n+     bind(gl) {\n+         gl.bindBuffer(this.target, this.glBuffer);\n+     }\n  }\n\n```\nand a convenient way to set buffer data\n\n📄 src/GLBuffer.js\n```diff\n      bind(gl) {\n          gl.bindBuffer(this.target, this.glBuffer);\n      }\n+ \n+     setData(gl, data, usage) {\n+         this.data = data;\n+         this.bind(gl);\n+         gl.bufferData(this.target, this.data, usage);\n+     }\n  }\n\n```\nNow let's make a `data` argument of constructor and add a `usage` argument to be able to do everything we need with just a constructor call\n\n📄 src/GLBuffer.js\n```diff\n  export class GLBuffer {\n-     constructor(gl, target, data) {\n+     constructor(gl, target, data, usage) {\n          this.target = target;\n          this.data = data;\n          this.glBuffer = gl.createBuffer();\n+ \n+         if (typeof data !== 'undefined') {\n+             this.setData(gl, data, usage);\n+         }\n      }\n  \n      bind(gl) {\n\n```\nCool, now we can replace texCoords buffer with our thin wrapper\n\n📄 src/texture.js\n```diff\n  \n  import textureImageSrc from '../assets/images/texture.jpg';\n  import textureGreenImageSrc from '../assets/images/texture-green.jpg';\n+ import { GLBuffer } from './GLBuffer';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  gl.linkProgram(program);\n  gl.useProgram(program);\n  \n- const texCoords = new Float32Array([\n+ const texCoordsBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, new Float32Array([\n      ...createRect(0, 0, 1, 1), // left rect\n      ...createRect(0, 0, 1, 1), // right rect\n- ]);\n- const texCoordsBuffer = gl.createBuffer();\n- \n- gl.bindBuffer(gl.ARRAY_BUFFER, texCoordsBuffer);\n- gl.bufferData(gl.ARRAY_BUFFER, texCoords, gl.STATIC_DRAW);\n+ ]), gl.STATIC_DRAW);\n  \n  const texIndicies = new Float32Array([\n      ...Array.from({ length: 4 }).fill(0), // left rect\n  \n  gl.vertexAttribPointer(programInfo.attributeLocations.position, 2, gl.FLOAT, false, 0, 0);\n  \n- gl.bindBuffer(gl.ARRAY_BUFFER, texCoordsBuffer);\n+ texCoordsBuffer.bind(gl);\n  gl.vertexAttribPointer(programInfo.attributeLocations.texCoord, 2, gl.FLOAT, false, 0, 0);\n  \n  gl.bindBuffer(gl.ARRAY_BUFFER, texIndiciesBuffer);\n\n```\nDo the same for texIndices buffer\n\n📄 src/texture.js\n```diff\n      ...createRect(0, 0, 1, 1), // right rect\n  ]), gl.STATIC_DRAW);\n  \n- const texIndicies = new Float32Array([\n+ const texIndiciesBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, new Float32Array([\n      ...Array.from({ length: 4 }).fill(0), // left rect\n      ...Array.from({ length: 4 }).fill(1), // right rect\n- ]);\n- const texIndiciesBuffer = gl.createBuffer();\n- \n- gl.bindBuffer(gl.ARRAY_BUFFER, texIndiciesBuffer);\n- gl.bufferData(gl.ARRAY_BUFFER, texIndicies, gl.STATIC_DRAW);\n+ ]), gl.STATIC_DRAW);\n  \n  const vertexPosition = new Float32Array([\n      ...createRect(-1, -1, 1, 2), // left rect\n  texCoordsBuffer.bind(gl);\n  gl.vertexAttribPointer(programInfo.attributeLocations.texCoord, 2, gl.FLOAT, false, 0, 0);\n  \n- gl.bindBuffer(gl.ARRAY_BUFFER, texIndiciesBuffer);\n+ texIndiciesBuffer.bind(gl);\n  gl.vertexAttribPointer(programInfo.attributeLocations.texIndex, 1, gl.FLOAT, false, 0, 0);\n  \n  const vertexIndices = new Uint8Array([\n\n```\nvertex positions\n\n📄 src/texture.js\n```diff\n      ...Array.from({ length: 4 }).fill(1), // right rect\n  ]), gl.STATIC_DRAW);\n  \n- const vertexPosition = new Float32Array([\n+ const vertexPositionBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, new Float32Array([\n      ...createRect(-1, -1, 1, 2), // left rect\n      ...createRect(-1, 0, 1, 2), // right rect\n- ]);\n- const vertexPositionBuffer = gl.createBuffer();\n+ ]), gl.STATIC_DRAW);\n  \n- gl.bindBuffer(gl.ARRAY_BUFFER, vertexPositionBuffer);\n- gl.bufferData(gl.ARRAY_BUFFER, vertexPosition, gl.STATIC_DRAW);\n  \n  const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n  \n+ vertexPositionBuffer.bind(gl);\n  gl.vertexAttribPointer(programInfo.attributeLocations.position, 2, gl.FLOAT, false, 0, 0);\n  \n  texCoordsBuffer.bind(gl);\n\n```\nand index buffer\n\n📄 src/texture.js\n```diff\n  texIndiciesBuffer.bind(gl);\n  gl.vertexAttribPointer(programInfo.attributeLocations.texIndex, 1, gl.FLOAT, false, 0, 0);\n  \n- const vertexIndices = new Uint8Array([\n+ const indexBuffer = new GLBuffer(gl, gl.ELEMENT_ARRAY_BUFFER, new Uint8Array([\n      // left rect\n      0, 1, 2, \n      1, 2, 3, \n      // right rect\n      4, 5, 6, \n      5, 6, 7,\n- ]);\n- const indexBuffer = gl.createBuffer();\n- \n- gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER, indexBuffer);\n- gl.bufferData(gl.ELEMENT_ARRAY_BUFFER, vertexIndices, gl.STATIC_DRAW);\n+ ]), gl.STATIC_DRAW);\n  \n  Promise.all([\n      loadImage(textureImageSrc),\n  \n      gl.uniform2fv(programInfo.uniformLocations.resolution, [canvas.width, canvas.height]);\n  \n-     gl.drawElements(gl.TRIANGLES, vertexIndices.length, gl.UNSIGNED_BYTE, 0);\n+     gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n  });\n\n```\nNow we are able to work with shaders being more productive with less code!\n\nSee you tomorrow 👋\n\n---\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social\u0026hash=day11)\n![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social\u0026hash=day11)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 12. Highdpi displays and webgl viewport\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social\u0026hash=day11)\n![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social\u0026hash=day11)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\nHey 👋 Welcome back to WebGL month\n\nAll previous tutorials where done on a default size canvas, let's make the picture bigger!\n\n\nWe'll need to tune a bit of css first to make body fill the screen\n\n📄 index.html\n```diff\n      \u003cmeta name=\"viewport\" content=\"width=device-width, initial-scale=1.0\" /\u003e\n      \u003cmeta http-equiv=\"X-UA-Compatible\" content=\"ie=edge\" /\u003e\n      \u003ctitle\u003eWebGL Month\u003c/title\u003e\n+ \n+     \u003cstyle\u003e\n+     html, body {\n+       height: 100%;\n+     }\n+ \n+     body {\n+       margin: 0;\n+     }\n+     \u003c/style\u003e\n    \u003c/head\u003e\n    \u003cbody\u003e\n      \u003ccanvas\u003e\u003c/canvas\u003e\n\n```\nNow we can read body dimensions\n\n📄 src/texture.js\n```diff\n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  \n+ const width = document.body.offsetWidth;\n+ const height = document.body.offsetHeight;\n+ \n  const vShader = gl.createShader(gl.VERTEX_SHADER);\n  const fShader = gl.createShader(gl.FRAGMENT_SHADER);\n  \n\n```\nAnd set canvas dimensions\n\n📄 src/texture.js\n```diff\n  const width = document.body.offsetWidth;\n  const height = document.body.offsetHeight;\n  \n+ canvas.width = width;\n+ canvas.height = height;\n+ \n  const vShader = gl.createShader(gl.VERTEX_SHADER);\n  const fShader = gl.createShader(gl.FRAGMENT_SHADER);\n  \n\n```\nOk, canvas size changed, but our picture isn't full screen, why?\n\n\nTurns out that changing canvas size isn't enought, we also need to specify a viwport. Treat viewport as a rectangle which will be used as drawing area and interpolate it to `[-1...1]` clipspace\n\n📄 src/texture.js\n```diff\n  \n      gl.uniform2fv(programInfo.uniformLocations.resolution, [canvas.width, canvas.height]);\n  \n+     gl.viewport(0, 0, canvas.width, canvas.height);\n+ \n      gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n  });\n\n```\nNow our picture fills the whole document, but it is a bit blurry. Obvious reason – our texture is not big enough, so it should be stretched and loses quality. That's correct, but there is another reason.\n\n\nModern displays fit higher amount of actual pixels in a physical pixel size (apple calls it retina). There is a global variable `devicePixelRatio` which might help us.\n\n📄 src/texture.js\n```diff\n  const width = document.body.offsetWidth;\n  const height = document.body.offsetHeight;\n  \n- canvas.width = width;\n- canvas.height = height;\n+ canvas.width = width * devicePixelRatio;\n+ canvas.height = height * devicePixelRatio;\n  \n  const vShader = gl.createShader(gl.VERTEX_SHADER);\n  const fShader = gl.createShader(gl.FRAGMENT_SHADER);\n\n```\nOk, now our canvas has an appropriate size, but it is bigger than body on retina displays. How do we fix it?\nWe can downscale canvas to a physical size with css `width` and `height` property\n\n📄 src/texture.js\n```diff\n  canvas.width = width * devicePixelRatio;\n  canvas.height = height * devicePixelRatio;\n  \n+ canvas.style.width = `${width}px`;\n+ canvas.style.height = `${height}px`;\n+ \n  const vShader = gl.createShader(gl.VERTEX_SHADER);\n  const fShader = gl.createShader(gl.FRAGMENT_SHADER);\n  \n\n```\nJust to summarize, `width` and `height` attributes of canvas specify actual size in pixels, but in order to make picture sharp on highdpi displays we need to multiply width and hegiht on `devicePixelRatio` and downscale canvas back with css\n\n\nNow we can alos make our canvas resizable\n\n📄 src/texture.js\n```diff\n  \n      gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n  });\n+ \n+ \n+ window.addEventListener('resize', () =\u003e {\n+     const width = document.body.offsetWidth;\n+     const height = document.body.offsetHeight;\n+ \n+     canvas.width = width * devicePixelRatio;\n+     canvas.height = height * devicePixelRatio;\n+ \n+     canvas.style.width = `${width}px`;\n+     canvas.style.height = `${height}px`;\n+ \n+     gl.viewport(0, 0, canvas.width, canvas.height);\n+ });\n\n```\nOops, canvas clears after resize. Turns out that modification of `width` or `height` attribute forces browser to clear canvas (the same for `2d` context), so we need to issue a draw call again.\n\n📄 src/texture.js\n```diff\n      canvas.style.height = `${height}px`;\n  \n      gl.viewport(0, 0, canvas.width, canvas.height);\n+ \n+     gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n  });\n\n```\nThat's it for today, see you tomorrow 👋\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social\u0026hash=day12)\n![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social\u0026hash=day12)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n## Day 13. Simple animation\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social\u0026hash=day11)\n![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social\u0026hash=day11)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\nHey 👋 Welcome to WebGL month.\n\nAll previous tutorials where based on static images, let's add some motion!\n\n\nWe'll need a simple vertex shader\n\n📄 src/shaders/rotating-square.v.glsl\n```glsl\nattribute vec2 position;\nuniform vec2 resolution;\n\nvoid main() {\n    gl_Position = vec4(position / resolution * 2.0 - 1.0, 0, 1);\n}\n\n```\nfragment shader\n\n📄 src/shaders/rotating-square.f.glsl\n```glsl\nprecision mediump float;\n\nvoid main() {\n    gl_FragColor = vec4(1, 0, 0, 1);\n}\n\n```\nNew entry point\n\n📄 index.html\n```diff\n    \u003c/head\u003e\n    \u003cbody\u003e\n      \u003ccanvas\u003e\u003c/canvas\u003e\n-     \u003cscript src=\"./dist/texture.js\"\u003e\u003c/script\u003e\n+     \u003cscript src=\"./dist/rotating-square.js\"\u003e\u003c/script\u003e\n    \u003c/body\u003e\n  \u003c/html\u003e\n\n```\n📄 src/rotating-square.js\n```js\nimport vShaderSource from './shaders/rotating-square.v.glsl';\nimport fShaderSource from './shaders/rotating-square.f.glsl';\n\n```\n📄 webpack.config.js\n```diff\n      entry: {\n          'week-1': './src/week-1.js',\n          'texture': './src/texture.js',\n+         'rotating-square': './src/rotating-square.js',\n      },\n  \n      output: {\n\n```\nGet WebGL context\n\n📄 src/rotating-square.js\n```diff\n  import vShaderSource from './shaders/rotating-square.v.glsl';\n  import fShaderSource from './shaders/rotating-square.f.glsl';\n+ \n+ const canvas = document.querySelector('canvas');\n+ const gl = canvas.getContext('webgl');\n+ \n\n```\nMake canvas fullscreen\n\n📄 src/rotating-square.js\n```diff\n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  \n+ const width = document.body.offsetWidth;\n+ const height = document.body.offsetHeight;\n+ \n+ canvas.width = width * devicePixelRatio;\n+ canvas.height = height * devicePixelRatio;\n+ \n+ canvas.style.width = `${width}px`;\n+ canvas.style.height = `${height}px`;\n\n```\nCreate shaders\n\n📄 src/rotating-square.js\n```diff\n  import vShaderSource from './shaders/rotating-square.v.glsl';\n  import fShaderSource from './shaders/rotating-square.f.glsl';\n+ import { compileShader } from './gl-helpers';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  \n  canvas.style.width = `${width}px`;\n  canvas.style.height = `${height}px`;\n+ \n+ const vShader = gl.createShader(gl.VERTEX_SHADER);\n+ const fShader = gl.createShader(gl.FRAGMENT_SHADER);\n+ \n+ compileShader(gl, vShader, vShaderSource);\n+ compileShader(gl, fShader, fShaderSource);\n\n```\nCreate program\n\n📄 src/rotating-square.js\n```diff\n  \n  compileShader(gl, vShader, vShaderSource);\n  compileShader(gl, fShader, fShaderSource);\n+ \n+ const program = gl.createProgram();\n+ \n+ gl.attachShader(program, vShader);\n+ gl.attachShader(program, fShader);\n+ \n+ gl.linkProgram(program);\n+ gl.useProgram(program);\n\n```\nGet attribute and uniform locations\n\n📄 src/rotating-square.js\n```diff\n  import vShaderSource from './shaders/rotating-square.v.glsl';\n  import fShaderSource from './shaders/rotating-square.f.glsl';\n- import { compileShader } from './gl-helpers';\n+ import { setupShaderInput, compileShader } from './gl-helpers';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  \n  gl.linkProgram(program);\n  gl.useProgram(program);\n+ \n+ const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n\n```\nCreate vertices to draw a square\n\n📄 src/rotating-square.js\n```diff\n  import vShaderSource from './shaders/rotating-square.v.glsl';\n  import fShaderSource from './shaders/rotating-square.f.glsl';\n  import { setupShaderInput, compileShader } from './gl-helpers';\n+ import { createRect } from './shape-helpers';\n+ import { GLBuffer } from './GLBuffer';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  gl.useProgram(program);\n  \n  const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n+ \n+ const vertexPositionBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, new Float32Array([\n+     ...createRect(canvas.width / 2 - 100, canvas.height / 2 - 100, 200, 200),\n+ ]), gl.STATIC_DRAW);\n\n```\nSetup attribute pointer\n\n📄 src/rotating-square.js\n```diff\n  const vertexPositionBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, new Float32Array([\n      ...createRect(canvas.width / 2 - 100, canvas.height / 2 - 100, 200, 200),\n  ]), gl.STATIC_DRAW);\n+ \n+ gl.vertexAttribPointer(programInfo.attributeLocations.position, 2, gl.FLOAT, false, 0, 0);\n\n```\nCreate index buffer\n\n📄 src/rotating-square.js\n```diff\n  ]), gl.STATIC_DRAW);\n  \n  gl.vertexAttribPointer(programInfo.attributeLocations.position, 2, gl.FLOAT, false, 0, 0);\n+ \n+ const indexBuffer = new GLBuffer(gl, gl.ELEMENT_ARRAY_BUFFER, new Uint8Array([\n+     0, 1, 2, \n+     1, 2, 3, \n+ ]), gl.STATIC_DRAW);\n\n```\nPass resolution and setup viewport\n\n📄 src/rotating-square.js\n```diff\n      0, 1, 2, \n      1, 2, 3, \n  ]), gl.STATIC_DRAW);\n+ \n+ gl.uniform2fv(programInfo.uniformLocations.resolution, [canvas.width, canvas.height]);\n+ \n+ gl.viewport(0, 0, canvas.width, canvas.height);\n\n```\nAnd finally issue a draw call\n\n📄 src/rotating-square.js\n```diff\n  gl.uniform2fv(programInfo.uniformLocations.resolution, [canvas.width, canvas.height]);\n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n+ gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n\n```\nNow let's think of how can we rotate this square\n\nActually we can fit in in the circle and each vertex position might be calculated with `radius`, `cos` and `sin` and all we'll need is add some delta angle to each vertex\n\n![Rotation](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/rotation.png)\n\n\nLet's refactor our createRect helper to take angle into account\n\n📄 src/rotating-square.js\n```diff\n  const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n  \n  const vertexPositionBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, new Float32Array([\n-     ...createRect(canvas.width / 2 - 100, canvas.height / 2 - 100, 200, 200),\n+     ...createRect(canvas.width / 2 - 100, canvas.height / 2 - 100, 200, 200, 0),\n  ]), gl.STATIC_DRAW);\n  \n  gl.vertexAttribPointer(programInfo.attributeLocations.position, 2, gl.FLOAT, false, 0, 0);\n\n```\n📄 src/shape-helpers.js\n```diff\n- export function createRect(top, left, width, height) {\n+ const Pi_4 = Math.PI / 4;\n+ \n+ export function createRect(top, left, width, height, angle = 0) {\n+     const centerX = width / 2;\n+     const centerY = height / 2;\n+ \n+     const diagonalLength = Math.sqrt(centerX ** 2 + centerY ** 2);\n+ \n+     const x1 = centerX + diagonalLength * Math.cos(angle + Pi_4);\n+     const y1 = centerY + diagonalLength * Math.sin(angle + Pi_4);\n+ \n+     const x2 = centerX + diagonalLength * Math.cos(angle + Pi_4 * 3);\n+     const y2 = centerY + diagonalLength * Math.sin(angle + Pi_4 * 3);\n+ \n+     const x3 = centerX + diagonalLength * Math.cos(angle - Pi_4);\n+     const y3 = centerY + diagonalLength * Math.sin(angle - Pi_4);\n+ \n+     const x4 = centerX + diagonalLength * Math.cos(angle - Pi_4 * 3);\n+     const y4 = centerY + diagonalLength * Math.sin(angle - Pi_4 * 3);\n+ \n      return [\n-         left, top, // x1 y1\n-         left + width, top, // x2 y2\n-         left, top + height, // x3 y3\n-         left + width, top + height, // x4 y4\n+         x1 + left, y1 + top,\n+         x2 + left, y2 + top,\n+         x3 + left, y3 + top,\n+         x4 + left, y4 + top,\n      ];\n  }\n  \n\n```\nNow we need to define initial angle\n\n📄 src/rotating-square.js\n```diff\n  gl.uniform2fv(programInfo.uniformLocations.resolution, [canvas.width, canvas.height]);\n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n- gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n+ \n+ let angle = 0;\n\n```\nand a function which will be called each frame\n\n📄 src/rotating-square.js\n```diff\n  gl.viewport(0, 0, canvas.width, canvas.height);\n  \n  let angle = 0;\n+ \n+ function frame() {\n+     requestAnimationFrame(frame);\n+ }\n+ \n+ frame();\n\n```\nEach frame WebGL just goes through vertex data and renders it. In order to make it render smth different we need to update this data\n\n📄 src/rotating-square.js\n```diff\n  let angle = 0;\n  \n  function frame() {\n+     vertexPositionBuffer.setData(\n+         gl, \n+         new Float32Array(\n+             createRect(canvas.width / 2 - 100, canvas.height / 2 - 100, 200, 200, angle)\n+         ), \n+         gl.STATIC_DRAW,\n+     );\n+ \n      requestAnimationFrame(frame);\n  }\n  \n\n```\nWe also need to update rotation angle each frame\n\n📄 src/rotating-square.js\n```diff\n          gl.STATIC_DRAW,\n      );\n  \n+     angle += Math.PI / 60;\n+ \n      requestAnimationFrame(frame);\n  }\n  \n\n```\nand issue a draw call\n\n📄 src/rotating-square.js\n```diff\n  \n      angle += Math.PI / 60;\n  \n+     gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n      requestAnimationFrame(frame);\n  }\n  \n\n```\nCool! We now have a rotating square! 🎉\n\n![Rotating circle gif](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/rotation.gif)\n\n\nWhat we've just done could be simplified with [rotation matrix](https://en.wikipedia.org/wiki/Rotation_matrix)\n\n\nDon't worry if you're not fluent in linear algebra, me neither, there is a special package 😉\n\n📄 package.json\n```diff\n      \"webpack-cli\": \"^3.3.5\"\n    },\n    \"dependencies\": {\n+     \"gl-matrix\": \"^3.0.0\",\n      \"glsl-extract-sync\": \"0.0.0\"\n    }\n  }\n\n```\nWe'll need to define a rotation matrix uniform\n\n📄 src/shaders/rotating-square.v.glsl\n```diff\n  attribute vec2 position;\n  uniform vec2 resolution;\n  \n+ uniform mat2 rotationMatrix;\n+ \n  void main() {\n      gl_Position = vec4(position / resolution * 2.0 - 1.0, 0, 1);\n  }\n\n```\nAnd multiply vertex positions\n\n📄 src/shaders/rotating-square.v.glsl\n```diff\n  uniform mat2 rotationMatrix;\n  \n  void main() {\n-     gl_Position = vec4(position / resolution * 2.0 - 1.0, 0, 1);\n+     gl_Position = vec4((position / resolution * 2.0 - 1.0) * rotationMatrix, 0, 1);\n  }\n\n```\nNow we can get rid of vertex position updates\n\n📄 src/rotating-square.js\n```diff\n  import { setupShaderInput, compileShader } from './gl-helpers';\n  import { createRect } from './shape-helpers';\n  import { GLBuffer } from './GLBuffer';\n+ import { mat2 } from 'gl-matrix';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n  \n- let angle = 0;\n+ const rotationMatrix = mat2.create();\n  \n  function frame() {\n-     vertexPositionBuffer.setData(\n-         gl, \n-         new Float32Array(\n-             createRect(canvas.width / 2 - 100, canvas.height / 2 - 100, 200, 200, angle)\n-         ), \n-         gl.STATIC_DRAW,\n-     );\n- \n-     angle += Math.PI / 60;\n  \n      gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n      requestAnimationFrame(frame);\n\n```\nand use rotation matrix instead\n\n📄 src/rotating-square.js\n```diff\n  const rotationMatrix = mat2.create();\n  \n  function frame() {\n+     gl.uniformMatrix2fv(programInfo.uniformLocations.rotationMatrix, false, rotationMatrix);\n+ \n+     mat2.rotate(rotationMatrix, rotationMatrix, -Math.PI / 60);\n  \n      gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n      requestAnimationFrame(frame);\n\n```\n### Conclusion\n\nWhat seemed a complex math in our shape helper refactor turned out to be pretty easy doable with matrix math. GPU performs matrix multiplication very fast (it has special optimisations on hardware level for this kind of operations), so a lot of transformations can be made with transform matrix. This is very improtant concept, especcially in 3d rendering world.\n\nThat's it for today, see you tomorrow! 👋\n\n---\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social\u0026hash=day11)\n![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social\u0026hash=day11)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 14. Intro to 3d\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social\u0026hash=day11)\n![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social\u0026hash=day11)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋 Welcome to WebGL month. Today we're going to explore some important topics before starting to work with 3D.\n\nLet's talk about projections first. As you might know a point in 2d space has a projection on X and Y axises.\n\nIn 3d space we can project a point not only on axises, but also on a plane\n\nThis is how point in space will be projected on a plane\n\n![Point projection](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/projection-point.jpg)\n\nDisplay is also a flat plane, so basically every point in a 3d space could be projected on it.\n\nAs we know, WebGL could render only triangles, so every 3d object should be built from a lot of triangles. The more triangles object consists of – the more precise object will look like.\n\nThat's how a triangle in 3d space could be projected on a plane\n\n![Triangle projection](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/projection-triangle.jpg)\n\nNotice that on a plane triangle looks a bit different, because vertices of the triangle are not in the plane parallel to the one we project this triangle on (rotated).\n\nYou can build 3D models in editors like [Blender](https://www.blender.org/) or [3ds Max](https://www.autodesk.com/products/3ds-max/overview). There are special file formats which describe 3d objects, so in order to render these objects we'll need to parse these files and build triangles. We'll explore how to do this in upcoming tutorilas.\n\nOne more important concept of 3d is perspective. Farther objects seem smaller\n\nImagine we're looking at some objects from the top\n\n![Perspective](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/perspective.jpg)\n\nNotice that projected faces of cubes are different in size (bottom is larger than top) because of perspective.\n\nAnother variable in this complex \"how to render 3d\" equasion is field of view (what the max distance to the object which is visible by the camera, how wide is camera lens) and how much of objects fit the \"camera lens\".\n\nTaking into account all these specifics seems like a lot of work to do, but luckily this work was already done, and that's where we need linear algebra and matrix multiplication stuff. Again, if you're not fluent in linear algebra – don't worrly, there is an awesome package [gl-matrix](http://glmatrix.net/) which will help you with all this stuff.\n\nTurns out that in order to get a 2d coordinates on a screen of a point in 3d space we just need to multiply a [homogeneous coordinates](https://developer.mozilla.org/en-US/docs/Web/API/WebGL_API/WebGL_model_view_projection#Homogeneous_Coordinates) of the point and a special \"projection matrix\". This matrix describes the field of view, near and far bounds of [camera frustum](https://en.wikipedia.org/wiki/Viewing_frustum) (region of space in the modeled world which may appear on the screen). Perspective projection looks more realistic, because it takes into account a distance to the object, so we'll use a [mat4.perspective](http://glmatrix.net/docs/module-mat4.html#.perspective) method from `gl-matrix`.\n\nThere are two more matrices which simplify life in 3d rendering world\n\n1. Model matrix – matrix containing object transforms (translation, rotation, scale). If no transforms applied – this is an identity matrix\n\n```\n1. 0, 0, 0,\n0, 1, 0, 0,\n0, 0, 1, 0,\n0, 0, 0, 1,\n```\n\n2. [View matrix](http://glmatrix.net/docs/module-mat4.html#.lookAt) – matrix describing position and direction of the \"camera\"\n\nThere's also a [good article on MDN explaining these concepts](https://developer.mozilla.org/en-US/docs/Web/API/WebGL_API/WebGL_model_view_projection)\n\n---\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social\u0026hash=day11)\n![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social\u0026hash=day11)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 15. Rendring a cube\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social\u0026hash=day11)\n![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social\u0026hash=day11)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋 Welcome to WebGL month.\n[Yesterday] we've explored some concepts required for 3d rendering, so let's finally render something 💪\n\n\nWe'll need a new entry point\n\n📄 index.html\n```diff\n      \u003c/head\u003e\n      \u003cbody\u003e\n          \u003ccanvas\u003e\u003c/canvas\u003e\n-         \u003cscript src=\"./dist/rotating-square.js\"\u003e\u003c/script\u003e\n+         \u003cscript src=\"./dist/3d.js\"\u003e\u003c/script\u003e\n      \u003c/body\u003e\n  \u003c/html\u003e\n\n```\n📄 src/3d.js\n```js\nconsole.log('Hello 3d!');\n\n```\n📄 webpack.config.js\n```diff\n          'week-1': './src/week-1.js',\n          texture: './src/texture.js',\n          'rotating-square': './src/rotating-square.js',\n+         '3d': './src/3d.js',\n      },\n  \n      output: {\n\n```\nSimple vertex and fragment shaders. Notice that we use `vec3` for position now as we'll work in 3-dimensional clipsace.\n\n📄 src/shaders/3d.f.glsl\n```glsl\nprecision mediump float;\n\nvoid main() {\n    gl_FragColor = vec4(1, 0, 0, 1);\n}\n\n```\n📄 src/shaders/3d.v.glsl\n```glsl\nattribute vec3 position;\n\nvoid main() {\n    gl_Position = vec4(position, 1.0);\n}\n\n```\nWe'll also need a familiar from previous tutorials boilerplate for our WebGL program\n\n📄 src/3d.js\n```diff\n- console.log('Hello 3d!');\n+ import vShaderSource from './shaders/3d.v.glsl';\n+ import fShaderSource from './shaders/3d.f.glsl';\n+ import { compileShader, setupShaderInput } from './gl-helpers';\n+ \n+ const canvas = document.querySelector('canvas');\n+ const gl = canvas.getContext('webgl');\n+ \n+ const width = document.body.offsetWidth;\n+ const height = document.body.offsetHeight;\n+ \n+ canvas.width = width * devicePixelRatio;\n+ canvas.height = height * devicePixelRatio;\n+ \n+ canvas.style.width = `${width}px`;\n+ canvas.style.height = `${height}px`;\n+ \n+ const vShader = gl.createShader(gl.VERTEX_SHADER);\n+ const fShader = gl.createShader(gl.FRAGMENT_SHADER);\n+ \n+ compileShader(gl, vShader, vShaderSource);\n+ compileShader(gl, fShader, fShaderSource);\n+ \n+ const program = gl.createProgram();\n+ \n+ gl.attachShader(program, vShader);\n+ gl.attachShader(program, fShader);\n+ \n+ gl.linkProgram(program);\n+ gl.useProgram(program);\n+ \n+ const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n\n```\nNow let's define cube vertices for each face. We'll start with front face\n\n📄 src/3d.js\n```diff\n  gl.useProgram(program);\n  \n  const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n+ \n+ const cubeVertices = new Float32Array([\n+     // Front face\n+     -1.0, -1.0, 1.0,\n+     1.0, -1.0, 1.0,\n+     1.0, 1.0, 1.0,\n+     -1.0, 1.0, 1.0,\n+ ]);\n\n```\nback face\n\n📄 src/3d.js\n```diff\n      1.0, -1.0, 1.0,\n      1.0, 1.0, 1.0,\n      -1.0, 1.0, 1.0,\n+ \n+     // Back face\n+     -1.0, -1.0, -1.0,\n+     -1.0, 1.0, -1.0,\n+     1.0, 1.0, -1.0,\n+     1.0, -1.0, -1.0,\n  ]);\n\n```\ntop face\n\n📄 src/3d.js\n```diff\n      -1.0, 1.0, -1.0,\n      1.0, 1.0, -1.0,\n      1.0, -1.0, -1.0,\n+ \n+     // Top face\n+     -1.0, 1.0, -1.0,\n+     -1.0, 1.0, 1.0,\n+     1.0, 1.0, 1.0,\n+     1.0, 1.0, -1.0,\n  ]);\n\n```\nbottom face\n\n📄 src/3d.js\n```diff\n      -1.0, 1.0, 1.0,\n      1.0, 1.0, 1.0,\n      1.0, 1.0, -1.0,\n+ \n+     // Bottom face\n+     -1.0, -1.0, -1.0,\n+     1.0, -1.0, -1.0,\n+     1.0, -1.0, 1.0,\n+     -1.0, -1.0, 1.0,\n  ]);\n\n```\nright face\n\n📄 src/3d.js\n```diff\n      1.0, -1.0, -1.0,\n      1.0, -1.0, 1.0,\n      -1.0, -1.0, 1.0,\n+ \n+     // Right face\n+     1.0, -1.0, -1.0,\n+     1.0, 1.0, -1.0,\n+     1.0, 1.0, 1.0,\n+     1.0, -1.0, 1.0,\n  ]);\n\n```\nleft face\n\n📄 src/3d.js\n```diff\n      1.0, 1.0, -1.0,\n      1.0, 1.0, 1.0,\n      1.0, -1.0, 1.0,\n+ \n+     // Left face\n+     -1.0, -1.0, -1.0,\n+     -1.0, -1.0, 1.0,\n+     -1.0, 1.0, 1.0,\n+     -1.0, 1.0, -1.0,\n  ]);\n\n```\nNow we need to define vertex indices\n\n📄 src/3d.js\n```diff\n      -1.0, 1.0, 1.0,\n      -1.0, 1.0, -1.0,\n  ]);\n+ \n+ const indices = new Uint8Array([\n+     0, 1, 2, 0, 2, 3,       // front\n+     4, 5, 6, 4, 6, 7,       // back\n+     8, 9, 10, 8, 10, 11,    // top\n+     12, 13, 14, 12, 14, 15, // bottom\n+     16, 17, 18, 16, 18, 19, // right\n+     20, 21, 22, 20, 22, 23, // left\n+ ]);\n\n```\nand create gl buffers\n\n📄 src/3d.js\n```diff\n  import vShaderSource from './shaders/3d.v.glsl';\n  import fShaderSource from './shaders/3d.f.glsl';\n  import { compileShader, setupShaderInput } from './gl-helpers';\n+ import { GLBuffer } from './GLBuffer';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n      16, 17, 18, 16, 18, 19, // right\n      20, 21, 22, 20, 22, 23, // left\n  ]);\n+ \n+ const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, cubeVertices, gl.STATIC_DRAW);\n+ const indexBuffer = new GLBuffer(gl, gl.ELEMENT_ARRAY_BUFFER, indices, gl.STATIC_DRAW);\n\n```\nSetup vertex attribute pointer\n\n📄 src/3d.js\n```diff\n  \n  const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, cubeVertices, gl.STATIC_DRAW);\n  const indexBuffer = new GLBuffer(gl, gl.ELEMENT_ARRAY_BUFFER, indices, gl.STATIC_DRAW);\n+ \n+ vertexBuffer.bind(gl);\n+ gl.vertexAttribPointer(programInfo.attributeLocations.position, 3, gl.FLOAT, false, 0, 0);\n\n```\nsetup viewport\n\n📄 src/3d.js\n```diff\n  \n  vertexBuffer.bind(gl);\n  gl.vertexAttribPointer(programInfo.attributeLocations.position, 3, gl.FLOAT, false, 0, 0);\n+ \n+ gl.viewport(0, 0, canvas.width, canvas.height);\n\n```\nand issue a draw call\n\n📄 src/3d.js\n```diff\n  gl.vertexAttribPointer(programInfo.attributeLocations.position, 3, gl.FLOAT, false, 0, 0);\n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n+ \n+ gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n\n```\nOk, we did everything right, but we just see a red canvas? That's expected result, because every face of cube has a length of `2` with left-most vertices at `-1` and right-most at `1`, so we need to add some matrix magic from yesterday.\n\n\nLet's define uniforms for each matrix\n\n📄 src/shaders/3d.v.glsl\n```diff\n  attribute vec3 position;\n  \n+ uniform mat4 modelMatrix;\n+ uniform mat4 viewMatrix;\n+ uniform mat4 projectionMatrix;\n+ \n  void main() {\n      gl_Position = vec4(position, 1.0);\n  }\n\n```\nand multiply every matrix.\n\n📄 src/shaders/3d.v.glsl\n```diff\n  uniform mat4 projectionMatrix;\n  \n  void main() {\n-     gl_Position = vec4(position, 1.0);\n+     gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);\n  }\n\n```\nNow we need to define JS representations of the same matrices\n\n📄 src/3d.js\n```diff\n+ import { mat4 } from 'gl-matrix';\n+ \n  import vShaderSource from './shaders/3d.v.glsl';\n  import fShaderSource from './shaders/3d.f.glsl';\n  import { compileShader, setupShaderInput } from './gl-helpers';\n  vertexBuffer.bind(gl);\n  gl.vertexAttribPointer(programInfo.attributeLocations.position, 3, gl.FLOAT, false, 0, 0);\n  \n+ const modelMatrix = mat4.create();\n+ const viewMatrix = mat4.create();\n+ const projectionMatrix = mat4.create();\n+ \n  gl.viewport(0, 0, canvas.width, canvas.height);\n  \n  gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n\n```\nWe'll leave model matrix as-is (mat4.create returns an identity matrix), meaning cube won't have any transforms (no translation, no rotation, no scale).\n\n\nWe'll use `lookAt` method to setup `viewMatrix`\n\n📄 src/3d.js\n```diff\n  const viewMatrix = mat4.create();\n  const projectionMatrix = mat4.create();\n  \n+ mat4.lookAt(\n+     viewMatrix,\n+ );\n+ \n  gl.viewport(0, 0, canvas.width, canvas.height);\n  \n  gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n\n```\nThe 2nd argument is a position of a viewer. Let's place this point on top and in front of the cube\n\n📄 src/3d.js\n```diff\n  \n  mat4.lookAt(\n      viewMatrix,\n+     [0, 7, -7],\n  );\n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n\n```\nThe 3rd argument is a point where we want to look at. Coordinate of our cube is (0, 0, 0), that's exactly what we want to look at\n\n📄 src/3d.js\n```diff\n  mat4.lookAt(\n      viewMatrix,\n      [0, 7, -7],\n+     [0, 0, 0],\n  );\n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n\n```\nThe last argument is up vector. We can setup a view matrix in a way that any vector will be treated as pointing to the top of our world, so let's make y axis pointing to the top\n\n📄 src/3d.js\n```diff\n      viewMatrix,\n      [0, 7, -7],\n      [0, 0, 0],\n+     [0, 1, 0],\n  );\n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n\n```\nTo setup projection matrix we'll use perspective method\n\n📄 src/3d.js\n```diff\n      [0, 1, 0],\n  );\n  \n+ mat4.perspective(\n+     projectionMatrix,\n+ );\n+ \n  gl.viewport(0, 0, canvas.width, canvas.height);\n  \n  gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n\n```\nView and perspective matrices together are kind of a \"camera\" parameters.\nWe already have a position and direction of a camera, let's setup other parameters.\n\nThe 2nd argument of `perspective` method is a `field of view` (how wide is camera lens). Wider the angle – more objecs will fit the screen (you surely heard of a \"wide angle\" camera in recent years phones, that's about the same).\n\n📄 src/3d.js\n```diff\n  \n  mat4.perspective(\n      projectionMatrix,\n+     Math.PI / 360 * 90,\n  );\n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n\n```\nNext argument is aspect ration of a canvas. It could be calculated by a simple division\n\n📄 src/3d.js\n```diff\n  mat4.perspective(\n      projectionMatrix,\n      Math.PI / 360 * 90,\n+     canvas.width / canvas.height,\n  );\n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n\n```\nThe 4th and 5th argumnts setup a distance to objects which are visible by camera. Some objects might be too close, others too far, so they shouldn't be rendered. The 4th argument – distance to the closest object to render, the 5th – to the farthest\n\n📄 src/3d.js\n```diff\n      projectionMatrix,\n      Math.PI / 360 * 90,\n      canvas.width / canvas.height,\n+     0.01,\n+     100,\n  );\n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n\n```\nand finally we need to pass matrices to shader\n\n📄 src/3d.js\n```diff\n      100,\n  );\n  \n+ gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, modelMatrix);\n+ gl.uniformMatrix4fv(programInfo.uniformLocations.viewMatrix, false, viewMatrix);\n+ gl.uniformMatrix4fv(programInfo.uniformLocations.projectionMatrix, false, projectionMatrix);\n+ \n  gl.viewport(0, 0, canvas.width, canvas.height);\n  \n  gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n\n```\nLooks quite like a cube 🎉\n\n![Cube](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/cube.jpg)\n\n\nNow let's implement a rotation animation with help of model matrix and rotate method from gl-matrix\n\n📄 src/3d.js\n```diff\n  gl.viewport(0, 0, canvas.width, canvas.height);\n  \n  gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n+ \n+ function frame() {\n+     mat4.rotateY(modelMatrix, modelMatrix, Math.PI / 180);\n+ \n+     requestAnimationFrame(frame);\n+ }\n+ \n+ frame();\n\n```\nWe also need to update a uniform\n\n📄 src/3d.js\n```diff\n  function frame() {\n      mat4.rotateY(modelMatrix, modelMatrix, Math.PI / 180);\n  \n+     gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, modelMatrix);\n+ \n      requestAnimationFrame(frame);\n  }\n  \n\n```\nand issue a draw call\n\n📄 src/3d.js\n```diff\n      mat4.rotateY(modelMatrix, modelMatrix, Math.PI / 180);\n  \n      gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, modelMatrix);\n+     gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n  \n      requestAnimationFrame(frame);\n  }\n\n```\nCool! We have a rotation 🎉\n\n![Rotating cube](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/rotating-cube.gif)\n\nThat's it for today, see you tomorrow 👋\n\n---\n\n![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social\u0026hash=day11)\n![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social\u0026hash=day11)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 16. Colorizing cube and exploring depth buffer\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\nHey 👋\n\nWelcome to [WebGL month](https://github.com/lesnitsky/webgl-month)\n\n[Yesterday](https://dev.to/lesnitsky/webgl-month-day-15-rendering-a-3d-cube-190f) we've rendered a cube, but all faces are of the same color, let's change this.\n\n\nLet's define face colors\n\n📄 src/3d.js\n```diff\n      20, 21, 22, 20, 22, 23, // left\n  ]);\n  \n+ const faceColors = [\n+     [1.0, 1.0, 1.0, 1.0], // Front face: white\n+     [1.0, 0.0, 0.0, 1.0], // Back face: red\n+     [0.0, 1.0, 0.0, 1.0], // Top face: green\n+     [0.0, 0.0, 1.0, 1.0], // Bottom face: blue\n+     [1.0, 1.0, 0.0, 1.0], // Right face: yellow\n+     [1.0, 0.0, 1.0, 1.0], // Left face: purple\n+ ];\n+ \n  const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, cubeVertices, gl.STATIC_DRAW);\n  const indexBuffer = new GLBuffer(gl, gl.ELEMENT_ARRAY_BUFFER, indices, gl.STATIC_DRAW);\n  \n\n```\nNow we need to repeat face colors for each face vertex\n\n📄 src/3d.js\n```diff\n      [1.0, 0.0, 1.0, 1.0], // Left face: purple\n  ];\n  \n+ const colors = [];\n+ \n+ for (var j = 0; j \u003c faceColors.length; ++j) {\n+     const c = faceColors[j];\n+     colors.push(\n+         ...c, // vertex 1\n+         ...c, // vertex 2\n+         ...c, // vertex 3\n+         ...c, // vertex 4\n+     );\n+ }\n+ \n+ \n  const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, cubeVertices, gl.STATIC_DRAW);\n  const indexBuffer = new GLBuffer(gl, gl.ELEMENT_ARRAY_BUFFER, indices, gl.STATIC_DRAW);\n  \n\n```\nand create a webgl buffer\n\n📄 src/3d.js\n```diff\n  \n  \n  const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, cubeVertices, gl.STATIC_DRAW);\n+ const colorsBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, new Float32Array(colors), gl.STATIC_DRAW);\n  const indexBuffer = new GLBuffer(gl, gl.ELEMENT_ARRAY_BUFFER, indices, gl.STATIC_DRAW);\n  \n  vertexBuffer.bind(gl);\n\n```\nNext we need to define an attribute to pass color from js to vertex shader, and varying to pass it from vertex to fragment shader\n\n📄 src/shaders/3d.v.glsl\n```diff\n  attribute vec3 position;\n+ attribute vec4 color;\n  \n  uniform mat4 modelMatrix;\n  uniform mat4 viewMatrix;\n  uniform mat4 projectionMatrix;\n  \n+ varying vec4 vColor;\n+ \n  void main() {\n      gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);\n+     vColor = color;\n  }\n\n```\nand use it instead of hardcoded red in fragment shader\n\n📄 src/shaders/3d.f.glsl\n```diff\n  precision mediump float;\n  \n+ varying vec4 vColor;\n+ \n  void main() {\n-     gl_FragColor = vec4(1, 0, 0, 1);\n+     gl_FragColor = vColor;\n  }\n\n```\nand finally setup vertex attribute in js\n\n📄 src/3d.js\n```diff\n  vertexBuffer.bind(gl);\n  gl.vertexAttribPointer(programInfo.attributeLocations.position, 3, gl.FLOAT, false, 0, 0);\n  \n+ colorsBuffer.bind(gl);\n+ gl.vertexAttribPointer(programInfo.attributeLocations.color, 4, gl.FLOAT, false, 0, 0);\n+ \n  const modelMatrix = mat4.create();\n  const viewMatrix = mat4.create();\n  const projectionMatrix = mat4.create();\n\n```\nOk, colors are there, but something is wrong\n\n![Rotating colors cube](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/rotating-colors-cube.gif)\n\n\nLet's see what is going on in more details by rendering faces incrementally\n\n```javascript\nlet count = 3;\n\nfunction frame() {\n    if (count \u003c= index.data.length) {\n        gl.drawElements(gl.TRIANGLES, count, gl.UNSIGNED_BYTE, 0);\n        count += 3;\n\n        setTimeout(frame, 500);\n    }\n}\n```\n\n![Incremental rendering](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/incremental-rendering.gif)\n\n\nSeems like triangles which rendered later overlap the ones which are actually closer to the viewer 😕\nHow do we fix it?\n\n📄 src/3d.js\n```diff\n  gl.linkProgram(program);\n  gl.useProgram(program);\n  \n+ gl.enable(gl.DEPTH_TEST);\n+ \n  const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n  \n  const cubeVertices = new Float32Array([\n\n```\nAfter vertices are assembled into primitives (triangles) fragment shader paints each pixel inside of triangle, but before calculation of a color fragment passes some \"tests\". One of those tests is depth and we need to manually enable it.\n\nOther types of tests are:\n\n* `gl.SCISSORS_TEST` - whether a fragment inside of a certain triangle (don't confuse this with viewport, there is a special scissor[https://developer.mozilla.org/en-US/docs/Web/API/WebGLRenderingContext/scissor] method)\n* `gl.STENCIL_TEST` – similar to a depth, but we can manually define a \"mask\" and discard some pixels (we'll work with stencil buffer in next tutorials)\n* pixel ownership test – some pixels on screen might belong to other OpenGL contexts (imagine your browser is overlapped by other window), so this pixels get discarded (not painted)\n\n\nCool, we now have a working 3d cube, but we're duplicating a lot of colors to fill vertex buffer, can we do it better?\nWe're using a fixed color palette (6 colors), so we can pass these colors to a shader and use just index of that color.\n\nLet's drop color attrbiute and introduce a colorIndex instead\n\n📄 src/shaders/3d.v.glsl\n```diff\n  attribute vec3 position;\n- attribute vec4 color;\n+ attribute float colorIndex;\n  \n  uniform mat4 modelMatrix;\n  uniform mat4 viewMatrix;\n\n```\nShaders support \"arrays\" of uniforms, so we can pass our color palette to this array and use index to get a color out of it\n\n📄 src/shaders/3d.v.glsl\n```diff\n  uniform mat4 modelMatrix;\n  uniform mat4 viewMatrix;\n  uniform mat4 projectionMatrix;\n+ uniform vec4 colors[6];\n  \n  varying vec4 vColor;\n  \n  void main() {\n      gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);\n-     vColor = color;\n+     vColor = colors[int(colorIndex)];\n  }\n\n```\nWe need to make appropriate changes to setup color index attribute\n\n📄 src/3d.js\n```diff\n  const colors = [];\n  \n  for (var j = 0; j \u003c faceColors.length; ++j) {\n-     const c = faceColors[j];\n-     colors.push(\n-         ...c, // vertex 1\n-         ...c, // vertex 2\n-         ...c, // vertex 3\n-         ...c, // vertex 4\n-     );\n+     colors.push(j, j, j, j);\n  }\n  \n  \n  gl.vertexAttribPointer(programInfo.attributeLocations.position, 3, gl.FLOAT, false, 0, 0);\n  \n  colorsBuffer.bind(gl);\n- gl.vertexAttribPointer(programInfo.attributeLocations.color, 4, gl.FLOAT, false, 0, 0);\n+ gl.vertexAttribPointer(programInfo.attributeLocations.colorIndex, 1, gl.FLOAT, false, 0, 0);\n  \n  const modelMatrix = mat4.create();\n  const viewMatrix = mat4.create();\n\n```\nTo fill an array uniform, we need to set each \\\"item\\\" in this array individually, like so\n\n```javascript\ngl.uniform4fv(programInfo.uniformLocations[`colors[0]`], color[0]);\ngl.uniform4fv(programInfo.uniformLocations[`colors[1]`], colors[1]);\ngl.uniform4fv(programInfo.uniformLocations[`colors[2]`], colors[2]);\n...\n```\n\nObviously this can be done in a loop.\n\n📄 src/3d.js\n```diff\n      colors.push(j, j, j, j);\n  }\n  \n+ faceColors.forEach((color, index) =\u003e {\n+     gl.uniform4fv(programInfo.uniformLocations[`colors[${index}]`], color);\n+ });\n  \n  const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, cubeVertices, gl.STATIC_DRAW);\n  const colorsBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, new Float32Array(colors), gl.STATIC_DRAW);\n\n```\nNice, we have the same result, but using 4 times less of data in attributes.\n\nThis might seem as an unnecessary optimisation, but it might help when you have to update large buffers frequently\n\n![Rotating cube fixed](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/rotating-cube-fixed.gif)\n\nThat's it for today!\n\nSee you in next tutorials 👋\n\n---\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 17. Exploring OBJ format\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month\u0026hash=day17)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a\u0026hash=day17)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋\n\nWelcome to WebGL month.\n\n[Yesterday](https://dev.to/lesnitsky/webgl-month-day-16-colorizing-cube-depth-buffer-and-array-uniforms-4nhc) we've fixed our cube example, but vertices of this cube were defined right in our js code. This might get more complicated when rendering more complex objects.\n\nLuckily 3D editors like [Blender](https://www.blender.org/) can export object definition in several formats.\n\nLet's export a cube from blender\n\n![Blender export screenshot](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/blender-screenshot.jpg)\n\n\nLet's explore exported file\n\nFirst two lines start with `#` which is just a comment\n\n📄 assets/objects/cube.obj\n```diff\n+ # Blender v2.79 (sub 0) OBJ File: ''\n+ # www.blender.org\n\n```\n`mtllib` line references the file of material of the object\nWe'll ignore this for now\n\n📄 assets/objects/cube.obj\n```diff\n  # Blender v2.79 (sub 0) OBJ File: ''\n  # www.blender.org\n+ mtllib cube.mtl\n\n```\n`o` defines the name of the object\n\n📄 assets/objects/cube.obj\n```diff\n  # Blender v2.79 (sub 0) OBJ File: ''\n  # www.blender.org\n  mtllib cube.mtl\n+ o Cube\n\n```\nLines with `v` define vertex positions\n\n📄 assets/objects/cube.obj\n```diff\n  # www.blender.org\n  mtllib cube.mtl\n  o Cube\n+ v 1.000000 -1.000000 -1.000000\n+ v 1.000000 -1.000000 1.000000\n+ v -1.000000 -1.000000 1.000000\n+ v -1.000000 -1.000000 -1.000000\n+ v 1.000000 1.000000 -0.999999\n+ v 0.999999 1.000000 1.000001\n+ v -1.000000 1.000000 1.000000\n+ v -1.000000 1.000000 -1.000000\n\n```\n`vn` define vertex normals. In this case normals are perpendicular ot the cube facess\n\n📄 assets/objects/cube.obj\n```diff\n  v 0.999999 1.000000 1.000001\n  v -1.000000 1.000000 1.000000\n  v -1.000000 1.000000 -1.000000\n+ vn 0.0000 -1.0000 0.0000\n+ vn 0.0000 1.0000 0.0000\n+ vn 1.0000 0.0000 0.0000\n+ vn -0.0000 -0.0000 1.0000\n+ vn -1.0000 -0.0000 -0.0000\n+ vn 0.0000 0.0000 -1.0000\n\n```\n`usemtl` tells which material to use for the elements (faces) following this line\n\n📄 assets/objects/cube.obj\n```diff\n  vn -0.0000 -0.0000 1.0000\n  vn -1.0000 -0.0000 -0.0000\n  vn 0.0000 0.0000 -1.0000\n+ usemtl Material\n\n```\n`f` lines define object faces referencing vertices and normals by indices\n\n📄 assets/objects/cube.obj\n```diff\n  vn 0.0000 0.0000 -1.0000\n  usemtl Material\n  s off\n+ f 1//1 2//1 3//1 4//1\n+ f 5//2 8//2 7//2 6//2\n+ f 1//3 5//3 6//3 2//3\n+ f 2//4 6//4 7//4 3//4\n+ f 3//5 7//5 8//5 4//5\n+ f 5//6 1//6 4//6 8//6\n\n```\nSo in this case the first face consists of vertices `1, 2, 3 and 4`\n\n\nOther thing to mention – our face consists of 4 vertices, but webgl can render only triangles. We can break this faces to triangles in JS or do this in Blender\n\nEnter edit mode (`Tab` key), and hit `Control + T` (on macOS). That's it, cube faces are now triangulated\n\n![Triangulated cube](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/blender-triangulated-cube.png)\n\n\nNow let's load .obj file with raw loader\n\n📄 src/3d.js\n```diff\n  import fShaderSource from './shaders/3d.f.glsl';\n  import { compileShader, setupShaderInput } from './gl-helpers';\n  import { GLBuffer } from './GLBuffer';\n+ import cubeObj from '../assets/objects/cube.obj';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n\n```\n📄 webpack.config.js\n```diff\n      module: {\n          rules: [\n              {\n-                 test: /\\.glsl$/,\n+                 test: /\\.(glsl|obj)$/,\n                  use: 'raw-loader',\n              },\n  \n\n```\nand implement parser to get vertices and vertex indices\n\n📄 src/3d.js\n```diff\n  \n  import vShaderSource from './shaders/3d.v.glsl';\n  import fShaderSource from './shaders/3d.f.glsl';\n- import { compileShader, setupShaderInput } from './gl-helpers';\n+ import { compileShader, setupShaderInput, parseObj } from './gl-helpers';\n  import { GLBuffer } from './GLBuffer';\n  import cubeObj from '../assets/objects/cube.obj';\n  \n  \n  const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n  \n- const cubeVertices = new Float32Array([\n-     // Front face\n-     -1.0, -1.0, 1.0,\n-     1.0, -1.0, 1.0,\n-     1.0, 1.0, 1.0,\n-     -1.0, 1.0, 1.0,\n- \n-     // Back face\n-     -1.0, -1.0, -1.0,\n-     -1.0, 1.0, -1.0,\n-     1.0, 1.0, -1.0,\n-     1.0, -1.0, -1.0,\n- \n-     // Top face\n-     -1.0, 1.0, -1.0,\n-     -1.0, 1.0, 1.0,\n-     1.0, 1.0, 1.0,\n-     1.0, 1.0, -1.0,\n- \n-     // Bottom face\n-     -1.0, -1.0, -1.0,\n-     1.0, -1.0, -1.0,\n-     1.0, -1.0, 1.0,\n-     -1.0, -1.0, 1.0,\n- \n-     // Right face\n-     1.0, -1.0, -1.0,\n-     1.0, 1.0, -1.0,\n-     1.0, 1.0, 1.0,\n-     1.0, -1.0, 1.0,\n- \n-     // Left face\n-     -1.0, -1.0, -1.0,\n-     -1.0, -1.0, 1.0,\n-     -1.0, 1.0, 1.0,\n-     -1.0, 1.0, -1.0,\n- ]);\n- \n- const indices = new Uint8Array([\n-     0, 1, 2, 0, 2, 3,       // front\n-     4, 5, 6, 4, 6, 7,       // back\n-     8, 9, 10, 8, 10, 11,    // top\n-     12, 13, 14, 12, 14, 15, // bottom\n-     16, 17, 18, 16, 18, 19, // right\n-     20, 21, 22, 20, 22, 23, // left\n- ]);\n+ const { vertices, indices } = parseObj(cubeObj);\n  \n  const faceColors = [\n      [1.0, 1.0, 1.0, 1.0], // Front face: white\n      gl.uniform4fv(programInfo.uniformLocations[`colors[${index}]`], color);\n  });\n  \n- const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, cubeVertices, gl.STATIC_DRAW);\n+ const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, vertices, gl.STATIC_DRAW);\n  const colorsBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, new Float32Array(colors), gl.STATIC_DRAW);\n  const indexBuffer = new GLBuffer(gl, gl.ELEMENT_ARRAY_BUFFER, indices, gl.STATIC_DRAW);\n  \n\n```\n📄 src/gl-helpers.js\n```diff\n          uniformLocations,\n      }\n  }\n+ \n+ export function parseObj(objSource) {\n+     const vertices = [];\n+     const indices = [];\n+ \n+     return { vertices, indices };\n+ }\n\n```\nWe can iterate over each line and search for those starting with `v` to get vertex coordinatess\n\n📄 src/gl-helpers.js\n```diff\n      }\n  }\n  \n+ export function parseVec(string, prefix) {\n+     return string.replace(prefix, '').split(' ').map(Number);\n+ }\n+ \n  export function parseObj(objSource) {\n      const vertices = [];\n      const indices = [];\n  \n+     objSource.split('\\n').forEach(line =\u003e {\n+         if (line.startsWith('v ')) {\n+             vertices.push(...parseVec(line, 'v '));\n+         }\n+     });\n+ \n      return { vertices, indices };\n  }\n\n```\nand do the same with faces\n\n📄 src/gl-helpers.js\n```diff\n      return string.replace(prefix, '').split(' ').map(Number);\n  }\n  \n+ export function parseFace(string) {\n+     return string.replace('f ', '').split(' ').map(chunk =\u003e {\n+         return chunk.split('/').map(Number);\n+     })\n+ }\n+ \n  export function parseObj(objSource) {\n      const vertices = [];\n      const indices = [];\n          if (line.startsWith('v ')) {\n              vertices.push(...parseVec(line, 'v '));\n          }\n+ \n+         if (line.startsWith('f ')) {\n+             indices.push(...parseFace(line).map(face =\u003e face[0]));\n+         }\n      });\n  \n      return { vertices, indices };\n\n```\nLet's also return typed arrays\n\n📄 src/gl-helpers.js\n```diff\n          }\n      });\n  \n-     return { vertices, indices };\n+     return { \n+         vertices: new Float32Array(vertices), \n+         indices: new Uint8Array(indices),\n+     };\n  }\n\n```\nOk, everything seem to work fine, but we have an error\n\n```\nglDrawElements: attempt to access out of range vertices in attribute 0\n```\n\nThat's because indices in .obj file starts with `1`, so we need to decrement each index\n\n📄 src/gl-helpers.js\n```diff\n          }\n  \n          if (line.startsWith('f ')) {\n-             indices.push(...parseFace(line).map(face =\u003e face[0]));\n+             indices.push(...parseFace(line).map(face =\u003e face[0] - 1));\n          }\n      });\n  \n\n```\nLet's also change the way we colorize our faces, just to make it possible to render any object with any amount of faces with random colors\n\n📄 src/3d.js\n```diff\n  \n  const colors = [];\n  \n- for (var j = 0; j \u003c faceColors.length; ++j) {\n-     colors.push(j, j, j, j);\n+ for (var j = 0; j \u003c indices.length / 3; ++j) {\n+     const randomColorIndex = Math.floor(Math.random() * faceColors.length);\n+     colors.push(randomColorIndex, randomColorIndex, randomColorIndex);\n  }\n  \n  faceColors.forEach((color, index) =\u003e {\n\n```\nOne more issue with existing code, is that we use `gl.UNSIGNED_BYTE`, so index buffer might only of a `Uint8Array` which fits numbers up to `255`, so if object will have more than 255 vertices – it will be rendered incorrectly. Let's fix this\n\n📄 src/3d.js\n```diff\n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n  \n- gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n+ gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_SHORT, 0);\n  \n  function frame() {\n      mat4.rotateY(modelMatrix, modelMatrix, Math.PI / 180);\n  \n      gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, modelMatrix);\n-     gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n+     gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_SHORT, 0);\n  \n      requestAnimationFrame(frame);\n  }\n\n```\n📄 src/gl-helpers.js\n```diff\n  \n      return { \n          vertices: new Float32Array(vertices), \n-         indices: new Uint8Array(indices),\n+         indices: new Uint16Array(indices),\n      };\n  }\n\n```\nNow let's render different object, for example monkey\n\n📄 src/3d.js\n```diff\n  import fShaderSource from './shaders/3d.f.glsl';\n  import { compileShader, setupShaderInput, parseObj } from './gl-helpers';\n  import { GLBuffer } from './GLBuffer';\n- import cubeObj from '../assets/objects/cube.obj';\n+ import monkeyObj from '../assets/objects/monkey.obj';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  \n  const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n  \n- const { vertices, indices } = parseObj(cubeObj);\n+ const { vertices, indices } = parseObj(monkeyObj);\n  \n  const faceColors = [\n      [1.0, 1.0, 1.0, 1.0], // Front face: white\n  \n  mat4.lookAt(\n      viewMatrix,\n-     [0, 7, -7],\n+     [0, 0, -7],\n      [0, 0, 0],\n      [0, 1, 0],\n  );\n\n```\nCool! We now can render any objects exported from blender 🎉\n\n![Rotating monkey](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/rotating-monkey.gif)\n\nThat's it for today, see you tomorrow 👋\n\n---\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 18. Flat shading\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social\u0026hash=day18)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social\u0026hash=day18)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋\n\nWelcome to WebGL month.\n\nToday we'll learn how to implement flat shading. But let's first talk about light itself.\n\nA typical 3d scene will contain an object, global light and some specific source of light (torch, lamp etc.)\n\nSo how do we break all these down to something we can turn into a code\n\nHere's an example\n\n![Light illustration](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/light-illustration.png)\n\nPay attention to the red arrows coming from cube faces. These arrows are \"normals\", and each face color will depend on the angle between a vector of light and face normal.\n\n\nLet's change the way our object is colorized and make all faces the same color to see better how light affects face colors\n\n📄 src/3d.js\n```diff\n  const { vertices, indices } = parseObj(monkeyObj);\n  \n  const faceColors = [\n-     [1.0, 1.0, 1.0, 1.0], // Front face: white\n-     [1.0, 0.0, 0.0, 1.0], // Back face: red\n-     [0.0, 1.0, 0.0, 1.0], // Top face: green\n-     [0.0, 0.0, 1.0, 1.0], // Bottom face: blue\n-     [1.0, 1.0, 0.0, 1.0], // Right face: yellow\n-     [1.0, 0.0, 1.0, 1.0], // Left face: purple\n+     [0.5, 0.5, 0.5, 1.0]\n  ];\n  \n  const colors = [];\n  \n  for (var j = 0; j \u003c indices.length / 3; ++j) {\n-     const randomColorIndex = Math.floor(Math.random() * faceColors.length);\n-     colors.push(randomColorIndex, randomColorIndex, randomColorIndex);\n+     colors.push(0, 0, 0, 0);\n  }\n  \n  faceColors.forEach((color, index) =\u003e {\n\n```\nWe'll also need to extract normals from our object and use `drawArrays` instead of `drawElements`, as each vertex can't be referenced by index, because vertex coordinates and normals have different indices\n\n📄 src/3d.js\n```diff\n  \n  const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n  \n- const { vertices, indices } = parseObj(monkeyObj);\n+ const { vertices, normals } = parseObj(monkeyObj);\n  \n  const faceColors = [\n      [0.5, 0.5, 0.5, 1.0]\n  \n  const colors = [];\n  \n- for (var j = 0; j \u003c indices.length / 3; ++j) {\n+ for (var j = 0; j \u003c vertices.length / 3; ++j) {\n      colors.push(0, 0, 0, 0);\n  }\n  \n  \n  const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, vertices, gl.STATIC_DRAW);\n  const colorsBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, new Float32Array(colors), gl.STATIC_DRAW);\n- const indexBuffer = new GLBuffer(gl, gl.ELEMENT_ARRAY_BUFFER, indices, gl.STATIC_DRAW);\n  \n  vertexBuffer.bind(gl);\n  gl.vertexAttribPointer(programInfo.attributeLocations.position, 3, gl.FLOAT, false, 0, 0);\n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n  \n- gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_SHORT, 0);\n+ gl.drawArrays(gl.TRIANGLES, 0, vertexBuffer.data.length / 3);\n  \n  function frame() {\n      mat4.rotateY(modelMatrix, modelMatrix, Math.PI / 180);\n  \n      gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, modelMatrix);\n-     gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_SHORT, 0);\n+ \n+     gl.drawArrays(gl.TRIANGLES, 0, vertexBuffer.data.length / 3);\n  \n      requestAnimationFrame(frame);\n  }\n\n```\n📄 src/gl-helpers.js\n```diff\n  }\n  \n  export function parseObj(objSource) {\n-     const vertices = [];\n-     const indices = [];\n+     const _vertices = [];\n+     const _normals = [];\n+     const vertexIndices = [];\n+     const normalIndices = [];\n  \n      objSource.split('\\n').forEach(line =\u003e {\n          if (line.startsWith('v ')) {\n-             vertices.push(...parseVec(line, 'v '));\n+             _vertices.push(parseVec(line, 'v '));\n+         }\n+ \n+         if (line.startsWith('vn ')) {\n+             _normals.push(parseVec(line, 'vn '));\n          }\n  \n          if (line.startsWith('f ')) {\n-             indices.push(...parseFace(line).map(face =\u003e face[0] - 1));\n+             const parsedFace = parseFace(line);\n+ \n+             vertexIndices.push(...parsedFace.map(face =\u003e face[0] - 1));\n+             normalIndices.push(...parsedFace.map(face =\u003e face[2] - 1));\n          }\n      });\n  \n+     const vertices = [];\n+     const normals = [];\n+ \n+     for (let i = 0; i \u003c vertexIndices.length; i++) {\n+         const vertexIndex = vertexIndices[i];\n+         const normalIndex = normalIndices[i];\n+ \n+         const vertex = _vertices[vertexIndex];\n+         const normal = _normals[normalIndex];\n+ \n+         vertices.push(...vertex);\n+         normals.push(...normal);\n+     }\n+ \n      return { \n          vertices: new Float32Array(vertices), \n-         indices: new Uint16Array(indices),\n+         normals: new Float32Array(normals), \n      };\n  }\n\n```\nDefine normal attribute\n\n📄 src/3d.js\n```diff\n  \n  const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, vertices, gl.STATIC_DRAW);\n  const colorsBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, new Float32Array(colors), gl.STATIC_DRAW);\n+ const normalsBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, normals, gl.STATIC_DRAW);\n  \n  vertexBuffer.bind(gl);\n  gl.vertexAttribPointer(programInfo.attributeLocations.position, 3, gl.FLOAT, false, 0, 0);\n  colorsBuffer.bind(gl);\n  gl.vertexAttribPointer(programInfo.attributeLocations.colorIndex, 1, gl.FLOAT, false, 0, 0);\n  \n+ normalsBuffer.bind(gl);\n+ gl.vertexAttribPointer(programInfo.attributeLocations.normal, 3, gl.FLOAT, false, 0, 0);\n+ \n  const modelMatrix = mat4.create();\n  const viewMatrix = mat4.create();\n  const projectionMatrix = mat4.create();\n\n```\n📄 src/shaders/3d.v.glsl\n```diff\n  attribute vec3 position;\n+ attribute vec3 normal;\n  attribute float colorIndex;\n  \n  uniform mat4 modelMatrix;\n\n```\nLet's also define a position of light and pass it to shader via uniform\n\n📄 src/3d.js\n```diff\n  gl.uniformMatrix4fv(programInfo.uniformLocations.viewMatrix, false, viewMatrix);\n  gl.uniformMatrix4fv(programInfo.uniformLocations.projectionMatrix, false, projectionMatrix);\n  \n+ gl.uniform3fv(programInfo.uniformLocations.directionalLightVector, [0, 0, -7]);\n+ \n  gl.viewport(0, 0, canvas.width, canvas.height);\n  \n  gl.drawArrays(gl.TRIANGLES, 0, vertexBuffer.data.length / 3);\n\n```\n📄 src/shaders/3d.v.glsl\n```diff\n  uniform mat4 viewMatrix;\n  uniform mat4 projectionMatrix;\n  uniform vec4 colors[6];\n+ uniform vec3 directionalLightVector;\n  \n  varying vec4 vColor;\n  \n\n```\nNow we can use normal vector and directional light vector to calculate light \"intensity\" and multiply initial color\n\n📄 src/shaders/3d.v.glsl\n```diff\n  \n  void main() {\n      gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);\n-     vColor = colors[int(colorIndex)];\n+ \n+     float intensity = dot(normal, directionalLightVector);\n+ \n+     vColor = colors[int(colorIndex)] * intensity;\n  }\n\n```\n![Lighting 1](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/lighting-1.gif)\n\nNow some faces are brighter, some are lighter, so overall approach is working, but image seem to be too bright\n\nOne issue with current implementation is that we're using \"non-normalized\" vector for light direction\n\n📄 src/shaders/3d.v.glsl\n```diff\n  void main() {\n      gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);\n  \n-     float intensity = dot(normal, directionalLightVector);\n+     float intensity = dot(normal, normalize(directionalLightVector));\n  \n      vColor = colors[int(colorIndex)] * intensity;\n  }\n\n```\n![Lighting 2](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/lighting-2.gif)\n\nLooks better, but still too bright.\n\nThis is because we also multiply `alpha` component of the color by our intensity, so darker faces become lighter because they have opacity close to `0`.\n\n📄 src/3d.js\n```diff\n- import { mat4 } from 'gl-matrix';\n+ import { mat4, vec3 } from 'gl-matrix';\n  \n  import vShaderSource from './shaders/3d.v.glsl';\n  import fShaderSource from './shaders/3d.f.glsl';\n\n```\n📄 src/shaders/3d.v.glsl\n```diff\n  \n      float intensity = dot(normal, normalize(directionalLightVector));\n  \n-     vColor = colors[int(colorIndex)] * intensity;\n+     vColor.rgb = vec3(0.3, 0.3, 0.3) + colors[int(colorIndex)].rgb * intensity;\n+     vColor.a = 1.0;\n  }\n\n```\n![Lighting 3](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/lighting-3.gif)\n\nNow it is too dark 😕\n\nLet's add some \"global light\"\n\n\n![Lighting 4](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/lighting-4.gif)\n\nLooks better, but still not perfect.\nIt seems like the light source rotates together with object. This happens because we transform vertex positions, but normals stay the same. We need to transform normals as well. There is a special transformation matrix which could be calculatd as invert-transpose from model matrix.\n\n📄 src/3d.js\n```diff\n  const modelMatrix = mat4.create();\n  const viewMatrix = mat4.create();\n  const projectionMatrix = mat4.create();\n+ const normalMatrix = mat4.create();\n  \n  mat4.lookAt(\n      viewMatrix,\n  function frame() {\n      mat4.rotateY(modelMatrix, modelMatrix, Math.PI / 180);\n  \n+     mat4.invert(normalMatrix, modelMatrix);\n+     mat4.transpose(normalMatrix, normalMatrix);\n+ \n      gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, modelMatrix);\n+     gl.uniformMatrix4fv(programInfo.uniformLocations.normalMatrix, false, normalMatrix);\n  \n      gl.drawArrays(gl.TRIANGLES, 0, vertexBuffer.data.length / 3);\n  \n\n```\n📄 src/shaders/3d.v.glsl\n```diff\n  uniform mat4 modelMatrix;\n  uniform mat4 viewMatrix;\n  uniform mat4 projectionMatrix;\n+ uniform mat4 normalMatrix;\n  uniform vec4 colors[6];\n  uniform vec3 directionalLightVector;\n  \n  void main() {\n      gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);\n  \n-     float intensity = dot(normal, normalize(directionalLightVector));\n+     vec3 transformedNormal = (normalMatrix * vec4(normal, 1.0)).xyz;\n+     float intensity = dot(transformedNormal, normalize(directionalLightVector));\n  \n      vColor.rgb = vec3(0.3, 0.3, 0.3) + colors[int(colorIndex)].rgb * intensity;\n      vColor.a = 1.0;\n\n```\n![Lighting 5](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/lighting-5.gif)\n\nCool, looks good enough!\n\nThat's it for today.\n\nSee you tomorrow 👋\n\n---\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social\u0026hash=day18)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social\u0026hash=day18)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 19. Rendering multiple objects\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social\u0026hash=day19)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social\u0026hash=day19)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋\n\nWelcome to WebGL month.\n\nIn previous tutorials we've been rendering only a signle object, but typical 3D scene consists of a multiple objects.\nToday we're going to learn how to render many objects on scene.\n\n\nSince we're rendering objects with a solid color, let's get rid of colorIndex attribute and pass a signle color via uniform\n\n📄 src/3d.js\n```diff\n  \n  const { vertices, normals } = parseObj(monkeyObj);\n  \n- const faceColors = [\n-     [0.5, 0.5, 0.5, 1.0]\n- ];\n- \n- const colors = [];\n- \n- for (var j = 0; j \u003c vertices.length / 3; ++j) {\n-     colors.push(0, 0, 0, 0);\n- }\n- \n- faceColors.forEach((color, index) =\u003e {\n-     gl.uniform4fv(programInfo.uniformLocations[`colors[${index}]`], color);\n- });\n+ gl.uniform3fv(programInfo.uniformLocations.color, [0.5, 0.5, 0.5]);\n  \n  const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, vertices, gl.STATIC_DRAW);\n- const colorsBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, new Float32Array(colors), gl.STATIC_DRAW);\n  const normalsBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, normals, gl.STATIC_DRAW);\n  \n  vertexBuffer.bind(gl);\n  gl.vertexAttribPointer(programInfo.attributeLocations.position, 3, gl.FLOAT, false, 0, 0);\n  \n- colorsBuffer.bind(gl);\n- gl.vertexAttribPointer(programInfo.attributeLocations.colorIndex, 1, gl.FLOAT, false, 0, 0);\n- \n  normalsBuffer.bind(gl);\n  gl.vertexAttribPointer(programInfo.attributeLocations.normal, 3, gl.FLOAT, false, 0, 0);\n  \n\n```\n📄 src/shaders/3d.v.glsl\n```diff\n  attribute vec3 position;\n  attribute vec3 normal;\n- attribute float colorIndex;\n  \n  uniform mat4 modelMatrix;\n  uniform mat4 viewMatrix;\n  uniform mat4 projectionMatrix;\n  uniform mat4 normalMatrix;\n- uniform vec4 colors[6];\n+ uniform vec3 color;\n  uniform vec3 directionalLightVector;\n  \n  varying vec4 vColor;\n      vec3 transformedNormal = (normalMatrix * vec4(normal, 1.0)).xyz;\n      float intensity = dot(transformedNormal, normalize(directionalLightVector));\n  \n-     vColor.rgb = vec3(0.3, 0.3, 0.3) + colors[int(colorIndex)].rgb * intensity;\n+     vColor.rgb = vec3(0.3, 0.3, 0.3) + color * intensity;\n      vColor.a = 1.0;\n  }\n\n```\nWe'll need a helper class to store object related info\n\n📄 src/Object3D.js\n```js\nexport class Object3D {\n    constructor() {\n        \n    } \n}\n\n```\nEach object should contain it's own vertices and normals\n\n📄 src/Object3D.js\n```diff\n+ import { parseObj } from \"./gl-helpers\";\n+ \n  export class Object3D {\n-     constructor() {\n-         \n-     } \n+     constructor(source) {\n+         const { vertices, normals } = parseObj(source);\n+ \n+         this.vertices = vertices;\n+         this.normals = normals;\n+     }\n  }\n\n```\nAs well as a model matrix to store object transform\n\n📄 src/Object3D.js\n```diff\n  import { parseObj } from \"./gl-helpers\";\n+ import { mat4 } from \"gl-matrix\";\n  \n  export class Object3D {\n      constructor(source) {\n  \n          this.vertices = vertices;\n          this.normals = normals;\n+ \n+         this.modelMatrix = mat4.create();\n      }\n  }\n\n```\nSince normal matrix is computable from model matrix it makes sense to define a getter\n\n📄 src/Object3D.js\n```diff\n          this.normals = normals;\n  \n          this.modelMatrix = mat4.create();\n+         this._normalMatrix = mat4.create();\n+     }\n+ \n+     get normalMatrix () {\n+         mat4.invert(this._normalMatrix, this.modelMatrix);\n+         mat4.transpose(this._normalMatrix, this._normalMatrix);\n+ \n+         return this._normalMatrix;\n      }\n  }\n\n```\nNow we can refactor our code and use new helper class\n\n📄 src/3d.js\n```diff\n  import { compileShader, setupShaderInput, parseObj } from './gl-helpers';\n  import { GLBuffer } from './GLBuffer';\n  import monkeyObj from '../assets/objects/monkey.obj';\n+ import { Object3D } from './Object3D';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  \n  const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n  \n- const { vertices, normals } = parseObj(monkeyObj);\n+ const monkey = new Object3D(monkeyObj);\n  \n  gl.uniform3fv(programInfo.uniformLocations.color, [0.5, 0.5, 0.5]);\n  \n- const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, vertices, gl.STATIC_DRAW);\n- const normalsBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, normals, gl.STATIC_DRAW);\n+ const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, monkey.vertices, gl.STATIC_DRAW);\n+ const normalsBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, monkey.normals, gl.STATIC_DRAW);\n  \n  vertexBuffer.bind(gl);\n  gl.vertexAttribPointer(programInfo.attributeLocations.position, 3, gl.FLOAT, false, 0, 0);\n  normalsBuffer.bind(gl);\n  gl.vertexAttribPointer(programInfo.attributeLocations.normal, 3, gl.FLOAT, false, 0, 0);\n  \n- const modelMatrix = mat4.create();\n  const viewMatrix = mat4.create();\n  const projectionMatrix = mat4.create();\n- const normalMatrix = mat4.create();\n  \n  mat4.lookAt(\n      viewMatrix,\n      100,\n  );\n  \n- gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, modelMatrix);\n  gl.uniformMatrix4fv(programInfo.uniformLocations.viewMatrix, false, viewMatrix);\n  gl.uniformMatrix4fv(programInfo.uniformLocations.projectionMatrix, false, projectionMatrix);\n  \n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n  \n- gl.drawArrays(gl.TRIANGLES, 0, vertexBuffer.data.length / 3);\n- \n  function frame() {\n-     mat4.rotateY(modelMatrix, modelMatrix, Math.PI / 180);\n- \n-     mat4.invert(normalMatrix, modelMatrix);\n-     mat4.transpose(normalMatrix, normalMatrix);\n+     mat4.rotateY(monkey.modelMatrix, monkey.modelMatrix, Math.PI / 180);\n  \n-     gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, modelMatrix);\n-     gl.uniformMatrix4fv(programInfo.uniformLocations.normalMatrix, false, normalMatrix);\n+     gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, monkey.modelMatrix);\n+     gl.uniformMatrix4fv(programInfo.uniformLocations.normalMatrix, false, monkey.normalMatrix);\n  \n      gl.drawArrays(gl.TRIANGLES, 0, vertexBuffer.data.length / 3);\n  \n\n```\nNow let's import more objects\n\n📄 src/3d.js\n```diff\n  import { compileShader, setupShaderInput, parseObj } from './gl-helpers';\n  import { GLBuffer } from './GLBuffer';\n  import monkeyObj from '../assets/objects/monkey.obj';\n+ import torusObj from '../assets/objects/torus.obj';\n+ import coneObj from '../assets/objects/cone.obj';\n+ \n  import { Object3D } from './Object3D';\n  \n  const canvas = document.querySelector('canvas');\n  const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n  \n  const monkey = new Object3D(monkeyObj);\n+ const torus = new Object3D(torusObj);\n+ const cone = new Object3D(coneObj);\n  \n  gl.uniform3fv(programInfo.uniformLocations.color, [0.5, 0.5, 0.5]);\n  \n\n```\nand store them in a collection\n\n📄 src/3d.js\n```diff\n  const torus = new Object3D(torusObj);\n  const cone = new Object3D(coneObj);\n  \n+ const objects = [\n+     monkey,\n+     torus,\n+     cone,\n+ ];\n+ \n  gl.uniform3fv(programInfo.uniformLocations.color, [0.5, 0.5, 0.5]);\n  \n  const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, monkey.vertices, gl.STATIC_DRAW);\n\n```\nand instead of issuing a draw call for just a monkey, we'll iterate over collection\n\n📄 src/3d.js\n```diff\n  gl.viewport(0, 0, canvas.width, canvas.height);\n  \n  function frame() {\n-     mat4.rotateY(monkey.modelMatrix, monkey.modelMatrix, Math.PI / 180);\n+     objects.forEach((object) =\u003e {\n+         mat4.rotateY(object.modelMatrix, object.modelMatrix, Math.PI / 180);\n  \n-     gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, monkey.modelMatrix);\n-     gl.uniformMatrix4fv(programInfo.uniformLocations.normalMatrix, false, monkey.normalMatrix);\n+         gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, object.modelMatrix);\n+         gl.uniformMatrix4fv(programInfo.uniformLocations.normalMatrix, false, object.normalMatrix);\n  \n-     gl.drawArrays(gl.TRIANGLES, 0, vertexBuffer.data.length / 3);\n+         gl.drawArrays(gl.TRIANGLES, 0, vertexBuffer.data.length / 3);\n+     });\n  \n      requestAnimationFrame(frame);\n  }\n\n```\nOk, but why do we still have only monkey rendered?\n\n\nNo wonder, vertex and normals buffer stays the same, so we just render the same object N times. Let's update vertex and normals buffer each time we want to render an object\n\n📄 src/3d.js\n```diff\n          gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, object.modelMatrix);\n          gl.uniformMatrix4fv(programInfo.uniformLocations.normalMatrix, false, object.normalMatrix);\n  \n+         vertexBuffer.setData(gl, object.vertices, gl.STATIC_DRAW);\n+         normalsBuffer.setData(gl, object.normals, gl.STATIC_DRAW);\n+ \n          gl.drawArrays(gl.TRIANGLES, 0, vertexBuffer.data.length / 3);\n      });\n  \n\n```\n![Multiple objects 1](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/multiple-objects-1.gif)\n\nCool, we've rendered multiple objects, but they are all in the same spot. Let's fix that\n\n\nEach object will have a property storing a position in space\n\n📄 src/3d.js\n```diff\n  \n  const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n  \n- const monkey = new Object3D(monkeyObj);\n- const torus = new Object3D(torusObj);\n- const cone = new Object3D(coneObj);\n+ const monkey = new Object3D(monkeyObj, [0, 0, 0]);\n+ const torus = new Object3D(torusObj, [-3, 0, 0]);\n+ const cone = new Object3D(coneObj, [3, 0, 0]);\n  \n  const objects = [\n      monkey,\n\n```\n📄 src/Object3D.js\n```diff\n  import { mat4 } from \"gl-matrix\";\n  \n  export class Object3D {\n-     constructor(source) {\n+     constructor(source, position) {\n          const { vertices, normals } = parseObj(source);\n  \n          this.vertices = vertices;\n          this.normals = normals;\n+         this.position = position;\n  \n          this.modelMatrix = mat4.create();\n          this._normalMatrix = mat4.create();\n\n```\nand this position should be respected by model matrix\n\n📄 src/Object3D.js\n```diff\n          this.position = position;\n  \n          this.modelMatrix = mat4.create();\n+         mat4.fromTranslation(this.modelMatrix, position);\n          this._normalMatrix = mat4.create();\n      }\n  \n\n```\nAnd one more thing. We can also define a color specific to each object\n\n📄 src/3d.js\n```diff\n  \n  const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n  \n- const monkey = new Object3D(monkeyObj, [0, 0, 0]);\n- const torus = new Object3D(torusObj, [-3, 0, 0]);\n- const cone = new Object3D(coneObj, [3, 0, 0]);\n+ const monkey = new Object3D(monkeyObj, [0, 0, 0], [1, 0, 0]);\n+ const torus = new Object3D(torusObj, [-3, 0, 0], [0, 1, 0]);\n+ const cone = new Object3D(coneObj, [3, 0, 0], [0, 0, 1]);\n  \n  const objects = [\n      monkey,\n      cone,\n  ];\n  \n- gl.uniform3fv(programInfo.uniformLocations.color, [0.5, 0.5, 0.5]);\n- \n  const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, monkey.vertices, gl.STATIC_DRAW);\n  const normalsBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, monkey.normals, gl.STATIC_DRAW);\n  \n          gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, object.modelMatrix);\n          gl.uniformMatrix4fv(programInfo.uniformLocations.normalMatrix, false, object.normalMatrix);\n  \n+         gl.uniform3fv(programInfo.uniformLocations.color, object.color);\n+ \n          vertexBuffer.setData(gl, object.vertices, gl.STATIC_DRAW);\n          normalsBuffer.setData(gl, object.normals, gl.STATIC_DRAW);\n  \n\n```\n📄 src/Object3D.js\n```diff\n  import { mat4 } from \"gl-matrix\";\n  \n  export class Object3D {\n-     constructor(source, position) {\n+     constructor(source, position, color) {\n          const { vertices, normals } = parseObj(source);\n  \n          this.vertices = vertices;\n          this.modelMatrix = mat4.create();\n          mat4.fromTranslation(this.modelMatrix, position);\n          this._normalMatrix = mat4.create();\n+ \n+         this.color = color;\n      }\n  \n      get normalMatrix () {\n\n```\n![Multiple objects 2](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/multiple-objects-2.gif)\n\nYay! We now can render multiple objects with individual transforms and colors 🎉\n\nThat's it for today, see you tomorrow 👋\n\n---\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social\u0026hash=day19)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social\u0026hash=day19)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 20. Texturing 3d objects\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social\u0026hash=day20)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social\u0026hash=day20)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋 Welcome to [WebGL month](https://github.com/lesnitsky/webgl-month)\n\nToday we're going to explore how to add textures to 3d objects.\n\n\nFirst we'll need a new entry point\n\n📄 index.html\n```diff\n      \u003c/head\u003e\n      \u003cbody\u003e\n          \u003ccanvas\u003e\u003c/canvas\u003e\n-         \u003cscript src=\"./dist/3d.js\"\u003e\u003c/script\u003e\n+         \u003cscript src=\"./dist/3d-textured.js\"\u003e\u003c/script\u003e\n      \u003c/body\u003e\n  \u003c/html\u003e\n\n```\n📄 src/3d-textured.js\n```js\nconsole.log('Hello textures');\n\n```\n📄 webpack.config.js\n```diff\n          texture: './src/texture.js',\n          'rotating-square': './src/rotating-square.js',\n          '3d': './src/3d.js',\n+         '3d-textured': './src/3d-textured.js',\n      },\n  \n      output: {\n\n```\nNow let's create simple shaders to render a 3d object with solid color. [Learn more in this tutorial](https://dev.to/lesnitsky/webgl-month-day-15-rendering-a-3d-cube-190f)\n\n📄 src/shaders/3d-textured.f.glsl\n```glsl\nprecision mediump float;\n\nvoid main() {\n    gl_FragColor = vec4(1, 0, 0, 1);\n}\n\n```\n📄 src/shaders/3d-textured.v.glsl\n```glsl\nattribute vec3 position;\n\nuniform mat4 modelMatrix;\nuniform mat4 viewMatrix;\nuniform mat4 projectionMatrix;\n\nvoid main() {\n    gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);\n}\n\n```\nWe'll need a canvas, webgl context and make canvas fullscreen\n\n📄 src/3d-textured.js\n```diff\n- console.log('Hello textures');\n+ const canvas = document.querySelector('canvas');\n+ const gl = canvas.getContext('webgl');\n+ \n+ const width = document.body.offsetWidth;\n+ const height = document.body.offsetHeight;\n+ \n+ canvas.width = width * devicePixelRatio;\n+ canvas.height = height * devicePixelRatio;\n+ \n+ canvas.style.width = `${width}px`;\n+ canvas.style.height = `${height}px`;\n\n```\nCreate and compile shaders. [Learn more here](https://dev.to/lesnitsky/shaders-and-points-3h2c)\n\n📄 src/3d-textured.js\n```diff\n+ import vShaderSource from './shaders/3d-textured.v.glsl';\n+ import fShaderSource from './shaders/3d-textured.f.glsl';\n+ import { compileShader } from './gl-helpers';\n+ \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  \n  \n  canvas.style.width = `${width}px`;\n  canvas.style.height = `${height}px`;\n+ \n+ const vShader = gl.createShader(gl.VERTEX_SHADER);\n+ const fShader = gl.createShader(gl.FRAGMENT_SHADER);\n+ \n+ compileShader(gl, vShader, vShaderSource);\n+ compileShader(gl, fShader, fShaderSource);\n\n```\nCreate, link and use webgl program\n\n📄 src/3d-textured.js\n```diff\n  \n  compileShader(gl, vShader, vShaderSource);\n  compileShader(gl, fShader, fShaderSource);\n+ \n+ const program = gl.createProgram();\n+ \n+ gl.attachShader(program, vShader);\n+ gl.attachShader(program, fShader);\n+ \n+ gl.linkProgram(program);\n+ gl.useProgram(program);\n\n```\nEnable depth test since we're rendering 3d. [Learn more here](https://dev.to/lesnitsky/webgl-month-day-16-colorizing-cube-depth-buffer-and-array-uniforms-4nhc)\n\n📄 src/3d-textured.js\n```diff\n  \n  gl.linkProgram(program);\n  gl.useProgram(program);\n+ \n+ gl.enable(gl.DEPTH_TEST);\n\n```\nSetup shader input. [Learn more here](https://dev.to/lesnitsky/webgl-month-day-11-3plb)\n\n📄 src/3d-textured.js\n```diff\n  import vShaderSource from './shaders/3d-textured.v.glsl';\n  import fShaderSource from './shaders/3d-textured.f.glsl';\n- import { compileShader } from './gl-helpers';\n+ import { compileShader, setupShaderInput } from './gl-helpers';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  gl.useProgram(program);\n  \n  gl.enable(gl.DEPTH_TEST);\n+ \n+ const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n\n```\nNow let's go to [Blender](https://www.blender.org/) and create a cube, but make sure to check \"Generate UVs\" so that blender can map cube vertices to a plain image.\n\n![Blender 1](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/blender-step-1.png)\n\n\nNext open \"UV Editing\" view\n\n![Blender 2](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/blender-step-2.png)\n\n\nEnter edit mode\n\n![Blender 3](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/blender-step-3.png)\n\n\nUnwrapped cube looks good already, so we can export UV layout\n\n![Blender 4](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/blender-step-4.png)\n\n\nNow if we open exported image in some editor we'll see something like this\n\n![Photoshop 1](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/photoshop-1.png)\n\n\nCool, now we can actually fill our texture with some content\n\nLet's render a minecraft dirt block\n\n![Photoshop 2](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/photoshop-2.png)\n\n\nNext we need to export our object from blender, but don't forget to triangulate it first\n\n![Blender 5](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/blender-step-5.png)\n\n\nAnd finally export our object\n\n![Blender 6](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/blender-step-6.png)\n\n\nNow let's import our cube and create an object. [Learn here about this helper class](https://dev.to/lesnitsky/webgl-month-day-19-rendering-multiple-objects-45m7)\n\n📄 src/3d-textured.js\n```diff\n  import vShaderSource from './shaders/3d-textured.v.glsl';\n  import fShaderSource from './shaders/3d-textured.f.glsl';\n  import { compileShader, setupShaderInput } from './gl-helpers';\n+ import cubeObj from '../assets/objects/textured-cube.obj';\n+ import { Object3D } from './Object3D';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  gl.enable(gl.DEPTH_TEST);\n  \n  const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n+ \n+ const cube = new Object3D(cubeObj, [0, 0, 0], [1, 0, 0]);\n\n```\nIf we'll look into object source, we'll see lines like below\n\n```\nvt 0.625000 0.000000\nvt 0.375000 0.250000\nvt 0.375000 0.000000\nvt 0.625000 0.250000\nvt 0.375000 0.500000\n```\n\nThese are texture coordinates which are referenced by faces in the 2nd \"property\"\n\n```\nf 2/1/1 3/2/1 1/3/1\n\n# vertexIndex / textureCoordinateIndex / normalIndex\n```\n\nso we need to update our parser to support texture coordinates\n\n📄 src/gl-helpers.js\n```diff\n  export function parseObj(objSource) {\n      const _vertices = [];\n      const _normals = [];\n+     const _texCoords = [];\n+ \n      const vertexIndices = [];\n      const normalIndices = [];\n+     const texCoordIndices = [];\n  \n      objSource.split('\\n').forEach(line =\u003e {\n          if (line.startsWith('v ')) {\n              _normals.push(parseVec(line, 'vn '));\n          }\n  \n+         if (line.startsWith('vt ')) {\n+             _texCoords.push(parseVec(line, 'vt '));\n+         }\n+ \n          if (line.startsWith('f ')) {\n              const parsedFace = parseFace(line);\n  \n              vertexIndices.push(...parsedFace.map(face =\u003e face[0] - 1));\n+             texCoordIndices.push(...parsedFace.map(face =\u003e face[1] - 1));\n              normalIndices.push(...parsedFace.map(face =\u003e face[2] - 1));\n          }\n      });\n  \n      const vertices = [];\n      const normals = [];\n+     const texCoords = [];\n  \n      for (let i = 0; i \u003c vertexIndices.length; i++) {\n          const vertexIndex = vertexIndices[i];\n          const normalIndex = normalIndices[i];\n+         const texCoordIndex = texCoordIndices[i];\n  \n          const vertex = _vertices[vertexIndex];\n          const normal = _normals[normalIndex];\n+         const texCoord = _texCoords[texCoordIndex];\n  \n          vertices.push(...vertex);\n          normals.push(...normal);\n+ \n+         if (texCoord) {\n+             texCoords.push(...texCoord);\n+         }\n      }\n  \n      return { \n          vertices: new Float32Array(vertices), \n-         normals: new Float32Array(normals), \n+         normals: new Float32Array(normals),\n+         texCoords: new Float32Array(texCoords), \n      };\n  }\n\n```\nand add this property to Object3D\n\n📄 src/Object3D.js\n```diff\n  \n  export class Object3D {\n      constructor(source, position, color) {\n-         const { vertices, normals } = parseObj(source);\n+         const { vertices, normals, texCoords } = parseObj(source);\n  \n          this.vertices = vertices;\n          this.normals = normals;\n          this.position = position;\n+         this.texCoords = texCoords;\n  \n          this.modelMatrix = mat4.create();\n          mat4.fromTranslation(this.modelMatrix, position);\n\n```\nNow we need to define gl buffers. [Learn more about this helper class here](https://dev.to/lesnitsky/webgl-month-day-11-3plb)\n\n📄 src/3d-textured.js\n```diff\n  import { compileShader, setupShaderInput } from './gl-helpers';\n  import cubeObj from '../assets/objects/textured-cube.obj';\n  import { Object3D } from './Object3D';\n+ import { GLBuffer } from './GLBuffer';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n  \n  const cube = new Object3D(cubeObj, [0, 0, 0], [1, 0, 0]);\n+ \n+ const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, cube.vertices, gl.STATIC_DRAW);\n+ const texCoordsBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, cube.texCoords, gl.STATIC_DRAW);\n\n```\nWe also need to define an attribute to pass tex coords to the vertex shader\n\n📄 src/shaders/3d-textured.v.glsl\n```diff\n  attribute vec3 position;\n+ attribute vec2 texCoord;\n  \n  uniform mat4 modelMatrix;\n  uniform mat4 viewMatrix;\n\n```\nand varying to pass texture coordinate to the fragment shader. [Learn more here](https://dev.to/lesnitsky/shader-varyings-2p0f)\n\n📄 src/shaders/3d-textured.f.glsl\n```diff\n  precision mediump float;\n  \n+ varying vec2 vTexCoord;\n+ \n  void main() {\n      gl_FragColor = vec4(1, 0, 0, 1);\n  }\n\n```\n📄 src/shaders/3d-textured.v.glsl\n```diff\n  uniform mat4 viewMatrix;\n  uniform mat4 projectionMatrix;\n  \n+ varying vec2 vTexCoord;\n+ \n  void main() {\n      gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);\n+ \n+     vTexCoord = texCoord;\n  }\n\n```\nLet's setup attributes\n\n📄 src/3d-textured.js\n```diff\n  \n  const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, cube.vertices, gl.STATIC_DRAW);\n  const texCoordsBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, cube.texCoords, gl.STATIC_DRAW);\n+ \n+ vertexBuffer.bind(gl);\n+ gl.vertexAttribPointer(programInfo.attributeLocations.position, 3, gl.FLOAT, false, 0, 0);\n+ \n+ texCoordsBuffer.bind(gl);\n+ gl.vertexAttribPointer(programInfo.attributeLocations.texCoord, 2, gl.FLOAT, false, 0, 0);\n\n```\nCreate and setup view and projection matrix. [Learn more here](https://dev.to/lesnitsky/webgl-month-day-15-rendering-a-3d-cube-190f)\n\n📄 src/3d-textured.js\n```diff\n+ import { mat4 } from 'gl-matrix';\n+ \n  import vShaderSource from './shaders/3d-textured.v.glsl';\n  import fShaderSource from './shaders/3d-textured.f.glsl';\n  import { compileShader, setupShaderInput } from './gl-helpers';\n  \n  texCoordsBuffer.bind(gl);\n  gl.vertexAttribPointer(programInfo.attributeLocations.texCoord, 2, gl.FLOAT, false, 0, 0);\n+ \n+ const viewMatrix = mat4.create();\n+ const projectionMatrix = mat4.create();\n+ \n+ mat4.lookAt(\n+     viewMatrix,\n+     [0, 0, -7],\n+     [0, 0, 0],\n+     [0, 1, 0],\n+ );\n+ \n+ mat4.perspective(\n+     projectionMatrix,\n+     Math.PI / 360 * 90,\n+     canvas.width / canvas.height,\n+     0.01,\n+     100,\n+ );\n\n```\nPass view and projection matrices to shader via uniforms\n\n📄 src/3d-textured.js\n```diff\n      0.01,\n      100,\n  );\n+ \n+ gl.uniformMatrix4fv(programInfo.uniformLocations.viewMatrix, false, viewMatrix);\n+ gl.uniformMatrix4fv(programInfo.uniformLocations.projectionMatrix, false, projectionMatrix);\n\n```\nSetup viewport\n\n📄 src/3d-textured.js\n```diff\n  \n  gl.uniformMatrix4fv(programInfo.uniformLocations.viewMatrix, false, viewMatrix);\n  gl.uniformMatrix4fv(programInfo.uniformLocations.projectionMatrix, false, projectionMatrix);\n+ \n+ gl.viewport(0, 0, canvas.width, canvas.height);\n\n```\nand finally render our cube\n\n📄 src/3d-textured.js\n```diff\n  gl.uniformMatrix4fv(programInfo.uniformLocations.projectionMatrix, false, projectionMatrix);\n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n+ \n+ function frame() {\n+     mat4.rotateY(cube.modelMatrix, cube.modelMatrix, Math.PI / 180);\n+ \n+     gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, cube.modelMatrix);\n+     gl.uniformMatrix4fv(programInfo.uniformLocations.normalMatrix, false, cube.normalMatrix);\n+ \n+     gl.drawArrays(gl.TRIANGLES, 0, vertexBuffer.data.length / 3);\n+ \n+     requestAnimationFrame(frame);\n+ }\n+ \n+ frame();\n\n```\nbut before rendering the cube we need to load our texture image. [Learn more about loadImage helper here](https://dev.to/lesnitsky/webgl-month-day-8-textures-1mk8)\n\n📄 src/3d-textured.js\n```diff\n  \n  import vShaderSource from './shaders/3d-textured.v.glsl';\n  import fShaderSource from './shaders/3d-textured.f.glsl';\n- import { compileShader, setupShaderInput } from './gl-helpers';\n+ import { compileShader, setupShaderInput, loadImage } from './gl-helpers';\n  import cubeObj from '../assets/objects/textured-cube.obj';\n  import { Object3D } from './Object3D';\n  import { GLBuffer } from './GLBuffer';\n+ import textureSource from '../assets/images/cube-texture.png';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n      requestAnimationFrame(frame);\n  }\n  \n- frame();\n+ loadImage(textureSource).then((image) =\u003e {\n+     frame();\n+ });\n\n```\n📄 webpack.config.js\n```diff\n              },\n  \n              {\n-                 test: /\\.jpg$/,\n+                 test: /\\.(jpg|png)$/,\n                  use: 'url-loader',\n              },\n          ],\n\n```\nand create webgl texture. [Learn more here](https://dev.to/lesnitsky/webgl-month-day-10-multiple-textures-gf3)\n\n📄 src/3d-textured.js\n```diff\n  \n  import vShaderSource from './shaders/3d-textured.v.glsl';\n  import fShaderSource from './shaders/3d-textured.f.glsl';\n- import { compileShader, setupShaderInput, loadImage } from './gl-helpers';\n+ import { compileShader, setupShaderInput, loadImage, createTexture, setImage } from './gl-helpers';\n  import cubeObj from '../assets/objects/textured-cube.obj';\n  import { Object3D } from './Object3D';\n  import { GLBuffer } from './GLBuffer';\n  }\n  \n  loadImage(textureSource).then((image) =\u003e {\n+     const texture = createTexture(gl);\n+     setImage(gl, texture, image);\n+ \n      frame();\n  });\n\n```\nand read fragment colors from texture\n\n📄 src/shaders/3d-textured.f.glsl\n```diff\n  precision mediump float;\n+ uniform sampler2D texture;\n  \n  varying vec2 vTexCoord;\n  \n  void main() {\n-     gl_FragColor = vec4(1, 0, 0, 1);\n+     gl_FragColor = texture2D(texture, vTexCoord);\n  }\n\n```\nLet's move camera a bit to top to see the \"grass\" side\n\n📄 src/3d-textured.js\n```diff\n  \n  mat4.lookAt(\n      viewMatrix,\n-     [0, 0, -7],\n+     [0, 4, -7],\n      [0, 0, 0],\n      [0, 1, 0],\n  );\n\n```\nSomething is wrong, top part is partially white, but why?\n\nTurns out that image is flipped when read by GPU, so we need to flip it back\n\n\nTurns out that image is flipped when read by GPU, so we need to flip it back\n\n📄 src/shaders/3d-textured.f.glsl\n```diff\n  varying vec2 vTexCoord;\n  \n  void main() {\n-     gl_FragColor = texture2D(texture, vTexCoord);\n+     gl_FragColor = texture2D(texture, vTexCoord * vec2(1, -1) + vec2(0, 1));\n  }\n\n```\n![Minecraft cube](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/minecraft-cube.gif)\n\nCool, we rendered a minecraft cube with WebGL 🎉\n\nThat's it for today, see you tomorrow 👋!\n\n---\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 21. Rendering a minecraft terrain\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social\u0026hash=day21)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social\u0026hash=day21)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋\n\nWelcome to WebGL month.\n\n[Yesterday](https://dev.to/lesnitsky/webgl-month-day-20-rendering-a-minecraft-dirt-cube-5ag3) we rendered a single minecraft dirt cube, let's render a terrain today!\n\n\nWe'll need to store each block position in separate transform matrix\n\n📄 src/3d-textured.js\n```diff\n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n  \n+ const matrices = [];\n+ \n  function frame() {\n      mat4.rotateY(cube.modelMatrix, cube.modelMatrix, Math.PI / 180);\n  \n\n```\nNow let's create 10k blocks iteration over x and z axis from -50 to 50\n\n📄 src/3d-textured.js\n```diff\n  \n  const matrices = [];\n  \n+ for (let i = -50; i \u003c 50; i++) {\n+     for (let j = -50; j \u003c 50; j++) {\n+         const matrix = mat4.create();\n+     }\n+ }\n+ \n  function frame() {\n      mat4.rotateY(cube.modelMatrix, cube.modelMatrix, Math.PI / 180);\n  \n\n```\nEach block is a size of 2 (vertex coordinates are in [-1..1] range) so positions should be divisible by two\n\n📄 src/3d-textured.js\n```diff\n  for (let i = -50; i \u003c 50; i++) {\n      for (let j = -50; j \u003c 50; j++) {\n          const matrix = mat4.create();\n+ \n+         const position = [i * 2, (Math.floor(Math.random() * 2) - 1) * 2, j * 2];\n      }\n  }\n  \n\n```\nNow we need to create a transform matrix. Let's use `ma4.fromTranslation`\n\n📄 src/3d-textured.js\n```diff\n          const matrix = mat4.create();\n  \n          const position = [i * 2, (Math.floor(Math.random() * 2) - 1) * 2, j * 2];\n+         mat4.fromTranslation(matrix, position);\n      }\n  }\n  \n\n```\nLet's also rotate each block around Y axis to make terrain look more random\n\n📄 src/3d-textured.js\n```diff\n  gl.viewport(0, 0, canvas.width, canvas.height);\n  \n  const matrices = [];\n+ const rotationMatrix = mat4.create();\n  \n  for (let i = -50; i \u003c 50; i++) {\n      for (let j = -50; j \u003c 50; j++) {\n  \n          const position = [i * 2, (Math.floor(Math.random() * 2) - 1) * 2, j * 2];\n          mat4.fromTranslation(matrix, position);\n+ \n+         mat4.fromRotation(rotationMatrix, Math.PI * Math.round(Math.random() * 4), [0, 1, 0]);\n+         mat4.multiply(matrix, matrix, rotationMatrix);\n      }\n  }\n  \n\n```\nand finally push matrix of each block to matrices collection\n\n📄 src/3d-textured.js\n```diff\n  \n          mat4.fromRotation(rotationMatrix, Math.PI * Math.round(Math.random() * 4), [0, 1, 0]);\n          mat4.multiply(matrix, matrix, rotationMatrix);\n+ \n+         matrices.push(matrix);\n      }\n  }\n  \n\n```\nSince our blocks are static, we don't need a rotation transform in each frame\n\n📄 src/3d-textured.js\n```diff\n  }\n  \n  function frame() {\n-     mat4.rotateY(cube.modelMatrix, cube.modelMatrix, Math.PI / 180);\n- \n      gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, cube.modelMatrix);\n      gl.uniformMatrix4fv(programInfo.uniformLocations.normalMatrix, false, cube.normalMatrix);\n  \n\n```\nNow we'll need to iterate over matrices collection and issue a draw call for each cube with its transform matrix passed to uniform\n\n📄 src/3d-textured.js\n```diff\n  }\n  \n  function frame() {\n-     gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, cube.modelMatrix);\n-     gl.uniformMatrix4fv(programInfo.uniformLocations.normalMatrix, false, cube.normalMatrix);\n+     matrices.forEach((matrix) =\u003e {\n+         gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, matrix);\n+         gl.uniformMatrix4fv(programInfo.uniformLocations.normalMatrix, false, cube.normalMatrix);\n  \n-     gl.drawArrays(gl.TRIANGLES, 0, vertexBuffer.data.length / 3);\n+         gl.drawArrays(gl.TRIANGLES, 0, vertexBuffer.data.length / 3);\n+     });\n  \n      requestAnimationFrame(frame);\n  }\n\n```\nNow let's create an animation of rotating camera. Camera has a position and a point where it is pointed. So to implement this, we need to rotate focus point around camera position. Let's first get rid of static view matrix\n\n📄 src/3d-textured.js\n```diff\n  const viewMatrix = mat4.create();\n  const projectionMatrix = mat4.create();\n  \n- mat4.lookAt(viewMatrix, [0, 4, -7], [0, 0, 0], [0, 1, 0]);\n- \n  mat4.perspective(projectionMatrix, (Math.PI / 360) * 90, canvas.width / canvas.height, 0.01, 100);\n  \n  gl.uniformMatrix4fv(programInfo.uniformLocations.viewMatrix, false, viewMatrix);\n\n```\nDefine camera position, camera focus point vector and focus point transform matrix\n\n📄 src/3d-textured.js\n```diff\n- import { mat4 } from 'gl-matrix';\n+ import { mat4, vec3 } from 'gl-matrix';\n  \n  import vShaderSource from './shaders/3d-textured.v.glsl';\n  import fShaderSource from './shaders/3d-textured.f.glsl';\n      }\n  }\n  \n+ const cameraPosition = [0, 10, 0];\n+ const cameraFocusPoint = vec3.fromValues(30, 0, 0);\n+ const cameraFocusPointMatrix = mat4.create();\n+ \n+ mat4.fromTranslation(cameraFocusPointMatrix, cameraFocusPoint);\n+ \n  function frame() {\n      matrices.forEach((matrix) =\u003e {\n          gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, matrix);\n\n```\nOur camera is located in 0.0.0, so we need to translate camera focus point to 0.0.0, rotate it, and translate back to original position\n\n📄 src/3d-textured.js\n```diff\n  mat4.fromTranslation(cameraFocusPointMatrix, cameraFocusPoint);\n  \n  function frame() {\n+     mat4.translate(cameraFocusPointMatrix, cameraFocusPointMatrix, [-30, 0, 0]);\n+     mat4.rotateY(cameraFocusPointMatrix, cameraFocusPointMatrix, Math.PI / 360);\n+     mat4.translate(cameraFocusPointMatrix, cameraFocusPointMatrix, [30, 0, 0]);\n+ \n      matrices.forEach((matrix) =\u003e {\n          gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, matrix);\n          gl.uniformMatrix4fv(programInfo.uniformLocations.normalMatrix, false, cube.normalMatrix);\n\n```\nFinal step – update view matrix uniform\n\n📄 src/3d-textured.js\n```diff\n      mat4.rotateY(cameraFocusPointMatrix, cameraFocusPointMatrix, Math.PI / 360);\n      mat4.translate(cameraFocusPointMatrix, cameraFocusPointMatrix, [30, 0, 0]);\n  \n+     mat4.getTranslation(cameraFocusPoint, cameraFocusPointMatrix);\n+ \n+     mat4.lookAt(viewMatrix, cameraPosition, cameraFocusPoint, [0, 1, 0]);\n+     gl.uniformMatrix4fv(programInfo.uniformLocations.viewMatrix, false, viewMatrix);\n+ \n      matrices.forEach((matrix) =\u003e {\n          gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, matrix);\n-         gl.uniformMatrix4fv(programInfo.uniformLocations.normalMatrix, false, cube.normalMatrix);\n  \n          gl.drawArrays(gl.TRIANGLES, 0, vertexBuffer.data.length / 3);\n      });\n\n```\nThat's it!\n\nThis approach is not very performant though, as we're issuing 2 gl calls for each object, so it is a 20k of gl calls each frame. GL calls are expensive, so we'll need to reduce this number. We'll learn a great technique tomorrow!\n\n---\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social\u0026hash=day21)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social\u0026hash=day21)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 22. Optimizing minecraft terrain by reducing number of webgl calls by 5000 times\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋\n\nWelcome to WebGL month\n\n[Yesterday](https://dev.to/lesnitsky/webgl-month-day-21-rendering-a-minecraft-terrain-24b5) we've rendered minecraft terrain, but implementation wasn't optimal. We had to issue two gl calls for each block. One to update model matrix uniform, another to issue a draw call. There is a way to render the whole scene with a SINGLE call, so that way we'll reduce number of calls by 5000 times 🤯.\n\n\nThese technique is called WebGL instancing. Our cubes share the same vertex and tex coord data, the only difference is model matrix. Instead of passing it as uniform we can define an attribute\n\n📄 src/shaders/3d-textured.v.glsl\n```diff\n  attribute vec3 position;\n  attribute vec2 texCoord;\n+ attribute mat4 modelMatrix;\n  \n- uniform mat4 modelMatrix;\n  uniform mat4 viewMatrix;\n  uniform mat4 projectionMatrix;\n  \n\n```\nMatrix attributes are actually a number of `vec4` attributes, so if `mat4` attribute location is `0`, we'll have 4 separate attributes with locations `0`, `1`, `2`, `3`. Our `setupShaderInput` helper doesn't support these, so we'll need to enable each attribute manually\n\n📄 src/3d-textured.js\n```diff\n  \n  const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n  \n+ for (let i = 0; i \u003c 4; i++) {\n+     gl.enableVertexAttribArray(programInfo.attributeLocations.modelMatrix + i);\n+ }\n+ \n  const cube = new Object3D(cubeObj, [0, 0, 0], [1, 0, 0]);\n  \n  const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, cube.vertices, gl.STATIC_DRAW);\n\n```\nNow we need to define a Float32Array for matrices data. The size is `100 * 100` (size of our world) `* 4 * 4` (dimensions of the model matrix)\n\n📄 src/3d-textured.js\n```diff\n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n  \n- const matrices = [];\n+ const matrices = new Float32Array(100 * 100 * 4 * 4);\n  const rotationMatrix = mat4.create();\n  \n  for (let i = -50; i \u003c 50; i++) {\n\n```\nTo save resources we can use a single model matrix for all cubes while filling matrices array with data\n\n📄 src/3d-textured.js\n```diff\n  gl.viewport(0, 0, canvas.width, canvas.height);\n  \n  const matrices = new Float32Array(100 * 100 * 4 * 4);\n+ const modelMatrix = mat4.create();\n  const rotationMatrix = mat4.create();\n  \n  for (let i = -50; i \u003c 50; i++) {\n      for (let j = -50; j \u003c 50; j++) {\n-         const matrix = mat4.create();\n- \n          const position = [i * 2, (Math.floor(Math.random() * 2) - 1) * 2, j * 2];\n-         mat4.fromTranslation(matrix, position);\n+         mat4.fromTranslation(modelMatrix, position);\n  \n          mat4.fromRotation(rotationMatrix, Math.PI * Math.round(Math.random() * 4), [0, 1, 0]);\n-         mat4.multiply(matrix, matrix, rotationMatrix);\n+         mat4.multiply(modelMatrix, modelMatrix, rotationMatrix);\n  \n          matrices.push(matrix);\n      }\n\n```\nWe'll also need a counter to know the offset at the matrices Float32Array to write data to an appropriate location\n\n📄 src/3d-textured.js\n```diff\n  const modelMatrix = mat4.create();\n  const rotationMatrix = mat4.create();\n  \n+ let cubeIndex = 0;\n+ \n  for (let i = -50; i \u003c 50; i++) {\n      for (let j = -50; j \u003c 50; j++) {\n          const position = [i * 2, (Math.floor(Math.random() * 2) - 1) * 2, j * 2];\n          mat4.fromRotation(rotationMatrix, Math.PI * Math.round(Math.random() * 4), [0, 1, 0]);\n          mat4.multiply(modelMatrix, modelMatrix, rotationMatrix);\n  \n-         matrices.push(matrix);\n+         modelMatrix.forEach((value, index) =\u003e {\n+             matrices[cubeIndex * 4 * 4 + index] = value;\n+         });\n+ \n+         cubeIndex++;\n      }\n  }\n  \n\n```\nNext we need a matrices gl buffer\n\n📄 src/3d-textured.js\n```diff\n      }\n  }\n  \n+ const matricesBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, matrices, gl.STATIC_DRAW);\n+ \n  const cameraPosition = [0, 10, 0];\n  const cameraFocusPoint = vec3.fromValues(30, 0, 0);\n  const cameraFocusPointMatrix = mat4.create();\n\n```\nand setup attribute pointer using stride and offset, since our buffer is interleaved. [Learn more about interleaved buffers here](https://dev.to/lesnitsky/webgl-month-day-5-interleaved-buffers-2k9a)\n\n📄 src/3d-textured.js\n```diff\n  \n  const matricesBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, matrices, gl.STATIC_DRAW);\n  \n+ const offset = 4 * 4; // 4 floats 4 bytes each\n+ const stride = offset * 4; // 4 rows of 4 floats\n+ \n+ for (let i = 0; i \u003c 4; i++) {\n+     gl.vertexAttribPointer(programInfo.attributeLocations.modelMatrix + i, 4, gl.FLOAT, false, stride, i * offset);\n+ }\n+ \n  const cameraPosition = [0, 10, 0];\n  const cameraFocusPoint = vec3.fromValues(30, 0, 0);\n  const cameraFocusPointMatrix = mat4.create();\n\n```\nInstancing itself isn't supported be webgl 1 out of the box, but available via extension, so we need to get it\n\n📄 src/3d-textured.js\n```diff\n  const offset = 4 * 4; // 4 floats 4 bytes each\n  const stride = offset * 4; // 4 rows of 4 floats\n  \n+ const ext = gl.getExtension('ANGLE_instanced_arrays');\n+ \n  for (let i = 0; i \u003c 4; i++) {\n      gl.vertexAttribPointer(programInfo.attributeLocations.modelMatrix + i, 4, gl.FLOAT, false, stride, i * offset);\n  }\n\n```\nBasically what this extension does, is helps us avoid repeating vertex positions and texture coordinates for each cube, since these are the same. By using instancing we tell WebGL to render N instances of objects, reusing some attribute data for each object, and getting \"unique\" data for other attributes. To specify which attributes contain data for each object, we need to call `vertexAttribDivisorANGLE(location, divisor)` method of the extension.\n\nDivisor is used to determine how to read data from attributes filled with data for each object.\n\nOur modelMatrix attribute has a matrix for each object, so divisor should be `1`.\nWe can use modelMarix `A` for objects `0` and `1`, `B` for objects `2` and `3` – in this case divisor is `2`.\n\nIn our case it is `1`.\n\n📄 src/3d-textured.js\n```diff\n  \n  for (let i = 0; i \u003c 4; i++) {\n      gl.vertexAttribPointer(programInfo.attributeLocations.modelMatrix + i, 4, gl.FLOAT, false, stride, i * offset);\n+     ext.vertexAttribDivisorANGLE(programInfo.attributeLocations.modelMatrix + i, 1);\n  }\n  \n  const cameraPosition = [0, 10, 0];\n\n```\nFinally we can get read of iteration over all matrices, and use a single  call. However we should call it on the instance of extension instead of gl itself. The last argument should be the number of instances we want to render\n\n📄 src/3d-textured.js\n```diff\n      mat4.lookAt(viewMatrix, cameraPosition, cameraFocusPoint, [0, 1, 0]);\n      gl.uniformMatrix4fv(programInfo.uniformLocations.viewMatrix, false, viewMatrix);\n  \n-     matrices.forEach((matrix) =\u003e {\n-         gl.uniformMatrix4fv(programInfo.uniformLocations.modelMatrix, false, matrix);\n- \n-         gl.drawArrays(gl.TRIANGLES, 0, vertexBuffer.data.length / 3);\n-     });\n+     ext.drawArraysInstancedANGLE(gl.TRIANGLES, 0, vertexBuffer.data.length / 3, 100 * 100);\n  \n      requestAnimationFrame(frame);\n  }\n\n```\nThat's it! We just reduced number of gl calls by 5000 times 🎉!\n\nWebGL instancing extension is widely support, so don't hesitate to use it whenever it makes sense.\n\nTypical case – need to render a lot of the same objects but with different locations, colors and other type of \"attributes\"\n\nThanks for reading!\nSee you tomorrow 👋\n\n---\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 23. Skybox in WebGL\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋\n\nWelcome to WebGL month.\n\nIn previous tutorials we've rendered objects without any surroundings, but what if we want to add sky to our scene?\n\nThere's a special texture type which mught help us with it\n\nWe can treat our scene as a giant cube where camera is always in the center of this cube.\nSo all we need it render this cube and apply a texture, like below\n\n![Skybox](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/skybox.png)\n\n\nVertex shader will have vertex positions and texCoord attribute, view and projection matrix uniforms. We don't need model matrix as our \"world\" cube is static\n\n📄 src/shaders/skybox.v.glsl\n```glsl\nattribute vec3 position;\nvarying vec3 vTexCoord;\n\nuniform mat4 projectionMatrix;\nuniform mat4 viewMatrix;\n\nvoid main() {\n\n}\n\n```\nIf our cube vertices coordinates are in `[-1..1]` range, we can use this coordinates as texture coordinates directly\n\n📄 src/shaders/skybox.v.glsl\n```diff\n  uniform mat4 viewMatrix;\n  \n  void main() {\n- \n+     vTexCoord = position;\n  }\n\n```\nAnd to calculate position of transformed vertex we need to multiply vertex position, view matrix and projection matrix\n\n📄 src/shaders/skybox.v.glsl\n```diff\n  \n  void main() {\n      vTexCoord = position;\n+     gl_Position = projectionMatrix * viewMatrix * vec4(position, 1.0);\n  }\n\n```\nFragment shader should have a vTexCoord varying to receive tex coords from vertex shader\n\n📄 src/shaders/skybox.f.glsl\n```glsl\nprecision mediump float;\n\nvarying vec3 vTexCoord;\n\nvoid main() {\n\n}\n\n```\nand a special type of texture – sampler cube\n\n📄 src/shaders/skybox.f.glsl\n```diff\n  precision mediump float;\n  \n  varying vec3 vTexCoord;\n+ uniform samplerCube skybox;\n  \n  void main() {\n- \n  }\n\n```\nand all we need to calculate fragment color is to read color from cubemap texture\n\n📄 src/shaders/skybox.f.glsl\n```diff\n  uniform samplerCube skybox;\n  \n  void main() {\n+     gl_FragColor = textureCube(skybox, vTexCoord);\n  }\n\n```\nAs usual we need to get a canvas reference, webgl context, and make canvas fullscreen\n\n📄 src/skybox.js\n```js\nconst canvas = document.querySelector('canvas');\nconst gl = canvas.getContext('webgl');\n\nconst width = document.body.offsetWidth;\nconst height = document.body.offsetHeight;\n\ncanvas.width = width * devicePixelRatio;\ncanvas.height = height * devicePixelRatio;\n\ncanvas.style.width = `${width}px`;\ncanvas.style.height = `${height}px`;\n\n```\nSetup webgl program\n\n📄 src/skybox.js\n```diff\n+ import vShaderSource from './shaders/skybox.v.glsl';\n+ import fShaderSource from './shaders/skybox.f.glsl';\n+ \n+ import { compileShader, setupShaderInput } from './gl-helpers';\n+ \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  \n  \n  canvas.style.width = `${width}px`;\n  canvas.style.height = `${height}px`;\n+ \n+ const vShader = gl.createShader(gl.VERTEX_SHADER);\n+ const fShader = gl.createShader(gl.FRAGMENT_SHADER);\n+ \n+ compileShader(gl, vShader, vShaderSource);\n+ compileShader(gl, fShader, fShaderSource);\n+ \n+ const program = gl.createProgram();\n+ \n+ gl.attachShader(program, vShader);\n+ gl.attachShader(program, fShader);\n+ \n+ gl.linkProgram(program);\n+ gl.useProgram(program);\n+ \n+ const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n\n```\nCreate cube object and setup buffer for vertex positions\n\n📄 src/skybox.js\n```diff\n  import fShaderSource from './shaders/skybox.f.glsl';\n  \n  import { compileShader, setupShaderInput } from './gl-helpers';\n+ import { Object3D } from './Object3D';\n+ import { GLBuffer } from './GLBuffer';\n+ \n+ import cubeObj from '../assets/objects/cube.obj';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  gl.useProgram(program);\n  \n  const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n+ \n+ const cube = new Object3D(cubeObj, [0, 0, 0], [0, 0, 0]);\n+ const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, cube.vertices, gl.STATIC_DRAW);\n\n```\nSetup position attribute\n\n📄 src/skybox.js\n```diff\n  \n  const cube = new Object3D(cubeObj, [0, 0, 0], [0, 0, 0]);\n  const vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, cube.vertices, gl.STATIC_DRAW);\n+ \n+ vertexBuffer.bind(gl);\n+ gl.vertexAttribPointer(programInfo.attributeLocations.position, 3, gl.FLOAT, false, 0, 0);\n\n```\nSetup view, projection matrices, pass values to uniforms and set viewport\n\n📄 src/skybox.js\n```diff\n  import { GLBuffer } from './GLBuffer';\n  \n  import cubeObj from '../assets/objects/cube.obj';\n+ import { mat4 } from 'gl-matrix';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  \n  vertexBuffer.bind(gl);\n  gl.vertexAttribPointer(programInfo.attributeLocations.position, 3, gl.FLOAT, false, 0, 0);\n+ \n+ const viewMatrix = mat4.create();\n+ const projectionMatrix = mat4.create();\n+ \n+ mat4.lookAt(viewMatrix, [0, 0, 0], [0, 0, -1], [0, 1, 0]);\n+ \n+ mat4.perspective(projectionMatrix, (Math.PI / 360) * 90, canvas.width / canvas.height, 0.01, 100);\n+ \n+ gl.uniformMatrix4fv(programInfo.uniformLocations.viewMatrix, false, viewMatrix);\n+ gl.uniformMatrix4fv(programInfo.uniformLocations.projectionMatrix, false, projectionMatrix);\n+ \n+ gl.viewport(0, 0, canvas.width, canvas.height);\n\n```\nAnd define a function which will render our scene\n\n📄 src/skybox.js\n```diff\n  gl.uniformMatrix4fv(programInfo.uniformLocations.projectionMatrix, false, projectionMatrix);\n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n+ \n+ function frame() {\n+     gl.drawArrays(gl.TRIANGLES, 0, vertexBuffer.data.length / 3);\n+ \n+     requestAnimationFrame(frame);\n+ }\n\n```\nNow the fun part. Texture for each side of the cube should be stored in separate file, so we need to laod all images. [Check out this site for other textures](http://www.custommapmakers.org/skyboxes.php)\n\n📄 src/skybox.js\n```diff\n  import vShaderSource from './shaders/skybox.v.glsl';\n  import fShaderSource from './shaders/skybox.f.glsl';\n  \n- import { compileShader, setupShaderInput } from './gl-helpers';\n+ import { compileShader, setupShaderInput, loadImage } from './gl-helpers';\n  import { Object3D } from './Object3D';\n  import { GLBuffer } from './GLBuffer';\n  \n  import cubeObj from '../assets/objects/cube.obj';\n  import { mat4 } from 'gl-matrix';\n  \n+ import rightTexture from '../assets/images/skybox/right.JPG';\n+ import leftTexture from '../assets/images/skybox/left.JPG';\n+ import upTexture from '../assets/images/skybox/up.JPG';\n+ import downTexture from '../assets/images/skybox/down.JPG';\n+ import backTexture from '../assets/images/skybox/back.JPG';\n+ import frontTexture from '../assets/images/skybox/front.JPG';\n+ \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  \n  \n      requestAnimationFrame(frame);\n  }\n+ \n+ Promise.all([\n+     loadImage(rightTexture),\n+     loadImage(leftTexture),\n+     loadImage(upTexture),\n+     loadImage(downTexture),\n+     loadImage(backTexture),\n+     loadImage(frontTexture),\n+ ]).then((images) =\u003e {\n+     frame();\n+ });\n\n```\nNow we need to create a webgl texture\n\n📄 src/skybox.js\n```diff\n      loadImage(backTexture),\n      loadImage(frontTexture),\n  ]).then((images) =\u003e {\n+     const texture = gl.createTexture();\n+ \n      frame();\n  });\n\n```\nAnd pass a special texture type to bind method – `gl.TEXTURE_CUBE_MAP`\n\n📄 src/skybox.js\n```diff\n      loadImage(frontTexture),\n  ]).then((images) =\u003e {\n      const texture = gl.createTexture();\n+     gl.bindTexture(gl.TEXTURE_CUBE_MAP, texture);\n  \n      frame();\n  });\n\n```\nThen we need to setup texture\n\n📄 src/skybox.js\n```diff\n      const texture = gl.createTexture();\n      gl.bindTexture(gl.TEXTURE_CUBE_MAP, texture);\n  \n+     gl.texParameteri(gl.TEXTURE_CUBE_MAP, gl.TEXTURE_MIN_FILTER, gl.LINEAR);\n+     gl.texParameteri(gl.TEXTURE_CUBE_MAP, gl.TEXTURE_MAG_FILTER, gl.LINEAR);\n+     gl.texParameteri(gl.TEXTURE_CUBE_MAP, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);\n+     gl.texParameteri(gl.TEXTURE_CUBE_MAP, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);\n+ \n      frame();\n  });\n\n```\nand upload each image to gpu\n\nTargets are:\n\n-   `gl.TEXTURE_CUBE_MAP_POSITIVE_X` – right\n-   `gl.TEXTURE_CUBE_MAP_NEGATIVE_X` – left\n-   `gl.TEXTURE_CUBE_MAP_POSITIVE_Y` – top\n-   `gl.TEXTURE_CUBE_MAP_NEGATIVE_Y` – bottom\n-   `gl.TEXTURE_CUBE_MAP_POSITIVE_Z` – front\n-   `gl.TEXTURE_CUBE_MAP_NEGATIVE_Z` – back\n\nSince all these values are integers, we can iterate over all images and add image index to `TEXTURE_CUBE_MAP_POSITIVE_X` target\n\n📄 src/skybox.js\n```diff\n      gl.texParameteri(gl.TEXTURE_CUBE_MAP, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);\n      gl.texParameteri(gl.TEXTURE_CUBE_MAP, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);\n  \n+     images.forEach((image, index) =\u003e {\n+         gl.texImage2D(gl.TEXTURE_CUBE_MAP_POSITIVE_X + index, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, image);\n+     });\n+ \n      frame();\n  });\n\n```\nand finally let's reuse the code from [previous tutorial](https://dev.to/lesnitsky/webgl-month-day-21-rendering-a-minecraft-terrain-24b5) to implement camera rotation animation\n\n📄 src/skybox.js\n```diff\n  import { GLBuffer } from './GLBuffer';\n  \n  import cubeObj from '../assets/objects/cube.obj';\n- import { mat4 } from 'gl-matrix';\n+ import { mat4, vec3 } from 'gl-matrix';\n  \n  import rightTexture from '../assets/images/skybox/right.JPG';\n  import leftTexture from '../assets/images/skybox/left.JPG';\n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n  \n+ const cameraPosition = [0, 0, 0];\n+ const cameraFocusPoint = vec3.fromValues(0, 0, 1);\n+ const cameraFocusPointMatrix = mat4.create();\n+ \n+ mat4.fromTranslation(cameraFocusPointMatrix, cameraFocusPoint);\n+ \n  function frame() {\n+     mat4.translate(cameraFocusPointMatrix, cameraFocusPointMatrix, [0, 0, -1]);\n+     mat4.rotateY(cameraFocusPointMatrix, cameraFocusPointMatrix, Math.PI / 360);\n+     mat4.translate(cameraFocusPointMatrix, cameraFocusPointMatrix, [0, 0, 1]);\n+ \n+     mat4.getTranslation(cameraFocusPoint, cameraFocusPointMatrix);\n+ \n+     mat4.lookAt(viewMatrix, cameraPosition, cameraFocusPoint, [0, 1, 0]);\n+     gl.uniformMatrix4fv(programInfo.uniformLocations.viewMatrix, false, viewMatrix);\n+ \n      gl.drawArrays(gl.TRIANGLES, 0, vertexBuffer.data.length / 3);\n  \n      requestAnimationFrame(frame);\n\n```\nThat's it, we now have a skybox which makes scene look more impressive 😎\n\nThanks for reading!\n\nSee you tomorrow 👋\n\n---\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 24. Combining terrain and skybox\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋\n\nWelcome to WebGL month\n\nIn previous tutorials we've rendered minecraft terrain and skybox, but in different examples. How do we combine them? WebGL allows to use multiple programs, so we can combine both examples with a slight refactor.\n\nLet's create a new entry point file `minecraft.js` and assume `skybox.js` and `minecraft-terrain.js` export `prepare` and `render` functions\n\n```javascript\nimport { prepare as prepareSkybox, render as renderSkybox } from './skybox';\nimport { prepare as prepareTerrain, render as renderTerrain } from './minecraft-terrain';\n```\n\nNext we'll need to setup a canvas\n\n```javascript\nconst canvas = document.querySelector('canvas');\nconst gl = canvas.getContext('webgl');\n\nconst width = document.body.offsetWidth;\nconst height = document.body.offsetHeight;\n\ncanvas.width = width * devicePixelRatio;\ncanvas.height = height * devicePixelRatio;\n\ncanvas.style.width = `${width}px`;\ncanvas.style.height = `${height}px`;\n```\n\nSetup camera matrices\n\n```javascript\nconst viewMatrix = mat4.create();\nconst projectionMatrix = mat4.create();\n\nmat4.lookAt(viewMatrix, [0, 0, 0], [0, 0, -1], [0, 1, 0]);\n\nmat4.perspective(projectionMatrix, (Math.PI / 360) * 90, canvas.width / canvas.height, 0.01, 142);\n\ngl.viewport(0, 0, canvas.width, canvas.height);\n\nconst cameraPosition = [0, 5, 0];\nconst cameraFocusPoint = vec3.fromValues(0, 0, 30);\nconst cameraFocusPointMatrix = mat4.create();\n\nmat4.fromTranslation(cameraFocusPointMatrix, cameraFocusPoint);\n```\n\nDefine a render function\n\n```javascript\nfunction render() {\n    renderSkybox(gl, viewMatrix, projectionMatrix);\n    renderTerrain(gl, viewMatrix, projectionMatrix);\n\n    requestAnimationFrame(render);\n}\n```\n\nand execute \"preparation\" code\n\n```javascript\n(async () =\u003e {\n    await prepareSkybox(gl);\n    await prepareTerrain(gl);\n\n    render();\n})();\n```\n\nNow we need to implement `prepare` and `render` functions of skybox and terrain\n\nBoth functions will require access to shared state, like WebGL program, attributes and buffers, so let's create an object\n\n```javascript\nconst State = {};\n\nexport async function prepare(gl) {\n    // initialization code goes here\n}\n```\n\nSo what's a \"preparation\" step?\n\nIt's about creating program\n\n```diff\n  export async function prepare(gl) {\n+     const vShader = gl.createShader(gl.VERTEX_SHADER);\n+     const fShader = gl.createShader(gl.FRAGMENT_SHADER);\n\n+     compileShader(gl, vShader, vShaderSource);\n+     compileShader(gl, fShader, fShaderSource);\n\n+     const program = gl.createProgram();\n+     State.program = program;\n\n+     gl.attachShader(program, vShader);\n+     gl.attachShader(program, fShader);\n\n+     gl.linkProgram(program);\n+     gl.useProgram(program);\n\n+     State.programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n  }\n```\n\nBuffers\n\n```diff\n      gl.useProgram(program);\n\n      State.programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n\n+     const cube = new Object3D(cubeObj, [0, 0, 0], [0, 0, 0]);\n+     State.vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, cube.vertices, gl.STATIC_DRAW);\n  }\n```\n\nTextures\n\n```diff\n      const cube = new Object3D(cubeObj, [0, 0, 0], [0, 0, 0]);\n      State.vertexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, cube.vertices, gl.STATIC_DRAW);\n\n+     await Promise.all([\n+         loadImage(rightTexture),\n+         loadImage(leftTexture),\n+         loadImage(upTexture),\n+         loadImage(downTexture),\n+         loadImage(backTexture),\n+         loadImage(frontTexture),\n+     ]).then((images) =\u003e {\n+         State.texture = gl.createTexture();\n+         gl.bindTexture(gl.TEXTURE_CUBE_MAP, State.texture);\n\n+         gl.texParameteri(gl.TEXTURE_CUBE_MAP, gl.TEXTURE_MIN_FILTER, gl.LINEAR);\n+         gl.texParameteri(gl.TEXTURE_CUBE_MAP, gl.TEXTURE_MAG_FILTER, gl.LINEAR);\n+         gl.texParameteri(gl.TEXTURE_CUBE_MAP, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);\n+         gl.texParameteri(gl.TEXTURE_CUBE_MAP, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);\n\n+         images.forEach((image, index) =\u003e {\n+             gl.texImage2D(gl.TEXTURE_CUBE_MAP_POSITIVE_X + index, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, image);\n+         });\n+     });\n}\n```\n\nand setting up attributes\n\n```diff\n              gl.texImage2D(gl.TEXTURE_CUBE_MAP_POSITIVE_X   index, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, image);\n          });\n      });\n+     setupAttributes(gl);\n}\n```\n\nWe need a separate function to setup attributes because we'll need to do this in render function as well. Attributes share the state between different programs, so we'll need to setup them properly each time we use different program\n\n`setupAttributes` looks like this for `skybox`\n\n```javascript\nfunction setupAttributes(gl) {\n    State.vertexBuffer.bind(gl);\n    gl.vertexAttribPointer(State.programInfo.attributeLocations.position, 3, gl.FLOAT, false, 0, 0);\n}\n```\n\nAnd now we need a render function which will pass view and projection matrices to uniforms and issue a draw call\n\n```javascript\nexport function render(gl, viewMatrix, projectionMatrix) {\n    gl.useProgram(State.program);\n\n    gl.uniformMatrix4fv(State.programInfo.uniformLocations.viewMatrix, false, viewMatrix);\n    gl.uniformMatrix4fv(State.programInfo.uniformLocations.projectionMatrix, false, projectionMatrix);\n\n    setupAttributes(gl);\n\n    gl.drawArrays(gl.TRIANGLES, 0, State.vertexBuffer.data.length / 3);\n}\n```\n\nThis refactor is pretty straightforward, as it requires only moving pieces of code to necessary functions, so this steps will look the same for `minecraft-terrain`, with one exception\n\nWe're using `ANGLE_instanced_arrays` extension to render terrain, which sets up `divisorAngle`. As attributes share the state between programs, we'll need to \"reset\" those divisor angles.\n\n```javascript\nfunction resetDivisorAngles() {\n    for (let i = 0; i \u003c 4; i++) {\n        State.ext.vertexAttribDivisorANGLE(State.programInfo.attributeLocations.modelMatrix + i, 0);\n    }\n}\n```\n\nand call this function after a draw call\n\n```javascript\nexport function render(gl, viewMatrix, projectionMatrix) {\n    gl.useProgram(State.program);\n\n    setupAttributes(gl);\n\n    gl.uniformMatrix4fv(State.programInfo.uniformLocations.viewMatrix, false, viewMatrix);\n    gl.uniformMatrix4fv(State.programInfo.uniformLocations.projectionMatrix, false, projectionMatrix);\n\n    State.ext.drawArraysInstancedANGLE(gl.TRIANGLES, 0, State.vertexBuffer.data.length / 3, 100 * 100);\n\n    resetDivisorAngles();\n}\n```\n\nDoes resulting code actually work?\n\nUnfortunatelly no 😢\nThe issue is that we render the skybox inisde the cube which is smaller than our terrain, but we can fix it with a single change in skybox vertex shader\n\n```diff\n  attribute vec3 position;\n  varying vec3 vTexCoord;\n\n  uniform mat4 projectionMatrix;\n  uniform mat4 viewMatrix;\n\n  void main() {\n      vTexCoord = position;\n-     gl_Position = projectionMatrix * viewMatrix * vec4(position, 1);\n+     gl_Position = projectionMatrix * viewMatrix * vec4(position, 0.01);\n  }\n```\n\nBy changing the 4th argument, we'll scale our skybox by 100 times (the magic of homogeneous coordinates).\n\nAfter this change the world looks ok, until we try to look at the farthest \"edge\" of our world cube. Skybox isn't rendered there 😢\n\nThis happens because of the `zFar` argument passed to projection matrix\n\n```diff\n  const projectionMatrix = mat4.create();\n\n  mat4.lookAt(viewMatrix, [0, 0, 0], [0, 0, -1], [0, 1, 0]);\n\n- mat4.perspective(projectionMatrix, (Math.PI / 360) * 90, canvas.width / canvas.height, 0.01, 100);\n+ mat4.perspective(projectionMatrix, (Math.PI / 360) * 90, canvas.width / canvas.height, 0.01, 142);\n\n  gl.viewport(0, 0, canvas.width, canvas.height);\n```\n\nThe distance to the farthest edge is `Math.sqrt(size ** 2 + size ** 2)`, which is `141.4213562373095`, so we can just pass `142`\n\nThat's it!\n\nThanks for reading, see you tomorrow 👋\n\n---\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 25. Mipmaps\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋\n\nWelcome to WebGL month\n\nToday we're going to learn one more webgl concept which might improve the quality of the final rendered image\n\nFirst we need to discuss how color is being read from texture.\n\nLet say we have a 1024x1024 image, but render only a 512x512 area on canvas. So each pixel in resulting image represents 4 pixels in original texture.\n\nHere's where `gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, filter)` plays some role\n\nThere are several algorithms on how to read a color from the texture\n\n-   `gl.LINEAR` - this one will read 4 pixels of original image and blend colors of 4 pixels to calculate final pixel color\n\n-   `gl.NEARETS` will just take the closest coordinate of the pixel from original image and use this color. While being more performant, this method has a lower quality\n\nBoth methods has it's caveats, especially when the size of area which need to be painted with texture is much smaller than original texture\n\nThere is a special technique which allows to improve the quality and performance of rendering when dealing with textures. This special textures are called [mipmaps] – pre-calculated sequences of images, where each next image has a progressively smaller resolution. So when fragment shader reads a color from a texture, it takes the closest texture in size, and reads a color from it.\n\nIn WebGL 1.0 mipmaps can only be generated for textures of \"power-of-2\" size (256x256, 512x512, 1024x1024 etc.)\n\nAnd that's how mipmap will look like for our dirt cube\n\n![Mipmap](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/mipmap.jpg)\n\n\nDon't worry, you don't need to generate such a sequence for all your textures, this could be done automatically if your texture is a size of power of 2\n\n📄 src/minecraft-terrain.js\n```diff\n  \n  const State = {};\n  \n+ /**\n+  *\n+  * @param {WebGLRenderingContext} gl\n+  */\n  export async function prepare(gl) {\n      const vShader = gl.createShader(gl.VERTEX_SHADER);\n      const fShader = gl.createShader(gl.FRAGMENT_SHADER);\n      await loadImage(textureSource).then((image) =\u003e {\n          const texture = createTexture(gl);\n          setImage(gl, texture, image);\n+ \n+         gl.generateMipmap(gl.TEXTURE_2D);\n      });\n  \n      setupAttributes(gl);\n\n```\nAnd in order to make GPU read a pixel color from mipmap, we need to specify `TEXTURE_MIN_FILTER`.\n\n📄 src/minecraft-terrain.js\n```diff\n          setImage(gl, texture, image);\n  \n          gl.generateMipmap(gl.TEXTURE_2D);\n+         gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST_MIPMAP_LINEAR);\n      });\n  \n      setupAttributes(gl);\n\n```\n`NEAREST_MIPMAP_LINEAR` will choose the closest size mipmap and interpolate 4 pixels to get resulting color\n\n\nThat's it for today!\n\nThanks for reading, see you tomorrow 👋\n\n---\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 26. Rendering to texture\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋 Welcome to WebGL month.\n\nIn one of our previous tutorials we've build some simple image filters, like \"black and white\", \"sepia\", etc.\nCan we apply this \"post effects\" not only to an existing image, but to the whole 3d scene we're rendering?\n\nYes, we can! However we'll still need a texture to process, so we need to render our scene not to a canvas, but to a texture first\n\nAs we know from the very first tutorial, canvas is just a buffer of pixel colors (4 integers, r, g, b, a)\nThere's also a depth buffer (for Z coordinate of each pixel)\n\nSo the idea is to make webgl render to some different \"buffer\" instead of canvas.\n\nThere's a special type of buffer, called `framebuffer` which can be treated as a render target\n\n\nTo create a framebuffer we need to call `gl.createFramebuffer`\n\n📄 src/minecraft.js\n```diff\n  \n  mat4.fromTranslation(cameraFocusPointMatrix, cameraFocusPoint);\n  \n+ const framebuffer = gl.createFramebuffer();\n+ \n  function render() {\n      mat4.translate(cameraFocusPointMatrix, cameraFocusPointMatrix, [0, 0, -30]);\n      mat4.rotateY(cameraFocusPointMatrix, cameraFocusPointMatrix, Math.PI / 360);\n\n```\nFramebuffer itself is not a storage, but rather a set of references to \"attachments\" (color, depth)\n\nTo render colors we'll need a texture\n\n📄 src/minecraft.js\n```diff\n  \n  const framebuffer = gl.createFramebuffer();\n  \n+ const texture = gl.createTexture();\n+ \n+ gl.bindTexture(gl.TEXTURE_2D, texture);\n+ gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, canvas.width, canvas.height, 0, gl.RGBA, gl.UNSIGNED_BYTE, null);\n+ \n+ gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);\n+ gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);\n+ gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);\n+ \n  function render() {\n      mat4.translate(cameraFocusPointMatrix, cameraFocusPointMatrix, [0, 0, -30]);\n      mat4.rotateY(cameraFocusPointMatrix, cameraFocusPointMatrix, Math.PI / 360);\n\n```\nNow we need to bind a framebuffer and setup a color attachment\n\n📄 src/minecraft.js\n```diff\n  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);\n  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);\n  \n+ gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffer);\n+ gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, texture, 0);\n+ \n  function render() {\n      mat4.translate(cameraFocusPointMatrix, cameraFocusPointMatrix, [0, 0, -30]);\n      mat4.rotateY(cameraFocusPointMatrix, cameraFocusPointMatrix, Math.PI / 360);\n\n```\nNow our canvas is white. Did we break something? No – everything is fine, but our scene is now rendered to a texture instead of canvas\n\n\nNow we need to render from texture to canvas\n\n\nVertex shader is very simple, we just need to render a canvas-size rectangle, so we can pass vertex positions from js without any transformations\n\n📄 src/shaders/filter.v.glsl\n```glsl\nattribute vec2 position;\n\nvoid main() {\n    gl_Position = vec4(position, 0, 1);\n}\n\n```\nFragment shader needs a texture to read a color from and resolution to transform pixel coordinates to texture coordinates\n\n📄 src/shaders/filter.f.glsl\n```glsl\nprecision mediump float;\n\nuniform sampler2D texture;\nuniform vec2 resolution;\n\nvoid main() {\n    gl_FragColor = texture2D(texture, gl_FragCoord.xy / resolution);\n}\n\n```\nNow we need to go through a program setup routine\n\n📄 src/minecraft.js\n```diff\n  import { prepare as prepareSkybox, render as renderSkybox } from './skybox';\n  import { prepare as prepareTerrain, render as renderTerrain } from './minecraft-terrain';\n  \n+ import vShaderSource from './shaders/filter.v.glsl';\n+ import fShaderSource from './shaders/filter.f.glsl';\n+ import { setupShaderInput, compileShader } from './gl-helpers';\n+ import { GLBuffer } from './GLBuffer';\n+ import { createRect } from './shape-helpers';\n+ \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  \n  gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffer);\n  gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, texture, 0);\n  \n+ const vShader = gl.createShader(gl.VERTEX_SHADER);\n+ const fShader = gl.createShader(gl.FRAGMENT_SHADER);\n+ \n+ compileShader(gl, vShader, vShaderSource);\n+ compileShader(gl, fShader, fShaderSource);\n+ \n+ const program = gl.createProgram();\n+ \n+ gl.attachShader(program, vShader);\n+ gl.attachShader(program, fShader);\n+ \n+ gl.linkProgram(program);\n+ gl.useProgram(program);\n+ \n+ const vertexPositionBuffer = new GLBuffer(\n+     gl,\n+     gl.ARRAY_BUFFER,\n+     new Float32Array([...createRect(-1, -1, 2, 2)]),\n+     gl.STATIC_DRAW\n+ );\n+ \n+ const indexBuffer = new GLBuffer(gl, gl.ELEMENT_ARRAY_BUFFER, new Uint8Array([0, 1, 2, 1, 2, 3]), gl.STATIC_DRAW);\n+ \n+ const programInfo = setupShaderInput(gl, program, vShaderSource, fShaderSource);\n+ \n+ vertexPositionBuffer.bind(gl);\n+ gl.vertexAttribPointer(programInfo.attributeLocations.position, 2, gl.FLOAT, false, 0, 0);\n+ \n+ gl.uniform2f(programInfo.uniformLocations.resolution, canvas.width, canvas.height);\n+ \n  function render() {\n      mat4.translate(cameraFocusPointMatrix, cameraFocusPointMatrix, [0, 0, -30]);\n      mat4.rotateY(cameraFocusPointMatrix, cameraFocusPointMatrix, Math.PI / 360);\n\n```\nIn the beginning of each frame we need to bind a framebuffer to tell webgl to render to a texture\n\n📄 src/minecraft.js\n```diff\n  gl.uniform2f(programInfo.uniformLocations.resolution, canvas.width, canvas.height);\n  \n  function render() {\n+     gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffer);\n+ \n      mat4.translate(cameraFocusPointMatrix, cameraFocusPointMatrix, [0, 0, -30]);\n      mat4.rotateY(cameraFocusPointMatrix, cameraFocusPointMatrix, Math.PI / 360);\n      mat4.translate(cameraFocusPointMatrix, cameraFocusPointMatrix, [0, 0, 30]);\n\n```\nand after we rendered the scene to texture, we need to use our new program\n\n📄 src/minecraft.js\n```diff\n      renderSkybox(gl, viewMatrix, projectionMatrix);\n      renderTerrain(gl, viewMatrix, projectionMatrix);\n  \n+     gl.useProgram(program);\n+ \n      requestAnimationFrame(render);\n  }\n  \n\n```\nSetup program attributes and uniforms\n\n📄 src/minecraft.js\n```diff\n  \n      gl.useProgram(program);\n  \n+     vertexPositionBuffer.bind(gl);\n+     gl.vertexAttribPointer(programInfo.attributeLocations.position, 2, gl.FLOAT, false, 0, 0);\n+ \n+     gl.uniform2f(programInfo.uniformLocations.resolution, canvas.width, canvas.height);\n+ \n      requestAnimationFrame(render);\n  }\n  \n\n```\nBind null framebuffer (this will make webgl render to canvas)\n\n📄 src/minecraft.js\n```diff\n  \n      gl.uniform2f(programInfo.uniformLocations.resolution, canvas.width, canvas.height);\n  \n+     gl.bindFramebuffer(gl.FRAMEBUFFER, null);\n+ \n      requestAnimationFrame(render);\n  }\n  \n\n```\nBind texture to use it as a source of color data\n\n📄 src/minecraft.js\n```diff\n      gl.uniform2f(programInfo.uniformLocations.resolution, canvas.width, canvas.height);\n  \n      gl.bindFramebuffer(gl.FRAMEBUFFER, null);\n+     gl.bindTexture(gl.TEXTURE_2D, texture);\n  \n      requestAnimationFrame(render);\n  }\n\n```\nAnd issue a draw call\n\n📄 src/minecraft.js\n```diff\n      gl.bindFramebuffer(gl.FRAMEBUFFER, null);\n      gl.bindTexture(gl.TEXTURE_2D, texture);\n  \n+     gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n+ \n      requestAnimationFrame(render);\n  }\n  \n\n```\nSince we're binding different texture after we render terrain and skybox, we need to re-bind textures in terrain and skybox programs\n\n📄 src/minecraft-terrain.js\n```diff\n  \n      await loadImage(textureSource).then((image) =\u003e {\n          const texture = createTexture(gl);\n+         State.texture = texture;\n+ \n          setImage(gl, texture, image);\n  \n          gl.generateMipmap(gl.TEXTURE_2D);\n  \n      setupAttributes(gl);\n  \n+     gl.bindTexture(gl.TEXTURE_2D, State.texture);\n+ \n      gl.uniformMatrix4fv(State.programInfo.uniformLocations.viewMatrix, false, viewMatrix);\n      gl.uniformMatrix4fv(State.programInfo.uniformLocations.projectionMatrix, false, projectionMatrix);\n  \n\n```\n📄 src/skybox.js\n```diff\n  export function render(gl, viewMatrix, projectionMatrix) {\n      gl.useProgram(State.program);\n  \n+     gl.bindTexture(gl.TEXTURE_CUBE_MAP, State.texture);\n+ \n      gl.uniformMatrix4fv(State.programInfo.uniformLocations.viewMatrix, false, viewMatrix);\n      gl.uniformMatrix4fv(State.programInfo.uniformLocations.projectionMatrix, false, projectionMatrix);\n  \n\n```\nWe need to create a depth buffer. Depth buffer is a render buffer (object which contains a data which came from fragmnt shader output)\n\n📄 src/minecraft.js\n```diff\n  gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffer);\n  gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, texture, 0);\n  \n+ const depthBuffer = gl.createRenderbuffer();\n+ gl.bindRenderbuffer(gl.RENDERBUFFER, depthBuffer);\n+ \n  const vShader = gl.createShader(gl.VERTEX_SHADER);\n  const fShader = gl.createShader(gl.FRAGMENT_SHADER);\n  \n\n```\nand setup renderbuffer to store depth info\n\n📄 src/minecraft.js\n```diff\n  const depthBuffer = gl.createRenderbuffer();\n  gl.bindRenderbuffer(gl.RENDERBUFFER, depthBuffer);\n  \n+ gl.renderbufferStorage(gl.RENDERBUFFER, gl.DEPTH_COMPONENT16, canvas.width, canvas.height);\n+ gl.framebufferRenderbuffer(gl.FRAMEBUFFER, gl.DEPTH_ATTACHMENT, gl.RENDERBUFFER, depthBuffer);\n+ \n  const vShader = gl.createShader(gl.VERTEX_SHADER);\n  const fShader = gl.createShader(gl.FRAGMENT_SHADER);\n  \n\n```\nNow scene looks better, but only for a single frame, others seem to be drawn on top of previous. This happens because texture is\nn't cleared before next draw call\n\n\nWe need to call a `gl.clear` to clear the texture (clears currently bound framebuffer). This method accepts a bitmask which tells webgl which buffers to clear. We need to clear both color and depth buffer, so the mask is `gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT`\n\n📄 src/minecraft.js\n```diff\n  function render() {\n      gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffer);\n  \n+     gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);\n+ \n      mat4.translate(cameraFocusPointMatrix, cameraFocusPointMatrix, [0, 0, -30]);\n      mat4.rotateY(cameraFocusPointMatrix, cameraFocusPointMatrix, Math.PI / 360);\n      mat4.translate(cameraFocusPointMatrix, cameraFocusPointMatrix, [0, 0, 30]);\n\n```\n\u003e NOTE: This also should be done before rendering to canvas if `preserveDrawingBuffer` context argument is set to true\n\n\nNow we can reuse our filter function from previous tutorial to make the whole scene black and white\n\n📄 src/shaders/filter.f.glsl\n```diff\n  uniform sampler2D texture;\n  uniform vec2 resolution;\n  \n+ vec4 blackAndWhite(vec4 color) {\n+     return vec4(vec3(1.0, 1.0, 1.0) * (color.r + color.g + color.b) / 3.0, color.a);\n+ }\n+ \n  void main() {\n-     gl_FragColor = texture2D(texture, gl_FragCoord.xy / resolution);\n+     gl_FragColor = blackAndWhite(texture2D(texture, gl_FragCoord.xy / resolution));\n  }\n\n```\nThat's it!\n\nOffscreen rendering (rendering to texture) might be used to apply different \"post\" effects like blur, water on camera, etc. We'll learn another useful usecase of offscreen rendering tomorrow\n\nThanks for reading! 👋\n\n---\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 27. Click detection. Part I\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Soruce code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋\n\nYesterday we've learned how to render to a texture. This is a nice ability to make some nice effects after the scene was completely rendered, but we can get advantage of offscreen rendering for something else.\n\nOne important thing in interactive 3D is click detection. While it may be done with javascript, it involves some complex math. Instead we can:\n\n-   assign a unique solid color to each object\n-   render scene to a texture\n-   read pixel color under cursor\n-   match color with an object\n\n\nSince we'll need another framebuffer, let's create a helper class\n\n📄 src/RenderBuffer.js\n```js\nexport class RenderBuffer {\n    constructor(gl) {\n        this.framebuffer = gl.createFramebuffer();\n        this.texture = gl.createTexture();\n    }\n}\n\n```\nSetup framebuffer and color texture\n\n📄 src/RenderBuffer.js\n```diff\n      constructor(gl) {\n          this.framebuffer = gl.createFramebuffer();\n          this.texture = gl.createTexture();\n+ \n+         gl.bindTexture(gl.TEXTURE_2D, this.texture);\n+         gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.canvas.width, gl.canvas.height, 0, gl.RGBA, gl.UNSIGNED_BYTE, null);\n+ \n+         gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);\n+         gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);\n+         gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);\n+ \n+         gl.bindFramebuffer(gl.FRAMEBUFFER, this.framebuffer);\n+         gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, this.texture, 0);\n      }\n  }\n\n```\nSetup depth buffer\n\n📄 src/RenderBuffer.js\n```diff\n  \n          gl.bindFramebuffer(gl.FRAMEBUFFER, this.framebuffer);\n          gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, this.texture, 0);\n+ \n+         this.depthBuffer = gl.createRenderbuffer();\n+         gl.bindRenderbuffer(gl.RENDERBUFFER, this.depthBuffer);\n+ \n+         gl.renderbufferStorage(gl.RENDERBUFFER, gl.DEPTH_COMPONENT16, gl.canvas.width, gl.canvas.height);\n+         gl.framebufferRenderbuffer(gl.FRAMEBUFFER, gl.DEPTH_ATTACHMENT, gl.RENDERBUFFER, this.depthBuffer);\n      }\n  }\n\n```\nImplement bind method\n\n📄 src/RenderBuffer.js\n```diff\n          gl.renderbufferStorage(gl.RENDERBUFFER, gl.DEPTH_COMPONENT16, gl.canvas.width, gl.canvas.height);\n          gl.framebufferRenderbuffer(gl.FRAMEBUFFER, gl.DEPTH_ATTACHMENT, gl.RENDERBUFFER, this.depthBuffer);\n      }\n+ \n+     bind(gl) {\n+         gl.bindFramebuffer(gl.FRAMEBUFFER, this.framebuffer);\n+     }\n  }\n\n```\nand clear\n\n📄 src/RenderBuffer.js\n```diff\n      bind(gl) {\n          gl.bindFramebuffer(gl.FRAMEBUFFER, this.framebuffer);\n      }\n+ \n+     clear(gl) {\n+         this.bind(gl);\n+         gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);\n+     }\n  }\n\n```\nUse new helper class\n\n📄 src/minecraft.js\n```diff\n  import { setupShaderInput, compileShader } from './gl-helpers';\n  import { GLBuffer } from './GLBuffer';\n  import { createRect } from './shape-helpers';\n+ import { RenderBuffer } from './RenderBuffer';\n  \n  const canvas = document.querySelector('canvas');\n  const gl = canvas.getContext('webgl');\n  \n  mat4.fromTranslation(cameraFocusPointMatrix, cameraFocusPoint);\n  \n- const framebuffer = gl.createFramebuffer();\n- \n- const texture = gl.createTexture();\n- \n- gl.bindTexture(gl.TEXTURE_2D, texture);\n- gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, canvas.width, canvas.height, 0, gl.RGBA, gl.UNSIGNED_BYTE, null);\n- \n- gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);\n- gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);\n- gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_T, gl.CLAMP_TO_EDGE);\n- \n- gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffer);\n- gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, texture, 0);\n- \n- const depthBuffer = gl.createRenderbuffer();\n- gl.bindRenderbuffer(gl.RENDERBUFFER, depthBuffer);\n- \n- gl.renderbufferStorage(gl.RENDERBUFFER, gl.DEPTH_COMPONENT16, canvas.width, canvas.height);\n- gl.framebufferRenderbuffer(gl.FRAMEBUFFER, gl.DEPTH_ATTACHMENT, gl.RENDERBUFFER, depthBuffer);\n+ const offscreenRenderBuffer = new RenderBuffer(gl);\n  \n  const vShader = gl.createShader(gl.VERTEX_SHADER);\n  const fShader = gl.createShader(gl.FRAGMENT_SHADER);\n  gl.uniform2f(programInfo.uniformLocations.resolution, canvas.width, canvas.height);\n  \n  function render() {\n-     gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffer);\n- \n-     gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);\n+     offscreenRenderBuffer.clear(gl);\n  \n      mat4.translate(cameraFocusPointMatrix, cameraFocusPointMatrix, [0, 0, -30]);\n      mat4.rotateY(cameraFocusPointMatrix, cameraFocusPointMatrix, Math.PI / 360);\n      gl.uniform2f(programInfo.uniformLocations.resolution, canvas.width, canvas.height);\n  \n      gl.bindFramebuffer(gl.FRAMEBUFFER, null);\n-     gl.bindTexture(gl.TEXTURE_2D, texture);\n+     gl.bindTexture(gl.TEXTURE_2D, offscreenRenderBuffer.texture);\n  \n      gl.drawElements(gl.TRIANGLES, indexBuffer.data.length, gl.UNSIGNED_BYTE, 0);\n  \n\n```\nInstead of passing the whole unique color of the object, which is a vec3, we can pass only object index\n\n📄 src/shaders/3d-textured.v.glsl\n```diff\n  attribute vec3 position;\n  attribute vec2 texCoord;\n  attribute mat4 modelMatrix;\n+ attribute float index;\n  \n  uniform mat4 viewMatrix;\n  uniform mat4 projectionMatrix;\n\n```\nand convert this float to a color right in the shader\n\n📄 src/shaders/3d-textured.v.glsl\n```diff\n  \n  varying vec2 vTexCoord;\n  \n+ vec3 encodeObject(float id) {\n+     int b = int(mod(id, 255.0));\n+     int r = int(id) / 255 / 255;\n+     int g = (int(id) - b - r * 255 * 255) / 255;\n+     return vec3(r, g, b) / 255.0;\n+ }\n+ \n  void main() {\n      gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);\n  \n\n```\nNow we need to pass the color to a fragment shader via varying\n\n📄 src/shaders/3d-textured.f.glsl\n```diff\n  uniform sampler2D texture;\n  \n  varying vec2 vTexCoord;\n+ varying vec3 vColor;\n  \n  void main() {\n      gl_FragColor = texture2D(texture, vTexCoord * vec2(1, -1) + vec2(0, 1));\n\n```\n📄 src/shaders/3d-textured.v.glsl\n```diff\n  uniform mat4 projectionMatrix;\n  \n  varying vec2 vTexCoord;\n+ varying vec3 vColor;\n  \n  vec3 encodeObject(float id) {\n      int b = int(mod(id, 255.0));\n      gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);\n  \n      vTexCoord = texCoord;\n+     vColor = encodeObject(index);\n  }\n\n```\nWe also need to specify what do we want to render: textured object or colored, so let's use a uniform for it\n\n📄 src/shaders/3d-textured.f.glsl\n```diff\n  varying vec2 vTexCoord;\n  varying vec3 vColor;\n  \n+ uniform float renderIndices;\n+ \n  void main() {\n      gl_FragColor = texture2D(texture, vTexCoord * vec2(1, -1) + vec2(0, 1));\n+ \n+     if (renderIndices == 1.0) {\n+         gl_FragColor.rgb = vColor;\n+     }\n  }\n\n```\nNow let's create indices array\n\n📄 src/minecraft-terrain.js\n```diff\n      State.modelMatrix = mat4.create();\n      State.rotationMatrix = mat4.create();\n  \n+     const indices = new Float32Array(100 * 100);\n+ \n      let cubeIndex = 0;\n  \n      for (let i = -50; i \u003c 50; i++) {\n\n```\nFill it with data and setup a GLBuffer\n\n📄 src/minecraft-terrain.js\n```diff\n                  matrices[cubeIndex * 4 * 4 + index] = value;\n              });\n  \n+             indices[cubeIndex] = cubeIndex;\n+ \n              cubeIndex++;\n          }\n      }\n  \n      State.matricesBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, matrices, gl.STATIC_DRAW);\n+     State.indexBuffer = new GLBuffer(gl, gl.ARRAY_BUFFER, indices, gl.STATIC_DRAW);\n  \n      State.offset = 4 * 4; // 4 floats 4 bytes each\n      State.stride = State.offset * 4; // 4 rows of 4 floats\n\n```\nSince we have a new attribute, we need to update setupAttribute and resetDivisorAngles functions\n\n📄 src/minecraft-terrain.js\n```diff\n  \n          State.ext.vertexAttribDivisorANGLE(State.programInfo.attributeLocations.modelMatrix + i, 1);\n      }\n+ \n+     State.indexBuffer.bind(gl);\n+     gl.vertexAttribPointer(State.programInfo.attributeLocations.index, 1, gl.FLOAT, false, 0, 0);\n+     State.ext.vertexAttribDivisorANGLE(State.programInfo.attributeLocations.index, 1);\n  }\n  \n  function resetDivisorAngles() {\n      for (let i = 0; i \u003c 4; i++) {\n          State.ext.vertexAttribDivisorANGLE(State.programInfo.attributeLocations.modelMatrix + i, 0);\n      }\n+ \n+     State.ext.vertexAttribDivisorANGLE(State.programInfo.attributeLocations.index, 0);\n  }\n  \n  export function render(gl, viewMatrix, projectionMatrix) {\n\n```\nAnd finally we need another argument of a render function to distinguish between \"render modes\" (either textured cubes or colored)\n\n📄 src/minecraft-terrain.js\n```diff\n      State.ext.vertexAttribDivisorANGLE(State.programInfo.attributeLocations.index, 0);\n  }\n  \n- export function render(gl, viewMatrix, projectionMatrix) {\n+ export function render(gl, viewMatrix, projectionMatrix, renderIndices) {\n      gl.useProgram(State.program);\n  \n      setupAttributes(gl);\n      gl.uniformMatrix4fv(State.programInfo.uniformLocations.viewMatrix, false, viewMatrix);\n      gl.uniformMatrix4fv(State.programInfo.uniformLocations.projectionMatrix, false, projectionMatrix);\n  \n+     if (renderIndices) {\n+         gl.uniform1f(State.programInfo.uniformLocations.renderIndices, 1);\n+     } else {\n+         gl.uniform1f(State.programInfo.uniformLocations.renderIndices, 0);\n+     }\n+ \n      State.ext.drawArraysInstancedANGLE(gl.TRIANGLES, 0, State.vertexBuffer.data.length / 3, 100 * 100);\n  \n      resetDivisorAngles();\n\n```\nNow we need another render buffer to render colored cubes to\n\n📄 src/minecraft.js\n```diff\n  mat4.fromTranslation(cameraFocusPointMatrix, cameraFocusPoint);\n  \n  const offscreenRenderBuffer = new RenderBuffer(gl);\n+ const coloredCubesRenderBuffer = new RenderBuffer(gl);\n  \n  const vShader = gl.createShader(gl.VERTEX_SHADER);\n  const fShader = gl.createShader(gl.FRAGMENT_SHADER);\n\n```\nNow let's add a click listeneer\n\n📄 src/minecraft.js\n```diff\n      requestAnimationFrame(render);\n  }\n  \n+ document.body.addEventListener('click', () =\u003e {\n+     coloredCubesRenderBuffer.bind(gl);\n+ });\n+ \n  (async () =\u003e {\n      await prepareSkybox(gl);\n      await prepareTerrain(gl);\n\n```\nand render colored cubes to a texture each time user clicks on a canvas\n\n📄 src/minecraft.js\n```diff\n  \n  document.body.addEventListener('click', () =\u003e {\n      coloredCubesRenderBuffer.bind(gl);\n+ \n+     renderTerrain(gl, viewMatrix, projectionMatrix, true);\n  });\n  \n  (async () =\u003e {\n\n```\nNow we need a storage to read pixel colors to\n\n📄 src/minecraft.js\n```diff\n      coloredCubesRenderBuffer.bind(gl);\n  \n      renderTerrain(gl, viewMatrix, projectionMatrix, true);\n+ \n+     const pixels = new Uint8Array(canvas.width * canvas.height * 4);\n  });\n  \n  (async () =\u003e {\n\n```\nand actually read pixel colors\n\n📄 src/minecraft.js\n```diff\n      renderTerrain(gl, viewMatrix, projectionMatrix, true);\n  \n      const pixels = new Uint8Array(canvas.width * canvas.height * 4);\n+     gl.readPixels(0, 0, canvas.width, canvas.height, gl.RGBA, gl.UNSIGNED_BYTE, pixels);\n  });\n  \n  (async () =\u003e {\n\n```\nThat's it, we now have the whole scene rendered to an offscreen texture, where each object has a unique color. We'll continue click detection tomorrow\n\nThanks for reading! 👋\n\n---\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Soruce code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 28. Click detection. Part II\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋\n\nWelcome to WebGL month\n\nYesterday we've rendered our minecraft terrain to a offscreen texture, where each object is encoded into a specific color and learned how to read pixel colors from the texture back to JS. Now let's decode this color to an object index and highlight selected cube\n\n\n`gl.readPixels` fills the `Uint8Array` with pixel colors startig from the bottom left corner. We need to convert client coordinates to the pixels coordinate in the array. Don't forget the pixel ration, since our offscreen framebuffer takes it into account, and event coordinates don't.\n\n📄 src/minecraft.js\n```diff\n      requestAnimationFrame(render);\n  }\n  \n- document.body.addEventListener('click', () =\u003e {\n+ document.body.addEventListener('click', (e) =\u003e {\n      coloredCubesRenderBuffer.bind(gl);\n  \n      renderTerrain(gl, viewMatrix, projectionMatrix, true);\n  \n      const pixels = new Uint8Array(canvas.width * canvas.height * 4);\n      gl.readPixels(0, 0, canvas.width, canvas.height, gl.RGBA, gl.UNSIGNED_BYTE, pixels);\n+ \n+     const x = e.clientX * devicePixelRatio;\n+     const y = (canvas.offsetHeight - e.clientY) * devicePixelRatio;\n  });\n  \n  (async () =\u003e {\n\n```\nWe need to skip `y` rows (`y * canvas.width`) multiplied by 4 (4 integers per pixel)\n\n📄 src/minecraft.js\n```diff\n  \n      const x = e.clientX * devicePixelRatio;\n      const y = (canvas.offsetHeight - e.clientY) * devicePixelRatio;\n+ \n+     const rowsToSkip = y * canvas.width * 4;\n  });\n  \n  (async () =\u003e {\n\n```\nHorizontal coordinate is `x * 4` (coordinate multiplied by number of integers per pixel)\n\n📄 src/minecraft.js\n```diff\n      const y = (canvas.offsetHeight - e.clientY) * devicePixelRatio;\n  \n      const rowsToSkip = y * canvas.width * 4;\n+     const col = x * 4;\n  });\n  \n  (async () =\u003e {\n\n```\nSo the final index of pixel is rowsToSkip + col\n\n📄 src/minecraft.js\n```diff\n  \n      const rowsToSkip = y * canvas.width * 4;\n      const col = x * 4;\n+ \n+     const pixelIndex = rowsToSkip + col;\n  });\n  \n  (async () =\u003e {\n\n```\nNow we need to read each pixel color component\n\n📄 src/minecraft.js\n```diff\n      const col = x * 4;\n  \n      const pixelIndex = rowsToSkip + col;\n+ \n+     const r = pixels[pixelIndex];\n+     const g = pixels[pixelIndex + 1];\n+     const b = pixels[pixelIndex + 2];\n+     const a = pixels[pixelIndex + 3];\n  });\n  \n  (async () =\u003e {\n\n```\nNow we need to convert back to integer from r g b\n\n📄 src/minecraft.js\n```diff\n      requestAnimationFrame(render);\n  }\n  \n+ function rgbToInt(r, g, b) {\n+     return b + g * 255 + r * 255 ** 2;\n+ }\n+ \n  document.body.addEventListener('click', (e) =\u003e {\n      coloredCubesRenderBuffer.bind(gl);\n  \n\n```\nLet's drop camera rotation code to make scene static\n\n📄 src/minecraft.js\n```diff\n  function render() {\n      offscreenRenderBuffer.clear(gl);\n  \n-     mat4.translate(cameraFocusPointMatrix, cameraFocusPointMatrix, [0, 0, -30]);\n-     mat4.rotateY(cameraFocusPointMatrix, cameraFocusPointMatrix, Math.PI / 360);\n-     mat4.translate(cameraFocusPointMatrix, cameraFocusPointMatrix, [0, 0, 30]);\n- \n-     mat4.getTranslation(cameraFocusPoint, cameraFocusPointMatrix);\n- \n      mat4.lookAt(viewMatrix, cameraPosition, cameraFocusPoint, [0, 1, 0]);\n  \n      renderSkybox(gl, viewMatrix, projectionMatrix);\n      const g = pixels[pixelIndex + 1];\n      const b = pixels[pixelIndex + 2];\n      const a = pixels[pixelIndex + 3];\n+ \n+     const index = rgbToInt(r, g, b);\n+ \n+     console.log(index);\n  });\n  \n  (async () =\u003e {\n\n```\nand update initial camera position to see the scene better\n\n📄 src/minecraft.js\n```diff\n  \n  gl.viewport(0, 0, canvas.width, canvas.height);\n  \n- const cameraPosition = [0, 5, 0];\n- const cameraFocusPoint = vec3.fromValues(0, 0, 30);\n+ const cameraPosition = [0, 10, 0];\n+ const cameraFocusPoint = vec3.fromValues(30, 0, 30);\n  const cameraFocusPointMatrix = mat4.create();\n  \n  mat4.fromTranslation(cameraFocusPointMatrix, cameraFocusPoint);\n\n```\nNext let's pass selected color index into vertex shader as varying\n\n📄 src/shaders/3d-textured.v.glsl\n```diff\n  \n  uniform mat4 viewMatrix;\n  uniform mat4 projectionMatrix;\n+ uniform float selectedObjectIndex;\n  \n  varying vec2 vTexCoord;\n  varying vec3 vColor;\n\n```\nAnd multiply object color if its index matches selected object index\n\n📄 src/shaders/3d-textured.f.glsl\n```diff\n  varying vec3 vColor;\n  \n  uniform float renderIndices;\n+ varying vec4 vColorMultiplier;\n  \n  void main() {\n-     gl_FragColor = texture2D(texture, vTexCoord * vec2(1, -1) + vec2(0, 1));\n+     gl_FragColor = texture2D(texture, vTexCoord * vec2(1, -1) + vec2(0, 1)) * vColorMultiplier;\n  \n      if (renderIndices == 1.0) {\n          gl_FragColor.rgb = vColor;\n\n```\n📄 src/shaders/3d-textured.v.glsl\n```diff\n  \n  varying vec2 vTexCoord;\n  varying vec3 vColor;\n+ varying vec4 vColorMultiplier;\n  \n  vec3 encodeObject(float id) {\n      int b = int(mod(id, 255.0));\n  \n      vTexCoord = texCoord;\n      vColor = encodeObject(index);\n+     \n+     if (selectedObjectIndex == index) {\n+         vColorMultiplier = vec4(1.5, 1.5, 1.5, 1.0);\n+     } else {\n+         vColorMultiplier = vec4(1.0, 1.0, 1.0, 1.0);\n+     }\n  }\n\n```\nand reflect shader changes in js\n\n📄 src/minecraft-terrain.js\n```diff\n      State.ext.vertexAttribDivisorANGLE(State.programInfo.attributeLocations.index, 0);\n  }\n  \n- export function render(gl, viewMatrix, projectionMatrix, renderIndices) {\n+ export function render(gl, viewMatrix, projectionMatrix, renderIndices, selectedObjectIndex) {\n      gl.useProgram(State.program);\n  \n      setupAttributes(gl);\n      gl.uniformMatrix4fv(State.programInfo.uniformLocations.viewMatrix, false, viewMatrix);\n      gl.uniformMatrix4fv(State.programInfo.uniformLocations.projectionMatrix, false, projectionMatrix);\n  \n+     gl.uniform1f(State.programInfo.uniformLocations.selectedObjectIndex, selectedObjectIndex);\n+ \n      if (renderIndices) {\n          gl.uniform1f(State.programInfo.uniformLocations.renderIndices, 1);\n      } else {\n\n```\n📄 src/minecraft.js\n```diff\n  \n  gl.uniform2f(programInfo.uniformLocations.resolution, canvas.width, canvas.height);\n  \n+ let selectedObjectIndex = -1;\n+ \n  function render() {\n      offscreenRenderBuffer.clear(gl);\n  \n      mat4.lookAt(viewMatrix, cameraPosition, cameraFocusPoint, [0, 1, 0]);\n  \n      renderSkybox(gl, viewMatrix, projectionMatrix);\n-     renderTerrain(gl, viewMatrix, projectionMatrix);\n+     renderTerrain(gl, viewMatrix, projectionMatrix, false, selectedObjectIndex);\n  \n      gl.useProgram(program);\n  \n  \n      const index = rgbToInt(r, g, b);\n  \n-     console.log(index);\n+     selectedObjectIndex = index;\n  });\n  \n  (async () =\u003e {\n\n```\nThat's it! We now know selected object index, so that we can perform JS operations as well as visual feedback!\n\nThanks for reading!\n\n---\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 29. Fog\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋\n\nWelcome to WebGL month\n\nToday we're going to improve our 3D minecraft terrain scene with fog\n\nBasically we need to \"lighten\" the color of far cubes (calculate distance between camera and cube vertex)\n\n\nTo calculate relative distance between camera position and some point, we need to multiply position by view and model matrices. Since we also need the same resulting matrix together with projection matrix, let's just extract it to a variable\n\n📄 src/shaders/3d-textured.v.glsl\n```diff\n  }\n  \n  void main() {\n-     gl_Position = projectionMatrix * viewMatrix * modelMatrix * vec4(position, 1.0);\n+     mat4 modelView = viewMatrix * modelMatrix;\n+ \n+     gl_Position = projectionMatrix * modelView * vec4(position, 1.0);\n  \n      vTexCoord = texCoord;\n      vColor = encodeObject(index);\n\n```\nSince our camera looks in a negative direction of Z axis, we need to get `z` coordinate of resulting vertex position\n\n📄 src/shaders/3d-textured.v.glsl\n```diff\n  \n      gl_Position = projectionMatrix * modelView * vec4(position, 1.0);\n  \n+     float depth = (modelView * vec4(position, 1.0)).z;\n+ \n      vTexCoord = texCoord;\n      vColor = encodeObject(index);\n      \n\n```\nBut this value will be negative, while we need a positive value, so let's just negate it\n\n📄 src/shaders/3d-textured.v.glsl\n```diff\n  \n      gl_Position = projectionMatrix * modelView * vec4(position, 1.0);\n  \n-     float depth = (modelView * vec4(position, 1.0)).z;\n+     float depth = -(modelView * vec4(position, 1.0)).z;\n  \n      vTexCoord = texCoord;\n      vColor = encodeObject(index);\n\n```\nWe can't use `depth` directly, since we need a value in `[0..1]` range. Also it'd be nice to have a smooth \"gradient\" like fog. We can apply glsl [smoothstep](https://thebookofshaders.com/glossary/?search=smoothstep) function to calcuate the final amount of fog. This function interpolates a value in range of `lowerBound` and `upperBound`. Max depth of our camera is `142`\n\n```javascript\nmat4.perspective(\n    projectionMatrix,\n    (Math.PI / 360) * 90,\n    canvas.width / canvas.height,\n    0.01,\n    142 // \u003c- zFar\n);\n```\n\nSo the max value of `depth` should be \u003c 142 in order to see any fog at all (object farther than 142 won't be visible at all). Let's use `60..100` range.\n\nOne more thing to take into account is that we don't want to see the object _completely_ white, so let's multiply the final amount by `0.9`\n\nWe'll need the final value of `fogAmount` in fragment shader, so this should be a `varying`\n\n📄 src/shaders/3d-textured.v.glsl\n```diff\n  varying vec2 vTexCoord;\n  varying vec3 vColor;\n  varying vec4 vColorMultiplier;\n+ varying float vFogAmount;\n  \n  vec3 encodeObject(float id) {\n      int b = int(mod(id, 255.0));\n      gl_Position = projectionMatrix * modelView * vec4(position, 1.0);\n  \n      float depth = -(modelView * vec4(position, 1.0)).z;\n+     vFogAmount = smoothstep(60.0, 100.0, depth) * 0.9;\n  \n      vTexCoord = texCoord;\n      vColor = encodeObject(index);\n\n```\nLet's define this varying in fragment shader\n\n📄 src/shaders/3d-textured.f.glsl\n```diff\n  \n  uniform float renderIndices;\n  varying vec4 vColorMultiplier;\n+ varying float vFogAmount;\n  \n  void main() {\n      gl_FragColor = texture2D(texture, vTexCoord * vec2(1, -1) + vec2(0, 1)) * vColorMultiplier;\n\n```\nNow let's define a color of the fog (white). We can also pass this color to a uniform, but let's keep things simple\n\n📄 src/shaders/3d-textured.f.glsl\n```diff\n  void main() {\n      gl_FragColor = texture2D(texture, vTexCoord * vec2(1, -1) + vec2(0, 1)) * vColorMultiplier;\n  \n+     vec3 fogColor = vec3(1.0, 1.0, 1.0);\n+ \n      if (renderIndices == 1.0) {\n          gl_FragColor.rgb = vColor;\n      }\n\n```\nand finally we need to mix original color of the pixel with the fog. We can use glsl [mix](https://thebookofshaders.com/glossary/\\?search\\=mix)\n\n📄 src/shaders/3d-textured.f.glsl\n```diff\n      gl_FragColor = texture2D(texture, vTexCoord * vec2(1, -1) + vec2(0, 1)) * vColorMultiplier;\n  \n      vec3 fogColor = vec3(1.0, 1.0, 1.0);\n+     gl_FragColor.rgb = mix(gl_FragColor.rgb, fogColor, vFogAmount);\n  \n      if (renderIndices == 1.0) {\n          gl_FragColor.rgb = vColor;\n\n```\nThat's it, our scene is now \"foggy\". To implement the same effect, but \"at night\", we just need to change fog color to black.\n\nThanks for reading!\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 30. Text rendering in WebGL\n\nThis is a series of blog posts related to WebGL. New post will be available every day\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Soruce code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n---\n\nHey 👋\n\nWelcome to WebGL month.\n\nIn previuos tutorials we were focused on rendering 2d and 3d shapes, but never rendered text, which is important part of any application.\n\nIn this article we'll review possible ways of text rendering.\n\n### HTML overlay\n\nThe most obvoius and simple solution would be to render text with HTML and place it above the webgl canvas, but this will only work for 2D scenes, 3D stuff will require some calculations to calculate text position and css transforms\n\n### Canvas as texture\n\nOther technique might be applied in a wider range of cases. It requires several steps\n\n1. create another canvas\n2. get 2d context (`canvas.getContext('2d')`)\n3. render text with `fillText` or `strokeText`\n4. use this canvas as webgl texture with correct texture coordinates\n\nSince the texture is a rasterized image, it will loose the quality when you'll come \"closer\" to the object\n\n### Glyphs texture\n\nEach font is actually a set of \"glyphs\" – each symbol is rendered in a signle image\n\n```\nA | B | C | D | E | F | G |\n---------------------------\nH | I | J | K | L | M | N |\n...\n```\n\nEach letter will have it's own \"properties\", like width (`i` is thiner than `W`), height (`o` vs `L`) etc.\nThese properties will affect how to build rectangles, containing each letter\n\nTypically aside of texture you'll need to have a javascript object describing all these properties and coordinates in original texture image\n\n```javascript\nconst font = {\n    textureSize: {\n        width: 512,\n        height: 512,\n    },\n    height: 32,\n    glyphs: {\n        a: { x: 0, y: 0, height: 32, width: 16 },\n        b: { x: 16, y: 0, height: 32, width: 14 },\n    },\n    // ...\n};\n```\n\nand to render some text you'll need something like this\n\n```javascript\nfunction getRects(text, sizeMultiplier) {\n    let prevLetterX = 0;\n\n    const rects = text.split('').map((symbol) =\u003e {\n        const glyph = font.glyphs[symbol];\n\n        return {\n            x: prevLetterX,\n            y: font.height - glyph.height,\n            width: glyph.width * sizeMultiplier,\n            height: glyph.height * sizeMultiplier,\n            texCoords: glyph,\n        };\n    });\n}\n```\n\nLater this \"rects\" will be used to generate attributes data\n\n```javascript\nimport { createRect } from './gl-helpers';\n\nfunction generateBuffers(rects) {\n    const attributeBuffers = {\n        position: [],\n        texCoords: [],\n    };\n\n    rects.forEach((rect, index) =\u003e {\n        attributeBuffers.position.push(...createRect(rect.x, rect.y, rect.width, rect.height)),\n            attributeBuffers.texCoords.push(\n                ...createRect(rect.texCoords.x, rect.texCoords.y, rect.texCoords.width, rect.texCoords.height)\n            );\n    });\n\n    return attributeBuffers;\n}\n```\n\nThere's a [gl-render-text](https://www.npmjs.com/package/gl-render-text) package which can render texture based fonts\n\n### Font triangulation\n\nSince webgl is capable of drawing triangles, one more obvious solution would be to break each letter into triangles\nThis seem to be a very complex task 😢\n\nLuckily – there's a [fontpath-gl](https://github.com/mattdesl/fontpath-gl) package, which does exactly this\n\n### Signed distance field font\n\nAnother technique for rendering text in OpenGL/WebGL\n\nFind [more info here](https://github.com/libgdx/libgdx/wiki/Distance-field-fonts)\n\n---\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Join mailing list](http://eepurl.com/gwiSeH) to get new posts right to your inbox\n\n[Soruce code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\n\n## Day 31. WebGL Month summary\n\n[![GitHub stars](https://img.shields.io/github/stars/lesnitsky/webgl-month.svg?style=social)](https://github.com/lesnitsky/webgl-month)\n[![Twitter Follow](https://img.shields.io/twitter/follow/lesnitsky_a.svg?label=Follow%20me\u0026style=social)](https://twitter.com/lesnitsky_a)\n\n[Source code available here](https://github.com/lesnitsky/webgl-month)\n\nBuilt with\n\n[![Git Tutor Logo](https://git-tutor-assets.s3.eu-west-2.amazonaws.com/git-tutor-logo-50.png)](https://github.com/lesnitsky/git-tutor)\n\nHey 👋\n\nWelcome to the last day of WebGL month.\nThis article won't cover any new topics, but rather summarize previous 30 days\n\n### Previuos tutorials:\n\n#### [Day 1. Intro](https://dev.to/lesnitsky/webgl-month-day-1-19ha)\n\nThis article doesn't cover any WebGL topics, but rather explains what WebGL does under the hood. TL;DR: it calculates colors of each pixel it has to draw\n\n#### [Day 2. Shaders and points](https://dev.to/lesnitsky/shaders-and-points-3h2c)\n\nIntroduction to WebGL API and GLSL shaders with the simpliest possible primitive type – point\n\n#### [Day 3. Shader uniforms, lines and triangles](https://dev.to/lesnitsky/webgl-month-day-3-shader-uniforms-lines-and-triangles-5dof)\n\nThis article covers more ways of passing data to shaders and uses more complex primitives to render\n\n#### [Day 4. Shader varying](https://dev.to/lesnitsky/shader-varyings-2p0f)\n\nPassing data from vertex to fragment shader with varyings\n\n#### [Day 5. Interleaved buffers](https://dev.to/lesnitsky/webgl-month-day-5-interleaved-buffers-2k9a)\n\nAlternative ways of storing and passing vertex data to shaders\n\n#### [Day 6. Indexed buffer](https://dev.to/lesnitsky/webgl-month-day-6-indexed-buffer-ll6)\n\nA technique which helps reduce number of duplicate vertices\n\n#### [Day 7. Cleanup and tooling](https://dev.to/lesnitsky/webgl-month-day-7-a-bit-of-cleanup-and-tooling-bd4)\n\nWebGL is fun, but it requires a bit of tooling when your project grows. Luckily we have awesome tools like webpack\n\n#### [Day 8. Textures](https://dev.to/lesnitsky/webgl-month-day-8-textures-1mk8)\n\nIntro to textures\n\n#### [Day 9. Image filters](https://dev.to/lesnitsky/webgl-month-day-9-image-filters-5g8e)\n\nTaking advantage of fragment shader to implement simple image \"filters\" (inverse, black and white, sepia)\n\n#### [Day 10. Multiple textures](https://dev.to/lesnitsky/webgl-month-day-10-multiple-textures-gf3)\n\nHow to use multiple textures in a single webgl program\n\n#### [Day 11. Reducing WebGL boilerplate](https://dev.to/lesnitsky/webgl-month-day-11-3plb)\n\nImplementation of some utility classes and functions to reduce WebGL boilerplate\n\n#### [Day 12. Highdpi displays and WebGL viewport](https://dev.to/lesnitsky/webgl-month-day-12-highdpi-displays-and-webgl-viewport-2cg3)\n\nHow to handle retina displays with canvas and use webgl viewport\n\n#### [Day 13. Simple animation](https://dev.to/lesnitsky/webgl-month-simple-animation-5hc3)\n\nAll previous examples where static images, this article will add some motion to the scene\n\n#### [Day 14. Intro to 3D](https://dev.to/lesnitsky/webgl-month-day-14-intro-to-3d-2ni2)\n\nTheory of 3D compuatations required for 3D rendering. No code\n\n#### [Day 15. Rendering a cube](https://dev.to/lesnitsky/webgl-month-day-15-rendering-a-3d-cube-190f)\n\n3D theory applied on practice to render 3D cube\n\n#### [Day 16. Depth buffer. Cube faces colors](https://dev.to/lesnitsky/webgl-month-day-16-colorizing-cube-depth-buffer-and-array-uniforms-4nhc)\n\nThis article contains fixes for previous example and adds more colors\n\n#### [Day 17. OBJ format](https://dev.to/lesnitsky/webgl-month-day-17-exploring-obj-format-6fn)\n\nImplementing simple parser for OBJ format\n\n#### [Day 18. Flat shading](https://dev.to/lesnitsky/webgl-month-day-18-flat-shading-3nhg)\n\nImplementation of flat shading\n\n#### [Day 19. Rendering multiple objects](https://dev.to/lesnitsky/webgl-month-day-19-rendering-multiple-objects-45m7)\n\nA typical 3D scene consists of multiple objects, this tutorial will teach you how to render more than 1 object\n\n#### [Day 20. Rendering a minecraft dirt cube](https://dev.to/lesnitsky/webgl-month-day-20-rendering-a-minecraft-dirt-cube-5ag3)\n\nTexturing 3D object with Blender and WebGL\n\n#### [Day 21. Rendering a minecraft terrain](https://dev.to/lesnitsky/webgl-month-day-21-rendering-a-minecraft-terrain-24b5)\n\nWe've learned how to render multiple objects. How to render 10000 of objects?\n\n#### [Day 22. Reducing number of webgl calls by 5000 times](https://dev.to/lesnitsky/webgl-month-day-22-reducing-number-of-webgl-calls-by-5000-times-3a4j)\n\nPrevious example worked, but wasn't really performance. This article explains _instancing_ (a technique which helps to improve performance when rendering a large amount of same objects)\n\n#### [Day 23. Skynox](https://dev.to/lesnitsky/webgl-month-day-23-skybox-in-webgl-1eig)\n\nAdding \"environment\" to the scene\n\n#### [Day 24. Combining terrain and skybox](https://dev.to/lesnitsky/webgl-month-day-24-combining-terrain-and-skybox-kgo)\n\nHow to use multiple WebGL programs together\n\n#### [Day 25. Mipmaps](https://dev.to/lesnitsky/webgl-month-day-25-mipmaps-33i)\n\nA technique which improves performance of shaders reading data from textures\n\n#### [Day 26. Rendering to texture](https://dev.to/lesnitsky/webgl-month-day-26-rendering-to-texture-4hkp)\n\nRendering to texture allows to apply some \"post-effects\" and might be used for a variety of use-cases\n\n#### [Day 27. Click detection. Part I](https://dev.to/lesnitsky/webgl-month-day-27-click-detection-part-i-5920)\n\n#### [Day 28. Click detection. Part II](https://dev.to/lesnitsky/webgl-month-day-28-click-detection-part-ii-367e)\n\nDetecting object under the cursor might seem a tough task, but it might be done without complex 3d math in JS\n\n#### [Day 29. Fog](https://dev.to/lesnitsky/webgl-month-day-29-fog-58od)\n\nImproving scene with fog\n\n#### [Day 30. Text rendering in WebGL](https://dev.to/lesnitsky/webgl-month-day-30-text-rendering-in-webgl-3ih3)\n\nAn overview of text rendering techniques in WebGL\n\n### Useful links\n\nI've started working with WebGL only a year and a half ago. My WebGL journey started with an awesome resource – [https://webglfundamentals.org/](https://webglfundamentals.org/)\n\nOne more important thing to understand: WebGL is just a wrapper of OpenGL, so almost everything from OpenGL tutorials might be used in WebGL as well: [https://learnopengl.com/](https://learnopengl.com/)\n\nExploring more glsl stuff: [https://thebookofshaders.com/](https://thebookofshaders.com/)\n\nCodepen for shaders: [https://www.shadertoy.com/](https://www.shadertoy.com/)\n\n[Getting started with WebGL tutorial on MDN](https://developer.mozilla.org/en-US/docs/Web/API/WebGL_API/Tutorial/Getting_started_with_WebGL)\n\n### Thanks!\n\nThanks for joining WebGL month. Hope this articles helped you learn WebGL! 😉\nFeel free to submit questions, suggestions, improvements to [github repo](https://github.com/lesnitsky/webgl-month), get in touch with me [via email](mailto:andrei.lesnitsky@gmail.com) or [twitter](https://twitter.com/lesnitsky_a)\n\n","funding_links":[],"categories":["WebGL"],"sub_categories":["Blog Series"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flesnitsky%2Fwebgl-month","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Flesnitsky%2Fwebgl-month","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Flesnitsky%2Fwebgl-month/lists"}