{"id":13660841,"url":"https://github.com/Centribo/Unity-Shader-Basics-Tutorial","last_synced_at":"2025-04-24T23:30:57.942Z","repository":{"id":40598681,"uuid":"105407286","full_name":"Centribo/Unity-Shader-Basics-Tutorial","owner":"Centribo","description":"A introduction into creating shaders for Unity","archived":false,"fork":false,"pushed_at":"2022-04-26T22:03:40.000Z","size":6650,"stargazers_count":462,"open_issues_count":1,"forks_count":53,"subscribers_count":14,"default_branch":"master","last_synced_at":"2024-11-10T15:44:31.641Z","etag":null,"topics":["fragment-functions","material","shaderlab","shaders","texture","tutorial","tutorials","unity","unity-shader","vertex-data","vertex-functions"],"latest_commit_sha":null,"homepage":null,"language":"ShaderLab","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Centribo.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2017-09-30T23:26:21.000Z","updated_at":"2024-10-23T07:16:52.000Z","dependencies_parsed_at":"2022-08-09T23:50:31.345Z","dependency_job_id":null,"html_url":"https://github.com/Centribo/Unity-Shader-Basics-Tutorial","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Centribo%2FUnity-Shader-Basics-Tutorial","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Centribo%2FUnity-Shader-Basics-Tutorial/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Centribo%2FUnity-Shader-Basics-Tutorial/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Centribo%2FUnity-Shader-Basics-Tutorial/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Centribo","download_url":"https://codeload.github.com/Centribo/Unity-Shader-Basics-Tutorial/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":250727490,"owners_count":21477321,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["fragment-functions","material","shaderlab","shaders","texture","tutorial","tutorials","unity","unity-shader","vertex-data","vertex-functions"],"created_at":"2024-08-02T05:01:26.448Z","updated_at":"2025-04-24T23:30:55.134Z","avatar_url":"https://github.com/Centribo.png","language":"ShaderLab","readme":"![Logo](./Images/Logo.png)\n# Unity-Shader-Basics-Tutorial\n### By Adam Leung (www.adamleung.me)\n\n\u003e Check out my favourite code snippets for Unity [here](https://github.com/Centribo/Useful-Unity-Assets) and my favourite Unity packages [here](https://github.com/Centribo/Centribo-Awesome-Unity)!\n\nWelcome, this tutorial is supposed to be a gentle introduction into writing shaders for Unity. It assumes you have some previous knowledge in working with Unity but have never touched shaders or materials.\n\nWe'll be building up the shader in parts, stopping along the way to show what everything does.\n\n\u003e If you have any improvements, suggestions, or spot any mistakes, please contact me!\n\n## Part 1: What's a Shader?\n\nShaders are part of the computer graphics rendering pipeline. They're small applications that tell the computer how to render and shade objects in a scene. This includes calculating the color and light values for a given object so that it can be shown on screen. Ontop of that, shaders are used to create many of the special and post-processing effects that you see in games today. \n\nIn modern game engines, (Including Unity) shaders run in a programmable GPU (Graphics Processing Unit) rendering pipeline, which allow them to run in parallel and do many shader calculations very quickly.\n\nWikipedia has a great article about shaders [here.](https://en.wikipedia.org/wiki/Shader)\n\n## Part 2: The Rendering Pipeline\n\nFor our purposes, we'll simplify the rendering pipeline. Here's an image showing what we'll discuss in this tutorial:\n\n![Simplified Rendering Pipeline](./Images/Rendering_Pipeline.png)\n\nI like to think of shaders as programs that transform one type of information (model data, colours, etc.) to another type of information (pixels/fragments). Object data is data that is inherit to the object. Things such as points in the model, normals, triangles, UV coordinates, etc. Custom Data/Properties are things that we can pass into a shader to use. Things such as colours, textures, numbers, etc.\n\nThe first step of the shader pipeline is the vertex function. Vertices, as you might know, are just points in 3D space. The vertex function will work with the vertices in the model (Along with other data such as normals) and prepare them for the next step, the fragment function.\n\nThe fragment function will take in vertices and shade them in. Think of it like a painter and their paint brush. It ultimately outputs pixel data, in a (R, G, B, A) format.\n\nLastly, the pixels are pushed to a frame buffer, where they may be manipulated further (even by other shaders!) until they are drawn on screen.\n\n## Part 3: Scene Setup\n\nSo before we start writing some shader code, let's setup our scene. Create a new project in Unity, and import all the assets:\n\n* [Bowl model](./Assets/Models/Bowl.blend)\n\n* [Noise texture](./Assets/Textures/Noise.png)\n\n* [Bowl texture](./Assets/Textures/Bowl.png)\n\nAdd a cube, a sphere, and the bowl model to a new scene and save the scene. Here's what your scene should look like after:\n\n![Setup 1](./Images/Setup_1.png)\n\nNext, right click in the Project view (Or go to Create) and add a new Unlit Shader. We'll call it \"Tutorial_Shader\" for now.\n\n*If you're curious about the other kinds of shaders, I'll talk about them at the near the end.*\n\n![Setup 2](./Images/Setup_2.png)\n\nThen, right click the shader file we just made and go to Create \u003e Material. Unity will automatically create a material that uses that shader with the correct name.\n\n__Note: a \"Material\" in Unity is just a *instance* of a shader. It just saves the values \u0026 refences of the custom data/properties.__\n\n![Setup 3](./Images/Setup_3.png)\n\nLastly, apply the material to all the objects we've added to the scene by clicking and dragging them to each object.\n\nEverything in the scene should look white and without shadows or shading, like this:\n\n![Setup 4](./Images/Setup_4.png)\n\n## Part 4: Skeleton of a Unlit Shader\n\nTime to start writing our shader! Let's open our Tutorial_Shader.shader file we created before. You'll see Unity automatically generates some code for us to use/build off of. For the sake of this tutorial, delete all of this and make the .shader file blank. \n\n__Note: All shaders in Unity are written in language called \"ShaderLab.\" Shaderlab is a wrapper for HLSL/Cg that lets Unity cross compile shader code for many platforms and expose properties to the inspector.__\n\nTo start we'll add this code:\n\n```\nShader \"Unlit/Tutorial_Shader\" {\n\t...\n}\n```\nThese lines of code just specify where the shader code is. The string in quotes after the *Shader* keyword specify to Unity where you'll find the shader.\n\nFor example:\n```hlsl\nShader \"A/B/C/D/E_Shader\" {\n\t...\n}\n```\n![Skeleton 1](./Images/Skeleton_1.png)\n\nIf you save your shader and switch back to Unity, you'll notice all our objects now are pink:\n\n![Skeleton 2](./Images/Skeleton_2.png)\n\nThis is a fallback shader that Unity will use whenever your shader has errors in it. If you ever get pink objects, you can click on your shader file in the project window and look at the inspector to see the corresponding errors. For now, we'll have pink objects because we haven't completed our shader.\n\nNext up is the properties block:\n\n```\nShader \"Unlit/Tutorial_Shader\" {\n\tProperties {\n\t\t...\n\t}\n}\n```\n\nThe properties block is where we can pass in that custom data we were walking about before. Anything we declare here will be shown in the Unity editor for us to change and be exposed to scripting aswell.\n\nUnderneath our properties block we'll have our subshader:\n\n```\nShader \"Unlit/Tutorial_Shader\" {\n\tProperties {\n\t}\n\n\tSubShader {\n\t\t...\n\t}\n}\n```\n\nEvery shader has one or more subshaders. If you're deploying to multiple platforms it can be useful to add multiple subshaders; For example, you might want two subshaders, one of higher quality for PC/Desktop and one of lower quality but faster for mobile.\n\nThen we have our pass:\n```\nShader \"Unlit/Tutorial_Shader\" {\n\tProperties {\n\t}\n\n\tSubShader {\n\t\tPass {\n\t\t\t...\n\t\t}\n\t}\n}\n```\nEach subshader has atleast one pass, which is actually where the object gets rendered. Some effects require having multiple passes, but we'll just focus on one for now.\n\nWithin our pass, we have the actual rendering code block:\n```\nShader \"Unlit/Tutorial_Shader\" {\n\tProperties {\n\t}\n\n\tSubShader {\n\t\tPass {\n\t\t\tCGPROGRAM\n\t\t\t\t...\n\t\t\tENDCG\n\t\t}\n\t}\n}\n```\nAnything within CGPROGRAM and ENDCG is where we actually write our shading code. For Unity this is a variant of HLSL and CG shading languages.\n\nNext, we'll tell Unity what our vertex and fragment functions are:\n```\nCGPROGRAM\n\t#pragma vertex vertexFunction\n\t#pragma fragment fragmentFunction\nENDCG\n```\nHere, we're saying we have a vertex function called \"vertexFunction\", and a fragment function called \"fragmentFunction\"\".\n\nWe'll define those functions aswell:\n```\nCGPROGRAM\n\t#pragma vertex vertexFunction\n\t#pragma fragment fragmentFunction\n\n\tvoid vertexFunction () {\n\n\t}\n\n\tvoid fragmentFunction () {\n\n\t}\nENDCG\n```\nBefore we start shading, we need to setup some data structures and our two functions in a way so that we can take in Unity's given data and give data back to Unity. First, we'll include *UnityCG.inc*. This file includes a number of helper functions that we can use. If you want a full list of them, you can go [here.](https://docs.unity3d.com/Manual/SL-BuiltinFunctions.html)\n\nWe'll also add a data structure called *appdata*, and modify our vertex function so that it takes in an appdata structure:\n\n```\nCGPROGRAM\n\t#pragma vertex vertexFunction\n\t#pragma fragment fragmentFunction\n\n\t#include \"UnityCG.cginc\"\n\n\tstruct appdata {\n\n\t};\n\n\tvoid vertexFunction (appdata IN) {\n\n\t}\n\n\tvoid fragmentFunction () {\n\n\t}\nENDCG\n```\nWhen we give Unity an argument to call the vertex function with, it will look into the structure of that argument (in this case, our *appdata* structure) and attempt to pass in values to it based on the model that is being drawn. We can define data that we want Unity to pass in by declaring variables like this:\n\n```\n[type] [name] : [semantic];\n```\nSo for example, we can ask Unity for the positions of the vertices of this model like this:\n```\nfloat4 vertex : POSITION;\n```\nFor now we'll ask Unity to give us the position of the vertices and the coordinates of the UV like so:\n```\nstruct appdata {\n\tfloat4 vertex : POSITION;\n\tfloat2 uv : TEXCOORD0;\n};\n```\nIf you want to learn more about providing vertex data to vertex functions, you can read [here.](https://docs.unity3d.com/Manual/SL-VertexProgramInputs.html)\n\nLastly for the vertex function setup, we'll create one more struct called *v2f* (which stands for vertex to fragment) that will contain the data we'll be passing into our fragment function. We'll also make sure our vertex function returns data of this struct and create and return a blank one while we're at it:\n```\nCGPROGRAM\n\t#pragma vertex vertexFunction\n\t#pragma fragment fragmentFunction\n\n\t#include \"UnityCG.cginc\"\n\n\tstruct appdata {\n\t\tfloat4 vertex : POSITION;\n\t\tfloat2 uv : TEXCOORD0;\n\t};\n\n\tstruct v2f {\n\t};\n\n\tv2f vertexFunction (appdata IN) {\n\t\tv2f OUT;\n\n\t\treturn OUT;\n\t}\n\n\tvoid fragmentFunction () {\n\n\t}\nENDCG\n```\nJust like before we can define some data in v2f that we want to pass from our vertex function to our fragment function.\n```\nstruct v2f {\n\tfloat4 position : SV_POSITION;\n\tfloat2 uv : TEXCOORD0;\n};\n```\n*If you're curious about SV_POSITION vs POSITION, SV stands for \"system value\" and represents in our v2f struct that this will be the final transformed vertex position use for rendering.*\n\nOkay we're almost ready, we just need to edit our fragment function. First, we'll modify it to take in the v2f struct and make it return a *fixed4* value:\n```\nfixed4 fragmentFunction (v2f IN) {\n\n}\n```\nOur output for the fragment function will be a colour represented by (R, G, B, A) values, hence the output of this function being a *fixed4*.\n\nLastly, we're going to add an output semantic SV_TARGET to our fragment function like so:\n```\nfixed4 fragmentFunction (v2f IN) : SV_TARGET {\n\n}\n```\nThis tells Unity that we're outputting a fixed4 colour to be rendered.\nWe're now ready to start actually coding the meat and potatoes of our vertex and fragment functions!\nHere's our basic skeleton that we've made up to this point:\n```\nShader \"Unlit/Tutorial_Shader\" {\n\tProperties {\n\t\t\n\t}\n\n\tSubShader {\n\t\tPass {\n\t\t\tCGPROGRAM\n\t\t\t\t#pragma vertex vertexFunction\n\t\t\t\t#pragma fragment fragmentFunction\n\n\t\t\t\t#include \"UnityCG.cginc\"\n\n\t\t\t\tstruct appdata {\n\t\t\t\t\tfloat4 vertex : POSITION;\n\t\t\t\t\tfloat2 uv : TEXCOORD0;\n\t\t\t\t};\n\n\t\t\t\tstruct v2f {\n\t\t\t\t\tfloat4 position : SV_POSITION;\n\t\t\t\t\tfloat2 uv : TEXCOORD0;\n\t\t\t\t};\n\n\t\t\t\tv2f vertexFunction (appdata IN) {\n\t\t\t\t\tv2f OUT;\n\n\t\t\t\t\treturn OUT;\n\t\t\t\t}\n\n\t\t\t\tfixed4 fragmentFunction (v2f IN) : SV_TARGET {\n\n\t\t\t\t}\n\t\t\tENDCG\n\t\t}\n\t}\n}\n```\n\n## Part 5: Shading basics\n\nFirst thing we'll do is get the correct positions of the vertices. We'll do this using a function called UnityObjectToClipPos() like so:\n```\nv2f vertexFunction (appdata IN) {\n\tv2f OUT;\n\n\tOUT.position = UnityObjectToClipPos(IN.vertex);\n\n\treturn OUT;\n}\n```\nWhat this function does is take a vertex that is represented in local object space, and tranforms it into the rendering camera's clip space. Notice we're passing along the transformed point by setting OUT.position's value. If you want to learn more about this, [here](https://learnopengl.com/Getting-started/Coordinate-Systems) is a great discussion on what these spaces are and their purposes.\n\nNext, we'll make our fragment function return a solid green colour: \n```\nfixed4 fragmentFunction (v2f IN) : SV_TARGET {\n\treturn fixed4(0, 1, 0, 1); //(R, G, B, A)\n}\n```\nAnd now, the moment you've been waiting for! Save your shader and return to Unity and you'll see our beautiful green objects!\n\n![Shading Basics 1](./Images/Shading_Basics_1.png)\n\nOkay, this probably not that impressive to you, so lets keep building. How about, instead of returning a basic green colour, we edit our shader to return any colour we want? What we'll need to do to achieve this is start working with custom properties.\n\nWe can add the properties in the *Properties* block we want to use by following this syntax:\n```\nname (\"display name\", type) = default value\n```\nSo for example, we'll expose a colour value like so:\n```\nProperties {\n\t_Colour (\"Totally Rad Colour!\", Color) = (1, 1, 1, 1)\n}\n```\nHere we're defining a colour for us to use, called *_Colour* and it will be shown as \"Totally Rad Colour!\" in the Unity inspector. We're also giving it a default value of white.\nIf you save and return to Unity now, when you inspect the material, you should see this:\n\n![Shading Basics 2](./Images/Shading_Basics_2.png)\n\nBefore we can use this colour, we need to actually pass it into the CG code. Unity does this automatically by binding it by variable name like so:\n\n```\nCGPROGRAM\n\t#pragma vertex vertexFunction\n\t#pragma fragment fragmentFunction\n\n\t#include \"UnityCG.cginc\"\n\n\tstruct appdata {\n\t\tfloat4 vertex : POSITION;\n\t\tfloat2 uv : TEXCOORD0;\n\t};\n\n\tstruct v2f {\n\t\tfloat4 position : SV_POSITION;\n\t\tfloat2 uv : TEXCOORD0;\n\t};\n\n\t// ****************************\n\t//Get our properties into CG\n\t// ****************************\n\tfloat4 _Colour;\n\n\tv2f vertexFunction (appdata IN) {\n\t\tv2f OUT;\n\t\tOUT.position = UnityObjectToClipPos(IN.vertex);\n\t\treturn OUT;\n\t}\n\n\tfixed4 fragmentFunction (v2f IN) : SV_TARGET {\n\t\treturn fixed4(0, 1, 0, 1);\n\t}\nENDCG\n```\n*I like to put properties after my structs to keep my code organized, but you can put it anywhere so long as its in the top scope of the CGPROGRAM*\n\nWe can now use our _Colour value in our fragment function. Instead of returning that green, lets just return whatever colour we want:\n```\nfixed4 fragmentFunction (v2f IN) : SV_TARGET {\n\treturn _Colour;\n}\n```\nAnd now, we can save and return to Unity. If you inspect the material and start changing our colour value, you should see all the colours of the objects change accordingly!\n\n![Shading Basics 3](./Images/Shading_Basics_3.png)\n\nSince we now know how to add properties, lets try adding a standard texture map. We'll need a new property for our texture:\n\n```\nProperties {\n\t_Colour (\"Colour\", Color) = (1, 1, 1, 1)\n\t_MainTexture (\"Main Texture\", 2D) = \"white\" {}\n}\n```\nNotice how it's of type *2D* (2D Texture), and we're defaulting to a blank white texture. We've also need to get the property into CG to use it:\n\n```\nfloat4 _Colour;\nsampler2D _MainTexture;\n```\nThen, we need to give our fragment function the UV coordinates from the model. We can do this by going back to our vertex function and passing them into the v2f struct we return like so:\n\n```\nv2f vertexFunction (appdata IN) {\n\tv2f OUT;\n\tOUT.position = UnityObjectToClipPos(IN.vertex);\n\tOUT.uv = IN.uv;\n\treturn OUT;\n}\n```\nNow in order to use the colours from the texture for our fragment function, we need to *sample* it at certain points. Thankfully, CG has a function that does this for us, called *tex2D*.\n\n```\nfixed4 fragmentFunction (v2f IN) : SV_TARGET {\n\treturn tex2D(_MainTexture, IN.uv);\n}\n```\ntex2D takes in the texture (ie: sample2D) we want to sample, and the UV coordinate we want to sample with. In this case, we're providing it with our main texture and giving it the point on the model where we want to get the colour from, then returning that result as our final colour. Now, if you save and return back to Unity and inspect the material, we can select the bowl texture for our \"Main Texture\". You'll see the models update, and the bowl model in particular (the model the texture was made for) should look like a bowl of soup!\n\n![Shading Basics 4](./Images/Shading_Basics_4.png)\n\n__Note: We can change how Textures in Unity are sampled by going back to the texture file and changing the filter mode in the inspector:__\n\n![Shading Basics 5](./Images/Shading_Basics_5.png)\n![Shading Basics 6](./Images/Shading_Basics_6.png)\n\n## Part 6: Playing With Shaders\n\nSo now that we know the basics, we can start having some fun with shaders and achieve some simple effects. First, we're going to use our noise texture and achieve a sort of \"dissolve\" or \"cutout\" effect. We'll start by adding a texture property and a float property:\n```\nProperties {\n\t_Colour (\"Colour\", Color) = (1, 1, 1, 1)\n\t_MainTexture (\"Main Texture\", 2D) = \"white\" {}\n\t_DissolveTexture (\"Dissolve Texture\", 2D) = \"white\" {}\n\t_DissolveCutoff (\"Dissolve Cutoff\", Range(0, 1)) = 1\n}\n```\nNotice how we've set _DissolveCutoff to be a Range from (0, 1). This represents a float value from 0 to 1 (inclusive) and this notation also allows us to easily set it's value using a slider from within Unity's inspector. Now let's add them to our CGPROGRAM:\n```\nfloat4 _Colour;\nsampler2D _MainTexture;\nsampler2D _DissolveTexture;\nfloat _DissolveCutoff;\n```\nNow we can sample the dissolve texture in our fragment function:\n```\nfixed4 fragmentFunction (v2f IN) : SV_TARGET {\n\tfloat4 textureColour = tex2D(_MainTexture, IN.uv);\n\tfloat4 dissolveColour = tex2D(_DissolveTexture, IN.uv);\n\treturn textureColour;\n}\n```\nNotice we're still using the same UV coordinates as our main texture.\nNow here's where the magic happens:\n```\nfixed4 fragmentFunction (v2f IN) : SV_TARGET {\n\tfloat4 textureColour = tex2D(_MainTexture, IN.uv);\n\tfloat4 dissolveColour = tex2D(_DissolveTexture, IN.uv);\n\tclip(dissolveColour.rgb - _DissolveCutoff);\n\treturn textureColour;\n}\n```\nThe *clip* function works by checking if the value given is less than 0. If it is, then we discard the pixel and draw nothing. If it isn't we keep the pixel and continue as normal. The way our code currently works is:\n\n1. We sample the main texture for colour.\n2. We sample the cutout texture for it's colour.\n3. We subtract the cutoff value from the \"brightness\" of our cutoff sample, and...\n4. If it's less than 0, we draw nothing\n5. Otherwise, return the main texture sample colour.\n\nNow, save your shader and return to Unity. Set the \"Dissolve Texture\" to our noise texture, and start moving the \"Dissolve Cutoff\" slider, you should see an effect like this:\n\n![Playing With Shaders 1](./Images/Playing_With_Shaders_1.gif)\n\nPretty cool huh? We can do more too. Let's try playing with the vertices before we pass them to our fragment function. Let's expose another property:\n\n```\nProperties {\n\t_Colour (\"Colour\", Color) = (1, 1, 1, 1)\n\t_MainTexture (\"Main Texture\", 2D) = \"white\" {}\n\t_DissolveTexture (\"Dissolve Texture\", 2D) = \"white\" {}\n\t_DissolveCutoff (\"Dissolve Cutoff\", Range(0, 1)) = 1\n\t_ExtrudeAmount (\"Extrue Amount\", float) = 0\n}\n\n...\n\nfloat4 _Colour;\nsampler2D _MainTexture;\nsampler2D _DissolveTexture;\nfloat _DissolveCutoff;\nfloat _ExtrudeAmount;\n```\nWe're also going to using normals from the model, so lets add the field into the appdata struct so we can access them:\n```\nstruct appdata {\n\tfloat4 vertex : POSITION;\n\tfloat2 uv : TEXCOORD0;\n\tfloat3 normal : NORMAL;\n};\n```\nNow let's add a single line to our vertex function:\n```\nv2f vertexFunction (appdata IN) {\n\tv2f OUT;\n\tIN.vertex.xyz += IN.normal.xyz * _ExtrudeAmount;\n\tOUT.position = UnityObjectToClipPos(IN.vertex);\n\tOUT.uv = IN.uv;\n\treturn OUT;\n}\n```\nWhat we're doing here is, before we transform our vertices out of local model space, we're going to offset them a certain amount outwards by adding their normal direction times our _ExtrudeAmount. A normal is just a vector that represents the direction that the vertex is facing. Now if you save and return to Unity and play with the \"Extrude Amount\" value, you should see an effect like this:\n\n![Playing With Shaders 2](./Images/Playing_With_Shaders_2.gif)\n\nWe can even animate these properties:\n\n```\nv2f vertexFunction (appdata IN) {\n\tv2f OUT;\n\tIN.vertex.xyz += IN.normal.xyz * _ExtrudeAmount * sin(_Time.y); // Note the use of sin(_Time.y)\n\tOUT.position = UnityObjectToClipPos(IN.vertex);\n\tOUT.uv = IN.uv;\n\treturn OUT;\n}\n```\n*_Time* is a variable included in *UnityCG.cginc* that represents the time, with the y value representing seconds.\nMake sure \"Animated Materials\" is checked on in the scene view in order to preview this effect in the editor:\n\n![Playing With Shaders 3](./Images/Playing_With_Shaders_3.gif)\n\nHere's our final shader:\n\n```\nShader \"Unlit/Tutorial_Shader\" {\n\tProperties {\n\t\t_Colour (\"Colour\", Color) = (1, 1, 1, 1)\n\t\t_MainTexture (\"Main Texture\", 2D) = \"white\" {}\n\t\t_DissolveTexture (\"Dissolve Texture\", 2D) = \"white\" {}\n\t\t_DissolveCutoff (\"Dissolve Cutoff\", Range(0, 1)) = 1\n\t\t_ExtrudeAmount (\"Extrue Amount\", float) = 0\n\t}\n\n\tSubShader {\n\t\tPass {\n\t\t\tCGPROGRAM\n\t\t\t\t#pragma vertex vertexFunction\n\t\t\t\t#pragma fragment fragmentFunction\n\n\t\t\t\t#include \"UnityCG.cginc\"\n\n\t\t\t\tstruct appdata {\n\t\t\t\t\tfloat4 vertex : POSITION;\n\t\t\t\t\tfloat2 uv : TEXCOORD0;\n\t\t\t\t\tfloat3 normal : NORMAL;\n\t\t\t\t};\n\n\t\t\t\tstruct v2f {\n\t\t\t\t\tfloat4 position : SV_POSITION;\n\t\t\t\t\tfloat2 uv : TEXCOORD0;\n\t\t\t\t};\n\n\t\t\t\tfloat4 _Colour;\n\t\t\t\tsampler2D _MainTexture;\n\t\t\t\tsampler2D _DissolveTexture;\n\t\t\t\tfloat _DissolveCutoff;\n\t\t\t\tfloat _ExtrudeAmount;\n\n\t\t\t\tv2f vertexFunction (appdata IN) {\n\t\t\t\t\tv2f OUT;\n\t\t\t\t\tIN.vertex.xyz += IN.normal.xyz * _ExtrudeAmount * sin(_Time.y);\n\t\t\t\t\tOUT.position = UnityObjectToClipPos(IN.vertex);\n\t\t\t\t\tOUT.uv = IN.uv;\n\t\t\t\t\treturn OUT;\n\t\t\t\t}\n\n\t\t\t\tfixed4 fragmentFunction (v2f IN) : SV_TARGET {\n\t\t\t\t\tfloat4 textureColour = tex2D(_MainTexture, IN.uv);\n\t\t\t\t\tfloat4 dissolveColour = tex2D(_DissolveTexture, IN.uv);\n\t\t\t\t\tclip(dissolveColour.rgb - _DissolveCutoff);\n\t\t\t\t\treturn textureColour;\n\t\t\t\t}\n\t\t\tENDCG\n\t\t}\n\t}\n}\n```\n*(If you want to see a commented version, go [here.](./Assets/Shaders/Tutorial_Shader.shader))*\n\n## Part 7: Scripting and Shaders\n\nNext, we'll talk about how to control shaders with Unity scripts. For this example, we'll reuse the _Colour property we added before. First, lets set it as a colour tint for our shader by doing this in our fragment function:\n\n```\nfixed4 fragmentFunction (v2f IN) : SV_TARGET {\n\tfloat4 textureColour = tex2D(_MainTexture, IN.uv);\n\tfloat4 dissolveColour = tex2D(_DissolveTexture, IN.uv);\n\tclip(dissolveColour.rgb - _DissolveCutoff);\n\treturn textureColour * _Colour;\n}\n```\nWe're just multiplying the output colour by our _Colour property to tint it. Here's what that looks like in the editor:\n\n![Scripting and Shaders 1](./Images/Scripting_and_Shaders_1.png)\n\nAlright, lets start scripting. We'll add a new script to all the objects and we'll call it *RainbowColour.cs*\n\n![Scripting and Shaders 2](./Images/Scripting_and_Shaders_2.png)\n\nIn our script, we'll start by declaring two private variables for our Renderer and our Material:\n\n```csharp\nusing System.Collections;\nusing System.Collections.Generic;\nusing UnityEngine;\n\npublic class RainbowColour : MonoBehaviour {\n\n\tRenderer rend;\n\tMaterial material;\n\n\tvoid Start () {\n\t\t\n\t}\n\t\n\tvoid Update () {\n\t\t\n\t}\n}\n```\n\nWe'll also get references to them in our Start() function:\n```csharp\nvoid Start () {\n\trend = GetComponent\u003cRenderer\u003e();\n\tmaterial = rend.material;\n}\n```\nWe will use Material.SetColor(...) to set the colour in our shader. This function's first argument is a string, which is the name of the property we want to set. The second argument is the colour we want to set the property to.\n```csharp\nvoid Start () {\n\trend = GetComponent\u003cRenderer\u003e();\n\tmaterial = rend.material;\n\tmaterial.SetColor(\"_Colour\", Color.magenta);\n}\n```\n\nNow notice when we start our game, the tint colour changes to magenta!\n\n![Scripting and Shaders 3](./Images/Scripting_and_Shaders_3.gif)\n\n*(If you want to see a commented version of the script, go [here.](./Assets/Scripts/RainbowColour.cs))*\n\nThere are many functions for getting and setting properties for materials from within scripts, and you can find all of them [here.](https://docs.unity3d.com/ScriptReference/Material.html)\n\n## Part 8: Shadows? Surface Shaders?\n\nUp to this point, we've been writing *unlit* shaders. Unlit shaders don't consider lights or shadows. Unity also lets you write *surface* shaders. Surface shaders are actually just like vertex/fragment shaders except they strip away alot of the boilerplate code that is required to make shaders interact with lighting and shadows. If you're curious about going through that process of writing code for lighting and shadows, there is a great tutorial by Jasper Flick [here.](http://catlikecoding.com/unity/tutorials/rendering/part-4/)\n\nWhat I'll show you in this section is how each part of the surface shader relates to our vertex/fragment shaders. If you create a new \"Standard Surface Shader\" from within Unity, you'll get this auto-generated code:\n\n```\nShader \"Custom/NewSurfaceShader\" {\n\tProperties {\n\t\t_Color (\"Color\", Color) = (1,1,1,1)\n\t\t_MainTex (\"Albedo (RGB)\", 2D) = \"white\" {}\n\t\t_Glossiness (\"Smoothness\", Range(0,1)) = 0.5\n\t\t_Metallic (\"Metallic\", Range(0,1)) = 0.0\n\t}\n\tSubShader {\n\t\tTags { \"RenderType\"=\"Opaque\" }\n\t\tLOD 200\n\t\t\n\t\tCGPROGRAM\n\t\t// Physically based Standard lighting model, and enable shadows on all light types\n\t\t#pragma surface surf Standard fullforwardshadows\n\n\t\t// Use shader model 3.0 target, to get nicer looking lighting\n\t\t#pragma target 3.0\n\n\t\tsampler2D _MainTex;\n\n\t\tstruct Input {\n\t\t\tfloat2 uv_MainTex;\n\t\t};\n\n\t\thalf _Glossiness;\n\t\thalf _Metallic;\n\t\tfixed4 _Color;\n\n\t\t// Add instancing support for this shader. You need to check 'Enable Instancing' on materials that use the shader.\n\t\t// See https://docs.unity3d.com/Manual/GPUInstancing.html for more information about instancing.\n\t\t// #pragma instancing_options assumeuniformscaling\n\t\tUNITY_INSTANCING_CBUFFER_START(Props)\n\t\t\t// put more per-instance properties here\n\t\tUNITY_INSTANCING_CBUFFER_END\n\n\t\tvoid surf (Input IN, inout SurfaceOutputStandard o) {\n\t\t\t// Albedo comes from a texture tinted by color\n\t\t\tfixed4 c = tex2D (_MainTex, IN.uv_MainTex) * _Color;\n\t\t\to.Albedo = c.rgb;\n\t\t\t// Metallic and smoothness come from slider variables\n\t\t\to.Metallic = _Metallic;\n\t\t\to.Smoothness = _Glossiness;\n\t\t\to.Alpha = c.a;\n\t\t}\n\t\tENDCG\n\t}\n\tFallBack \"Diffuse\"\n}\n```\n\nLet's go through each section that is new and explain what they do. First, the tags:\n```\nSubShader {\n\t\tTags { \"RenderType\"=\"Opaque\" }\n\t\t...\n}\n```\nTags help you tell the rendering engine how and when the shader you're writing is going to be rendered. You can learn more about tags [here.](https://docs.unity3d.com/Manual/SL-SubShaderTags.html) In this case, we're just specifying that our shader is opaque; Especially useful for producing a depth texture/map.\n\n```\nLOD 200\n```\nThe shader Level of Detail or (LOD) helps specify which shader to use on certain hardware. The higher the LOD, the more \"complex\" the shader is. This value has nothing to do with model LOD. You can read more about shader LOD [here.](https://docs.unity3d.com/Manual/SL-ShaderLOD.html)\n```\n#pragma surface surf Standard fullforwardshadows\n```\nSimilar to how we defined the vertex and fragment functions, we care defining here a surface function called surf. \"Standard\" tells Unity that this shader uses the standard lighting model, and \"fullforwardshadows\" specifies that this shader should enable all regular shadow types.\n```\n#pragma target 3.0\n```\nThis tells which lighting version to compile. The higher the value, the more complex and better looking but the higher system requirements. You can read more about this [here.](https://docs.unity3d.com/Manual/SL-ShaderCompileTargets.html)\n```\nvoid surf (Input IN, inout SurfaceOutputStandard o) {\n\t// Albedo comes from a texture tinted by color\n\tfixed4 c = tex2D (_MainTex, IN.uv_MainTex) * _Color;\n\to.Albedo = c.rgb;\n\t// Metallic and smoothness come from slider variables\n\to.Metallic = _Metallic;\n\to.Smoothness = _Glossiness;\n\to.Alpha = c.a;\n}\n```\nThis is the heart of the shader. Instead of specifying exactly the colour value of the pixel, Unity defines a SurfaceOutputStandard structure. It has attributes such as Albedo (for colour) which you will set. Since we're working with lighting and shadows now, we don't just grab the colour directly, it needs to be calculated from values held in SurfaceOutputStandard. Here are all the attributes that are part of SurfaceOutputStandard:\n```\nstruct SurfaceOutput\n{\n\tfixed3 Albedo;  // diffuse color\n\tfixed3 Normal;  // tangent space normal, if written\n\tfixed3 Emission;\n\thalf Specular;  // specular power in 0..1 range\n\tfixed Gloss;    // specular intensity\n\tfixed Alpha;    // alpha for transparencies\n};\n```\n\n__Okay, so what about vertices?__\n\nBy default, the standard surface shader doesn't expose a function for editing vertices. We can still add one though. First, we'll add to the pragma and define a vertex function:\n```\n#pragma surface surf Standard fullforwardshadows vertex:vert\n```\nAnd also define the function:\n```\nvoid vert(inout appdata_full v){\n\t\n}\n```\nThe \"appdata_full\" structure will automatically be filled in by Unity with the attributes of the model we're rendering. This is the same as before, except instead of explicitly creating our own structure, Unity has already defined a few for us. You can see what other structures they have defined and what attributes will be passed in [here.](https://docs.unity3d.com/Manual/SL-VertexProgramInputs.html)\n\nNow we can edit the vertices as normal. For example, to translate the code we had before:\n```\nvoid vert(inout appdata_full v){\n\tv.vertex.xyz += v.normal.xyz * _ExtrudeAmount * sin(_Time.y);\n}\n```\n\n__Note: If you notice that when you update the vertices but the shadows are not also being updated, make sure to add the \"addshadow\" pragma like this:__\n\n```\n#pragma surface surf Standard fullforwardshadows vertex:vert addshadow\n```\n\nSurface shaders have alot going on within them and are much more complex, but they ultimately compile down to vertex and fragment functions just like the ones we were writing before. I highly suggest reading the official documentation [here](https://docs.unity3d.com/Manual/SL-SurfaceShaders.html) to learn more about them. The official documenation also has a great page of examples [here](https://docs.unity3d.com/Manual/SL-SurfaceShaderLightingExamples.html) which is a good place to start if you want to understand them better. Alan Zucconi also has a great tutorial introducing them available [here.](http://www.alanzucconi.com/2015/06/17/surface-shaders-in-unity3d/)\n\n## Part 9: Other Shaders\n\nSo far we've talked about the *unlit* shader and the *surface* shader. Let's talk about the other types of shaders we can use in Unity.\n\nThe *Image Effect* shader is exactly as it sounds, it's a shader for image effects. More specifically, they tend to take a texture as their input and output a texture aswell. They can be applied to cameras in Unity or any other texture to affect their look before being outputted to the screen/framebuffer. As an exercise, try creating a new one in Unity and attempting to understand the code! They are great for doing things like the \"CRT\" effect, or a black and white effect. Dan John Moran has a great video tutorial available [here](https://www.youtube.com/watch?v=kpBnIAPtsj8) which introduces image effect shaders and how to create/use them. (His channel in general is a great place to start learning more about shaders!)\n\nThe *Compute* shader is a type of shader that is used for computing and calculating data. Remember how I said shaders run in the GPU? For some computational tasks, this can be extremely beneficial as they will run much faster in a parallel process. For example, they can be used to calculate physics, or the position of particles in a simulation. In general, most people will never need to touch compute shaders. If you'd like to learn more you can check out a tutorial by Kyle Halladay available [here.](http://kylehalladay.com/blog/tutorial/2014/06/27/Compute-Shaders-Are-Nifty.html) (Admittedly I don't know too much about compute shaders myself.)\n\n## Part 10: Further Reading\n\nHopefully this tutorial has helped you in getting started on writing your own shaders, but there is still alot to learn! Shaders are a vital ingredient in helping shape how your game looks and performs. My suggestion is to keep experimenting and keep learning. (That doesn't just apply to shaders either!) If you see a neat or notable effect in a game, chances are shaders have a part in achieving it, so try your hand at replicating it. This section is dedicated in listing some resources that have been useful to me for learning about shaders.\n\n* __[Unity Manual, Shader Reference](https://docs.unity3d.com/Manual/SL-Reference.html)__\n\t* [ShaderLab Syntax](https://docs.unity3d.com/Manual/SL-Shader.html)\n\t* [Built-in Shader Helper Functions](https://docs.unity3d.com/Manual/SL-BuiltinFunctions.html)\n\t* [Writing Vertex and Fragment Shaders](https://docs.unity3d.com/Manual/SL-ShaderPrograms.html)\n\t* [Vertex and Fragment Shader Examples](https://docs.unity3d.com/Manual/SL-VertexFragmentShaderExamples.html)\n\t* [Writing Surface Shaders](https://docs.unity3d.com/Manual/SL-SurfaceShaders.html)\n\t* [Surface Shader Examples](https://docs.unity3d.com/Manual/SL-SurfaceShaderLightingExamples.html)\n* __[Nvidia Cg Documentation](http://developer.download.nvidia.com/cg/index_stdlib.html)__\n* __[Catlike Coding's/Jasper Flick's Tutorials](http://catlikecoding.com/unity/tutorials/)__\n* __[Alan Zucconi's Tutorials](http://www.alanzucconi.com/tutorials/)__\n* __[Makin' Stuff Look Good in Video Games](https://www.youtube.com/channel/UCEklP9iLcpExB8vp_fWQseg)__\n","funding_links":[],"categories":["ShaderLab"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FCentribo%2FUnity-Shader-Basics-Tutorial","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FCentribo%2FUnity-Shader-Basics-Tutorial","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FCentribo%2FUnity-Shader-Basics-Tutorial/lists"}