Thanks to these posts and blogs: Michael Sanders Catike Coding Alan Zucconi

Unity has 3 types of shader: surface shader, fragment and vertex shader, and the obsolete fixed function shader. They all have the same anatomy.

Catlike Coding

Shader "MyShader"
{
	Properties
	{
		// The properties of your shaders
		// - textures
		// - colours
		// - parameters
		// ...
		_MyTexture ("My texture", 2D) = "white" {}
		_MyNormalMap ("My normal map", 2D) = "bump" {} // Grey
		
		_MyInt ("My integer", Int) = 2
		_MyFloat ("My float", Float) = 1.5
		_MyRange ("My range", Range(0.0, 1.0)) = 0.5
		
		_MyColor ("My colour", Color) = (1, 0, 0, 1) // (R, G, B, A)
		_MyVector ("My Vector4", Vector) = (0, 0, 0, 0) // (x, y, z, w)
	}
 
	SubShader
	{
		Tags
		{
			"Queue" = "Geometry"
			"RenderType" = "Opaque"
		}

		CGPROGRAM

		// CG/HLSL code

		// The code of your shaders
		sampler2D _MyTexture;
		sampler2D _MyNormalMap;
		
		int _MyInt;
		float _MyFloat;
		float _MyRange;
		
		half4 _MyColor;
		float4 _MyVector;

		ENDCG
	} 
}

One shader can have multiple SubShader . They contain the instructions for the GPU. Unity execute one by one until it find the compatible one. 2D properties = textures. bumb indicates that the texture is used as a normal map. Vector and Color always have 4 elements. Define the properties variables in the SubShader before using them. Tags are telling Unity certain properties of the shader. “Queue” tells the RenderType and controls the render order. Queue accept positive integer too. The smaller, the sooner is drawn.

  • Background (1000): background and skyboxes
  • Geometry (2000): default label used for most solid objects
  • Transparent (3000): materials with transparent properties
  • Overlay (4000): effects used as lens flares, GUI elements and texts.

Relative orders are acceptable, such as Background+2. Messing up the Queue might cause nasty situations.

CGPROGRAM and ENDCG are keywords to separate the shader program from other statements. They are old design decisions that made sense once, now we’re still stuck with the backwards compatibility.

ZTest

An object from Transparent doesn’t always appear above on object from Geometry because GPU performs ZTest which stops hidden pixels from being drawn. GPU uses an extra buffer to store the depth of the object drawn in that pixel. Any further away pixel is discarded.

Surface Shader

CGPROGRAM
// uses the lambertian lighting model
#program surface surf Lambert

sampler2D _MainText; // the input texture

struct Input
{
	float2 uv_MainTex;
}

void surf (Input IN, inout SurfaceOutput o)
{
	o.Albedo = tex2D(_MainTex, IN.uv_MainTex).rgb;
}
ENDCG

Use surface shader if the material is affected by lights realistically. Surface shader also hides the calculation of how light isrefelcted.

Vertex and fragment shader

Pass {
CGPROGRAM
 
#pragma vertex vert             
#pragma fragment frag
#include "UnityCG.cginc"
 
struct vertInput {
 float4 pos : POSITION;
};  
 
struct vertOutput {
 float4 pos : SV_POSITION;
};
 
vertOutput vert(vertInput input) {
 vertOutput o;
 o.pos = mul(UNITY_MATRIX_MVP, input.pos);
 return o;
}
 
half4 frag(vertOutput output) : COLOR {
 return half4(1.0, 0.0, 0.0, 1.0); 
}
ENDC

UnityCG.cginc is one of the shader files bundled with Unity. It includes some other essentials files and contains generic functionality.

UnityCG contains UnityInstancing and UnityShaderVariables, UnityInstancing depends on UnityShaderVariables. UnityShaderVariables includes HLSLsupport. UnityShaderVariables defines shader variables like transformation, camera, and light data. HLSLSupport helps us to use the same code, no matter which platform we’re targeting. UnityInstancing is specifically for instancing support. Include a file more than once will make the content copied more than once. It might lead to compiler errors for duplicate definitions.

pragma is to issue special compiler directives.

Geometry model is firstly passed thru the vert to alter the vertices. Then individual traingles are passed thru frag to decide the final RGB color. Useful for 2D effects, post-processing and special 3D effects.

*Each shader consists of a list of subshaders.

Subshader { [Tags] [CommonState] Passdef [Passdef ...]}

SV_POSITION: SV stands for the System Value, POSITION is the final vertex position SV_TARGET means the default shader target. SV_TARGET1 or 2 indicates the additional render target. 0 is the default.

Any other semantics (not SV) are defined completely by the user is to match the outputs from one stage of the pipeline to the inputs of another stage.

swizzle operations: flexible access a single or more component of a vector. e.g. .x, .xy, .yx

Texture mapping: to change the tiling and offset, use the TRANSFORM_TEX macro. It will multiply the tilling and add offset. It’s not need to multiply by _Main_ST. ST = Scale and Translation, it’s an old terminology.

Texels: pixels of a texture Filter Mode:

  • Point: the nearest texel is used when a texture is sampled at some UV coordinates. (blocky appearance)
  • Bilinear: interpolation between 2 texels. Good for texel density is less than the display pixel but not the other way around, because adjacent pixels will end up with sampels taht are mot than one texel apart.