Practically understanding Graphics.Blit() in Unity
So, for the past few days, I’ve been reading up about the Blit() method of the Graphics class in Unity. It all started with some code making use of the method, which seemed simple enough at first glance. And for the most part, it is. But then out of curiosity, I ended up checking the Unity docs about it and well… here we are.
This article looks at the significance of the different arguments of Blit() method in its various overloaded forms. And it does so in a practical way.
So what exactly does the Blit() method do?
In its simplest form, it copies a source texture to a destination render texture. Additionally, it can also apply a shader when copying the texture. That’s why it's typically used in post-processing effects. Its normally called from OnRenderImage()and OnPostRender() ie, after the camera finishes rendering.
Common Blit() function declarations:
The pass represents which shader pass to execute. For PP effects we typically set it to 0 to indicate only shader code execution. -1 runs all the passes in the shader.
To understand about it more practically, let's look at an example setup where we have the opportunity to use the Blit() method.
We create a render texture to work with. It’s a 2D render texture with no depth buffer
We create a material with the base map set to the render texture
Finally, we have a cube Gameobject in the scene with the above created material applied to it
Now anything we Blit onto the render texture will appear on the cube, allowing us a way to see the results of the Blit process.
Note: Another way to view the render texture would be to use Raw Image, a 2D UI element, and set its Texture property to our render texture.
For our purposes, the source texture, we use here will be
We have also created a simple shader (in shader graph) that applies the Voronoi noise pattern
And created a material from the shader to use in code
We have also created a simple script (which sits on the cube) that runs the Blit code
We specify the ‘sourceTexture’, ‘sourceTexture2DArray‘ (we’ll use this later), ‘destRenderTexture’ and ‘mat’ from the inspector
Now that we have the entire setup, let's start with some code.
OUTPUT:
OUTPUT:
Here the ‘sourceDepthSlice’ and ‘destDepthSlice’ are used to refer to a particular part of a 2D Texture array
To see this in action, we use a texture atlas, imported into Unity as a 2D texture array.
OUTPUT:
Here the 6th (as mentioned via ‘sourceDepthSlice’) element of the 2D texture array is rendered to the render texture.
The ‘destDepthSlice’ is 0 because the target render texture is not a 2D texture array.
Let's look at the “scale” argument of Blit()
It’s a Vector2 value (for scaling in x and y axes)
It’s simple, when the value is increased above 1, the rendered output is smaller than the original.
If the value is decreased below 1 (above 0), the rendered output is bigger than the original.
Also, the source texture’s wrap mode (set to clamp, repeat etc) determines how the final render fills up space.
OUTPUT:
OUTPUT:
OUTPUT:
OUTPUT:
Let's look at the “offset” argument of Blit()
It’s a Vector2 value (for offsetting in x and y axes)
Positive values of x offset to the right
Negative values of x offset to left
Positive values of y offset to down
Negative values of y offset to the top
OUTPUT:
OUTPUT:
OUTPUT:
Some other stuff:
- Blitting to the screen directly in case of a Scriptable RP like URP or HDRP requires calling Graphics.Blit() from inside a method registered as a callback for the RenderPipelineManager.endFrameRendering event
OUTPUT:
2. Using the same render texture as source and destination can have undefined behavior. The best way to deal with such scenarios is to use a custom render texture with double buffering enabled
Or
Use two render textures to copy data back and forth between them