当前位置:网站首页>Unity shader global fog effect
Unity shader global fog effect
2022-07-28 17:07:00 【Morita Rinko】
Depth texture
Depth texture stores high-precision depth values , The scope is [0,1], And it is usually nonlinear .
Depth value calculation
In vertex change , Finally, it will be transformed to the clipping space NDC Under space , Crop space is a [-1,1] Linear space , stay NDC In space, we can easily get [-1,1] Depth value of d

To obtain d after , We map it to [0,1] in 
Depth value acquisition
stay unity We don't need to calculate the depth value by ourselves , We can get the depth value from the depth texture .
First , We need to set the camera through script depthTextureMode
After setting , We can do that shader Pass through _CameraDepthTexture Variable to access the depth texture .
Sampling depth texture , To deal with platform differences , Use 
sampling .
Because the depth value obtained by sampling is not linear , We need to make it linear .
We know the transformation matrix from perspective space to clipping space , Join us to change a point from perspective space to clipping space , We can get :
Divide it next

Get the expression 
Map it to [0,1]
Because the camera is corresponding to z Negative value , So we have to take the opposite 
But in unity Functions are provided in to convert
LinearEyeDepth The sampling result of the depth texture will be converted to the linear depth value in the view space
Linear01Depth The depth texture sampling results will be converted to the view space [0,1] Linear depth value of
Global fog effect
Realization effect
Noise global fog effect
The key to achieving
We need the depth value , Get the actual world coordinates of each pixel . So as to simulate the global fog effect .
- Firstly, the apparent cone rays in image space are interpolated , Get the direction information from the camera to the pixel .
- Multiply the ray by the depth value in the linear viewing angle space , Get the offset of this point to the camera
- The world coordinates of this point are obtained by offsetting and adding the camera position in world space
The code is as follows :
_WorldSpaceCameraPos and linearDepth It can be obtained by function .
interpolatedRay Calculation
interpolatedRay The calculation of comes from the interpolation of a specific vector near the four corners of the clipping plane .
First, we calculate the near clipping plane up、right The direction of the vector

Use the known vector to express the vector from the camera to the four corners 

According to the vector of the angle and the depth value , The distance from the camera to this point can be obtained 

extract scale factor 
The vector values corresponding to the four corners 
After the interpolation of these four vectors, we can get interpolatedRay
Calculation of fog
Three calculation formulas for calculating fog :
Here we use noise texture to achieve uneven fog effect
Script
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class FogWithNoise : PostEffectsBase
{
public Shader fogShader;
private Material fogMaterial;
public Material material
{
get
{
fogMaterial = CheckShaderAndCreateMaterial(fogShader, fogMaterial);
return fogMaterial;
}
}
private Camera myCamera;
public Camera camera
{
get
{
if (myCamera == null)
{
myCamera = GetComponent<Camera>();
}
return myCamera;
}
}
private Transform myCameraTransform;
public Transform cameraTransform
{
get
{
if (myCameraTransform == null)
{
myCameraTransform = camera.transform;
}
return myCameraTransform;
}
}
// Fog concentration
[Range(0.0f, 3.0f)]
public float fogDensity = 1.0f;
// The color of the fog
public Color fogColor = Color.white;
// Starting height
public float fogStart = 0.0f;
// Termination height
public float fogend = 2.0f;
// Noise texture
public Texture noiseTexture;
// Noise texture x Moving speed in direction
[Range(-0.5f, 0.5f)]
public float fogXSpeed = 0.1f;
// Noise texture y Moving speed in direction
[Range(-0.5f, 0.5f)]
public float fogYSpeed = 0.1f;
// How much noise texture is used , If 0 The fog effect is not affected by noise
[Range(0.0f, 3.0f)]
public float noiseAmount = 1.0f;
// Set the camera status
private void OnEnable()
{
camera.depthTextureMode |= DepthTextureMode.Depth;
}
private void OnRenderImage(RenderTexture source, RenderTexture destination)
{
if (material != null)
{
// Create a matrix variable that stores four directional variables
Matrix4x4 frustumCorners = Matrix4x4.identity;
// Get the variables needed for calculation
float fov = camera.fieldOfView;
float near = camera.nearClipPlane;
float far = camera.farClipPlane;
float aspect = camera.aspect;
// Calculate four directional variables
float halfHeight = near * Mathf.Tan(fov * 0.5f * Mathf.Deg2Rad);
// Calculate the vectors in two directions
Vector3 toRight = cameraTransform.right * halfHeight * aspect;
Vector3 toTop = cameraTransform.up * halfHeight;
// Top left
Vector3 topLeft = cameraTransform.forward * near + toTop - toRight;
//scale factor
float scale = topLeft.magnitude / near;
topLeft.Normalize();
// The direction of the vector
topLeft *= scale;
// The upper right
Vector3 topRight = cameraTransform.forward * near + toRight + toTop;
topRight.Normalize();
topRight *= scale;
// The lower left
Vector3 bottomLeft = cameraTransform.forward * near - toTop - toRight;
bottomLeft.Normalize();
bottomLeft *= scale;
// The lower right
Vector3 bottomRight = cameraTransform.forward * near + toRight - toTop;
bottomRight.Normalize();
bottomRight *= scale;
// Store the calculated vector in the matrix ( In a certain order )
frustumCorners.SetRow(0, bottomLeft);
frustumCorners.SetRow(1, bottomRight);
frustumCorners.SetRow(2, topRight);
frustumCorners.SetRow(3, topLeft);
// Pass attribute values
material.SetMatrix("_FrustumCornersRap", frustumCorners);
material.SetMatrix("_ViewProjectionInverseMatrix", (camera.projectionMatrix * camera.worldToCameraMatrix).inverse);
material.SetFloat("_FogDensity", fogDensity);
material.SetColor("_FogColor", fogColor);
material.SetFloat("_FogStart", fogStart);
material.SetFloat("_FogEnd", fogend);
material.SetTexture("_NoiseTex", noiseTexture);
material.SetFloat("_FogXSpeed", fogXSpeed);
material.SetFloat("_FogYSpeed", fogYSpeed);
material.SetFloat("_NoiseAmount", noiseAmount);
Graphics.Blit(source, destination, material);
}
else
{
Graphics.Blit(source, destination);
}
}
}
shader
Shader "Custom/Chapter15-FogWithNoise"
{
Properties
{
_MainTex ("Base (RGB)", 2D) = "white" {
}
_FogDensity("Fog Density",Float)=1.0
_FogColor("Fog Color",Color)=(1,1,1,1)
_FogStart("Fog Start",Float)=0.0
_FogEnd("Fog End",Float)=1.0
_NoiseTex("Noise Texture",2D)="white"{
}
_FogXSpeed("Fog Horizontal Speed",Float)=0.1
_FogYSpeed("Fog Vertical Speed",Float)=0.1
_NoiseAmount("Noise Amount",Float)=1
}
SubShader
{
CGINCLUDE
#include "unityCG.cginc"
sampler2D _MainTex;
half4 _MainTex_TexelSize;
half _FogDensity;
fixed4 _FogColor;
half _FogStart;
half _FogEnd;
sampler2D _NoiseTex;
half _FogXSpeed;
half _FogYSpeed;
half _NoiseAmount;
sampler2D _CameraDepthTexture;
float4x4 _FrustumCornersRay;
struct v2f{
float4 pos:SV_POSITION;
// Noise texture texture coordinates
half2 uv:TEXCOORD0;
// Depth texture texture coordinates
half2 uv_depth:TEXCOORD1;
// Store the vector after interpolation
float4 interpolatedRay:TEXCOORD2;
};
v2f vert (appdata_img v){
v2f o;
o.pos =UnityObjectToClipPos(v.vertex);
o.uv=v.texcoord;
o.uv_depth=v.texcoord;
#if UNITY_UV_STARTS_AT_TOP
if(_MainTex_TexelSize.y<0){
o.uv_depth.y=1-o.uv_depth.y;
}
#endif
// Calculate the index to determine the direction variable , According to index To get interpolatedRay
int index=0;
if(v.texcoord.x<0.5 && v.texcoord.y<0.5){
index=0;
}else if(v.texcoord.x>0.5 && v.texcoord.y<0.5){
index=1;
}else if(v.texcoord.x>0.5 && v.texcoord.y>0.5){
index=2;
}else if(v.texcoord.x<0.5 && v.texcoord.y>0.5){
index=3;
}
#if UNITY_UV_STARTS_AT_TOP
if(_MainTex_TexelSize.y<0){
index=3-index;
}
#endif
// from _FrustumCornersRay Get the corresponding index The direction vector of
o.interpolatedRay=_FrustumCornersRay[index];
return o;
}
fixed4 frag(v2f i):SV_Target{
// Depth value in view space
float linearDepth =LinearEyeDepth(SAMPLE_DEPTH_TEXTURE(_CameraDepthTexture,i.uv_depth));
// World coordinates
float3 worldPos =_WorldSpaceCameraPos+linearDepth*i.interpolatedRay;
// Offset value of noise texture
float2 speed =_Time.y*float2(_FogXSpeed,_FogYSpeed);
// Sample noise texture
float noise =(tex2D(_NoiseTex,i.uv+speed).r-0.5)*_NoiseAmount;
// According to the world coordinates y Value to calculate the concentration of fog
float fogDensity =(_FogEnd-worldPos.y)/(_FogEnd-_FogStart);// In proportion to
// Add the noise value , Randomly change the concentration of fog
fogDensity =(fogDensity*_FogDensity*(1+noise));// Fog concentration
fixed4 finalColor=tex2D(_MainTex,i.uv);
// Mix the fog color with the original color according to the fog concentration
finalColor.rgb =lerp(finalColor,_FogColor,fogDensity);
return finalColor;
}
ENDCG
Pass{
ZTest Always Cull Off ZTest Off
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
ENDCG
}
}
FallBack Off
}
边栏推荐
- 做题笔记3(二分查找)
- Rsync service deployment and parameter details
- Ruoyi's solution to error reporting after integrating flyway
- Detailed steps for setting up SUSE storage6 environment – win10 + VMware Workstation
- 13 differences between MySQL and Oracle
- Using MVC in the UI of unity
- Games101-assignment05 ray tracing - rays intersect triangles
- Tcp/ip related
- Unity shader transparent effect
- leetcode647. 回文子串
猜你喜欢
![[deep learning]: day 6 of pytorch introduction to project practice: multi-layer perceptron (including code)](/img/19/18d6e94a1e0fa4a75b66cf8cd99595.png)
[deep learning]: day 6 of pytorch introduction to project practice: multi-layer perceptron (including code)

大学生参加六星教育PHP培训,找到了薪水远超同龄人的工作
![[deep learning]: introduction to pytorch to project practice: simple code to realize linear neural network (with code)](/img/19/18d6e94a1e0fa4a75b66cf8cd99595.png)
[deep learning]: introduction to pytorch to project practice: simple code to realize linear neural network (with code)
Read excel xlsx format file in unity

【深度学习】:《PyTorch入门到项目实战》第五天:从0到1实现Softmax回归(含源码)

Re13: read the paper gender and racial stereotype detection in legal opinion word embeddings

RE14: reading paper illsi interpretable low resource legal decision making

Interesting kotlin 0x08:what am I

在AD中添加差分对及连线

Re11:读论文 EPM Legal Judgment Prediction via Event Extraction with Constraints
随机推荐
做题笔记5(有序数组的平方)
Applet: scroll view slides to the bottom by default
PostgreSQL weekly news - July 20, 2022
Comprehensively design an oppe homepage -- page service part
阿里云-武林头条-建站小能手争霸赛
Do you really understand CMS garbage collector?
The longest substring of sword finger offer without repeated characters
负整数及浮点数的二进制表示
Re12:读论文 Se3 Semantic Self-segmentation for Abstractive Summarization of Long Legal Documents in Low
Mysql与Oracle的13点区别
阿里大哥教你如何正确认识关于标准IO缓冲区的问题
Unity3d simple implementation of water surface shader
我该如何理解工艺
Games101 section 13 ray tracing notes
Unity shader transparent effect
How to use fail2ban to protect WordPress login page
Re11:读论文 EPM Legal Judgment Prediction via Event Extraction with Constraints
How should I understand craft
SUSE CEPH rapid deployment – storage6
Question note 4 (the first wrong version, search the insertion position)