URP的工作流程源码分析

URP的工作流程源码分析

URP Analysis
URP是unity推出的,用于替代传统build-in管线。该篇为阅读的笔记~

URP的入口文件为UniversalRenderPiplineAsset.cs,在该文件中,进行了一系列初始化操作,核心是CreatePipeline函数,创建了渲染管线,默认渲染的render为继承至ScriptableRenderer的ForwardRenderer,初始化默认渲染管线UniversalRenderPipeline。

www.zeeklog.com  - URP的工作流程源码分析

UniversalRenderPipeline
首先初始化一系列渲染参数,如shader的全局变量等。

        public UniversalRenderPipeline(UniversalRenderPipelineAsset asset)
        {
            SetSupportedRenderingFeatures(); #some settings in editor mode PerFrameBuffer._GlossyEnvironmentColor =
                    Shader.PropertyToID("_GlossyEnvironmentColor");
            PerFrameBuffer._SubtractiveShadowColor = Shader.PropertyToID("_SubtractiveShadowColor");
            PerFrameBuffer._Time = Shader.PropertyToID("_Time");
            PerFrameBuffer._SinTime = Shader.PropertyToID("_SinTime");
            PerFrameBuffer._CosTime = Shader.PropertyToID("_CosTime");
            PerFrameBuffer.unity_DeltaTime = Shader.PropertyToID("unity_DeltaTime");
            PerFrameBuffer._TimeParameters = Shader.PropertyToID("_TimeParameters");
            PerCameraBuffer._InvCameraViewProj = Shader.PropertyToID("_InvCameraViewProj");
            PerCameraBuffer._ScreenParams = Shader.PropertyToID("_ScreenParams");
            PerCameraBuffer._ScaledScreenParams = Shader.PropertyToID("_ScaledScreenParams");
            PerCameraBuffer._WorldSpaceCameraPos =
                    Shader.PropertyToID("_WorldSpaceCameraPos"); 
            // // Let engine know we have MSAA on for cases where we support MSAA backbuffer
            // if (QualitySettings.antiAliasing != asset.msaaSampleCount) QualitySettings.antiAliasing = asset.msaaSampleCount;
            // For compatibility reasons we also match old LightweightPipeline tag.Shader.globalRenderPipeline = "UniversalPipeline,LightweightPipeline";
            // Lightmapping.SetDelegate(lightsDelegate);
            // CameraCaptureBridge.enabled = true;
            // RenderingUtils.ClearSystemInfoCache();
        }

初始化完毕后,unity会每帧自动调用UniversalRenderPipeline的Render()函数

        protected override void Render(ScriptableRenderContext renderContext, Camera[] cameras)
        {
            BeginFrameRendering(renderContext, cameras);
            GraphicsSettings.lightsUseLinearIntensity = (QualitySettings.activeColorSpace == ColorSpace.Linear);
            GraphicsSettings.useScriptableRenderPipelineBatching = asset.useSRPBatcher;
            SetupPerFrameShaderConstants();
            SortCameras(cameras);
            foreach (Camera camera in cameras)
            {
                BeginCameraRendering(renderContext, camera);
                RenderSingleCamera(renderContext, camera);
                EndCameraRendering(renderContext, camera);
            }

            EndFrameRendering(renderContext, cameras);
        }
www.zeeklog.com  - URP的工作流程源码分析

BeginFrameRendering,BeginCameraRendering,EndCameraRendering,EndFrameRendering为几个标志性函数,表示某些步骤的开始或结束,render逻辑很简单,设置好每帧需要的shader参数,全局设置,线性空间等,然后按照Camera的深度排序相机,顺序渲染每个相机。关键函数RenderSingleCamera()

        /// <summary>
        /// Standalone camera rendering. Use this to render procedural cameras.
        /// This method doesn't call <c>BeginCameraRendering</c> and <c>EndCameraRendering</c> callbacks.
        /// </summary>
        /// <param name="context">Render context used to record commands during execution.</param>
        /// <param name="camera">Camera to render.</param>
        /// <seealso cref="ScriptableRenderContext"/>
        public static void RenderSingleCamera(ScriptableRenderContext context, Camera camera)
        {
            UniversalAdditionalCameraData additionalCameraData = null;
            if (IsGameCamera(camera))
                camera.gameObject.TryGetComponent(out additionalCameraData);

            if (additionalCameraData != null && additionalCameraData.renderType != CameraRenderType.Base)
            {
                Debug.LogWarning("Only Base cameras can be rendered with standalone RenderSingleCamera. Camera will be skipped.");
                return;
            }

            InitializeCameraData(camera, additionalCameraData, true, out var cameraData);
#if ADAPTIVE_PERFORMANCE_2_0_0_OR_NEWER
            if (asset.useAdaptivePerformance)
                ApplyAdaptivePerformance(ref cameraData);
#endif
            RenderSingleCamera(context, cameraData, cameraData.postProcessEnabled);
        }

概括来看,RenderSingleCamera的逻辑也很简单,获取裁剪数据,获取相机的特殊设置(msaa,depthmap等),设置相机相关的shader全局参数(MVP矩阵等),从commandbufferpool中取一个cmd,然后调用asset中初始化的renderer,默认为forwardrenderer,接下来就开始渲染设置,提交渲染命令,结束当前相机渲染。

标准的unity渲染流程:

  • 清空renderer,setup裁剪数据,清空cmd
        renderer.Clear();
        renderer.SetupCullingParameters(ref cullingParameters, ref cameraData);context.ExecuteCommandBuffer(cmd);
        cmd.Clear();
  • 使用裁剪数据进行裁切
var cullResults = context.Cull(ref cullingParameters);
  • 初始化渲染数据
        InitializeRenderingData(settings, ref cameraData, ref cullResults, out var renderingData);
  • renderer设置,执行
        renderer.Setup(context, ref renderingData);renderer.Execute(context, ref renderingData);
  • 执行cmd,释放cmd
context.ExecuteCommandBuffer(cmd);CommandBufferPool.Release(cmd);
  • 提交渲染命令
context.Submit();

这里解释一下cmd,在gpu渲染中,每一次绘制都是一次命令,unity可以将多个绘制命令push到同一个commandbuffer中,比如我首先push一个绘制shadowmap的命令,之后push绘制不透明物体的命令,最后执行commandbuffer,那么gpu中就会先绘制阴影相关的pass,再绘制不透明物体pass。同理,我们也可以使用commandbuffer来定制我们想要的效果。

在InitializeRenderingData中,对可见光源进行了筛选,设置主光源,shadow相关参数(如shadow bias,shadow map resolution等),后处理lut,dynamicbatching。
重点分析renderer.Setup(context, ref renderingData)和renderer.Execute(context, ref renderingData)两个函数。

ForwardRenderer&ScriptableRenderer
ForwardRenderer继承至ScriptableRenderer,ForwardRenderer的初始化代码如下:

        public ForwardRenderer(ForwardRendererData data): base(data)
        {
            Material blitMaterial = CoreUtils.CreateEngineMaterial(data.shaders.blitPS);
            Material copyDepthMaterial = CoreUtils.CreateEngineMaterial(data.shaders.copyDepthPS);
            Material samplingMaterial = CoreUtils.CreateEngineMaterial(data.shaders.samplingPS);
            Material screenspaceShadowsMaterial = CoreUtils.CreateEngineMaterial(data.shaders.screenSpaceShadowPS);
            StencilStateData stencilData = data.defaultStencilState;
            m_DefaultStencilState = StencilState.defaultValue;
            m_DefaultStencilState.enabled = stencilData.overrideStencilState;
            m_DefaultStencilState.SetCompareFunction(stencilData.stencilCompareFunction);
            m_DefaultStencilState.SetPassOperation(stencilData.passOperation);
            m_DefaultStencilState.SetFailOperation(stencilData.failOperation);
            m_DefaultStencilState.SetZFailOperation(stencilData
                    .zFailOperation);

            // Note:
            // Since all custom render passes inject first and we have stable sort,
            // we inject the builtin passes in the before events.
            m_MainLightShadowCasterPass = new MainLightShadowCasterPass(RenderPassEvent.BeforeRenderingShadows);
            m_AdditionalLightsShadowCasterPass = new AdditionalLightsShadowCasterPass(RenderPassEvent.BeforeRenderingShadows);
            m_DepthPrepass = new DepthOnlyPass(RenderPassEvent.BeforeRenderingPrepasses, RenderQueueRange.opaque, data.opaqueLayerMask);
            m_ScreenSpaceShadowResolvePass = new ScreenSpaceShadowResolvePass(RenderPassEvent.BeforeRenderingPrepasses, screenspaceShadowsMaterial);
            m_ColorGradingLutPass = new ColorGradingLutPass(RenderPassEvent.BeforeRenderingOpaques, data.postProcessData);
            m_RenderOpaqueForwardPass = new DrawObjectsPass("Render Opaques", true, RenderPassEvent.BeforeRenderingOpaques, RenderQueueRange.opaque,
                data.opaqueLayerMask, m_DefaultStencilState, stencilData.stencil