Friday 11 July 2014

Shader Linker

I already posted about shader linking graph previously, so I thought I would give a bit more details about it.

Now that DirectX11.2 is implemented in SharpDX, it was much easier to integrate, removed some small c++ library hack in the meantime.

Now as much as I still don't think shader patching fits every use case, I have to admit there's a few times where it proves really useful.

First scenario is actually not using the linker, but the function reflector. You build a library of functions, and so some form of signature matching, for example:

Code Snippet
  1. float3 GravityStrength = float3(0.0f,-1.0f,0.0f);
  2.  
  3. export float3 Gravity(float3 p, float3 v)
  4. {
  5.     return GravityStrength;
  6. }

Now the library reflector will scan all functions, and if signature matches (in this example, particle force field needs pos/vel tuple), it will automatically create a node. Attaching this node to a generic force field node will take the base code, include the function header and compile a new shader.

This is something I did a lot already (using function headers to inject code in an incomplete shader and include then compile), so you can build a function library while having a single main function, but function reflection brings it to the next level, since I can now have fast collection of nodes in few line of code.

I use it for a lot of cases (geometry displacers, materials, particles interaction...) and it's great.

For geometry displacers, for example (it's overly simplified compared to the real version), I have several displace techniques (tetrahedron, tube, face), each need a displacement value from a position.

So as soon as I have a new displacement technique, it can be integrated with all displacer functions, which gives a great amount of flexibility (and removes a lot of boilerplate code).

Now I have some cases where I need some more complex functions, and where using the linker really shine:

  • Pixel shader for screen space textures. I use this a lot for masks
  • Deffered Pixel materials : Use noise/sine functions for color/rough/reflectivity can be done in handlers, but I often need more complex functions.
  • Distance field builders
Here is an example for pixel mask builder:



Great thing about all this, I have a nice collection of pre built functions (noise, sine displaces...) since I don't want to patch that, and I can easily combine them to create variations. If I need a new function, I can also create a custom one using code and it will reflect parameters and create me a node. Having mixed support for patch/code is such a great feature.

Now one thing I also would like to build quite easily is Distance Field.

I already have distance field builder, which is made using compute shader, and uses an include script to build function. This works really well, but having it as patch could also be interesting.

So first thing, you build your small function collection (primitive distance function, bend/twist operators, union/substract/intersect...). And you have all your nodes ready. 

Now this runs into an issue, the shader linker only builds either Vertex or Pixel shader.
I could use the "GenerateHLSL" function, and include that in my code, but then you lose the advantage of using linker over compiler = speed. Link speed is REALLY fast compared to compile speed.

So let's rewrite that routine, since writing to volume is also really easy to implement in standard pipeline too.
  • Create a single RenderTarget for your Volume
  • Send a draw call with N full screen Quad instances (or triangle, your choice), where N = volume Depth
  • In vertex Shader, multiply instance ID by inverse Slice count, and pass object space position plus this depth to geometry shader (as well as instance id itself)
  • In geometry shader, assign Instance id to SV_RenderTargetArrayIndex so each quad instance is drawn in single slice.
  • Compute your distance function in Pixel Shader :)
Here is definition:

Code Snippet
  1. struct vsInput
  2. {
  3.     float4 p : POSITION;
  4.     float2 uv : TEXCOORD0;
  5.     uint ii : SV_InstanceID;
  6. };
  7.  
  8. struct gsInput
  9. {
  10.     float4 p : POSITION;
  11.     float3 xyz : SDFPOS;
  12.     uint ii : SLICEINDEX;
  13. };
  14.  
  15. struct gsOutput
  16. {
  17.     float4 p : SV_Position;
  18.     float3 xyz : SDFPOS;
  19.     uint rt : SV_RenderTargetArrayIndex;
  20. };
  21.  
  22. struct psInput
  23. {
  24.     float4 p : SV_Position;
  25.     float3 xyz : SDFPOS;
  26. };
  27.  
  28. int SliceCount = 128;

Vertex Shader:

Code Snippet
  1. gsInput VS(vsInput input)
  2. {
  3.     gsInput output;
  4.     output.p = input.p;
  5.     output.xyz = float3(input.uv, input.ii / (float)SliceCount) - 0.5f;
  6.     output.ii = input.ii;
  7.     
  8.     return output;
  9. }

Geometry Shader:

Code Snippet
  1. [maxvertexcount(3)]
  2. void GS(triangle gsInput input[3], inout TriangleStream<gsOutput> gsout)
  3. {
  4.     gsOutput output;
  5.     for (int i = 0; i < 3; i++)
  6.     {
  7.         output.p = input[i].p;
  8.         output.xyz = input[i].xyz;
  9.         output.rt = input[i].ii;
  10.         gsout.Append(output);
  11.     }
  12.     gsout.RestartStrip();
  13. }

And a dummy pixel shader (that builds a sphere):

Code Snippet
  1. float PS(psInput input) : SV_Target
  2. {
  3.     return length(input.xyz)-0.25f;
  4. }

Now we can just bypass this dummy pixel shader and use a patch version of it, as in those screenshots:



Here we go, distance field patcher :)

And a version with some nodes built from code editor, hybrid is the future ;)


No comments:

Post a Comment