Mesh Rendering

Oct 23, 2007 at 8:36 PM
How is the engine going to organise effects/matertials?

There seems to be 2 ways shaders are treated in games.
  • Each mesh gets its own effect instance, with the material paramaters for that mesh already set.
  • Each mesh gets a material, whos paramators are set to the shader before the mesh is drawn.

Both of these can be quite efficient when states are batched correctly, but I'm not sure which is fastest. It comes down to "Is calling effect.Begin() and pass.Begin() etc, or param.SetValue() faster?".
Oct 23, 2007 at 10:31 PM
Performance-wise, pooling the effects and rendering in order of effects would be ideal, so every effect is only activated once per frame. I haven't really thought about how its all going to fit together into the engine yet, though. A lot of it will depend on how the individual scene managers are written, as they will be responsible for sending commands to the renderer. Now whether the renderer or the scene manager does the actually sorting by material, I'm not sure.
Oct 23, 2007 at 10:47 PM
I would create a mesh manager that contains lists of lists of the meshes. There is a decent example of one of these in Benjamin Nitschke's book (well, the source code for the book).

In his system, you would have a list of SortedEffects, each SortedEffect contains a list of SortedTechniques, each SortedTechnique has a list of SortedMaterials, each SortedMaterial has a list of SortedMeshes, and each SortedMesh has a list of transforms.

When a model is created, it requests a SortedMesh from the mesh manager for each of its mesh parts. If the manager already has that mesh/material/effect combination in its list it will return that, else it creates a new one and returns it.

model.Draw(Matrix transform) could be called by the scene manager, which just adds the transform to the transforms list for each of its mesh parts.

When it is time to draw the models, the mesh manager can quickly iterate over the list, starting each effect and setting the paramaters as little as required. The transforms lists are cleared after rendering has finished.
Oct 26, 2007 at 4:05 PM
Here is a (very) early version of a model renderer I decided to have a go at.

As we haven't decided exactly how subsystems and such are going to work, it's in it's own project at the moment.

It uses a custom model class, based on the example on (, which shouldn't be too hard to extend upon. I suppose the real advantages of this system is that it renders a little more efficiently than the usual mode.Draw() method, and supports multiple viewports.

However, it requires that any shaders specified by the model use "World", "View" and "Projection" parameters, else it will default to using BasicEffect. I don't know any way arround this, and I don't actualy have any models that use any shaders other than the xna sample ones, so it may not work with custom effects at all :P
Oct 26, 2007 at 4:36 PM
Awesome, I'll take a look at it as soon as I can....which might be tomorrow.
Oct 26, 2007 at 10:17 PM
Aphid, it's not that strange that it will revert to BasicEffect. it's impossible for the engine to know which parameters are available and what to map to these parameters. So the code simply assumes that if the effect does not contain the "World", "View" and "Projection" or is a BasicEffect the it's going to revert to that.

Image a efect which had the following parameters
  1. proj
  2. backProj
  3. myView
  4. World

How would the engine dertermine where to map World/View/Projection. So to be safe BasicEffect is being used. You can use something else besides BasicEffect, simply exchange BasicEffect with the default effect you wish to use and modify DrawWithBasicEffect accordently.
Oct 26, 2007 at 10:37 PM
Use semantics, i.e.

float4x4 myMat1 : VIEW_MATRIX;
float4x4 myMat2 : WORLD_MATRIX

Then, it doesn't matter what the variable names are. The renderer just binds to the semantics.
Oct 26, 2007 at 10:43 PM
Yes, but still if the sematics are not present then the engine still needs to have a fallback.