Gray Rendering —- Real-Time Fur over Arbitrary Surface

Please forgive my English translation level 0.0 (for reference only)
http://research.microsoft.com/en-us/um/people/hoppe/fur.pdf

Predecessors’ research: Mainly use the Shell method@@
Disadvantages:
1. Global parameters of surface are required, but not all surfaces have this feature (such as some specific surfaces. No Divide them into slices and render them separately)
2. A lot of texture memory is needed, and each shell needs a different texture to cover the entire surface, and the texture should be large enough to solve individual hairs.
3. The Shell method is only implemented When the direction is roughly perpendicular to the surface, it provides an effective approximation to the volume texture. Near the contour (the shell is observed in grazing angles), the hair will be too transparent, and it will be obvious between different shells

The improved algorithm improves these:
1.-Use lapped texture (solve problems 1,2):
Lapped textures can cover a specific topological structure by pasting small patches of example texture (wrap mode?). Therefore, we only need the local parameters of the individual patch, not the global parameters. And they share a texture for a large number of small surfaces, so the problem of excessive texture memory can be solved.
2. For the contour problem: (the contour here refers to the surface just perpendicular to the line of sight)
This is actually not a problem that the Shell method has. In the interactive settings, designers will use low mode in order to ensure a high frame rate. Detailed Texture helps to beautify these rough models, except where the tangent of the outline is not continuous, where the visual effect will be reduced. When the hair is rendered, the visual effect of the contour plays a very important role in perception. In this case, artificial polygons can be relieved with a higher resolution, or use Sander’s method-clipping to a high-resolution 2D contour.
In order to solve this problem, we introduced a scheme: render fins vertically to the contour surface. The fins texture uses the same volume model as the Shell texture to simulate hair, but they will sample in different directions (more suitable for oblique viewing directions), or let the artist directly create a fin texture.
We provide local and Control of global attributes, such as direction, length, and color.

Contribution:
1. Combining lapped texture and Shell method allows to cover arbitrary topological structure and reduce texture memory
2. Improve the visual effect at the contour position
3. Provide for the hair Global or local control

Approach

Our system needs to input biological models and hair parameter models. Before real-time rendering, we need to perform two operations:
1) Geometry preprocessing: calculate the parameters of the lapped texture patch

2), texture preprocessing: create a hair block Geometric model, and sample it into shell and fin texture.
When it comes to real-time rendering, we render a series of fins offset from the original surface, already pasted, concentric shell, and vertical surface (better support for hair near the contour)

< p>Geometric preprocessing:
In order to establish the parameters of the patch, we use Praun’s method to obtain the lapped texture. To put it simply, this method randomly builds the patch in a place that is not covered by the surface. And use an optimized way to locally parameterize them to the texture domain.
In the original lapped texture method, in order to make the gap between the patches less obvious, the patches are overlapped together, and then irregular borders and alphablend borders are assigned to them. For many types of hair texture (hair is established), these measures are not required (the gap is obviously normal…). On the contrary, we can make the border of the patch correspond to the edge of the mesh, so that we can completely eliminate the difference between the patch and the mesh. Between the overlap. Since the hair texture is random, the resulting texture discontinuity along the edges of the mesh is almost non-existent. The advantage of this non-overlapping texture parameterization is that the triangle mesh of each shell only needs to be rendered once, instead of overlapping it once for each patch. Another advantage is that we don’t need a boundary (Boundary). In this way, our patch texture can be looped, using tilling to cover a larger pacth surface.

Although using non-overlapped parameterization, we can still get many benefits:
1) Let the patch align in a global direction (we still benefit from the alignment of the patches to a global direction field over the surface.) This constant direction can help us hide the gaps between patches (hair has a preferential growth direction)
2) Due to the irregular “stain” shape, each patch has an uneven Grid edge boundary. For example, when the hair is creepy, we can ignore the global direction and create an isotropic parameter so that their local orientation is not aligned with the adjacent patch

Texture preprocessing:
Using Lengyel’s method, we Create a shell texture based on the geometry of the hair strands. The hair strands are simulated by a particle system. In order to make the volumetric texture tileable, this simulation takes place in a small cuboid, which is circularly symmetrical along the two axes of the skin. By changing the parameters, we can get different hairs (straight, curly…)

Shell Texture
The Shell method covers the grid on the bounding box of the hair strands for each shell model The layer creates an RGBA picture. At each grid point, we calculate the color and transparency by applying a high-frequency Gaussian filter to the hair. The high frequency is used because the spacing in the shell is much larger than the grid in each shell. For the innermost shell, it will be completely opaque because it is just the surface of the skin.
In order to combine the lapped texture and the shell method, we use the parameters of each patch to obtain the shell texture of different layers. Therefore, for each layer, in order to generate a translucent shell covering the entire mesh, the patch texture is repeatedly instantiated. This process will be carried out for each layer.

Fin texture
We use a single fin texture to get the surface of all contours. This texture is generated by synthesizing a geometric hair slab from any surface tangent direction. The width of the slab determines the density of the hair to the fin. When rendering each fin, we randomly select the texture coordinate interval to represent the length of the edge. The Scaleing factor is consistent with the data of the lapped texture of the parameterized grid. Ideally, each edge has its own fin texture (they are sampled in appropriate places throughout the hair model, referring to their curvature and density). However, we found that this is unnecessary (because the visual effect of using the general fin texture is good enough) and doing so will increase the texture memory. Since the local density of fin depends on the complexity of the model, we manually select the width of the synthesized slab for each model, so that an appropriate density can be selected for the existing model and texture.

Runtime Rendering
Surface rendering
Each frame, we first render an opaque grid and set the Z-buffer. This includes the innermost layer and the area without hair
Fin rendering
Then, we render the fins, each fin is a quad, embedded in the edge of the mesh and extending along the normal direction. We found that just rendering the fin on the outline will get better results. In order to maintain time correlation, as fin gets closer and closer to the contour, we use the following method to gradually fade it out: For each mesh edge, we calculate the point of the fin normal and the line of sight multiplied by p, and then multiplied by the transparency of fin ( 0,2|p|-1). We collect non-transparent fins and render them for depth inspection, but do not write to Z-buffer. It should be noted that we will render those fins that are invisible to some edges, because for those fins, their tips can be seen. The number of Fin renderings only accounts for a part of the total number of edges, and they are all concentrated on the surface of Shell’s layers. Therefore, fin rendering does not have a major impact on performance, but it improves the visual effect.

Shell rendering
Next, we render the offset shell from the inside out. For each layer, we synthesize patches of lapped texture (translucent) that have been rendered. During Shell rendering, we perform depth testing and depth writing at the same time, and turn on alpha-testing to avoid writing the depth of pixels with alpha 0. In order to allow the Shell texture to be tiled to a large patch, we also turned on texture wrapping.

Rendering Discussion@_@

Leave a Comment

Your email address will not be published.