Modcraft - The community dedicated to quality WoW modding!

Projects => Software Development => Topic started by: Pausenclown on October 02, 2017, 09:23:22 pm

Title: Questions regarding Alphamap and rendering texture layers
Post by: Pausenclown on October 02, 2017, 09:23:22 pm
Hello everyone,

I'm currently working on rendering WoW terrain. I've came to the point where I'm able to render the terrain with normals and holes as shown in the picture. The holes in the map are randomly generated and are just for the purpose of testing my ability to render them appropriately. I'm am using OpenGl 3.3 and do everything with vertex buffer objects and one elementbuffer object per chunk which holds indices of triangles include degenerates for every part of the terrain that should be rendered (excluding holes obviously).
On my main PC this works quite well as I get a rendering time of one tile of 0.00003 ms

Now I want to move on to rendering textures, what I've understood so far is that the texture coordinates are one 1/64th of one chunk which correspond in my notation to one quad consisting of the vertices
i        j
    k
l       m
of which are 64 in one chunk of the tile. I looked exemplarily at one texture (ArathiHighlandsDirt, the map in the picture is from Arathi aswell) which is 256 by 256 pixels. My problem now is the following with a texture of 256x256 px resolution at one quad but a alphamap with a resolution of 64x64 px per chunk how can one blend the textures properly in the shader? Is the alphamap just stored this way or actually used this way? In the first case I suppose there is a interpolation function used in order to scale the alphamap to the same resolution; and in the second case how is it done then?

What I do understand from https://wowdev.wiki/ADT/v18#Rendering (https://wowdev.wiki/ADT/v18#Rendering)  is:
Code: [Select]
finalColor = tex0 * (1.0 - (alpha1 + alpha2 + alpha3)) + tex1 * alpha1 + tex2 * alpha2 + tex3 * alpha3but I don't know how to send my alphavalues correspondingly to the vertices to the shader.

Any hint, literature code examples etc. are very much appreciated.

kind regards and thank you for reading
Title: Re: Questions regarding Alphamap and rendering texture layers
Post by: schlumpf on October 02, 2017, 10:34:25 pm
As written in that first note in that section, prefer using https://wowdev.wiki/ADT/v18#legion_terrain_shader_excerpt

Yes, the alpha map will be worse resolution than the textures used. I don't know what interpolation is usually used for alpha map. The fact that it is in fact small, is what has resulted in _h textures and shit, which greatly mitigate this limitation.

Please note that you should never think about pixels in shaders to begin with. Yes, resolution will not be perfect, but all you do is upload MCAL layers as a sampler alphaX, and the layer textures as a sampler texX. The entire remainder will be done by the GPU for you. You don't manually map the 64x64 to the (16*8)x(16*8) or a chunk.

You just do texcoords on the sampler. As long as you correctly have that *8 (or taking texture scaling into account) for texcoords of the actual texture, and *1 on the texcoords of the alphamap, you're fine.

Sorry for rambling, I don't have code to link really since I'm a bit drunk and lazy. Alram has code that does this shit somewhere but I don't know where: https://github.com/Marlamin/WoWFormatTest/tree/master/OBJExporterUI

Maybe you find it somewhere. It is afaik the most accurate implementation of ADT rendering.
Title: Re: Questions regarding Alphamap and rendering texture layers
Post by: Pausenclown on October 03, 2017, 09:19:55 am
Thank you already, I just didn't thought of using the alpha layer as a texture/2Dsampler for whatever reason, I'll guess I'll get along now, since as you said the interpolation is done for me by the GPU.

Hope your hangover is gone already and u can enjoy our 'Tag der deutschen Einheit' ;)