Hello everyone,
I'm currently working on rendering WoW terrain. I've came to the point where I'm able to render the terrain with normals and holes as shown in the picture. The holes in the map are randomly generated and are just for the purpose of testing my ability to render them appropriately. I'm am using OpenGl 3.3 and do everything with vertex buffer objects and one elementbuffer object per chunk which holds indices of triangles include degenerates for every part of the terrain that should be rendered (excluding holes obviously).
On my main PC this works quite well as I get a rendering time of one tile of 0.00003 ms
Now I want to move on to rendering textures, what I've understood so far is that the texture coordinates are one 1/64th of one chunk which correspond in my notation to one
quad consisting of the vertices
i j
k
l m
of which are 64 in one chunk of the tile. I looked exemplarily at one texture (ArathiHighlandsDirt, the map in the picture is from Arathi aswell) which is 256 by 256 pixels. My problem now is the following with a texture of 256x256 px resolution at one
quad but a alphamap with a resolution of 64x64 px per
chunk how can one blend the textures properly in the shader? Is the alphamap just stored this way or actually used this way? In the first case I suppose there is a interpolation function used in order to scale the alphamap to the same resolution; and in the second case how is it done then?
What I do understand from
https://wowdev.wiki/ADT/v18#Rendering is:
finalColor = tex0 * (1.0 - (alpha1 + alpha2 + alpha3)) + tex1 * alpha1 + tex2 * alpha2 + tex3 * alpha3
but I don't know how to send my alphavalues correspondingly to the vertices to the shader.
Any hint, literature code examples etc. are very much appreciated.
kind regards and thank you for reading