If I understand you correctly, here is what I did first:
Your problem, more or less, is how to distribute your own value among the vertices. This actually looks like a normal generation on a grid: first you have to generate a normal one for each triangle, and then calculate them to the top. Google is a "normal generation" and you will get there, but here's the point. For each adjacent triangle, find the weighting coefficient (often the angle of the angle that the vertex uses, or the surface area of the triangle, or a combination), and then sum the value (whether normal or your "strengths"), multiplied from the weight coefficient to the overall result. Normalize, and you're done.
So, you have your own "strengths" texture, which you can send to your vertex shader. The modern solution would be to use the symbols and the texture array pattern in the pixel shader after you have slightly changed the blending values to give you more pleasant translations.
So, if I understood your problem correctly:
Preprocess:
forearch vertex in mesh vertexvalue = 0 normalization = 0 foreach adjacent triangle of vertex angle = calculateAngleBetween3Vertices(vertex,triangle.someothervertex,triangle.theotherothervertex) vertexvalue += triangle.value * angle normalization += angle vertexvalue/=normalization
Rendering time:
pass the value (s) of each vertex to the fragmentshader and do this in the fragment shader:
basecolour = 0; foreach value basecolour = mix(basecolour, texture2D(textureSamplerForThisValue,uv), value)
Or, alternatively, you can take a good look at your geometry. If you have a combination of large triangles and tiny ones, you will have uneven distribution of data, and since your data is at the top, you will have more detail where it is more geometry. In this case, you probably want to do what everyone else is doing and separate the texture from your geometry using blend maps. They can be low resolution and should not significantly increase memory consumption or shader runtime.