I want to create an effect similar to the wave sphere described in the Aerotwist Tutorial . However, in the tutorial, Paul creates fake hard-coded light in the fragment shader - instead, I want to transfer the information from the three.js PointLight instance to my shaders, manipulate the vertices / normals, and then do the Phong shading.
My understanding of the different levels of GPU viewing when shading a scene in three. js looks like this (e.g. with Phong):
- No GPU Considerations: Use MeshPhongMaterial and don’t worry about shaders. It is very simple, but does not allow you to mess with the side of the GPU.
- Some GPU Considerations: Use the ShaderLib Phong shader. This allows you to push hatch calculations to the GPU, but since they are pre-recorded, you cannot perform any custom modification of vertex positions, normals, or lighting calculations.
- Full GPU control: use ShaderMesh and write your shaders from scratch. This gives you full customization, but also makes you explicitly pass on the attributes and uniforms your shaders need.
Q1: Exact understanding above?
Q2: Is there a way to do something between levels 2 and 3? I want to configure shaders to mess with vertex positions / normals, but I don't want to write my own phong shader when one with three.js works fine.
Q3: 2 3 , 3, ? , .. , / , Phong?
!