(from discussion with @zeux)
A glTF file contains min/max bounds on mesh vertices. We should use those to set .boundingBox and .boundingSphere, so that computeBoundingBox() and computeBoundingSphere() can be optimized out.
I鈥檒l take a closer look and provide further details - from memory when I looked at this before this seemed to require core changes but I may be wrong.
Submitted #17939 to fix this - this eliminates the guaranteed attempt to compute bounding sphere during rendering of glTF scenes on the first frame after load. Also submitted #17940 - this isn't directly related but this is why I thought that this needs core changes - my viewer demo application computed bbox of the entire scene to position the camera, and that goes through a path that always looks at each vertex and doesn't respect morph targets.
After both of these changes there's no per-vertex CPU impact during glTF load other than uploading the WebGL buffer and normalizing skin weights (which is skipped when they are quantized with gltfpack...). Saves 30-100ms on moderately complex models, and hundreds of msec on multi-million triangle models.
Courtesy of Air France (long flights are long).
Thanks @zeux and Air France! 馃榾
Most helpful comment
Submitted #17939 to fix this - this eliminates the guaranteed attempt to compute bounding sphere during rendering of glTF scenes on the first frame after load. Also submitted #17940 - this isn't directly related but this is why I thought that this needs core changes - my viewer demo application computed bbox of the entire scene to position the camera, and that goes through a path that always looks at each vertex and doesn't respect morph targets.
After both of these changes there's no per-vertex CPU impact during glTF load other than uploading the WebGL buffer and normalizing skin weights (which is skipped when they are quantized with gltfpack...). Saves 30-100ms on moderately complex models, and hundreds of msec on multi-million triangle models.
Courtesy of Air France (long flights are long).