-
Notifications
You must be signed in to change notification settings - Fork 131
Description
hello!
i've been looking at using spark for a project, but i'm running into an issue. when i pass sh3 data to spark, i get splotches of green or red in areas of my model with bright specular highlights
my sh3 coefficients are very small - most have an absolute value well below 0.017, so they result in a value of 0 when packed into a six-bit signed integer. i noticed that i can pass a splatEncoding object to the PackedSplats constructor - https://github.com/sparkjsdev/spark/blob/main/src/PackedSplats.ts#L87. then the dyno code can pull those values when reading spherical harmonics, i think. i normalized my spherical harmonic coefficients, passed the normalization factors to spark in a SplatEncoding object, and.... the problem was worse
some debugging in the chrome debugger reveals that the splat encoding object is being overwritten after a PackedSplats constructor runs? but i'm struggling to find where in the spark code that's happening or how to stop it
so, enough rambling. some specific questions:
- can i somehow normalize my spherical harmonic coefficients and tell spark about the normalization factors, in order to get better precision?
- is there another way i might get better precision in
sh3? i've seen a draft pr Interleave spherical harmonics coefficients #146 that would make all the packed spherical harmonics use eight bits. that'd help, but for my particular assets the problem may still exist
thanks in advance for any help yalls can provide!