Hi, I'm having a weird problem where running an inference with _fp16 version of a model results in the following error: TypeError: A bool tensor's data must be type of function Uint8Array() { [native ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results