Skip to content

[BugFix]add int8 cache dtype when using attention quantization #158

[BugFix]add int8 cache dtype when using attention quantization

[BugFix]add int8 cache dtype when using attention quantization #158

Annotations

1 error

yapf (3.12)

failed Feb 22, 2025 in 6s