alpaca_lora_4bit/GPTQ-for-LLaMa
John Smith dc036373b2 add more scripts and adjust code for transformer branch 2023-03-22 04:09:04 +00:00
..
autograd_4bit.py add more scripts and adjust code for transformer branch 2023-03-22 04:09:04 +00:00
quant_cuda.cpp add fast_4bit_matmul and auto switch 2 methods according to bottleneck 2023-03-21 08:43:07 +00:00
quant_cuda_kernel.cu add fast_4bit_matmul and auto switch 2 methods according to bottleneck 2023-03-21 08:43:07 +00:00