Commit Graph

16 Commits

Author SHA1 Message Date
John Smith 1abdc99675 add server 2023-04-26 17:13:00 +08:00
John Smith 3b18aa1cc6 fix bug and remove bnb 2023-04-20 09:51:57 +08:00
John Smith fb7665726e
Update requirements.txt
Pinned commit hash
2023-04-13 14:44:59 +08:00
John Smith 5ff11b5bf2
Merge pull request #77 from winglian/upstream-peft
use monkey patch instead of forked peft
2023-04-13 10:25:05 +08:00
John Smith 4261bd8070 add xformers support 2023-04-12 12:59:44 +08:00
Wing Lian c2b33bacc9 use monkey patch instead of forked peft 2023-04-09 11:40:58 -04:00
John Smith dba3773b30 add triton backend support for v2 model 2023-04-07 15:34:06 +08:00
John Smith 9351f49542 merge pull request in new branch 2023-04-07 10:40:24 +08:00
yamashi 30bf938d03
Update requirements.txt 2023-04-06 13:50:25 +02:00
Andrey Glushenkov f20570343f
GPTQv2 support
GPTQv2 support.
1. Adds dependency on `triton`
2. Refactors autograd_4bit to include both GPTQv1 and GPTQv2
3. Introduces new environment variable GPTQ_VERSION to select autograd_4bit version
4. Fixes triton kernels
5. Matrix multiplications are in fp16
2023-04-06 02:29:36 +03:00
Wing Lian b47da33084 fixes for most recent update 2023-03-28 10:56:35 -04:00
Wing Lian 101d314bd9 add missing dependency to train with LlamaTokenizer 2023-03-27 16:13:46 -04:00
Wing Lian 62e54ac1c7 backwards support for pre-py3.10, add datasets requirement used in train 2023-03-27 16:08:20 -04:00
Star Dorminey 96440c8717 Removing submodules actually. 2023-03-25 20:20:38 -07:00
John Smith dc036373b2 add more scripts and adjust code for transformer branch 2023-03-22 04:09:04 +00:00
John Smith 551f62a0e8
add patch for gptq and peft 2023-03-18 13:31:48 +08:00