![]() Eventually out of nowhere she opened the PR that you all know and asked me to close mine. So anyway, I joined jart's discord and talked to her about this a bit, she seemed to be interested in collaborating and that's why I added her as co-author, even thought she didn't write a line of code of the PR. This only worked with 7B models but later on I realized that with PR #545 this could be extended to work with any model without any other changes. All it required was changing maybe 10-20 lines of code and it didn't break anything. Later on, after spending some time on the llama.cpp code for other reasons I realized that this would actually be trivial to implement properly. Completely unworkable in a real code base. Jart initially created an implementation of mmap a couple of week back that was an abomination that relied on doing things like replacing malloc. To understand why I did that, you have to go back here: >Interestingly, slaren mentioned jart was a co-author in the PR Someone screenshot all this and put it in a collage for future bakers Someone claiming to be slaren commented here confirming all this >92491452 Jart convinced ggerganov to dump the version system and put in a magic number with her initials in it >92490840 which he admits was a mistake Jart starts taking all the credit, "my changes" "my work" "I just wrote" "I'm the author" >92490106 >92491110 She says it was "written in collaboration with part of the changes jart claims the model is a "sparse graph" and uses 10x less memory too Jart then closes the PR and makes a new one here. Interestingly, slaren mentioned jart was a co-author in the PR Slaren confirms he wrote all that code here >92490820 ── llama-7b-4bit-HF-128 -> /mnt/y/ML/0-MODELS/2-TEXT/pygmalion/LLaMA/LLaMA-HF-4bit-128g/llama-7b-4bit-128gĪ user named slaren wrote code to load the model in faster with mmap >92490260 and >92490277 and put it in a PR ── alpaca7B -> /mnt/y/ML/0-MODELS/2-TEXT/pygmalion/LLaMA/v2/ggml-alpaca-7b-native-q4.bin Pip3 install torch torchvision torchaudio ![]() maybe more.Ĭonda install -c conda-forge cudatoolkit-dev I think im up to install number 7 or 8 now. tell me if i should give a full list of all installs ive tried so far. >RolePlayBoT for RP (Smut story writing version coming soon, Stupid Sexy Llama) >Pyggy Guide/Resources (w/ Kobold/Tavern guide) The Previous Thread Before the Last Thread: >92473453
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |