From my personal experiments, multiple such things do stack. Actually, on a face-to-face, you can see things that are just a combination of five LoRA components, in a particular order. One very recent example is that somebody took a Mistral 7 billion, compared it to Llama 7 billion, had a delta and then applied it to something that is 13 billion, twice. But it magically worked. It shouldn’t have worked actually. But we don’t know. It’s called Amethyst, the model.

Keyboard shortcuts

j previous speech k next speech