Right, so an extension to an existing pre-trained model. Yeah, it turns out one of the breakthroughs earlier this year was that training the pretrained models may be expensive, but tuning, that is to say, to tune the LoRA, to tune extensions to it.
j previous speech k next speech