Also, I’ve got this local model, LLaMa 65B, running on this MacBook Pro with 96GB RAM, instructed with the source code, and just locally can replicate quite a bit of results.
j previous speech k next speech