All different.
I choose the open AI interface stack right now.
It’s okay, because my LLM model has an open stack.
And we also have a local host LLM model. Yeah, LLM and Python.
So we need to build a self-hosted one, how to talk to the city?
One video would have all the data. It just says Nokia?
Oh, cool. You need to compare both.
If you only want to summarize, then that will give a very brief summary.
So label this cluster. Also use some kind of prompt to do a summary, right? Something like that.
I think if we can change the llama output layer, maybe we can get logs.
They just not return the likelihood.
Actually, they can return the likelihood? because they are generating model inside. They have a probability. They are sub-max distribution.
They don’t return the likelihood.
Prompt inference?
It’s fine. They have a lot of cluster everything. You can try it.