It’s fine. They have a lot of cluster everything. You can try it.
Prompt inference?
They don’t return the likelihood.
Actually, they can return the likelihood? because they are generating model inside. They have a probability. They are sub-max distribution.
They just not return the likelihood.
I think if we can change the llama output layer, maybe we can get logs.
So label this cluster. Also use some kind of prompt to do a summary, right? Something like that.
If you only want to summarize, then that will give a very brief summary.
Oh, cool. You need to compare both.
One video would have all the data. It just says Nokia?
So we need to build a self-hosted one, how to talk to the city?
And we also have a local host LLM model. Yeah, LLM and Python.
It’s okay, because my LLM model has an open stack.
I choose the open AI interface stack right now.
All different.