Model output is different when deployed to Inference Endpoint
#1
by
randallgann
- opened
Hi, I'm trying to understand why the model output would be different when deployed to an Inference Endpoint using the default settings. I'm using the same example prompt provided in the colab notebook but when using the Inference Endpoint deployment the output is
{
Get {
HistoricalEvent(
bm25: {
Any insight as to what I may be doing wrong with deploying the model to an inference endpoint would be much appreciated.