Perplexity
The perplexity metric in NLP is a way to capture the degree of ‘uncertainty’ a model has in predicting (ie assigning probabilities to) text.
The perplexity metric in NLP is a way to capture the degree of ‘uncertainty’ a model has in predicting (ie assigning probabilities to) text.