Out-of-Distribution (OOD)

Usually used when we refer to a model encountering data that is “out-of-distribution”, i.e. it encounters inputs that are different from what it was trained on.

This is very common problem in Behavior Cloning in robotics, with compounding errors in a real-world robot, and they talk about fixing this with Action Chunking with Transformers.