SEARCH
You are in browse mode. You must login to use MEMORY

   Log in to start


From course:

Intro to AI 2

» Start this Course
(Practice similar questions for free)
Question:

K-fold Cross validation

Author: Christian N



Answer:

- K determines the number of partitions (and iterations) - Complete evaluation because uses all data available to evaluate the model. - More costly than train/split because K models needs to be trained - Leave-one-out is an extreme case of cross validation where K equals to the size of all data


0 / 5  (0 ratings)

1 answer(s) in total