Let's begin implementing the k-Nearest Neighbors algorithm. Define the distance function, which takes in two arguments: an array of numerical features, and a different array of numerical features. The function should return the Euclidean distance between the two arrays. Euclidean distance is often referred to as the straight-line distance formula that you may have learned previously. n [8]: def distance (arri, arr2): # Don't change/delete the code below in this cell distance_example = distance (make_array(1, 2, 3), make_array(4, 5, 6)). distance_example File "Zipython-input-8-708e5bbb54f4>", line 5 distance_example = distance (make_array(1, 2, 3), make_array(4, 5, 6)). IndentationError: expected an indented block Splitting the dataset We'll do 2 different kinds of things with the coordinates dataset: 1. We'll build a classifier using coordinates for which we know the associated label; this will teach it to recognize labels of similar coordinate values. This process is known as training. 2. We'll evaluate or test the accuracy of the classifier we build on data we haven't seen before. For reasons discussed in lecture and the textbook, we want to use separate datasets for these two purposes. So we split up our one dataset into two.