Question

Indyk and Naor introduced embeddings that preserve this algorithm’s output for sets with a low doubling constant or low aspect ratio. A “condensed” version of this algorithm that uses prototypes to reduce the size of the dataset was developed by Peter Hart. (r1, r2, p1, p2)-sensitive families of functions were originally introduced to perform this algorithm using locality-sensitive hashing. Cover and Hart showed that the simplest version of this algorithm has error bounded by two times the Bayes error rate. Usage of a (*) k-d tree allows single queries in this algorithm to be computed in O(log n) time. When used for classification, (-5[1])this algorithm’s namesake parameter is often (-5[1])chosen to be odd to avoid ties. For 10 points, distance-based or simple majority voting can be used in what classification algorithm that examines close samples to an input? ■END■ (0[3])

ANSWER: k-nearest neighbors [or k-NN or k-nearest neighbors classification or k-nearest neighbors regression; accept approximate k-nearest neighbors; accept condensed nearest neighbors; accept 1-NN]
<Science - Other Science - Math>
= Average correct buzz position

Back to tossups

Buzzes

PlayerTeamOpponentBuzz PositionValue
Sam MooreSimple VibesGrzegorz Brzęczyszczykiewicz101-5
Omer KeskinCien Años de QuizboledadCambridge107-5
Michael WuGrzegorz BrzęczyszczykiewiczSimple Vibes1370
Oscar SiddleLimp FranceskitDefying Suavity1370
Daoud JacksonDefying SuavityLimp Franceskit1370

Summary

2024 ARGOS @ Stanford02/22/2025Y3100%0%67%135.00
2024 ARGOS @ Chicago11/23/2024Y5100%20%0%116.40
2024 ARGOS @ Columbia11/23/2024Y333%0%0%117.00
2024 ARGOS @ Christ's College12/14/2024Y30%0%67%0.00