Question

Indyk and Naor introduced embeddings that preserve this algorithm’s output for sets with a low doubling constant or low aspect ratio. A “condensed” version of this algorithm that uses prototypes to reduce the size of the dataset was developed by Peter Hart. (r1, r2, p1, p2)-sensitive families of functions were originally introduced to perform this algorithm using locality-sensitive hashing. Cover and Hart showed that the simplest version of this (15[1])algorithm has error bounded by two times the Bayes error rate. Usage of a (*) k-d tree allows single queries in this algorithm to be computed in O(log n) time. When used for classification, this algorithm’s namesake parameter is often chosen to be odd to avoid ties. For 10 points, distance-based (10[1])or simple majority voting (10[1])can be used in what classification algorithm that examines close samples to an input? ■END■ (10[2]0[1])

ANSWER: k-nearest neighbors [or k-NN or k-nearest neighbors classification or k-nearest neighbors regression; accept approximate k-nearest neighbors; accept condensed nearest neighbors; accept 1-NN]
<Science - Other Science - Math>
= Average correct buzz position

Back to tossups

Buzzes

PlayerTeamOpponentBuzz PositionValue
Matt BollingerBHSU ReFantazioMusic to Help You Stop Smoking6815
Dan Nihawk two ofWashU11810
Henry CafaroThe Love Song of J Alfred PrufRock and Roll All Nite (and Party Every Day)Notre Dame12210
Billy BusseBHSU RebirthWho is the Colleen Hoover of the Zulus?13710
Coby TranThat Feeling When Knee Surgery Is in Five DaysClown Squad13710
Max BrodskyWho is the Colleen Hoover of the Zulus?BHSU Rebirth1370

Summary

2024 ARGOS @ Stanford02/22/2025Y3100%0%67%135.00
2024 ARGOS @ Chicago11/23/2024Y5100%20%0%116.40
2024 ARGOS @ Columbia11/23/2024Y333%0%0%117.00
2024 ARGOS @ Christ's College12/14/2024Y30%0%67%0.00