“Stores” of these objects are created in a common application of the HNSW algorithm used by providers such as Pinecone and Chroma. Stanford’s “global” algorithm for creating these objects competes with a Google algorithm that can use either the CBOW or skip-gram architecture for transforming words to these objects. These objects name a technique whose “soft” variant uses the hinge loss; that technique can use the (*) kernel trick to solve nonlinear problems. In the transformer architecture, tokenized data like text is turned into “embeddings” of these objects. A linear classification technique seeks to maximize the margin between classes with a hyperplane defined by “support” types of these objects. For 10 points, name these rank-one tensors that can be “multiplied” using the dot product. ■END■
ANSWER: vectors [accept vector databases or vector stores or support vector machines or vector embeddings or vector representations or global vectors; prompt on SVMs; prompt on GloVe or word2vec; prompt on tensors until read]
<Steven Yuan, Other Science>
= Average correct buzz position