Главная
Study mode:
on
1
Intro
2
TRADITIONAL SEARCH
3
OUR SEARCH STORY
4
SEARCH PROBLEM - OVERVIEW
5
ENTERS DEEP LEARNING
6
EXAMPLE QUERY: SIMS GAME PC DOWNLOAD
7
LEARNING DISTRIBUTED REPRESENTATION OF WORDS
8
WORDZVEC DEMYSTIFIED
9
NEURAL PROBABILISTIC LANGUAGE MODELS
10
EXAMPLE: SKIP-GRAM MODEL
11
WORD VECTORS CAPTURING SEMANTIC INFORMATION
12
WORD VECTORS IN 20
13
QUERY VECTOR FORMATION - SIMS GAME PC DOWNLOAD
14
TERMS RELEVANCE
15
QUERY VECTOR INDEX
16
FINDING CLOSEST QUERIES
17
ANNOY (APPROXIMATE NEAREST NEIGHBOR MODEL)
18
ANATOMY OF ANNOY
19
STORING WORD EMBEDDINGS & QUERY-INTEGER MAPPINGS
20
RESULTS
21
CONCLUSION
Description:
Explore the world of query embeddings and web-scale search powered by deep learning and Python in this EuroPython Conference talk. Dive into an unsupervised deep learning system built using Python and open-source libraries like Annoy and keyvi, designed to recognize similarities between queries and their vector representations. Learn how this technology improves recall for previously unseen queries and integrates into the Cliqz browser's search stack. Discover the transition from traditional keyword-based search to deep learning and NLP techniques that represent sentences and documents as fixed-dimensional vectors in high-dimensional space. Gain insights into the architecture of query embeddings, including vector indexing, approximate nearest neighbor models, and the use of Word2Vec. Explore real-world applications, latency issues in real-time search systems, and the potential for this framework to be utilized in other low-latency systems involving vector representations.

Query Embeddings - Web Scale Search Powered by Deep Learning and Python

EuroPython Conference
Add to list
0:00 / 0:00