Главная
Study mode:
on
1
Introduction
2
Information Extraction
3
Semisupervised Learning
4
Outline
5
Supervised Machine Learning
6
Estimation
7
Classification
8
Document Classification
9
Naive Base
10
Maximum likelihood estimation
11
Sum over data
12
Recap
13
Conditional Log Linear Models
14
Graphical Models
15
Maximum Entropy Models
16
GradientBased Optimization
17
Naive Phase vs Maximum Entropy
18
Conditional Random Field
19
Hidden Markov Model
20
Model Framework
21
Model Structure
22
Conditional Random Field Models
23
Dependency Parsing
24
Generalized Expectations Criteria
25
KL Divergence
26
GE Estimation
27
Label Regularization
Description:
Explore probabilistic methods for classification in this comprehensive lecture by Gideon Mann from the Center for Language & Speech Processing at Johns Hopkins University. Delve into supervised machine learning techniques, covering topics such as information extraction, semisupervised learning, and document classification. Learn about Naive Bayes, maximum likelihood estimation, and conditional log-linear models. Examine graphical models, including Maximum Entropy Models and Conditional Random Fields. Understand gradient-based optimization, hidden Markov models, and dependency parsing. Investigate advanced concepts like the Generalized Expectations Criteria, KL Divergence, and label regularization. Gain valuable insights into the theoretical foundations and practical applications of probabilistic classification methods in natural language processing and machine learning.

Probabilistic Methods for Classification - 2009

Center for Language & Speech Processing(CLSP), JHU
Add to list
0:00 / 0:00