Главная
Study mode:
on
1
Introduction
2
Research interests
3
Natural language processing
4
Next generation semantic web
5
Example
6
Semantic Representation
7
Grammar Formalism
8
Grammar Formalism Architecture
9
Semantic Representations
10
Formalism
11
Semantics
12
ConstraintBased Grammar
13
Learnability Theorem
14
Inductive Logic Programming
15
Background Knowledge
16
Representation Lattice
17
Grammar Approximation
18
Different Types of Algorithms
19
Concrete Example
20
Approximation
21
Problem formulation
22
Concept identity
23
Qualitative evaluation
24
Advantages
25
Conclusion
26
Future Directions
27
Automatic Population of Knowledge
28
Machine Transmission
Description:
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only! Grab it Explore a comprehensive lecture on constraint-based grammars and their learning from representative data, presented by Smaranda Muresan at the Center for Language & Speech Processing (CLSP), Johns Hopkins University in 2008. Delve into the integration of syntax, semantics, and learning in natural language processing systems, focusing on a new grammar formalism called Lexicalized Well-Founded Grammar (LWFG). Discover how this approach combines deep language understanding with scalability, utilizing compositional and ontology constraints to access meaning during parsing. Learn about the unique search space for grammar learning, represented as a complete grammar lattice, and its implications for solution uniqueness. Examine the practical applications of LWFG in populating terminological knowledge bases from medical texts. Gain insights into Muresan's research interests, including language learning and understanding, machine translation, and relational learning, as well as her vision for the next generation semantic web. Read more

Learning Constraint-Based Grammars from Representative Data - 2008

Center for Language & Speech Processing(CLSP), JHU
Add to list