Preface |
|
1 | (2) |
|
Memory-Based Learning in Natural Language Processing |
|
|
3 | (12) |
|
Natural language processing as classification |
|
|
6 | (3) |
|
|
9 | (3) |
|
|
12 | (2) |
|
|
14 | (1) |
|
Inspirations from linguistics and artificial intelligence |
|
|
15 | (11) |
|
Inspirations from linguistics |
|
|
15 | (6) |
|
Inspirations from artificial intelligence |
|
|
21 | (1) |
|
Memory-based language processing literature |
|
|
22 | (2) |
|
|
24 | (2) |
|
|
26 | (31) |
|
|
27 | (1) |
|
|
28 | (17) |
|
Information-theoretic feature weighting |
|
|
29 | (2) |
|
Alternative feature weighting methods |
|
|
31 | (1) |
|
Getting started with TiMBL |
|
|
32 | (4) |
|
Feature weighting in TiMBL |
|
|
36 | (2) |
|
Modified value difference metric |
|
|
38 | (1) |
|
Value clustering in TiMBL |
|
|
39 | (3) |
|
Distance-weighted class voting |
|
|
42 | (2) |
|
Distance-weighted class voting in TiMBL |
|
|
44 | (1) |
|
Analyzing the output of MBLP |
|
|
45 | (1) |
|
Displaying nearest neighbors in TiMBL |
|
|
45 | (1) |
|
|
46 | (1) |
|
|
47 | (1) |
|
|
47 | (8) |
|
Experimental methodology in TiMBL |
|
|
48 | (4) |
|
Additional performance measures in TiMBL |
|
|
52 | (3) |
|
|
55 | (2) |
|
Application to morpho-phonology |
|
|
57 | (28) |
|
|
59 | (14) |
|
Memory-based word phonemization |
|
|
59 | (1) |
|
|
60 | (7) |
|
|
67 | (2) |
|
Experiments: applying IGTree to word phonemization |
|
|
69 | (2) |
|
TRIBL: trading memory for speed |
|
|
71 | (2) |
|
|
73 | (1) |
|
|
73 | (7) |
|
|
74 | (1) |
|
Feature and class encoding |
|
|
74 | (2) |
|
Experiments: MBMA on Dutch wordforms |
|
|
76 | (4) |
|
|
80 | (3) |
|
|
83 | (2) |
|
Application to shallow parsing |
|
|
85 | (19) |
|
|
86 | (10) |
|
Memory-based tagger architecture |
|
|
87 | (1) |
|
|
88 | (2) |
|
Memory-based tagging with Mbt and Mbtg |
|
|
90 | (6) |
|
|
96 | (3) |
|
|
96 | (1) |
|
Using MBT and MBTG for chunking |
|
|
97 | (2) |
|
|
99 | (2) |
|
Relation finder architecture |
|
|
99 | (1) |
|
|
100 | (1) |
|
|
101 | (1) |
|
|
102 | (2) |
|
Abstraction and generalization |
|
|
104 | (44) |
|
Lazy versus eager learning |
|
|
106 | (9) |
|
Benchmark language learning tasks |
|
|
107 | (4) |
|
Forgetting by rule induction is harmful in language learning |
|
|
111 | (4) |
|
|
115 | (8) |
|
Why forgetting examples can be harmful |
|
|
123 | (5) |
|
|
128 | (15) |
|
Careful abstraction in memory-based learning |
|
|
128 | (7) |
|
Getting started with FAMBL |
|
|
135 | (2) |
|
|
137 | (6) |
|
|
143 | (2) |
|
|
145 | (3) |
|
|
148 | (20) |
|
Wrapped progressive sampling |
|
|
149 | (7) |
|
The wrapped progressive sampling algorithm |
|
|
150 | (2) |
|
Getting started with wrapped progressive sampling |
|
|
152 | (2) |
|
Wrapped progressive sampling results |
|
|
154 | (2) |
|
Optimizing output sequences |
|
|
156 | (8) |
|
|
157 | (3) |
|
|
160 | (2) |
|
Combining stacking and class n-grams |
|
|
162 | (2) |
|
|
164 | (1) |
|
|
164 | (1) |
|
|
165 | (3) |
Bibliography |
|
168 | (18) |
Index |
|
186 | |