|
1 | 1 | Neural structured learning methods such as Neural Graph Machines [1], Graph |
2 | 2 | Convolutional Networks [2] and their variants have successfully combined the |
3 | 3 | expressiveness of neural networks with graph structures to improve on learning |
4 | | -tasks. Graph Agreement Models is a technique that can be applied to these |
5 | | -methods to handle the noisy nature of real world graphs. Traditional graph-based |
| 4 | +tasks. Graph Agreement Models (GAM) is a technique that can be applied to these |
| 5 | +methods to handle the noisy nature of real-world graphs. Traditional graph-based |
6 | 6 | algorithms, such as label propagation, were designed with the underlying |
7 | 7 | assumption that the label of a node can be imputed from that of the neighboring |
8 | | -nodes and edge weights. However, most real world graphs are either noisy or have |
| 8 | +nodes and edge weights. However, most real-world graphs are either noisy or have |
9 | 9 | edges that do not correspond to label agreement uniformly across the graph. |
10 | | -Graph Agreement Models(GAM) introduce an auxiliary model that predicts the |
| 10 | +Graph Agreement Models introduce an auxiliary model that predicts the |
11 | 11 | probability of two nodes sharing the same label as a learned function of their |
12 | 12 | features. This agreement model is then used when training a node classification |
13 | 13 | model by encouraging agreement only for those pairs of nodes that it deems |
14 | | -likely to have the same label, thus guiding its parameters to better local |
| 14 | +likely to have the same label, thus guiding its parameters to a better local |
15 | 15 | optima. The classification and agreement models are trained jointly in a |
16 | 16 | co-training fashion. |
17 | 17 |
|
|
0 commit comments