Probably every machine learning algorithm going has been applied to WSD, including associated techniques such as feature selection, parameter optimization, and ensemble learning.
The Impact of Diversity on On-line Ensemble Learning in the Presence of Concept Drift.
He is best known for his work on the AdaBoost algorithm, an ensemble learning algorithm which is used to combine many "weak" learning machines to create a more robust one.
See `Ensemble Learning and Evidence Maximization'-- nips.ps.gz, abstract, and the reference to Neal and Hinton therein, for the concepts needed to confirm this assertion.
Widespread incorrect usage and the availability of alternatives such as Ensemble learning, leaving all variables in the model, or using expert judgement to identify relevant variables have led to calls to totally avoid stepwise model selection.
Ensemble learning methods such as Random Forests help to overcome a common criticism of these methods - their vulnerability to overfitting of the data - by employing different algorithms and combining their output in some way.
Cascading is a particular case of ensemble learning based on the concatenation of several classifiers, using all information collected from the output from a given classifier as additional information for the next classifier in the cascade.
Ensemble learning by variational free energy minimization is a tool introduced to neural networks by Hinton and van Camp in which learning is described in terms of the optimization of an ensemble of parameter vectors.
Uplift modeling has been recently extended and incorporated into diverse machine learning algorithms, like Inductive Logic Programming, Bayesian Network, Statistical relational learning, Support Vector Machines, Survival Analysis and Ensemble learning.