Mixtures of Experts, Using Gaussian Mixture Models for the Gate

This code implements the mixture of expert’s using a Gaussian mixture model for the gate.
689 Downloads
Aktualisiert 11. Nov 2014

Lizenz anzeigen

This code implements using a Gaussian mixture model for the gate. ; the main advantage of this method is that training for the gate uses expected maximization (EM) algorithm or single loop EM algorithm. This is achieved using a Gaussian mixture model for the gate. Other methods use the Softmax Function that does not have an analytically closed form solution, requiring the Generalized Expectation Maximization (GEM) or the double loop EM algorithm. The problems with GEM is that it requires extra computation and the stepsize must be chosen carefully to guarantee the convergence of the inner loop. I used k means clustering for initialization, I find only a small improvement after initialization. If you have any questions or recommendations contact me.

Zitieren als

Joseph Santarcangelo (2024). Mixtures of Experts, Using Gaussian Mixture Models for the Gate (https://www.mathworks.com/matlabcentral/fileexchange/48367-mixtures-of-experts-using-gaussian-mixture-models-for-the-gate), MATLAB Central File Exchange. Abgerufen .

Kompatibilität der MATLAB-Version
Erstellt mit R2008a
Kompatibel mit allen Versionen
Plattform-Kompatibilität
Windows macOS Linux
Kategorien
Mehr zu Statistics and Machine Learning Toolbox finden Sie in Help Center und MATLAB Answers

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!
Version Veröffentlicht Versionshinweise
1.2.0.0

din't upload last time

1.1.0.0

There was an error in the first version, I also improved documentation

1.0.0.0