LRMoE.jl is an implementation of the Logit-Reduced Mixture-of-Experts model in julia.
This package is introduced in Tseung et al. 2021.
To install the stable version of the package, simply type the following in the julia REPL:
] add LRMoETo install the latest version, type the following in the julia REPL:
] add https://github.com/sparktseung/LRMoE.jlThe website of full documentation is here.