Mixture of experts (MoE) is a machine learning technique where multiple models, or experts, are trained to specialize in different parts of the input space.