
AMD is in a celebratory mood after AI research firm Zyphra successfully trained its cutting-edge, large-scale Mixture-of-Experts (MoE) model, ZAYA1, entirely on AMD’s accelerated computing platform, which consists of Instinct MI300X GPUs, Pensando Pollara 400 networking hardware, and the ROCm software stack.
What are MoEs, exactly? You