One of the biggest bottlenecks in enabling AI and keeping up with the calculation and storage needs in the cloud is power management. More specifically, the power density of the power converters used to fuel the processors and ASICs in the system. The “Open Compute Project” (OCP) attempts to address these challenges by defining new standards in the power architecture, moving from the traditional 12 V intermediate bus voltage up to 48 V. This significantly reduces transmission losses and enables a more efficient way to transfer power to the payload (i.e., AI ASIC / GPU / CPU or SOC). The power levels of AI accelerator modules are already exceeding 750 W with currents as high as 1000 A (@ 0.75 V core voltage). When looking at as many as eight of those modules on one mainboard, the power ratings and thermal management efforts become mind-boggling.