Add to Favourites
To login click here

Tokyo Institute of Technology, Tohoku University, Fujitsu Limited, and RIKEN are collaborating to research and develop distributed training of Large Language Models (LLMs) on supercomputer Fugaku. This initiative aims to improve the environment for creating LLMs that can be widely used by academia and companies, contribute to improving the research capabilities of AI in Japan, and increase the value of utilizing Fugaku in both academic and industrial fields. The technology used in this initiative will allow the organizations to efficiently perform large-scale language model training on the large-scale parallel computing environment of the supercomputer Fugaku.