Bloomberg has released a paper that reveals the technical depth of its BloombergGPT machine learning model, which applies AI techniques to financial datasets. Bloomberg has acquired or developed a large number of proprietary and curated datasets, which are used to build an unprecedented financial research and analysis tool. Training for the BloombergGPT model required approximately 53 days of computations run on 64 servers, each containing 8 NVIDIA 40GB A100 GPUs. Bloomberg partnered with NVIDIA and Amazon Web Services in the production of the BloombergGPT model, which cost more than $2.7 million to produce.
