that Bloomberg released reveals the great technical depth of its BloombergGPT machine learning model, applying the type of AI techniques that GPT uses to financial datasets. Bloomberg’s Terminal has been the go-to resource for the trading and financial world for financial market data for over four decades. As a result, Bloomberg has acquired or developed a large number of proprietary and curated datasets.
Machine learning algorithms learn from source data and produce a model, a process known as ‘training.’ Training for the BloombergGPT model required approximately 53 days of computations run on 64 servers, each containing 8 NVIDIA40GB A100 GPUs. For comparison, when we use ChatGPT, we provide to a model an input, known as the prompt, and the model then produces an output, much like providing an input to a formula and observing the output.
Smart move.
👍