in

Bloomberg plans to integrate GPT-style A.I. into its terminal

  • Bloomberg LP has developed an AI model using the same underlying technology as OpenAI’s GPT, and plans to integrate it into features delivered through its terminal software.
  • The financial information giant says Bloomberg GPT, its internal AI model, can more accurately answer questions like “CEO of Citigroup Inc?”, assess whether headlines are bearish or bullish for investors, and even write headlines based on short blurbs.
  • The move shows how software developers in many industries beyond Silicon Valley see state-of-the-art AI like GPT as a technical advancement allowing them to automate tasks that used to require a human.

Bloomberg LP has developed an AI model using the same underlying technology as OpenAI’s GPT, and plans to integrate it into features delivered through its terminal software, a company official said in an interview with CNBC.

Bloomberg says that Bloomberg GPT, an internal AI model, can more accurately answer questions like “CEO of Citigroup Inc?”, assess whether headlines are bearish or bullish for investors, and even write headlines based on short blurbs.

Large language models trained on terabytes of text data are the hottest corner of the tech industry. Giants such as Microsoft and Google are racing to integrate the technology into their products, and artificial intelligence startups are regularly raising funds at valuations over $1 billion.

Bloomberg’s move shows how software developers in many industries beyond Silicon Valley see state-of-the-art AI like GPT as a technical advancement allowing them to automate tasks that used to require a human.

“Both the capabilities of GPT-3 and the way that it achieved its performance through language modeling wasn’t something that I expected,” said Gideon Mann, head of ML Product and Research at Bloomberg. “So when that came out, we were like, ‘OK, this is going to change the way that we do NLP here.'”

NLP stands for natural language processing, the part of machine learning that focuses on deriving meaning from words.

The move also shows how the AI market may not be dominated by giants with massive amounts of generalized data.

Building large language models is expensive, requiring access to supercomputers and millions of dollars to pay for them, and some have wondered if OpenAI and Big Tech companies would develop an insurmountable lead. In this scenario, they would be the winners, and simply sell access to their AIs to everybody else.

But Bloomberg’s GPT doesn’t use OpenAI. The company was able to use freely available, off-the-shelf AI methods and apply them to its massive store of proprietary — if niche — data.

So far, Bloomberg says its GPT shows promising results doing tasks like figuring out whether a headline is good or bad for a company’s financial outlook, changing company names to stock tickers, figuring out the important names in a document, and even answering basic business questions like who the CEO of a company is.

It also can do some “generative AI” applications, like suggesting a new headline based on a short paragraph.

One example in the paper:

Input: “The US housing market shrank in value by $2.3 trillion, or 4.9%, in the second half of 2022, according to Redfin. That’s the largest drop in percentage terms since the 2008 housing crisis, when values slumped 5.8% during the same period”

Output: “Home Prices See Biggest Drop in 15 Years.”

How it could be used

OpenAI’s GPT is often called a “foundational” model because it wasn’t intended for a specific task.

Bloomberg’s approach is different. It was specifically trained on a large number of financial documents collected by the firm over the years to create a model that’s especially fluent in money and business.

In contrast, OpenAI’s GPT was trained on terabytes of text, the vast majority of which had nothing to do with finance.

About half of the data used to create Bloomberg’s model comes from nonfinancial sources scraped from the web, including GitHub, YouTube subtitles, and Wikipedia.

But Bloomberg also added over 100 billion words from a proprietary dataset called FinPile, which includes financial data the firm has accumulated over the last 20 years, including securities filings, press releases, Bloomberg News stories, stories from other publications and a web crawl focused on financial webpages.

It turns out that adding specific training materials increased accuracy and performance enough on financial tasks that Bloomberg is planning to integrate its GPT into features and services accessed through the company’s Terminal product, although Bloomberg is not planning a ChatGPT-style chatbot.

One early application would be to transform human language into the specific database language that Bloomberg’s software uses.

For example, it would transform “Tesla price” into “(get(px_last) for([‘TSLA US Equity’])”.

Another possibility would be for the model to do behind-the-scenes work cleaning data and doing other errands on the application’s back end.

But Bloomberg is also looking at using artificial intelligence to power features that could help financial professionals save time and stay on top of the news.

“There’s a lot of work we’re doing to help clients address that data deluge of news stories, whether that’s through summarization, or monitoring, or being able to ask questions on those news stories or transcripts. There are a lot of applications there,” Mann said.

Source: Finance - cnbc.com

The perils of Emmanuel Macron’s strategic assertiveness

Germany pushes Intel to spend more on €17bn chip plant