What Meta learned from Galactica, the doomed model

One year ago, Meta released a research demo called Galactica, an open-source language model for science. It was trained on 48 million scientific papers and promised to summarize academic literature, solve math problems, and more. However, after just three days, Meta took down the demo due to the model’s unscientific and offensive output. This led to the release of ChatGPT, which also had its own hallucination problem. Despite this flaw, ChatGPT has become one of the fastest-growing services, with an estimated 100 million weekly users. Galactica’s legacy has taught valuable lessons, leading to the development of Meta’s next-generation models, including Llama.

https://venturebeat.com/ai/what-meta-learned-from-galactica-the-doomed-model-launched-two-weeks-before-chatgpt/

To top