In this article, the author argues that despite the advanced capabilities of OpenAI’s GPT-4 language model, it is worth considering self-hosting models for building products or internal capabilities. They highlight that using APIs from companies like OpenAI limits customization and control over the model, making the product dependent on the company. Self-hosting allows for greater control over model architecture, future changes, and customization. The author also emphasizes that running smaller models can still be cost-effective for many applications. They stress the importance of deep understanding and access to models and code for effective integration. Overall, the article urges consideration of self-hosting models for long-term relationships and adaptability in a rapidly evolving landscape.
http://marble.onl/posts/why_host_your_own_llm.html