Add to Favourites
To login click here

With the advent of cloud computing, large language models (LLMs) such as ChatGPT have become increasingly powerful. However, it is now possible to run an AI chatbot on your own laptop or desktop, depending on the system’s capabilities. This is beneficial for those who want to fine-tune a tool for their own data, keep conversations private and offline, or explore AI models without the risk of censorship. LLMs are optimized to work with Nvidia graphics cards, Macs with Apple M-series processors, and even Raspberry Pi systems. Additionally, platforms like Hugging Face and communities like Reddit’s LocalLlaMA have made open-source models freely available, and tools like Oobabooga’s Text Generation WebUI provide easy-to-use interfaces. While local LLMs will not be as fast as cloud-server platforms, they are still a viable option for those who are curious.