As we know that most of the AI chatbots currently need to be connected to the cloud for processing, even if they can run locally, the configuration requirements are extremely high. So is there a lightweight chatbot that doesn’t require an internet connection?
A new open-source project called MLC LLM has been launched on GitHub. It can run completely locally without networking, and even old computers with graphics cards and Apple iPhones can run.
The MLC LLM project introduction states: “MLC LLM is a general solution that allows any language model to be deployed locally on a diverse set of hardware backends and native applications, in addition to an efficient framework for everyone to further Optimize model performance for your own use case. Everything runs locally, without server support, and is accelerated by local GPUs on phones and laptops. Our mission is to empower everyone to develop, optimize, and deploy AI models locally on their devices. “
MLC LLM Project Official Demonstration
We checked the GitHub page and found that the developers of this project came from the Catalyst project of Carnegie Mellon University, the SAMPL machine learning research group, the University of Washington, Shanghai Jiaotong University, and OctoML. They also have a related project called Web LLM, which runs an AI chatbot entirely in a web browser.
MLC is the abbreviation of Machine Learning Compilation
According to tests by foreign media tomshardware, the Apple iPhone 14 Pro Max and iPhone 12 Pro Max phones with 6GB of memory successfully run MLC LLM with an installation size of 3GB. And the Apple iPhone 11 Pro Max with 4GB RAM cannot run MLC LLM.
Picture source tomshardware
Do not forget to follow us on our Facebook group and page to keep you always aware of the latest advances, News, Updates, review, and giveaway on smartphones, tablets, gadgets, and more from the technology world of the future.