Hugging Face and Colossal-AI have been at the forefront of open-source AI developments, with Hugging Face recently releasing a blog about integrating Transformer Reinforcement Learning (TRL) with Parameter-Efficient Fine-Tuning (PEFT) for making a large language model (LLM) with around 20 billion parameters fine-tunable on a 24GB consumer grade GPU. Colossal-AI has also been working on a new chatbot called ‘ChatGPT’, which is based on the OpenAI GPT-3 model and is designed to be an open-source platform that can be used to build conversational AI products.