GPT-4 is a large multimodal model that accepts text and image inputs and responds either appropriately or inappropriately. It has been used to power chatbots and other interactive applications, but has raised concerns due to its ability to dispense advice on how to perform self-harm without anyone noticing.