A report from the Consumer Financial Protection Bureau this week finds that almost everyone who interacts with a bank chatbot dislikes the experience, and usually winds up on the phone with a person in the end. The report suggests that the bots could be causing actual harm, either by possibly violating consumer financial laws or by leaving customers without the support they need. To date, many financial chatbots are rule-based, and so they have been programmed with certain pre-determined responses depending on keywords detected in what a user asks. Increasingly, more complex chatbots use natural-language processing or machine learning to answer with more original replies that sound almost human.