Add to Favourites
To login click here

Humber River Hospital in Toronto, Canada, has implemented a Command Centre, an AI system that tracks the flow of patients from intake to discharge, helping healthcare providers make more informed decisions to improve overall efficiency and deliver better care. However, AI systems can produce discriminatory consequences, such as the Amazon recruiting algorithm that penalizes resumes containing the word “women” and the COMPAS algorithm used in criminal sentencing that was more likely to penalize African-American defendants. To avoid such biases, AI system development must be mindful of potential sources of bias and ensure that the data used to train the system is diverse, accurate, and representative.