Thank you, Ms. Kwan. Your question is excellent and you are absolutely right about the Chinook system. We are calling for an immediate halt to its use in order to conduct a thorough and detailed study of its parameters and possible racist biases. I specialize in data science myself; it is my job to create risk prediction models. Since those models are supposedly based on humans, creating notions of risk reflects human behaviour.
Instead of giving ourselves the opportunity to start from scratch, we used a system to implement human reflexes instead of defining the risks from the beginning. We analyzed the human risks, we put them in the machine and we make them apply, all the while disclaiming any responsibility because the machine makes the decisions. However, we know very well what we have put into it and that the software contains inherent risks. So, we need an independent and clear study; we need to be able to know what is going on in Chinook's black box.