Spain Reforms Domestic Violence System After Criticism
The Spanish government this week announced a major change to a system where police rely on an algorithm to identify potential victims of domestic violence, after officials faced questions about the system’s effectiveness.
The program, VioGén, requires the police to ask the victim a series of questions. The responses are entered into a software system that generates a score – from no risk to very high risk – aimed at flagging women who are at risk of repeat abuse. The score helps determine what police protection and other services a woman can get.
A New York Times investigation last year found that police rely heavily on the technology, almost always accepting decisions made by VioGén software. Some women the algorithm labeled as harmless or at low risk of further harm later were abused again, including dozens who were killed, the Times found.
Spanish officials said the changes announced this week were part of a planned review of the system, which was introduced in 2007. They say the software has helped police departments with limited resources protect vulnerable women and reduce the number of repeat assaults.
In the updated system, VioGén 2, the software can no longer label women as not at risk. Police must also include more information about the victim, which officials said will lead to more accurate predictions.
Other changes are aimed at improving cooperation between government agencies involved in cases of violence against women, including facilitating the sharing of information. In some cases, victims will receive personalized protection plans.
“Machismo is knocking on our doors and it’s doing it with violence unlike anything we’ve seen in a long time,” said Ana Redondo, the minister of equality, at a press conference on Wednesday. “It is not time to back down. It’s time to move on.”
Spain’s use of an algorithm to guide the treatment of sexual violence is a far-reaching example of how governments are turning to algorithms to make important public decisions, a trend that is expected to grow with the use of artificial intelligence. This program has been considered as a possible example for governments in other areas that are trying to combat violence against women.
VioGén was created with the belief that an algorithm based on a mathematical model could serve as an unbiased tool to help police find and protect women who might otherwise be missed. Yes or no questions include: Was a weapon used? Were there any economic problems? Did the attacker display controlling behavior?
Victims considered to be at high risk received more protection, including regular surveillance of their homes, access to a shelter and police monitoring of their abusers’ movements. Those with lower scores received less help.
As of November, Spain had more than 100,000 cases of women tested by VioGén, with nearly 85 percent of victims described as facing a low risk of being re-harmed by their abuser. Police in Spain are trained to override VioGén’s recommendations if the evidence warrants doing so, but The Times found that the risk score was accepted by about 95 percent at the time.
Victoria Rosell, a judge in Spain and a former government delegate who focused on issues of gender-based violence, said a period of “self-criticism” is needed for the government to improve VioGén. He said the system would be more accurate if it pulled data from more government data, including health care and education programs.
Natalia Morlas, the president of Somos Más, an organization that fights for the rights of victims, said that she welcomes the changes, which she hopes will lead to a better risk assessment by the police.
“Assessing the victim’s risk is very important because it can save lives,” said Ms. Morlas. He also added that it is important to maintain people’s vigilance in the system because the victim “must be treated by people, not by machines.”
Source link