Recently UN chief António Guterres called for measures to address a ‘horrifying global surge in domestic violence’, linked to lockdowns imposed by governments responding to the COVID-19 pandemic.
Refuge, the UK charity, have reported drastic increases in website visits and helpline calls.
Can tech be made resistant to such abuse?
In preparation for a paper on how tech is used for coercive control, several experts and charity representatives were brought to discuss combatting tech-facilitated abuse.
There were some stark numbers presented in the introductions. For example, 98% of domestic violence support workers had clients who had experienced tech facilitated abuse in an Australian study.
One in five people experiences domestic abuse in their lifetime. And whilst domestic abuse is often equated with violence, there is a growing trend of less obvious abuse - patterns of domination, where an individual intimidates, controls, humiliates and restricts another. It has been compared to being taken hostage.
Understanding technology’s role
What are some of these tech-enabled methods? Spyware, abusive messages, location tracking, mirroring and control of smart home devices. An interesting example mentioned was that of the connected doorbell - they are built for safety, but they also monitor when family members are leaving home and thus can become a vector for control.
In banking, joint cardholders can’t always both use apps for making and monitoring purchases.
Engineering a solution
IBM outlined some underlying design principles that need consideration in this area. This included:
- Diversity (in the development team and for use cases).
- Privacy and choice (through easy to use active informed security settings).
- The ability to combat gaslighting (for example, by abusive parties not being able to remove evidence of events, thus leading to doubt of behaviour patterns).
- Security and data considerations (for example, GDPR could be extended to data from car apps - some journeys may be private).
The other issue raised was that of technical ability - victims may lack the ability or the motivation to master the tech. Coercive control is about power imbalance.
Perspective 1 - IoT
Dr Leonie Tanczer from UCL’s Gender and IoT research group discussed the internet of things (IoT) implications on domestic abuse. She highlighted that society frequently has predefined ideas of what a perpetrator looks like and what abuse must be.
UK figures on tech abuse are sparse, but earlier this year, the UK charity Refuge identified that 72% of their service users experiencing some form of technology-mediated coercion and control. The abuse covered is broad - ranging from excessive texting to spyware.
The current research evidence is mostly on cyberstalking, spyware, and image-based abuse (e.g. upskirting). However, emerging tech is changing the abuse landscape. There are several vectors, from smart devices in the home to connected autonomous cars. Voice control, audio and video recording, location tracking, remote control, and social media have all facilitated new ways for perpetrators to coerce and control.
With incredibly personal data such as medical information now being stored on and shared via phones as well as the prevalence of shared user accounts, we need to change our security design. For example, does cohabitation mean automatic approval from each participant to privacy and security settings on smart home devices?
Besides, machine learning is informing inferences using data gathered over a long period. However, has any tech designer thought about what victims and survivors may do if this data reflects years of abuse rather than who they really are? Can there be a ‘reset’?
IoT is frequently misused in the first two stages of abuse - while still living with the abuser and during the withdrawal period from such a relationship.
The third stage is when a victim is trying to rebuild their life. Throughout these stages, victims and survivors may ask themselves: how much does my partner know about devices I use? Is it the perpetrator who purchased and maintained these systems? If that is the case, it is often difficult for a victim to withdraw themselves from the tech abuse occurring.
However, it is essential to view IoT systems in context. A smart doorbell may very well be empowering to victims and survivors. But empowerment is about who controls the device and who is in charge. In this regard, the traditional ‘administrator’ account model is increasingly becoming a problem.
Besides, one of the core problems with IoT devices is that they are disguised. They may have additional functionality of which a user is unaware. However, Leonie really stressed this point: both the under-estimation and the overestimation of IoT’s capabilities can be dangerous, which is why we need to be very clear about what IoT systems can and cannot do and raise awareness about these technologies in the public.
Perspective 2 - coercion
Jane Keeper, Director of Services at Refuge, discussed what she called ‘intimate partner violence’ which was previously termed domestic abuse. It is easy to compartmentalise abuse, but most perpetrators do all of them. It’s coercive control. As one victim described it: ‘It is like being in an awful psychological thriller’.
The behaviour of perpetrators is the same as it ever was, but the tools are quite different. Refuge, as an organisation, transformed its response. It focused on training staff on the tech and embedding tech savvy in their approaches. For Refuge staff, there is no such thing as ‘I’m no good at tech.’ Because this sort of power imbalance creates or exacerbates these situations.
Jane Keeper’s view was that Refuge needs to stay ahead of changes, but not to educate the perpetrators! Hence, apps are not the answer - indeed, some are very poor. Refuge evaluate these approaches, but are not anti-tech. For example, they created a tech abuse chat bot - a visual, easy to use app.
But their warning is clear - there is some poor advice out there. One example of this is that some people advise wiping a phone when leaving an abusive relationship. This is poor - that data could be the evidence of a crime!
Ethical design
The workshop, run by IBM, took their design thinking approach. It was about showing not telling; focusing on user outcomes; the fact that diverse teams generate more possibilities; and that it can be good to fail fast and cheap.
IBM are planning to publish ‘Coercive Control Resistant Design Principles’ on the 15th May - the UN International Day of Families - the current situation permitting. The BCS point of view is in sympathy with IBM’s goals too. BCS is all about making IT good for society, so any vector that can be used to abuse people needs to be controlled appropriately - and domestic abuse is a huge problem, recently exacerbated by the lockdown.
Tech allows us to do many positive things, sometimes unthinkable even a decade ago, but it also has the ability to be misused if it isn’t developed thoughtfully and ethically. BCS has a focus on professionalism, a supporting code of conduct, and the desire to be a place to come for informed debate on this, and many other, vital societal issues.
IBM have published the 'Coercive Control Resistant Design Principles' as part of an IBM Policy Lab article.