Ethical issues

The use of apps to enhance the safety of women and girls in public spaces is not value free and raises a host of ethical issues. These are outlined below including ethical concerns around the potential for harm or misuse, individualising the problem, devolving responsibility, privacy concerns, the selling of data to third parties and inequity around access to apps.

The potential for harm or misuse

While personal safety apps may make women feel safer, it is important to note that they may not reduce their actual vulnerability to crime (Maxwell et al., 2020). Rather they may give women a false sense of security which could lead to increased risk-taking behaviour (Maxwell et al., 2020). Similarly, apps that constantly track women could lead to ‘anxiety and hypervigilance’ (Hasinoff, 2017) and the ‘normalisation of constant surveillance’ (Hasinoff, 2017). Some academics have identified the potential misuse of tracking apps which can be used for stalking or can proliferate controlling behaviours by abusers (Doria et al., 2021). Other potential harms relate to the lack of follow up care for women who are victims of sexual assault or harassment because of anonymised reporting. Ford et al., (2022) suggest that personal safety apps should be trauma-informed. A final issue related to reporting apps is the potential to profile racial and socio-economic groups (Revier, 2020). While reporting apps allow users to report hotspot areas, this as Maxwell et al., (2020) note, is based on users’ perceptions of fear rather than actual crime. Rentschler (2018) carried out an analysis of the Hollaback! Anti-street harassment app, where users report their experiences of street harassment on the app. Once a user comments on an area, others are free to use that information as they wish, so it can be re-interpreted in harmful ways (Wood et al., 2019).

Individualising the problem

Most apps tend to focus on secondary and tertiary prevention (Ison & Matthewson, 2023), which does nothing to change cultural norms or address the root causes of misogyny. Rather, they place the burden of responsibility on women to ensure their own safety and their friends and family to intervene in the event of an emergency. Violence against women and girls is complex and encompasses a range of different individual, relational, community and societal factors (CDC, 2022). Apps tend to individualise the problem, negating the relational, community and societal level responses that are required to tackle it.

Most personal safety apps focus on the use of apps in emergency situations (Eisenhut et al., 2020), however this negates the fact that violence is ‘rarely experienced as a one-time individual experience’ (Eisenhut et al., 2020: 7). While apps may provide support in an emergency and may collect evidence, apps do not necessarily prevent violence from happening in the first place (Ford et al., 2022). Some suggest that apps reinforce the idea that VAWG is inevitable and that women, or society more generally, can do nothing to stop it from happening (Bivens & Hasinoff, 2018). Others suggest that if apps are to tackle the primary causes of VAWG, they need to include an educational element (Eisenhut et al., 2020) to educate the public around gendered norms.

Who is responsible?

A related issue to individualising responsibility is a wider discussion around accountability and responsibility of personal safety. The use of apps to report hotspot areas brings up issues around whose responsibility is it to report crime (Ceccato et al., 2022). Indeed, the idea of citizen policing and technologies that crowdsource data challenge traditional hierarchies of power and raise questions around the quality of the data produced. As has been noted above, apps tend to individualise the problem of VAWG, devolving responsibility away from emergency services to friends and family to intervene in the event of an emergency. Similarly, interventions that encourage women to report abuse to transport officials, some suggest, devolve the responsibility from transport officials to women to be responsible for their own safety (Ceccato & Paz, 2017).

Privacy and use of data

Some academics have argued that women must sacrifice privacy in place of personal safety (Hasinoff, 2017). To ensure full functionality, most apps, once downloaded, request access to media files, user contacts and camera/audio (Ford et al., 2022). Some scholars have highlighted the potential privacy issues around location sharing, particularly if alarms are set off unintentionally and audio/video is shared with pre-determined contacts (Hasinoff, 2017). Related to that is the issue of reporting apps that allow users to share photographic evidence of crimes without the owners’ knowledge and consent (Ceccato, 2019), particularly if it is inaccurate.

In terms of data management, app developers are legally required to abide by GDPR legislation under the Data Protection Act 2018. However, Ford et al., (2022) found that users raised concern about the collection of personal data, particularly where it required access to photos, user contacts and social media details. While data is required to be stored safely and securely, anonymised data can be sold to third parties if users consent to this. According to GDPR legislation, app developers need to be clear about their intention to sell data when it is first collected. However, a study of 20,000 health apps on the Google Play store found that 28.1% of apps did not provide a privacy policy for users (Tangari et al., 2021). Ford et al., (2022) posit that app developers should be transparent about why access to phone features is required and that users should be able to accept a minimum level of data sharing.

Regulation of apps

While app developers and app stores now have a Voluntary Code of Practice (UK Government, 2022), the full details of how app stores vet apps have not been made public (UK Government, 2022). While some app stores provide extensive details on app permissions, others provide ‘nothing that most users would find meaningful’ (UK Government, 2022). Indeed, app stores are free to decide what information they provide to users. Even with full transparency in how apps are vetted, the burden of responsibility still falls on the user to ensure that apps are appropriate. It is important to note that in addition to official app stores like Google Play and the Apple store, there are 300 third party app stores worldwide (Businessofapp.com, 2023). Research by RiskIQ (2020) suggests that the top three stores that saw the biggest influx of apps in 2020 were all Chinese based third party app stores, which are not regulated and fall out with GDPR regulations.

Inequity of access

Ethical issues around accessibility come into play, particularly if people are digitally excluded or do not have access to a smartphone. Some suggest that technology that works by crowdsourcing data creates a divide between those that have apps and therefore information to make informed choices and those that do not (Ceccato, 2019). While most apps can be downloaded for free, many require that the user has phone credit to be able to utilise the call or text functions, which could exclude certain groups (Eisenhut et al., 2020). Indeed, some scholars have criticised the commercialisation of women’s personal safety (Hasinoff, 2017; White & McMillan, 2020), which profits from existing vulnerabilities in the population and could lead to inequity in access