Being ethical is key to building trust and making people comfortable with using digital health solutions. But without any universal or localized norms and behaviors to ensure technology’s optimum functionality for everyone, what exactly does that entail? Ollie Smith, Strategy Director & Head of Ethics at Koa Health, explores if wellbeing apps can be ethical.
When it comes to technology, the question of ethics comes up often. And in regards to technology used for health or wellbeing, more so. Despite this, it seems that there’s yet to be any general consensus on what makes a product ethical, be it among researchers and developers, governing bodies, or even companies themselves.
What we do know, however, is that consumers care deeply about not just the efficacy of a solution but also its accessibility, freedom from biases, and trustworthiness. We also know that very few people are interested in another difficult-to-use tool. Fewer still want to use a product that’s not inclusive of their unique life experiences. And nearly no one is willing to hand over their personal data to a service provider that doesn’t offer adequate protection.
At Koa Health, we’re deeply committed to behaving ethically. For more details on precisely what this means to us, you can have a look at our Ethics page, but basically, to us, that means our solutions must consistently aim to do the following:
- Improve your health and happiness
- Put you in control
- Be understandable and transparent
- Secure your data
- Be accountable
From actions…to audits
What does this mean in practice? It means we make a concerted effort to ensure a well-researched evidence base to back up our products and services. We put our users in control of their own experience as much as we can, using unbiased and understandable language. We also carefully protect personal information and privacy. And last but not least, we make ourselves accountable via public discussion of how we’re handling ethics at Koa, as well as via external audits.
This brings us to the main subject of today’s post, our latest audit, performed with diligence and professionalism by Eticas Research & Consulting. In contrast to our first ethics audit (prior to the Koa Health Spin out), this time around, the focus has been on our commercially available products, not just prototype apps.
Many of you will know that there is a big difference between being in production vs. prototyping. As such, we’re really pleased to share our good news: Eticas believes that we’ve continued to make steady progress in our ethics work.
Highlights and areas to improve
Particular highlights include our solid systems for data privacy and security and a content creation process that considers risks of bias at every stage and has so far produced over 150 activities for our Foundations app alone.
As expected, there are areas for improvement as well. Fortunately, we are already tackling some of these. For instance, we’re creating our own Fairness Playbook to ensure that we consider bias more consistently and in a more upfront way in the creation of algorithms. At the moment we put a lot of emphasis on protecting our users’ privacy without sufficiently considering when and how there can be trade-offs with fairness. We are also running more evaluations of our products, which will enable us to more fully consider both longer-term and differential impact across groups.
In particular, we admit that we’re disappointed with Eticas’ findings on the reading age of our apps. We put a lot of effort into meeting our target reading age of 11, and then Eticas came in with a more robust measure and found that our copy is nearer to a standardized reading level for 13 to 14-year-olds. Fortunately, the team at Eticas has provided us with some tools to improve our measurements. Moving forward, we’ll be able to check our content as we go and make modifications as needed to create content that’s accessible to even more readers.
When it comes to some of the other areas for improvement highlighted in this audit, we’ll need to give further thought to how best to keep building on our ethical principles as we grow. Better, easier-to-understand terms and conditions is a perennial challenge for us (and almost all digital service providers), but we’re committed to continuing to do this vital work. Meanwhile, we’re very excited about Eticas’ recommendation to look again at keeping more personal data on devices, especially when it comes to this possibility’s potential for personalization and improving bias control.
Have ideas or thoughts on how to meet these challenges? We’d welcome hearing your ideas. Reach us at support@koahealth.com.
And thanks again to Eticas for their hard work and for creating such a thorough audit. We look forward to continuing to work with the Eticas to improve. For further details, access the full audit.
Discover More
Koa Health is a digital mental healthcare provider that offers integrated mental health solutions. Our ambition is to redefine care by offering a range of personalised mental health solutions, all of which are backed by science and designed to improve user wellbeing.