From banking and insurance to shopping these human-like computer programmes are becoming increasingly common. But many people are not keen on disclosing sensitive information, hindering their effectiveness.
As a result researchers at Kent, alongside UEA, Oxford Brookes and Cranfield University have been awarded £500,000 funding from the Engineering and Physical Sciences Research Council (EPSRC) to investigate how chatbots can be designed to become more trustworthy.
The project, dubbed PRoCEED (A Platform for Responsive Chatbot to Enhance Engagement and Disclosure), will focus on the nature of sensitive information – and how the context of the information can play a role in its perceived sensitivity.
Specifically, it will look at the use of chatbots in three key sectors – healthcare, defence and security and technology. These sectors are significant users of chatbots and all deal with potentially sensitive and personal information, as well as being areas of significant public spending. The researchers will conduct a range of experiments to better understand public perceptions of personal information, and how those perceptions relate to the classification of information.
Dr Nurse said: ‘Chatbots are becoming increasingly common on all types of websites and can offer huge benefits to businesses and their customers. However, ensuring the public trust them and are willing to share information that can help with their queries is vital. This project will look to address these issues to ensure more online users are able to use chatbot systems with confidence.’
Kent and UEA regularly work together alongside the University of Essex as part of the Eastern Arc collaboration that offers doctoral training awards in the natural and environmental sciences, in the arts and humanities, and on a range of other bilateral research relationships.