805 Columbus Avenue
524 Interdisciplinary Science and Engineering Complex (ISEC)
Boston, MA 02120
ATTN: Timothy Bickmore, 435 ISEC
360 Huntington Avenue
Boston, MA 02115
- Human-computer interaction
- Dialogue systems
- Intelligent virtual agents
- Personal health informatics
- Human-robot interaction
- PhD in Media Arts & Sciences, Massachusetts Institute of Technology
- MS in Computer Science, Arizona State University
- BSE in Computer Systems Engineering, Arizona State University
Timothy W. Bickmore is a professor in the Khoury College of Computer Sciences at Northeastern University. Prior to joining Northeastern in 2005, he was an assistant professor of medicine at the Boston University School of Medicine. In 2003, he completed his PhD at the Massachusetts Institute of Technology Media Laboratory.
With his interdisciplinary approach to research, Bickmore concentrates on the intersection of human-computer interaction, natural language processing (dialogue systems), animation, and health/medical/behavioral informatics. His research focuses on the development and evaluation of computer agents that emulate face-to-face interactions between health providers and patients. His emphasis on the emotional and relational aspects of those interactions allows for his research to be suited for use in health education and long-term health behavior change interventions.
As the director of the Relational Agents Group, he works with his team to simulate face-to-face counseling with a focus on the relational aspects of those interactions. In addition, Bickmore has chaired or co-chaired several meetings, including Intelligent Virtual Agents, and AAAI symposia and CHI workshops on health informatics and virtual agents. He is an associate editor of the Interacting with Computers journal and has presented his research internationally. His work has received funding from NSF, NIH, AHRQ, HRSA, and PCORI.
Virtual Coach for Atrial Fibrillation Support
Virtual Coach for Atrial Fibrillation Support
When deployed on smartphones, virtual agents have the potential to deliver life-saving advice regarding emergency medical conditions, as well as provide a convenient channel for health education to improve the safety and efficacy of pharmacotherapy.
We are developing a smartphone-based virtual agent that provides counseling to patients with Atrial Fibrillation. Atrial Fibrillation is a highly prevalent heart rhythm disorder and is known to significantly increase the risk of stroke, heart failure and death. In this project, a virtual agent is deployed in conjunction with a smartphone-based heart rhythm monitor that lets patients obtain real-time diagnostic information on the status of their atrial fibrillation and determine whether immediate action may be needed.
This project is a collaboration with University of Pittsburgh Medical Center.
Relational Agent for Palliative Care
Relational Agent for Palliative Care
The purpose of this project is to develop a conversational agent system that counsels terminally ill patients in order to alleviate their suffering and improve their quality of life.
Although many interventions have now been developed to address palliative care for specific chronic diseases, little has been done to address the overall quality of life for older adults with serious illness, spanning not only the functional aspects of symptom and medication management, but the affective aspects of suffering. In this project, we are developing a relational agent to counsel patients at home about medication adherence, stress management, advanced care planning, and spiritual support, and to provide referrals to palliative care services when needed.
Transforming Scientific Presentations with Co-Presenter Agents
Transforming Scientific Presentations with Co-Presenter Agents
The PI’s goal in this project is to revolutionize media-assisted oral presentations in general, and STEM presentations in particular, through the use of an intelligent, autonomous, life-sized, animated co-presenter agent that collaborates with a human presenter in preparing and delivering his or her talk in front of a live audience.
Although journal and conference articles are recognized as the most formal and enduring forms of scientific communication, oral presentations are central to science because they are the means by which researchers, practitioners, the media, and the public hear about the latest findings thereby becoming engaged and inspired, and where scientific reputations are made. Yet despite decades of technological advances in computing and communication media, the fundamentals of oral scientific presentations have not advanced since software such as Microsoft’s PowerPoint was introduced in the 1980’s. The PI’s goal in this project is to revolutionize media-assisted oral presentations in general, and STEM presentations in particular, through the use of an intelligent, autonomous, life-sized, animated co-presenter agent that collaborates with a human presenter in preparing and delivering his or her talk in front of a live audience. The PI’s pilot studies have demonstrated that audiences are receptive to this concept, and that the technology is especially effective for individuals who are non-native speakers of English (which may be up to 21% of the population of the United States). Project outcomes will be initially deployed and evaluated in higher education, both as a teaching tool for delivering STEM lectures and as a training tool for students in the sciences to learn how to give more effective oral presentations (which may inspire future generations to engage in careers in the sciences).
This research will be based on a theory of human-agent collaboration, in which the human presenter is monitored using real-time speech and gesture recognition, audience feedback is also monitored, and the agent, presentation media, and human presenter (cued via an intelligent wearable teleprompter) are all dynamically choreographed to maximize audience engagement, communication, and persuasion. The project will make fundamental, theoretical contributions to models of real-time human-agent collaboration and communication. It will explore how humans and agents can work together to communicate effectively with a heterogeneous audience using speech, gesture, and a variety of presentation media, amplifying the abilities of scientist-orators who would otherwise be “flying solo.” The work will advance both artificial intelligence and computational linguistics, by extending dialogue systems to encompass mixed-initiative, multi-party conversations among co-presenters and their audience. It will impact the state of the art in virtual agents, by advancing the dynamic generation of hand gestures, prosody, and proxemics for effective public speaking and turn-taking. And it will also contribute to the field of human-computer interaction, by developing new methods for human presenters to interact with autonomous co-presenter agents and their presentation media, including approaches to cueing human presenters effectively using wearable user interfaces.
Utami, Dina and Timothy W. Bickmore. “Collaborative User Responses in Multiparty Interaction with a Couples Counselor Robot.” 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (2019): 294-303.
Intimate relationships are integral parts of human societies, yet many relationships are in distress. Couples counseling has been shown to be effective in preventing and alleviating relationship distress, yet many couples do not seek professional help, due to cost, logistic, and discomfort in disclosing private problems. In this paper, we describe our efforts towards the development of a fully automated couples counselor robot, and focus specifically on the problem of identifying and processing “collaborative responses”, in which a human couple co-construct a response to a query from the robot. We present an analysis of collaborative responses obtained from a pilot study, then develop a data-driven model to detect end of collaborative responses for regulating turn taking during a counseling session. Our model uses a combination of multimodal features, and achieves an offline weighted F-score of 0.81. Finally, we present findings from a quasi-experimental study with a robot facilitating a counseling session to promote intimacy with romantic couples. Our findings suggest that the session improves couples intimacy and positive affect. An online evaluation of the end-of-collaborative-response model demonstrates an F-score of 0.72.
Wang, C, Bickmore, T, Bowen, D, Norkunas, T, Campion, M, Cabral, H, Winter, M, Paasche-Orlow, M, "Acceptability and feasibility of a virtual counselor (VICKY) to collect family health histories", Genetics in Medicine, 17:822-830 (2015)
To overcome literacy-related barriers in the collection of electronic family health histories, we developed an animated Virtual Counselor for Knowing your Family History, or VICKY. This study examined the acceptability and accuracy of using VICKY to collect family histories from underserved patients as compared with My Family Health Portrait (MFHP).
Participants were recruited from a patient registry at a safety net hospital and randomized to use either VICKY or MFHP. Accuracy was determined by comparing tool-collected histories with those obtained by a genetic counselor.
A total of 70 participants completed this study. Participants rated VICKY as easy to use (91%) and easy to follow (92%), would recommend VICKY to others (83%), and were highly satisfied (77%). VICKY identified 86% of first-degree relatives and 42% of second-degree relatives; combined accuracy was 55%. As compared with MFHP, VICKY identified a greater number of health conditions overall (49% with VICKY vs. 31% with MFHP; incidence rate ratio (IRR): 1.59; 95% confidence interval (95% CI): 1.13-2.25; P = 0.008), in particular, hypertension (47 vs. 15%; IRR: 3.18; 95% CI: 1.66-6.10; P = 0.001) and type 2 diabetes (54 vs. 22%; IRR: 2.47; 95% CI: 1.33-4.60; P = 0.004).
These results demonstrate that technological support for documenting family history risks can be highly accepted, feasible, and effective.
Utami, D., Bickmore, T., Nikolopoulou, A., and Paasche-Orlow, M. (2017) "Talk About Death: End of Life Planning with a Virtual Agent" International Conference on Intelligent Virtual Agents (IVA)
For those nearing the end of life, “wellness” must encompass reduction in suffering as well as the promotion of behaviors that mitigate stress and help people prepare for death. We discuss the design of a virtual conversational palliative care coach that works with individuals during their last year of life to help them manage symptoms, reduce stress, identify and address unmet spiritual needs, and support advance care planning. We present the results of an experiment that features the reactions of older adults in discussing these topics with a virtual agent, and note the importance of discussing spiritual needs in the context of end-of-life conversations. We find that all participants are comfortable discussing these topics with an agent, and that their discussion leads to reductions in state and death anxiety, as well as significant increase in intent to create a last will and testament.
Utami, D., Bickmore, T., Kruger, L. (2017) "A Robotic Couples Counselor for Promoting Positive Communication" IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN)
Intimate relationships are crucially important in all human societies, yet many relationships are in some degree of distress. Couple psychotherapy has been demonstrated to be effective at reducing relationship distress, yet most couples do not seek help from professionals. Automated couples counselors could provide help to many couples who avoid professional help due to cost, logistics, or discomfort disclosing personal problems. We explore reactions to and acceptance of a humanoid robot that takes the role of a couples counselor in promoting positive communication skills among asymptomatic intimate couples. Couples were comfortable with the robot in this role; displaying intimate behavior during the counseling session. They followed the directions of the robot in practicing interpersonal communication skills, were largely satisfied with the experience, and described several advantages to working with a robot compared to human counselors or self-help materials.
Asadi, R., Fell, H. J., Bickmore, T., & Trinh, H. (2016). Real-Time Presentation Tracking Using Semantic Keyword Spotting. Interspeech 2016, 3081-3085.
Given presentation slides with detailed written speaking notes, automatic tracking of oral presentations can help speakers ensure they cover their planned content, and can reduce their anxiety during the speech. Tracking is a more complex problem than speech-to-text alignment, since presenters rarely follow their exact presentation notes, and it must be performed in realtime. In this paper, we propose a novel system that can track the current degree of coverage of each slide’s contents. To do this, the presentation notes for each slide are segmented into sentences, and the words are filtered into keyword candidates. These candidates are then scored based on word specificity and semantic similarity measures to find the most useful keywords for the tracking task. Real-time automatic speech recognition results are matched against the keywords and their synonyms. Sentences are scored based on detected keywords, and the ones with scores higher than a threshold are tagged as covered. We manually and automatically annotated 150 slide presentation recordings to evaluate the system. A simple tracking method, matching speech recognition results against the notes, was used as the baseline. The results show that our approach led to higher accuracy measures compared to the baseline method.
Trinh, H., Asadi, R., Edge, D., & Bickmore, T. (2017). RoboCOP: A Robotic Coach for Oral Presentations. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies.
Rehearsing in front of a live audience is invaluable when preparing for important presentations. However, not all presenters take the opportunity to engage in such rehearsal, due to time constraints, availability of listeners who can provide constructive feedback, or public speaking anxiety. We present RoboCOP, an automated anthropomorphic robot head that acts as a coach to provide spoken feedback during presentation rehearsals at both the individual slide and overall presentation level. The robot offers conversational coaching on three key aspects of presentations: speech quality, content coverage, and audience orientation. The design of the feedback strategies was informed by findings from an exploratory study with academic professionals who were experienced in mentoring students on their presentations. In a within-subjects study comparing RoboCOP to visual feedback and spoken feedback without a robot, the robotic coach was shown to lead to significant improvement in the overall experience of presenters. Results of a second within-subjects evaluation study comparing RoboCOP with existing rehearsal practices show that our system creates a natural, interactive, and motivating rehearsal environment that leads to improved presentation quality.
Bickmore, T., Trinh, H., Hoppmann, M., & Asadi, R. (2016, September). Virtual Agents in the Classroom: Experience Fielding a Co-presenter Agent in University Courses. In International Conference on Intelligent Virtual Agents (pp. 154-163). Springer International Publishing.
The design of a conversational virtual agent that assists professors and students in giving in-class oral presentations is described, along with preliminary evaluation results. The life-sized agent is integrated with PowerPoint presentation software and can deliver presentations in conjunction with a human presenter using appropriate verbal and nonverbal behavior. Results from evaluation studies in two courses—business and professional speaking, and computer science research methods—indicate that the agent is widely accepted in the classroom by students, and can serve to increase engagement in presentations given both by professors and students.
A Feasibility Study to introduce an Embodied Conversational Agent (ECA) on a tablet computer into a group medical visit
A Feasibility Study to Introduce an Embodied Conversational Agent on a Tablet Computer into an Integrative Medicine Group Visit American Public Health Association (APHA) Annual Meeting Mccue, K, Shamekhi, A, Bickmore, T, Crooks, D, Barnett, K, Haas, N, Johnson, G, Gardiner, P.
The purpose of this pilot study is to evaluate the feasibility of introducing a tablet computer with an Embodied Conversational Agent (ECA) into an integrative medical group visit (IMGV) for patients with chronic pain and depression.
This prospective observational cohort study enrolled 20 participants who were attending an integrative medicine group visit. Patients attended a 9-session integrative medical group visit and received a tablet computer with an Embodied Conversational Agent (ECA). Participants were encouraged to interact with the ECA between groups at home. Participants completed questionnaires at baseline and 9 weeks. We recorded socio-demographics and feasibility outcomes including ECA helpfulness, satisfaction with the ECA, areas of interest for the ECA to discuss (nutrition and stress) and what could be improved about the ECA.
All participants receive their primary care in inner city outpatient clinics; the average age of participants was 47; 13 participants identified as African American and 3 as Latino; 16 participants with an annual income under $30,000; 12 participants were on disability. Of the participants who completed surveys, 100% reported they used the ECA’s suggestions to reduce stress; 89% used the ECA’s suggestions to eat healthier; 67% said they were extremely confident they could continue to use the ECA’s recommendations post-study. Sixty seven percent said it was easy to talk with the ECA; 78% said they trusted the ECA very much; 44% said they would prefer the ECA over speaking with a clinician; and 89% said they would definitely recommend the ECA to a friend. Emerging themes included participants’ feeling like the ECA was a friend and someone to talk and relate to, ability to use the ECA whenever they wanted (accessibility), ability to dive deeper into curriculum at their own pace and review material with the ECA when needed.
It is feasible to introduce an ECA into a 9-week IMGV program for an underserved patient population with chronic pain and depression.
Addressing Loneliness and Isolation in Older Adults: Proactive Affective Agents Provide Better Support
Addressing Loneliness and Isolation in Older Adults: Proactive Affective Agents Provide Better Support, L Ring, B Barry, K Totzke, T Bickmore Affective Computing and Intelligent Interaction (ACII), 2013 Humaine ...
Loneliness and social isolation are significant problems in older adult populations. We describe a conversational agent-based system designed to provide longitudinal social support to isolated older adults. Results from an exploratory pilot study indicate that when the agent proactively draws elders into interactions, it is more effective at addressing loneliness than when the agent passively relies upon elders to initiate interactions. We discuss future research opportunities for affective computing to address this important societal problem.
Zhou, S., Gali, R., Paasche-Orlow, M., & Bickmore, T. W. (2014, April). Afraid to ask: proactive assistance with healthcare documents using eye tracking. In CHI'14 Extended Abstracts on Human Factors in Computing Systems (pp. 1669-1674). ACM.
We investigate gaze patterns and other nonverbal behavior that people use when providing and receiving explanations of complex healthcare documents, and use a model of this behavior as the basis of a system that provides automated, proactive assistance. We present the results of the human analog study along with results from a preliminary evaluation of the automated system. We also demonstrate the feasibility of using eye tracking to automatically assess the health literacy of people reading healthcare documents.
Zhou, S., Bickmore, T., Paasche-Orlow, M., & Jack, B. (2014, August). Agent-user concordance and satisfaction with a virtual hospital discharge nurse. In Intelligent Virtual Agents (pp. 528-541). Springer International Publishing.
User attitudes towards a hospital virtual nurse agent are described, as evaluated in a randomized clinical trial involving 764 hospital patients. Patients talked to the agent for an average of 29 minutes while in their hospital beds, receiving their customized hospital discharge instructions from the agent and a printed booklet. Patients reported very high levels of satisfaction with and trust in the nurse agent, preferred receiving their discharge instructions from the agent over their human doctors and nurses, and found the system very easy to use. Perceived similarity to the agent was a significant determiner of liking, trust, desire to continue, and working alliance, although perceived similarity was unrelated to racial concordance between patients and the agent.
Breathe with Me: A Virtual Meditation Coach Intelligent Virtual Agents conference (IVA) Shamekhi, A, Bickmore, T (2015)
A virtual agent that guides users through mindfulness meditation sessions is described. The agent uses input from a respiration sensor to both respond to user breathing rate and use deep breaths as a continuation and acknowledgment signal. A pilot evaluation study comparing the agent to a selfhelp video indicates that users are very receptive to the virtual meditation coach, and that it is more effective at reducing anxiety and increasing mindfulness and flow state compared to the video.
Empathic touch by relational agents, TW Bickmore, R Fernando, L Ring, D Schulman Affective Computing, IEEE Transactions on 1 (1), 60-71
We describe a series of experiments with an agent designed to model human conversational touch—capable of physically touching users in synchrony with speech and other nonverbal communicative behavior—and its use in expressing empathy to users in distress. The agent is comprised of an animated human face that is displayed on a monitor affixed to the top of a human mannequin, with touch conveyed by an air bladder that squeezes a user’s hand. We demonstrate that when touch is used alone, hand squeeze pressure and number of squeezes are associated with user perceptions of affect arousal conveyed by an agent while number of squeezes and squeeze duration are associated with affect valence. We also show that when affect-relevant cues are present in facial display, speech prosody, and touch used simultaneously by the agent, that facial display dominates user perceptions of affect valence, and facial display and prosody are associated with affect arousal, while touch had little effect. Finally, we show that when touch is used in the context of an empathic, comforting interaction (but without the manipulation of affect cues in other modalities), it can lead to better perceptions of relationship with the agent, but only for users who are comfortable being touched by other people.
Perceived Organizational Affiliation and Its Effects on Patient Trust: Role Modeling with Embodied Conversational Agents
Zhang, Z., Bickmore, T, and Paasche-Orlow, M (2017) "Perceived Organizational Affiliation and Its Effects on Patient Trust: Role Modeling with Embodied Conversational Agents" Patient Education and Counseling (journal)
Improving Access to Online Health Information With Conversational Agents: A Randomized Controlled Experiment
Bickmore, T, Utami, D, Matsuyama, R, and Paasche-Orlow, M, "Improving Access to Online Health Information with Conversational Agents: A Randomized Controlled Experiment", Journal of Medical Internet Research (2016)
Conventional Web-based search engines may be unusable by individuals with low health literacy for finding health-related information, thus precluding their use by this population.
We describe a conversational search engine interface designed to allow individuals with low health and computer literacy identify and learn about clinical trials on the Internet.
A randomized trial involving 89 participants compared the conversational search engine interface (n=43) to the existing conventional keyword- and facet-based search engine interface (n=46) for the National Cancer Institute Clinical Trials database. Each participant performed 2 tasks: finding a clinical trial for themselves and finding a trial that met prespecified criteria.
Results indicated that all participants were more satisfied with the conversational interface based on 7-point self-reported satisfaction ratings (task 1: mean 4.9, SD 1.8 vs mean 3.2, SD 1.8, P<.001; task 2: mean 4.8, SD 1.9 vs mean 3.2, SD 1.7, P<.001) compared to the conventional Web form-based interface. All participants also rated the trials they found as better meeting their search criteria, based on 7-point self-reported scales (task 1: mean 3.7, SD 1.6 vs mean 2.7, SD 1.8, P=.01; task 2: mean 4.8, SD 1.7 vs mean 3.4, SD 1.9, P<.01). Participants with low health literacy failed to find any trials that satisfied the prespecified criteria for task 2 using the conventional search engine interface, whereas 36% (5/14) were successful at this task using the conversational interface (P=.05).
Conversational agents can be used to improve accessibility to Web-based searches in general and clinical trials in particular, and can help decrease recruitment bias against disadvantaged populations.
5. Kimani, E., Bickmore, T., Trinh, H., Ring, L., Paasche-Orlow, M. K., and Magnani, J. W. 2016. A smartphone-based virtual agent for atrial fibrillation education and counseling. 16th International Conference on Intelligent Virtual Agents, 120-127. Springer
When deployed on smartphones, virtual agents have the potential to deliver life-saving advice regarding emergency medical conditions, as well as provide a convenient channel for health education to help improve the safety and efficacy of pharmacotherapy. This paper describes the use of a smartphone-based virtual agent that provides counseling to patients with Atrial Fibrillation, along with the results from a pilot acceptance study among patients with the condition. Atrial Fibrillation is a highly prevalent heart rhythm disorder and is known to significantly increase the risk of stroke, heart failure and death. In this study, a virtual agent is deployed in conjunction with a smartphone-based heart rhythm monitor that lets patients obtain real-time diagnostic information on the status of their atrial fibrillation and determine whether immediate action may be needed. The results of the study indicate that participants are satisfied with receiving information about Atrial Fibrillation via the virtual agent.
Trinh, H., Ring, L. and Bickmore, T., 2015. DynamicDuo: Co-presenting with Virtual Agents. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15), (pp. 1739-1748). ACM.
The quality of most professional oral presentations is often poor, owing to a number of factors, including public speaking anxiety. We present DynamicDuo, a system that uses an automated, life-sized, animated agent to help inexperienced speakers deliver their presentations in front of an audience. The design of the system was informed by an analysis of TED talks given by two human presenters to identify the most common dual-presentation formats and transition behaviors used. In a within-subjects study (N=12) comparing co-presenting with DynamicDuo against solo-presenting with conventional presentation software, we demonstrated that our system led to significant improvements in public speaking anxiety and speaking confidence for non-native English speakers. Judges who viewed videotapes of these presentations rated those with DynamicDuo significantly higher on speech quality and overall presentation quality for all presenters.
Bickmore, Timothy, et al. "Automated Explanation of Research Informed Consent by Virtual Agents." Intelligent Virtual Agents. Springer International Publishing, 2015.
A virtual agent that explains research informed consent documents to study volunteers is described, along with a series of development efforts and evaluation studies. A study of nurse administration of informed consent finds that human explanations follow the structure of the document, and that much of information provided verbally is not contained in the document at all. A study of pedagogical strategies used by a virtual consent agent finds that automatic tailoring of document content based on users’ knowledge receives the highest ratings of satisfaction compared to two control conditions that provided fixed amounts of information. We finally report on an approach that lets clinicians construct their own virtual agents for informed consent, along with a study that finds that nurses are able to use the system to develop and extend agents to explain their own study consent forms.
Bickmore, Timothy, et al. "Automated Explanation of Research Informed Consent by Virtual Agents." Intelligent Virtual Agents. Springer International Publishing, 2015.
Ring, L., Shi, L., Totzke, K., Bickmore, T. (2014) "Social Support Agents for Older Adults: Longitudinal Affective Computing in the Home", Journal on Multimodal User Interfaces
Loneliness and social isolation are significant problems in older adult populations. We describe the design of a multimodal conversational agent-based system designed to provide longitudinal social support to isolated older adults. Results from a requirements analysis study and a remote “Wizard-of-Oz” study are presented that inform the design of the autonomous social support agent. An exploratory pilot study was conducted in which the agent was placed in the homes of 14 older adults for a week. Results indicate high levels of acceptance and satisfaction of the system. Results also indicate that when the agent proactively draws elders into interactions, triggered by a motion sensor, it is more effective at addressing loneliness than when the agent passively relies upon elders to initiate interactions. We discuss future research opportunities for affective computing to address this important societal problem.
Utami, Dina, et al. "A Conversational Agent-based Clinical Trial Search Engine." Proceedings of the Annual Symposium on Human-Computer Interaction and Information Retrieval (HCIR), Vancouver, BC, Canada, October. 2013.
The design and evaluation of a web-based search engine for cancer clinical trials is described. The search task is framed as a conversation with an animated agent in order to make it accessible to individuals with low health and computer literacy. Preliminary evaluation comparing the agent to a conventional keyword-based search interface indicates that the agent is at least as effective as the conventional interface, and users are significantly more satisfied with it.
Ring, L., Bickmore, T.W., & Schulman, D. (2012). Longitudinal Affective Computing - Virtual Agents That Respond to User Mood. IVA.
We present two empirical studies which examine user mood in long-term interaction with virtual conversational agents. The first study finds evidence for mood as a longitudinal construct independent of momentary affect and demonstrates that mood can be reliably identified by human judges observing user-agent interactions. The second study demonstrates that mood is an important consideration for virtual agents designed to persuade users, by showing that favors are more persuasive than direct requests when users are in negative moods, while direct requests are more persuasive for users in positive moods.
Designing relational agents as long term social companions for older adults, LP Vardoulakis, L Ring, B Barry, CL Sidner, T Bickmore Intelligent virtual agents, 289-302
Older adults with strong social connections are at a reduced risk for health problems and mortality. We describe two field studies to inform the development of a virtual agent designed to provide long-term, continuous social support to isolated older adults. Findings include the topics that older adults would like to discuss with a companion agent, in addition to overall reactions to interacting with a remote-controlled companion agent installed in their home for a week. Results indicate a generally positive attitude towards companion agents and a rich research agenda for virtual companion agents.