Copyright 2005-2012 by Philip Kortum.  All rights reserved.
Research
Philip Kortum
Rice University

Voting System Usability
Voting is an essential activity in our democratic system in the United States. Before the Florida voting problem in 2000 (with the infamous butterfly ballot) very little research had been done to assess the usability of our nation's voting systems. We are now beginning to examine various aspects of voting systems, trying to understand how users interact with these systems, what kinds of errors they make and how we might design secure voting systems that are highly usable by the general population and those voters who may have physical disabilities. We are examining issues such as 1) the usability of smartphone-based voting systems, 2) the usability of end-to-end cryptographic voting systems, 3) audio voting interfaces to support voters with visual impairments and 4) the impact of polling station physical attributes on voter behaviors, just to name a few.

We are also currently working with voting officials in the State of Texas to develop a highly secure, easy to use, voter verifiable voting platform. Below are a few recent publications in this area.
  • Acemyan, C. Z., Kortum, P., Byrne, M. D., & Wallach, D. S. (2015). Users’ mental models for three end-to-end voting systems: Helios, Pret a Voter, and Scantegrity II. Human Aspects of Information Security, Privacy, and Trust - Lecture Notes in Computer Science, 9190, 463-474.
  • Acemyan, C. Z., Kortum, P., Byrne, M. D., & Wallach, D. (2014). Usability of voter verifiable end-to-end voting systems: Baseline data for Helios, Pret a Voter, and Scantegrity II. Journal of Election Technology and Systems, 2(3), 26-56.
  • Campbell, B., Tossell, C. C., Byrne, M. D., & Kortum, P. (2014). Towards more usable electronic voting: Testing the usability of a smartphone voting system. Human Factors, 56(5), 973-985.

Measuring the Usability of Products and Services
One of the biggest questions that arises in industry is trying to determine how usable a new product or service is before it is launched to the public. We have been collecting data using Brookes' System Usability Scale (SUS) for over a decade now, reporting on what usability scores look like across wide range of products. This data will allow usability practitioners to benchmark their results against a varied set of products and services, including telephones, television set top boxes, interactive voice services, wireless phones, PDAs and various software applications. We have also been working to develop metrics that will allow practitioners to more easily describe the results of these scores for easy dissemination across the product development teams and determine some of the factors, such as user experience and task sucess rates, that might impact those ratings. Some recent publications:
  • Kortum, P. and Bangor, A. (2015). Usability Ratings for Everyday Products Measured With the System Usability Scale (SUS). International Journal of Human Computer Interaction., 31, 518-529.
  • Bangor, A., Kortum, P. and Miller, J.A. (2009) Determining what individual SUS scores mean: adding an adjective rating scale. Journal of Usability Studies, 4(3).
  • Bangor, A., Kortum, P. and Miller, J.A. (2008) The System Usability Scale (SUS): An Empirical Evaluation. International Journal of Human-Computer Interaction, 24(6).

Human Factors of Wireless Mobile Computing
Wireless mobile computing is a relatively recent phenomenon that has gained increasing penetration in the United States. One aspect of this growth that remains unexplored is a greater understanding of how people actually use this new wireless mobile computing resource. In the recent past, users who needed internet connectivity were forced to utilize laptop computers that, while portable, couldn't really be conveniently carried everywhere you went. With the advent of the smartphone, users now have always-on access to the internat via traditional web browsing and a host of dedicated applications. We hope to gain a better understanding of how users have integrated this ubiquitious computing device into their lives and how they use it over extended periods of time. To do this we have recently completed a year long longitudinal study where we measured multiple aspects of smartphone use over a years time. Some recent publications describing this research:
  • Kortum, P., & Sorber, M. (2015). Measuring the usability of mobile applications for phones and tablets. International Journal of Human-Computer Interaction, 31, 518-529.
  • Tossell, C. C., Kortum, P., Shepard, C. W., Rahmati, A., & Zhong, L. (2015). You can lead a horse to water but you cannot make him learn: Smartphone use in higher education. British Journal of Educational Technology, 46(4), 713-724.
  • Tossell, C.C., Kortum, P., Rahmati, A., Shepard, C.W., & Zhong, L. (2012). Characterizing web use on smartphones.  Human Factors in Computing Systems: Proceedings of CHI 2012, (pp 2769-2778).  New York: Association for Computing Machinery.
  • Tossell, C.C., Kortum, P., Shepard, C.W., Rahmati, A., Barg-Walkow, L. & Zhong, L. (2012). A Longitudinal Study of Emoticon Use in Text Messaging from Smartphones. Computers in Human Behavior, 28. 659-663.

Web Use and Navigation
We are currently investigating several lines of related research in this area. The first of these is concerned with how users deal with changes to web pages. Specifically, how do users cope when navigation structures change over the course of several visits. Using behavioral and eye tracking techniques, we are trying to understand how these kinds of changes impact user performance. We are also interested in how people use the web to find specific information that is relevant to them, and how they determine the 'goodness' of that information. Towards this end, we have been examining how teen users search for and evaluate medical information on the web. Below are some recent publications in these areas:
  • Zemla, J. C., Tossell, C. C., Kortum, P., & Byrne, M. D. (2015). A Bayesian approach to predicting website revisitation on mobile phones. International Journal of Human-Computer Studies, 83, 43-50.
  • Scharff, L.V.F. & Kortum, P. (2009). When Links Change: How Additions and Deletions of Single Navigation Links Affect User Performance. Journal of Usability Studies, 5(1), 8-20. 
  • Kortum, P., Edwards, C. and Richards-Kortum, R. (2008). The Impact of Inaccurate Internet Health Information in a Secondary School Learning Environment. Journal of Medical Internet Research, 10(2): e17
  • Grier, R., Kortum, P. and Miller, J. (2006) How users view web pages: An exploration of cognitive and perceptual mechanisms. In Zaphiris, P and Kurniawan, S. (Eds) Human Computer interaction Research in Web Design and Evaluation. Hershey, PA. Idea Group






I am interested in a number of applied human factors issues,as described in these brief summaries of some of my ongoing work:
If you are  a potential graduate student, and find one of these research areas interesting, please don't hesitate to give me call to discuss them in more detail!