Our paper “S.O.S.: Does Your Search Engine Results Page (SERP) Need Help?”—co-authors are Dr. Andreas Both (Unister) and Prof. Martin Gaedke (TU Chemnitz)—has been awarded a Best Paper Honarable Mention Award by ACM SIGCHI, the Special Interest Group on Computer–Human Interaction of the Association for Computing Machinery. According to Wikipedia, ACM SIGCHI is “the world’s leading organization in Human–Computer Interaction (HCI), and essentially created and defined the field.”1 Our paper is to be presented at the 2015 edition of the CHI Conference on Human Factors in Computing Systems2, which is the premier conference in the field of HCI and takes place in Seoul, South Korea.
S.O.S., which is short for “SERP Optimization Suite”, is a tool for determining the usability of a SERP in terms of quantitative scores by analyzing user feedback and interactions. If suboptimal scores are detected for a certain factor of usability (e.g., readability), adjustments are automatically proposed based on a catalog of best practices (e.g., adjust font size, among others). The catalog contains sets of potential causes for suboptimal scores and maps them to sets of corresponding countermeasures. Determining usability scores is based on WaPPU.
Phew! It was a rather busy (that’s why I haven’t been posting in a while), but also very successful start into the new year. A total of three full papers have been accepted at various conferences and journals. So basically, I’ve been revising and resubmitting papers since Christmas.
First, our paper about Inuit has been accepted at the 4th International Conference on Design, User Experience and Usability (DUXU), which will be held as a part of HCI International 2015 in Los Angeles, California. Inuit is a new usability instrument for interfaces that has been specifically designed for our concept of Usability-based Split Testing. An instrument of this kind contains a set of observable items that are used to predict a latent (i.e., non-observable) variable—in our case, usability. For instance, a person’s intelligence is a latent variable that can only be assessed with a number of specific questions (or items). Therefore, IQ tests are instruments.
Second, an article that is an extended version of our ICWE 2014 paper about SMR has been conditionally accepted by the Journal of Web Engineering (JWE). SMR is a streaming-based system that allows for the prediction of search result relevance from user interactions. In the extended version, we further elaborate on specifics of SMR’s data processing algorithm and complexity. Also, we describe the integration of our system into a real-world industry setting.
Finally—and probably most importantly—our paper titled “S.O.S.: Does Your Search Engine Results Page (SERP) Need Help?” has been accepted at CHI 2015, which is the premier conference on human–computer interaction and will take place in Seoul! What a great success 🙂 . S.O.S. is the abbreviation for SERP Optimization Suite, which comprises two components. (1) WaPPU, which is a tool for inferring usability scores from users’ interactions with an interface. WaPPU was already presented at ICWE 2014 and has been augmented with (2) a catalog of best practices to form S.O.S. That catalog contains potential causes and corresponding countermeasures for suboptimal usability scores. An extension to WaPPU now automatically detects such suboptimal scores and proposes optimizations based on the catalog.
I am very excited about these accepted papers and definitely looking forward to presenting them to an audience of world-renowned experts. As a side note, a revised and extended version of my post titled What is ›Usability‹? has been published as a technical report in the series “Chemnitzer Informatik-Berichte” (roughly translated: “Computer Science Reports of Chemnitz University of Technology”).
So after this very successful start of the year, let’s see what else 2015 will bring. Stay tuned! 🙂