The Search Interaction Optimization Toolkit – The Essence of my PhD Thesis

SIO Toolkit Logo
Logo of the SIO Toolkit.

My PhD thesis introduces a novel methodology that is named Search Interaction Optimization (SIO) and is used for designing, evaluating and optimizing search engine results pages (so-called SERPs). As a proof-of-concept of this new methodology, I’ve developed a corresponding SIO toolkit, which comprises a total of seven components1 (most of which have already been introduced in previous posts):

  1. Inuit, a new instrument for usability evalutation;
  2. WaPPU, a tool for Usability-based Split Testing;
  3. a catalog of best practices for creating better usable SERPs, which together with WaPPU and a special add-on forms
  4. S.O.S., a tool for automatically evaluating and optimizing SERPs;
  5. TellMyRelevance! (TMR), a novel pipeline that predicts the relevance of search results from client-side interactions;
  6. StreamMyRelevance! (SMR), a streaming-based version of TMR that works in real-time rather than batch-wise; and
  7. a set of requirements for current & future search interfaces, which has been derived from an empirical study with German-speaking users.
SIO Methodology Logo
Logo of the SIO Methodology.

Describing the design and development of the above components and evaluating their effectiveness and feasibility makes for a major part of my thesis. Now, I’ve finally managed to organize all of them in terms of GitHub repos2, which I make available through a new website I have specifically created for my PhD project: In particular, on that site you can filter the components depending on whether you want to design, evaluate and/or optimize a SERP. It also lists all of the related publications including links to the corresponding full texts (via ResearchGate). In case you are actually interested in all that fancy research stuff3—have fun browsing, reading & playing around! 🙂

1 The logo of the SIO toolkit features only six tiles because S.O.S. and the catalog of best practices are treated as one component there.
2 Because my PhD project was carried out in cooperation with Unister GmbH (Leipzig), unfortunately it’s not possible for me to provide the source codes of all components via GitHub, as some contain company secrets.
3 Which I doubt. 😉


S.O.S. Receives Best Paper Honarable Mention Award at CHI ’15

TrophyOur paper “S.O.S.: Does Your Search Engine Results Page (SERP) Need Help?”—co-authors are Dr. Andreas Both (Unister) and Prof. Martin Gaedke (TU Chemnitz)—has been awarded a Best Paper Honarable Mention Award by ACM SIGCHI, the Special Interest Group on Computer–Human Interaction of the Association for Computing Machinery. According to Wikipedia, ACM SIGCHI is “the world’s leading organization in Human–Computer Interaction (HCI), and essentially created and defined the field.”1 Our paper is to be presented at the 2015 edition of the CHI Conference on Human Factors in Computing Systems2, which is the premier conference in the field of HCI and takes place in Seoul, South Korea.

S.O.S., which is short for “SERP Optimization Suite”, is a tool for determining the usability of a SERP in terms of quantitative scores by analyzing user feedback and interactions. If suboptimal scores are detected for a certain factor of usability (e.g., readability), adjustments are automatically proposed based on a catalog of best practices (e.g., adjust font size, among others). The catalog contains sets of potential causes for suboptimal scores and maps them to sets of corresponding countermeasures. Determining usability scores is based on WaPPU.

S.O.S.’s GitHub repository can be found at It’s free for non-commercial use. Resources and results of the evaluation we describe in our paper are available at

(CC BY trophy icon by