The Design Philosophy Behind My New Website

TL;DR: After analyzing my old website, I decided to put more emphasis on (1) identifying and highlighting the pieces of information that are actually useful and (2) a two-dimensional approach to displaying my CV along traditional categories and skills/topics. Moreover, I set myself design constraints that forced me to keep my new website as clean and simple as possible, following the design philosophies of brutalism and Mies van der Rohe.

My new website has been up and running for a few weeks now and I suppose it might seem unusual to some of you. Therefore, in this post, I want to explain how I ended up with what I did. What motivated me to create a new website was the fact that OpenShift v2 was shut down by Red Hat and the new version of their hosting service was so unsatisfying that I decided to move to Github pages instead. This, however, also meant that I could not build on a Node server anymore, so I took the opportunity to start over with a blank slate and create something completely different.

Core Questions

As a first step, I analyzed my old website, and I noticed two things. First, it was more or less structured exactly like my CV, which is, to be honest, not the most creative way to lay out an online presence. Second, my publications were listed in a format that I would also use in the reference section of an actual research paper. However, that format was only really understandable and readable for scientists, which—as I could safely assume—excluded a certain amount of my visitors. After all, the only necessary pieces of information are the title (plus a link for those who want to have a closer look), the list of authors, and which topics the paper is actually about. From a title like “Ensuring Web Interface Quality through Usability-based Split Testing” one may conclude that I’ve done work on usability, but how would you know that this publication also addresses analytics and machine learning without reading it? The acronym of the conference where it was published or the CORE rating of that conference certainly don’t help the average user.

MaxSpeicher.com v1

Therefore, the two central questions that informed the design of my new website were:

  • How can I present my work and skills in a better, more useful, and more memorable way than the standard paper CV structure—projects, work, education, etc.?
  • How can I more effectively communicate to people the topics I actually worked on and which skills I acquired rather than just telling them that I was an “Intern R&D at Unister”?

Design Constraints

Additionally, I set myself the following design constraints to ensure I had to come up with something completely different and create a novel, unusual, and more memorable experience. The main drivers for these constraints were my love for minimalism and a desire to prevent unnecessary overhead as much as possible.

  • Make it brutalist. Originally, brutalism was an architectural movement known for “its ruggedness and lack of concern to look comfortable or easy”. Accordingly, brutalist websites have a rough, simple, and unfinished look. They are almost nihilistic towards user experience.
  • Make it as simple as possible. I chose this constraint in accordance with Mies van der Rohe’s architectural philosophy of “extreme clarity and simplicity”. He made his buildings purely functional and refrained from using anything ornamental.
  • Don’t use icons or images. Standalone icons are a bad idea in most of the cases anyway (as this nice summary explains). This led me to using smaller text with a solid, edgy border in places where I used icons for social media sites and the different sections on my old website.
  • Use only one primary color. The rest must be kept in black, white, and gray.
  • Work with typography as much as possible. Apart from the one primary color and white space, I tried to use only different font sizes and weights for structuring and highlighting information.

Finally, I came up with and realized a concept that is largely based on hashtags to communicate my skills and the topics I’ve worked on. Every CV entry on my website—be it a university degree, a job, or a publication—is annotated with a set of such tags. The entry about “Ensuring Web Interface Quality through Usability-based Split Testing” now tells the visitor that the paper is about #analytics and #machine learning, among other things. At the top of the page, I feature a list of skills that enables users to filter the page and hide everything that’s not related to a specific skill or topic they’re interested in. Moreover, I chose a two-dimensional approach to presenting my CV. That is, the visitor has the chance to display it either according to the traditional structure of a CV, or grouped by skill/topic. In the latter case, all CV entries that feature a certain tag are displayed in the corresponding section to be able to view them at a glance.

Technology-wise, my new website is based on standard web technologies, Less, and gulp.

MaxSpeicher.com v2

Why I Don’t User Test My New Website

I asked some friends to have a look at what I created and they immediately came up with the idea of having a sticky menu—as they are used to from other websites—so that they wouldn’t have to go back to the top of the page (using the #top button in the bottom right corner) when they wanted to change the current filter or the way the CV is displayed. I started implementing this supposed improvement yesterday, but became more and more dissatisfied the more I progressed. While it would have made the website slightly more usable (sparing users a click from time to time), a sticky menu would violate the constraints that define my design and would make my website less unique in my opinion. After all, the average user prefers being confronted with and using things they already know. Therefore, I abandoned the idea and did not deploy the changes.

In this sense, the design of my new website is clearly not 100% user-centered, but rather an experimental piece of art.

Did I manage to intrigue you? Feel free to have a look at http://www.maxspeicher.com/v2.

Advertisements

How to Data Analytics (in a Start-up)

3 Lessons I Learned as the “Chief Data Analyst” of a Silicon Valley–funded Start-up

From 2015 till 2017 I helped grow HoloBuilder Inc., a start-up providing virtual reality solutions for the construction industry, as their VP of Customer & Data Analytics & Optimization, which roughly translates to “Chief Data Analyst”. The company is headquartered in San Francisco while I was a part of their R&D lab in Aachen, Germany. I was responsible for the whole data analytics* pipeline — from collecting data on the web platform using Google Analytics and own trackers to processing the data in Google BigQuery and visualizing it using tools like Power BI and Klipfolio. During my time in Aachen I learned lots of valuable lessons. Here, I want to share with you the three most important ones that are directly concerned with data analytics (please scroll down for a TL;DR).

How to Data Analytics

1. Data Analytics ≈ UX Design

Data analytics is a lot like UX design. You have specific target audiences that expect to experience what you provide them with in the most optimal way — concerning content, presentation, and possibly interactivity. For instance, providing data for C-level management and for potential investors are two completely different stories. While management requires low-level insights concerning the software itself, among other things, for VCs we usually prepared more high-level business metrics including projections and forecasts, due to the different requirements. Moreover, internal data would usually be provided through dynamic dashboards that could be adjusted and customized while data for investors would rather be delivered in the form of PowerPoint slides that matched the layout of the pitch deck. Therefore, it is crucial to have the definition of a target audience (potentially even personas) and a requirements elicitation from that audience at the beginning of every data science process. At HoloBuilder Inc., this lesson became especially clear because of the split between San Francisco and Germany and the fact that most of the (potential) VCs were residing in the Silicon Valley.

I am convinced that a data analyst without some proper UX skills — and, of course, adequate requirements and input — cannot be successful.

2. Ask the Right Questions — and Do So early

This one goes hand in hand with requirements elicitation. Don’t provide analyses just for the sake of it!

This whole “let’s just analyze everything we can get” thing doesn’t work! It’s extremely important to define the questions you intend to answer beforehand. Tracking is cheap, so you can (and should!) track more than you need (at the moment). But the processing and visualization of data that nobody ever looks at eats up a whole lot of resources that would be required for the meaningful analysis and presentation of the few nuggets that are buried in your giant pile of big data. Also, having concrete questions in mind greatly helps with tailoring data structures more precisely to your specific needs. Of course, this doesn’t mean that your infrastructure doesn’t have to be flexible enough to quickly react to changing and new questions that need to be answered. In an optimal world, the data for answering new questions is already there and you “just” have to do the processing and visualization. In general: Expect surprise on-demand questions anytime! Therefore, anticipate and be prepared!

(While the questions that need to be answered can be seen as part of the requirements elicitation, I treat them separately here, because I give requirements a more technical connotation — e.g., “possibility to toggle between line/bar charts” or “include difference to previous period in %” — compared to key questions such as “Why do we lose users?”.)

3. Data is Meaningless …

… unless you give it meaning by interpreting it. For this, it’s inherently important to not think in silos. A data analytics team has to closely cooperate with the UX team and (almost) all other teams in the company in order to find meaningful interpretations or reasons for the collected data. Yet, this is still not the norm in industry. For instance, there is still the widely believed misconception that A/B testing = usability testing.

To ensure meaningful data analytics, at HoloBuilder Inc., marketing manager Harry Handorf and I developed a boilerplate for a weekly KPI report that posed three crucial questions:

  1. Which data did we collect?
  2. What are the reasons the data looks like that?
  3. What actions must/should be taken based on the above?

That is, the first part delivered the hard facts; the second part explained these numbers (e.g., less sign-ups due to change in UI); and the third part presented concrete calls to action (e.g., undo UI change). The report looked at those questions from the platform as well as the marketing perspective. Therefore, we had to extensively collaborate with software engineers, designers, UX people, marketing and sales to find meaningful answers. According to the second learning above, the basis of the report always were higher-level questions defined beforehand, such as: “Does the new tutorial work?”, “How can we gain more customers?”, and “Have we reached our target growth?”. In general, the interpretation of data is based on the processed data and the questions to be answered, rather than on technical requirements (see infographic above).

Again, because this is really important: Your data is worth nothing without proper interpretation and input from outside the data analytics department.

Ultimately, to conclude this article, I don’t want to withhold from you Harry’s take on the topic:

You might have heard of the metaphor for life feeling like a tornado. It perfectly applies to working with data of a young business — it spins you around with all of its metrics, data points and things you COULD measure. It’s noisy and wild. A good data scientist figures out how to step out of it. But that does not mean getting out of the tornado completely, letting it do its thing and becoming a passive spectator. It means getting inward, to the eye. Where silence and clarity allow for a better picture of what’s going on around you, defining appropriate KPIs and asking the right, well thought-out questions.”
—Harry Handorf (tornado tamer)

TL;DR

  1. Data analytics is a lot like UX design! As a data analyst, you have to define target audiences and elicit requirements. Tailor content & presentation of your analyses to those.
  2. Define the questions to be answered beforehand, then process and interpret the data necessary to answer those questions. Don’t analyze everything you can just for the sake of it.
  3. Data is meaningless without interpretation. Extensively collaborate with other departments — especially UX — to ensure meaningful data analytics.

(This article has also been published in Startups.co on Medium.)

Footnotes

* What we did at HoloBuilder Inc. was clearly a mix of data analytics and data science. But since it was closer to the analytics part, I refer to it as data analytics in this article. In case you are interested in the specific differences between the two (and how difficult it is to tell them apart), I recommend reading the Wikipedia articles about data science and data analytics, as well as “Data Analytics vs Data Science: Two Separate, but Interconnected Disciplines” by Jerry A. Smith.

Acknowledgments

Special thanks go to Harry for proofreading the article & his valuable input.

Schrödinger’s Website

Before you receive feedback from users, the user experience and usability of your website are both ‘good’ and ‘bad’ at the same time.* Through a good design process you can only raise the chances of user experience and usability manifesting as ‘good’** once the feedback arrives.

* That is, the factors x and y of a linear combination U = x\ \mathrm{good} + y\ \mathrm{bad} with x + y = 1 are unknown.
** Subject to definition. For instance, x > y or x \geq \frac{2}{3}.

The U Score: Redesigning Usability Testing

logo-bigUsability testing is often perceived as cumbersome and time-consuming and therefore not thoroughly applied. This was one of the key observations leading to the topic of my PhD thesis. Particularly conducting tests with actual users is often omitted, which results in the release of suboptimal products and websites. In my thesis, I tackle this problem through more automatic evaluation and optimization, however, in the specific context of search engines. Yet, every type of website—no matter if private or professional—should undergo at least one usability test before its release. Therefore, we need to redesign usability testing itself:

  • It must be quicker.
  • It must be cheaper.
  • It must be easier to understand.
  • Still, the result must be as precise as possible.

the-u-score_screenshot

The U Score is a more general derivative of the findings of my PhD project that provides quick and precise usability evaluation for everyone based on actual research. Any designer or developer who isn’t able to conduct a regular usability test can answer a minimal but exhaustive set of yes/no questions and receives a single usability score for their website or web app. The questions have been designed to be as objective as possible and are based on established research findings. Also, for time reasons I try to minimize the need to involve other people, which, however, cannot be completely eliminated (still, you can receive a complete U Score with the help of only three friends who have a look at your site).

In this way, the U Score provides an approach to usability testing that is as precise as possible given the minimal effort it requires. It’s intended for situations in which designers/developers don’t have the chance to conduct a traditional usability test. Also, it addresses everyone who needs a quick assessment, has never tested the usability of a website before or is new to usability testing. However, please note that the U Score can only be an approximation and is not a complete substitute for established usability testing methods. Still, it gives you a very good baseline according to the motto: Any usability test is better than no usability test!

The current version of the U Score is still in beta development status. Therefore, I highly appreciate your feedback, which you can add to this public Trello board.

For implementation, I’ve relied on a number of well-known technologies and frameworks in combination with some that were new to me (the ones marked with an asterisk):

I hope the U Score can help to reduce the number of websites who’ve never been tested at all and particularly help you to conduct more and quicker usability tests. Enjoy!  🙂

Lean Support: The Case of HoloBuilder

SupportWhen I started working on HoloBuilder.com over a year ago, there was no support. Of course, we would’ve immediately helped anyone who sent us a question via e-mail or Twitter, but those options weren’t communicated anywhere. Users accessing HoloBuilder ended up directly in our augmented/virtual reality creator, the only way for communication with us being a “leave feedback” option indicated by a line chart icon. HoloBuilder menu (old)However, “leaving feedback” is definitely not the same as “getting support” or “getting help”. Thus, following my UX rule #1, support functionality was de facto nonexistent. Also, we had neither a knowledge base nor a collection of FAQs, no support personnel and user feedback was still pretty rare.

Hence, due to the limited resources in a start-up, we decided for a lean support approach. That is, we rolled out HoloBuilder support in small pieces, treating every stage like a minimum viable product (this is also what Nate Munger describes on Quora).

Making Support Visible

First, we added a “?” icon next to the feedback option, which was visible at any time and clearly showed the user that they could get help. Since there were no FAQs available yet, clicking the new support option simply instructed the user to send their question via e-mail or Twitter. As you can see, we changed nothing about our support back-end, but we finally made the possibility to get help visible to the user, which is already a huge gain. This was confirmed by an increasing number of relevant support request that reached us in terms of e-mails and tweets.

Growing a Knowledge Base on Demand

Second, to continually grow a knowledge base, we created a blog for development news and tutorials, which can be found at createholo.com. In that blog, we publish solutions to (potential) problems on demand, i.e., when users get back to us with questions or when new HoloBuilder features are released. CreateHolo was then linked as “Tutorials” on our new landing pages, which we created to provide users with some introductory information and help before being confronted with the AR/VR creator itself. However, through heat map analyses, we found that the tutorials gained more attention when changing the link text to “Help & Tutorials”. This indicates that “help” is still the major keyword when it comes to support.

Adding Channels

In the next step, we integrated a tawk.to live chat into our HoloBuilder pricing page as soon as it went live in January 2016. In this way, we established a third feedback channel in our support back-end with almost all of our employees acting as support agents, thus providing a more direct and personal connection to users with urgent questions. Since our experiences with the live chat feature have been consistently positive so far, we plan to extend it to our different landing pages as well in the future.

Collecting & Organizing Feedback

Finally, all questions and pieces of feedback we receive through our three support channels—e-mail, Twitter and live chat—are collected and organized in a dedicated Trello board. Based on that board, on a regular basis, we decide on feature requests, tutorials to be written and continuously grow the aforementioned knowledge base. The FAQs collected in that knowledge base are at the same time treated as a list of to-dos for our internal UX team. Some particularly crucial FAQs are already featured on the HoloBuilder pricing page. Interaction with those FAQs is tracked anonymously to find out what users struggle with most. Interestingly, interactions seem to be not overly influenced by position bias since FAQ #8 at the very bottom receives the second-highest attention in terms of clicks.

To conclude, by following a lean support approach, we have established a well-working process and a convincing customer success rate within a year, without additional resources or personnel. The next big milestone of our process of implementing lean support for HoloBuilder will be the release of our knowledge base, so that we can provide an on-page support experience that’s just as awesome as MailChimp’s (disclaimer: I’m a fan).

This article is dedicated to Anna, our awesome support ninja.