The last weeks we tested on two different levels. We selected 10 people to use it and do two things: Fill in our SUS questionaire and give specific information about bugs or possible improvements. Both levels gave us good feedback about what to improve which we will be discussing in this post.

Specific Issues and Requests

Some clear bugs that we already knew of but hadn’t found the time to fix were of course pointed out but beside these minor issues, which we are working on, some other interesting suggestions were made.

First of all someone indicated that it wasn’t clear what the goal of the application was. However once it became clear he did find the application easy to use. A short explanatory introduction screen could solve this issue easily. …

What we’ll be doing the next few weeks.

The coming week we will finally release Adify publicly. For those who want a sneak peak, is already online. Due to some technical difficulties (which we will explain in a follow up blogpost) the public release was pushed back a couple of weeks. We did however do a closed release with some testing to capture early issues as you can see here.

This is the planning for next week and the following paragraphs contain some explanation about it. During the coming weeks more blogposts about the progress will follow specifically the goals for each evaluation iteration.

I’ve never got around setting up my thermostat to fit my schedule. Partially because setting it up is not something that can be done easily without a manual. But also because my schedule isn’t as fixed as my thermostat would want it to be. Maybe i should look into buying a smarter thermostat such as the one Nest is currently launching in Europe.

The thermostat is praised for its very simple user interface. It has some really cool features such as lighting up only when someone is around and of course it’s learning abilities. All you have to do is modify the temperature to your desired temperature and gradually the thermostat will learn when you want the heat turned up and down. It takes a couple of days to learn a new schedule but that’s as easy as what you would do with a regular thermostat anyhow: just turn the heat up or down. …

But general interaction technologies lagging behind?

My eye fell on a recent experiment with the Oculus Rift that makes you believe you are someone else. The setup is the following: There is one test subject that wears an Oculus Rift. A performer wears a chest mounted camera that follows the head movements of the test subject. …

and evaluating session 3.

Contrary to what most developers often do, we started designing focussing on the UI (which makes sense in an UI course). We had to use paper prototypes: the idea is to sketch the user interface and get quick response from test subjects that interact with the prototype. Important note: no fancy drawing, coding or digital design. If you invest too much in a prototype you won’t be able to reject (parts of) it as easily.

Adify v0.1 (paper prototype)

So we used some post-its and quickly assembled our first prototype. Bare in mind that we will be developing Adify as web application first. You can view our first prototype below. It has a large map on top because we want to focus on the location of the user. To the left are the ads you are currently viewing and a list with categories is to the right. The menus are colored in the buttons that activate them. …

Extra motivation to accept @ErikDuval’s application.

So he applied for us and now we are waiting eagerly for the result. You can find this enthusiasm with the hashtag #chikul14 (Computer Human Interaction KU Leuven 2014) of the course combined with #ProjectTango. Let me explain why we feel we are viable candidates.

First of all we are 38 highly motivated master students Computer Science. We span different sub-fields within CS and thus are a pretty complementary and heterogeneous group which benefits the creativity and this means we have multiple angles to look at projects. This should already be a great cocktail for an innovative project like #ProjectTango. …

A closer look at the usability of the popular rss reader

The first assignment for #chikul14 was to evaluate the usability of the recently booming rss reader Feedly. We choose to evaluate the web application at but the mobile application(s) were also a possibility. We had a to take a quick look in the first session and discuss our findings with the entire class. With this as starting point we took a second look and came up with the following results.

To evaluate the usability we will primarily be looking at the bold terms in the following iso definition:

The effectiveness, efficiency and satisfaction with which specified users achieve specified goals in particular environments. …

Dieter Castel

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store