Honeywell Thermostat Redesign Crack at it #2 and User Testing Thoughts

I’ve just completed the user testing for iteration 2 of my Honeywell Thermostat redesign that we’re working on in Matt Franks’ Rapid Iteration class.   In my last post (to be read here) I discuss the objective of my redesign: making the interface of the thermostat as intuitively navigatable as possible as well as the method used for creating the design called wireframing.   This week I was able to observe that that user testing is a direct way to get feedback on design function.  I can literally watch someone intuitively or un-intuitively try  reach the goal/task that I set before them with my thermostat wireframe interface- or not be able to reach it and the exact place in the path to the goal where the design fails.

We used a system called “Read Aloud Testing,” where the user speaks aloud each decision they make about which buttons to push and paths to take to reach the goal they were assigned as they physically “push” the corresponding button on my paper wire frames.  If they “pushed” the correct button for the path that I had designed to reach the goal of the given task, I would place the next wireframe in front of them for them to push the next button to reach the goal.   If they pushed the wrong button, or couldn’t figure out the path, I called “breakdown!” (well, more subtly than that) and took copious notes for redesign.   This process was rewarding for me as the designer bc even if the user chose something that didn’t lead the correct goal, it showed me directly where the path was unclear or unintuitive.  Every design flaw was a win in a strange way, because I felt like I had the information to redesign it.  I took notes on breakdowns as the user tested, and then made a chart of where they got stuck so that I could go back and fix it.

The most marked flaws are both in the navigation cues and also in the paths that I’ve created.  For example, a few users kept pushing the “Done” button to confirm the action of the button they’d just pressed.  I had made the miscall of using “Done” as a way to get back to the home screen instead of a way to confirm an action or decision which was how it was perceived.  For example, in the screens above, the user would press “Adjust Fan” and then press “Done.”  Which makes sense if you don’t know that pressing Adjust Fan takes you to another screen and pressing Done takes you to the home screen.  So, I’ll be rethinking that.

Also, in trying to be all exciting and new in my naming of the categories that you can navigate into (Climate Control instead of Temperature and Fan), I completely confused the user as to which button to press to get to the goal they want.  So, back to the drawing board on that one.

One big thing that I learned/am thinking about: it is frustrating and confusing to a user to enter into an interface and not know where they are in the system.  I feel that “going deep” (several screens) away from the home screen feels overwhelming to the user.  I personally like to know where I am in an interface in relation to the home screen.  In trying to program the schedule of my own thermostat at home, I felt a little bit like the more screens I had to navigate towards and through to reach my goal, then more unsure I felt of reaching my goal and felt frustrated that perhaps I had just committed to some erroneous, 25 minute, completely tedious task, which was not what I wanted.  So, I have a new idea for my manually edit schedule path that I am excited about and working on for this next iteration.  It will model some of the features of photoshop in order to manipulate the schedule of a day.   Stay tuned!

See full PDF of Iteration 2 Wireframes here