As noted in another post, I got an IPEVO document camera to use in usability studies. I also used it in a GoToMeeting conference call with a client the other day.
We had been doing some quick prototyping at his office, using the whiteboard. We didn’t have time to get together for the next meeting, but I connected the document camera and used it as the webcam input to GoToMeeting so he could see what I was sketching. I put a printout of the latest Balsamiq mockup under the camera and marked it up as we were talking.
Aside from impressing the client, it was a handy way to work together remotely.
What tools and tricks do you use for remote collaboration?
I was visiting someone recently who has severe vision and hearing problems. The number and types of assistive devices that he has is amazing. Most of them came from the US Department of Veterans Affairs.
Memory takes the place of vision with some of these devices: each has its own controls and layout to remember. It’s quite amazing to watch someone use them all. Click the images to see them larger.
The ScripTalk Station reads prescription labels aloud. It gets information from an RFID chip in the label, so the medication has to come from a pharmacy that uses these labels (like the VA).
An occupational therapist provided some interesting tips: put a rubber band at the top of a bottle for morning doses, and at the bottom for evening. Turn the bottle over after taking the meds, then reset it the next day.
As it says on the handset, this is a LOUD telephone. I had to turn it way down to use it. The buttons are large enough for many low-vision people (and I blurred out the names on the top of the phone).
Williams Sound makes SoundPlus TV listening devices like this. The base connects to the television, and sends the sound via infrared signals to the receiver, which the listener wears.
Sound on the television can be at a regular level for people without hearing problems, and the wearer can adjust the volume on the receiver.
Two reading devices: On the left, is a magnifier. You can see the corner of a yellow page under the screen. This is good for simple documents.
On the right is the Extreme Reader by Second Vision. It does text-to-speech conversion and reads documents out loud. You can see a newspaper in the device, and the simple control panel with four big buttons. It’s obviously slower than reading on your own, but it’s an amazing thing to have when you can’t.
This HealthSmart blood pressure cuff speaks instructions, measures blood pressure and then speaks the results. It provides a general diagnosis (“According to World Health Organization recommendations…”)
I saw a similar device for measuring blood glucose levels.
Not everything is electronic. Large-print calendars like this are very helpful. One source for them is LS&S, “the catalog of products for the visually impaired and hard of hearing”. Other simple accommodations include small velcro strips on washing machines and dishwashers to help get oriented on the control panels.
This one is my favorite: Press the red button to hear what color an item is, which can help vision-impaired people pick coordinating clothing (or carpets, for that matter).
Press the yellow button to get an audible signal that indicates the light level of the room. You don’t want to invite friends over for coffee if they can’t see the cake!
I bought stamps at the Post office today and saw that they have new terminals for swiping credit cards. It was this one, or one just like it.
The Ingenico iSC 350 Quick: where would you swipe your credit card?
Perhaps because I was looking at the pen, but I tried to swipe the card in the slot just under the pen. But that’s not where the card goes, that’s just a space between two parts of the device. The card goes in the space just above the keypad (which wasn’t lit up when I used it).
The funny thing is that the clerk said that I wasn’t the first person who did that! Maybe a little usability testing would have helped.
Update (7 Aug 2014)
When I bought stamps today, I noticed that the slot at the top of the machine was covered with tape. The clerk said they had to do it because so many people tried swiping their cards there.
I recently ran a usability study on a mobile app. Software like Morae makes it easy to record, observe, annotate and analyze a study using a laptop or desktop computer. But there’s nothing (yet) like that for testing mobile apps, so the setup was the complicated part.
This is a summary of how I set up a study on a mobile phone app.
Learn from others
First, I learned from a colleague by assisting her during a study she ran. I ran my study based on what I learned from her and what I discovered along the way.
Learn from others: find someone who’s done a study like yours, or find a blog post like this one.
The participant and I sat in one room, while we had a note-taker and observers in an adjacent room. We also had team members in remote locations. They all had to see and hear what was going on.
Did I mention that the study would occur 1000 miles from where I live and work? That was a bit nerve-wracking, but I had great help.
I sketched what I thought the setup would be and sent it to the people onsite for our discussions:
The left side represents equipment in the testing room. The right side shows what would be in the observers’ room. But it needed a little adjustment.
Here is a photo of the actual setup I used, followed by a discussion of how it worked and a list of what did not work.
Devices under test: The software ran on iPhones and Android phones. In the photo, the iPhone is on the right side of the screen, where the participant sat.
Positioning the phone: This involved conflicting goals: The phone had to be under the camera for recording and video transmission to the observers’ room. But the participants had to be able to use it as naturally as possible. Here’s the compromise: It’s sitting on an inverted clipboard, which tilts it at a reasonable angle for viewing by the participant. The clipboard is held in place by gaffer’s tape, and there’s a tape loop holding the phone in place. The participants were more important than the observers, so I let them pick up the phone when they needed to, but then asked them to replace it. There was an outline of the phone drawn on the white paper covering the clipboard as a target for replacing the phone.
Camera & microphone: Above the phone is an IPEVO Ziggi-HD document camera. It’s very compact and has a microphone. It connects to the laptop via a USB cable.
Image capture: You can see the image of the phone on the MacBook Pro laptop. It’s displaying the phone’s screen through the IPEVO Presenter software that comes with the camera. It was easier for me to watch the study on the laptop screen so I didn’t have to literally look over the participant’s shoulder. The Post-It on the laptop has all the resolution and exposure settings that seemed to work well. I always have a checklist of things to do between sessions and to start a new session, and this became part of it.
Recording: I used TechSmith’s Camtasia to record the audio and video during each session. It’s largely invisible during the study. (See below for what did not work.)
Remote video & audio broadcast: Audio for all the observers (local and remote) came from the camera’s mic, video from whatever was on the screen.As noted, we had remote observers, so I used GoToMeeting to transmit the audio and video to them. As long as their phones were muted, it was fine. The GoToMeeting UI is on the right side of the laptop screen. It can be minimized, but it can remain open next to the Presenter window so I could communicate with them via the built-in chat feature. Because I was mirroring my display in the observer room, the local observers could see any chat messages if the UI was exposed. (Mirroring the main display on a secondary monitor on a Mac is pretty easy, and happens in Displays System Preferences window.)
Local video broadcast: I connected a mini-DVI adapter to the Mac, and connected that to a long cable that went to a monitor in the observer room, which was next door. (See below for what did not work.)
Local audio broadcast: Camtasia will feed video to an external source, but not audio. The simplest solution was to make a phone call from the desk phone in the testing room to the observer room, put it on speaker phone there and mute it. We just left the connection open all day.
Backup: What’s the worst thing that can happen when you’re recording a session? Losing the recording. So I backed up each one to a local server immediately. The files were 3GB each, so it was slow over the wireless system I had to use, but it worked.
Communication with observers: I’ve tried many things over the years for communicating with observers, including an earpiece that they could talk to me through. Now I use text messaging. My iPhone is in front of the desk phone, but during a session, I kept it on the laptop so I could see messages that came in.
Note-taking: Morae is great for taking notes in a usability session because it time-stamps them and they’re part of the video timeline. The app LogIt is good for simple time-stamped notes on a Mac, but for my PC note-takers, I created an Excel file that put a time stamp in column B as soon as they entered any text in column C.
So how long did it take to figure that out? Quite a while. Plan for that, and do a number of dry runs to be sure. The stopwatch app on the iPhone is good to run while you’re testing because it constantly updates, and you can tell if the video freezes.
What did not work
Recording: I didn’t use GoToMeeting for recording, because I have had some problems with that on the Mac platform. My laptop didn’t reliably record with QuickTime because it’s a mid-2009 model, so I switched to Camtasia. (My colleague used QuickTime to record her study, but her MacBook Pro is a year newer than mine.) A benefit of Camtasia is that it writes the recording to disk as it progresses; QuickTime saves it in memory, which means it’s volatile and can disappear if various interruptions occur.
Local audio broadcast: Another thing that the mid-2009 MacBook pro doesn’t do is transmit audio over an HDMI connection, so I used the desk phone for audio. My colleague’s computer did work with HDMI, and she used HDMI to get audio and video to the observers. The item in the sketch (above) labeled “sound cable” didn’t work out because Camtasia doesn’t support monitoring audio.
My computer: Well, it did work, but I was afraid that it would melt, with Camtasia recording, the backup copying and GoToMeeting broadcasting. But it worked fine.
That’s why we test things. And call Tech Support (and they were very helpful at Camtasia).
Things will work differently for you, but this is one place to start. And now that I figured this out, I’m doing another study just like it, but with a few twists:
The next study, using an iPad and a picture-in-picture recording with Camtasia, the document camera and the built-in webcam.
The City of Boston recently announced the Boston Meter Card, a prepaid card to use at parking meters. It’s a great idea, but it was impossible for me to figure out because the card doesn’t work the way other cards work. You have to insert the card and keep it in the meter for 10 to 15 seconds.
What would you do when you walked up to a meter with the card? I thought about which way to put the card in, inserted it, took it out, and… nothing.
I was there with someone else, and we couldn’t figure it out. Was the card broken? Was the meter broken? What else could I have done?
Good thing I had quarters.
It doesn’t work the way you’d expect
When you insert the card, you have to hold it in for 10 to 15 seconds and wait while the small display updates a number of times. But you knew that, right?
Problem #1: It doesn’t work like any other card I use. I couldn’t figure it out. Was it user error, or a system-design problem?
Videos of using the Boston Meter Card
Watch video footage of checking in and out of a meter. It’s hard to read the display, but that’s part of the real-life situation.
Now that I know how it works, I understand the transitions in the display:
00:00 – there was no time on the meter when I arrived
25.00 – I have $25.00 left on the card
In – I’m checking in
4:00 – the maximum amount of time to park
The first time I tried the card, it took the full 15 seconds to get a response. It didn’t display “In” that time, but it did display “1111” for some reason.
How long do you have to wait and watch? And how many changes will there be? Not knowing makes it hard to know when it’s complete. Is it clear what each display means?? There was no explanation, and it was impossible to figure out the first time. A brochure came with the card, but didn’t mention any of this.
Problem #2: The displayed information isn’t always the same for the same operation.
Checking out of the space was even more confusing because there were more transitions in the display to figure out:
These were the transitions for checking out:
2:18 – the time left when I got back
1111 – no idea, what do you think?
1:42 – the time I had parked and would pay for now
22.85 – the money I would have left on the card
OUt – I was leaving
00:00 – the meter was reset and now had no time
Problem #3: There’s no way for a first-time user to know how many display transitions there will be, so there’s no way to know how long to wait before removing the card. (I think you have to wait, but I didn’t test that.) And it’s not clear what it all means.
It works like … nothing else
Even if you use an older ATM that holds on to your card, it reacts within a second or two. Most card-reading machines have instructions saying to “swipe” or “dip” the card; this was the only one that would use a word like “wait”. Here’s an example from a hotel I recently stayed at:
This hotel key card responded within a second. All I had to do was "dip" it in and remove it.
Using the card the first time
The first thing was to figure out how to insert it. This photo shows a graphic on the meter that corresponds to the chip on the back of the card. It’s hard to see and it’s not clear what it means.
The arrow points to a graphic that looks like the chip on the back of the card. Is that enough to tell you how to insert the card?
The sticker just below the slot would have been a good place to put some instructions. That would have been easier than trying to decipher that little mark under the slot.
Problem #4: The display is hard to read in bright light, and probably worse at night.
I inserted the card different ways, but it didn’t react (because I didn’t know to hold it in place). I spent a lot of time trying to make it work and a lot of time the next day on the phone finding out how it does work.
The problem: User error?
One person I talked with in the Parking Office said that it was “probably user error” because “that is the problem in 24 out of 25 cases.” I don’t generally believe in user error, so I took a deep breath and said that it’s more likely a system-design problem.
After awhile, I found someone who explained about having to hold the card in the meter for 10 to 15 seconds. I identified myself as a user experience designer, and we talked further.
More than user error, I think it was a failure to understand the users and their expectations.
Should a parking meter card need instructions?
He asked if I’d read the brochure that comes with the cards (PDF). This should be so simple that instructions aren’t needed. I don’t think people would read directions, save them or remember what they’d read. I mentioned that, and said that as a typical user, my copy was already in the recycle pile.
We talked about the instructions on the back of the card, too (ALL IN UPPER CASE) That text doesn’t say anything about holding the card in, it didn’t explain the transitions on the display and it didn’t explain when you’re done with a transaction. The brochure did mention holding the card in, but only for signing out.
The gold seal on the left must be the chip. The instructions at right ARE ALL UPPER CASE and don't mention holding the card in.
Problem #5: This system shouldn’t require documentation and what they provide is incomplete.
How can they fix this now that they’re already selling cards?
If the city doesn’t change something to make the system easier to figure out, I’m afraid that it will just fail.
It’s a system with many parts: the card, the display, the insertion method, the information on the meter and the brochure. Plus user expectations. Some parts are easier to change than others, but something has to change.
When I talked with someone in City Hall, I suggested reprinting the cards with complete instructions. He said that the cards came from the vendor. And that they had 10,000 of them. My card has a number in the 400s, so that won’t work.
Next, I suggested printing stickers with better instructions to cover the old text. Again, even if it were a lot of work, at least people would have the instructions with them.
It would help if the sticker on the meter had some instructions. I assume that changing the displays or how the meters work would be too involved, but we didn’t get to those topics.
We talked a little more and I wished him well.
Lesson: Design, test, redesign, test, …
Problem #6: The underlying problem is that the product design process probably didn’t involve any actual users or testing in real situations.
This is a system designed for anyone who parks a car at a meter, day or night, possibly in a hurry. How do you think someone like that reacts to this user experience the first time?
I don’t know who the vendor is, or who designed the system. And I don’t know how they’re going to resolve this problem. I’m pretty sure the program will not succeed without a big change.
I sent what I learned to Eric Moskowitz, the Boston Globe reporter who writes the Starts & Stops column about transportation issues. Maybe he can write a column and help teach people how it works.
It seems pretty clear to me that this whole system was designed the old-fashioned way. Rather than test the system with real users in real situations, they probably talked about it in a conference room and figured it would work out OK. If someone raised the obvious problems, I can imagine someone else saying, “Yeah, but all they have to do is…”
That phrase is the kiss of death for a design. I hope the City of Boston can make this project work because it’s a great idea.
I took the refresher course for the American Heart Association’s Heartsaver CPR & AED course recently. Once again, I was impressed with the design of AEDs.
Wikipedia describes an AED as
An automated external defibrillator or AED is a portable electronic device that automatically diagnoses the potentially life threatening cardiac arrhythmias of ventricular fibrillation and ventricular tachycardia in a patient, and is able to treat them through defibrillation, the application of electrical therapy which stops the arrhythmia, allowing the heart to reestablish an effective rhythm.
While they may be used by EMTs with a lot of training, they’re also used by people who happen to come across a person in distress. You can imagine how anxious such a user is, so the devices must be really easy to use.
And they are. Once you open the device and turn it on, it tells you what to do, step by step.
Here’s a video I found on YouTube that shows a typical one. (The demo starts at 00:1:00 into the video.)
You might be trained on one brand of device and have to use a different brand if you come across an emergency in a store or public library. I don’t think it matters, because they walk you through the process, showing and saying what to do at each step.
I’m not sure why they’re all so well-designed. Maybe one company figured it out and the others copied, or maybe the Red Cross or Heart Association made suggestions to all of the manufacturers.
Have you taken AED training? Have you ever used one in real life? How did it work?