I’ve noticed that some blogs often post a list of recent tweets on a specific topic, just to capture and showcase the current dialogue going on about that topic. I’d like to start doing the same for UXD-related tweets. This is the first of such posts.
@userexperience: An Historical Perspective of Flat versus Rich Usability Design -http://bit.ly/ZroRxu
@MMudassir: ”UX Manifesto: 7 Principles for Better Software” http://buff.ly/11Kl7Wk
@MeasuringU: Common Misconceptions About Touch – http://ow.ly/khnYX
@kkmett: Customer Experience is the Only Competitive Advantage Left http://bit.ly/185MX1b
@adrianh: Some smart people (and me) talk about Responsive Web Design and Accessibility http://buff.ly/14S6Uwl
@use_this: Mobile focused UX Jobs http://bit.ly/YKUu1r
@darrenhood: Lynne’s essential UX books – The Republic of Quality Blog http://ow.ly/kueJR
@uxfactory: The 3 R’s of Measuring Design Comprehension (Measuring Usability): http://bit.ly/12AYTYi
@fransgaard: Is “Mobile First” technology or behaviour? http://fransgaard.com/?p=2858
@UXPhil: How to Map Customer Journeys http://bit.ly/Ycn0Oy
On occasion, I will encounter a parking lot that has been poorly designed, difficult to navigate, and, thus, prone to cause vehicle jams and accidents. I drove through a parking lot last week that was quite a doozy: narrow parking spots, narrow two-way driving lanes between the parking sections, narrow and cramped turns, no established drop-off zones (which encouraged cars to park and sit in the middle of a driving lane), and no signs to assist with wayfinding. While I understand that limited space is always a factor for any organization’s parking lot, some of these issues could be remedied. For example, instead of allowing “two way” driving lanes, enforcing a “one way” driving lane rule would help alleviate congestion within the lanes. Implementing signs that help communicate the proper directional flow would help emphasize and reinforce the “one lane” rule, and thus keep the parking lot more organized and safe.
Performing usability tests, or even basic surveys, on a parking lot could certainly help reveal any underlying issues and flaws with the lot. Organizations would obviously benefit from safer and more user-friendly lots, as they could encourage more users to visit the building.
So, has anyone heard of usability studies conducted on parking lots? If so, please share them here. I’ve done a little research on this topic, but I wasn’t able to find anything substantial or noteworthy. This seems like an untapped avenue of usability testing. Parking lots are an essential part of an organization’s infrastructure, and certainly should not be ignored when considering how to enhance the user’s experience with the organization.
Last August, I created a post that discussed the relationship between user experience design and ROI. In it, I questioned how UXers can effectively demonstrate the ROI of usability testing and user experience design methods.
Fortunately, a company called UserZoom recently offered a webinar that discussed this same topic. UserZoom offers a software platform that assists organizations with conducting online UXD research. This webinar, titled How to Measure the ROI of User Experience and presented by Susan Weinschenk, offers some solid examples of how the ROI of UXD can be demonstrated. Weinschenk points out that implementing UXD methods can result in many benefits for an organization, including but not limited to: increased customer satisfaction, reduced training costs, and reduced troubleshooting issues, and saved costs by avoiding multiple redesigns of a poorly designed product or service.
Please take some time to watch this webinar. I found it both educational and reassuring for those in the UXD field.
Edit: UserZoom has posted 13 questions and answers that were not covered during the webinar due to time restrictions. Check them out on the UserZoom website. It’s a worthwhile read.
I attended my first ACRL (Association of College and Research Libraries) conference last week. In addition to being an amazing experience, it was quite eye-opening: there were several presentation sessions related to usability, assessment, and user experience design in library settings. These topics are clearly trending and becoming more popular among libraries, which is very exciting.
In an effort to share my first ACRL experience, I’m providing a list of links and summaries to some of these UXD-related presentations. I encourage everyone to read them:
- Seating Sweeps: An Innovative Research Method to Learn About How Our Patrons Use the Library (Mott Linn) – The speaker used an innovative research method call seating sweeps to learn how the clients at this university were using the library. The study determined which areas of the library and what types of furniture were used the most and least and where various activities took place. These findings greatly influenced the library’s recent renovation/expansion, which so inspired the student body that the door count more than doubled. Learn how to use this methodology.
- “The Mother of all LibGuides”: Applying Principles of Communication and Network Theory in LibGuide Design (Carol A. Leibiger and Alan W. Aldrich) – Ease of creation and flexibility make LibGuides popular in libraries. Their flexibility includes the ability to share content and create links across multiple LibGuides. A communication-as-design perspective is introduced and specific network models are identified for organizing LibGuides to manage changes and updates efficiently, thus easing librarians’ workload. Participants will evaluate these models in the context of their own libraries; an electronic handout provides guidance in the creation of these network models.
- Hidden Patterns of LibGuides Usage: Another Facet of Usability (Gabriela Castro-Gessner, Wendy Wilcox, and Adam Chandler) – In our paper, we present the analysis and use of raw log files for LibGuides used to contextualize and understand unfiltered user behavior as a novel approach that complements traditional usability testing of the LibGuides tool. We anticipate that revealing patterns derived directly from user actions and locations will allow us to make compelling and robust recommendations for our academic library community to enhance the use and value of library guides for our patrons.
- The Unobtrusive “Usability Test”: Creating Measurable Goals to Evaluate a Website (Tabatha Farney) – Determining the success of a library’s website is an ongoing process because the site’s intended audience constantly changes as students come and go every semester. Rather than assuming that your library’s website is still functional, unobtrusively test its usability by creating website goals that can be measured using website use data. Discover fundamental web analytics metrics and how to use them to evaluate a website without disturbing website users or spending a lot of time.
- (Dis)Abled: Transforming Disabling Library Spaces (Lorelei Rutledge and Alfred Mowdood) – Learn about a university library’s implementation of cultural competence models to better address disabled patrons’ needs. Discover new methods to develop a stronger institutional relationship with your Disability Services on campus, implement training strategies based on cultural competence models, and redefine and improve services, spaces and technology. Learn and discuss strategies and tools to accomplish these same changes on your campus.
Posted in Accessibility, Assessment, Education, Infrastructure, Libraries, Universities, Usability, Usability Testing, User Experience Design (UXD)
Tagged accessibility, ACRL 2013, assessment, conference, presentations, usability, UXD
A colleague of mine recently brought this site to my attention: http://weareinflux.com/prefab.
Prefab is a ready-made library website template. According to the Prefab site, the design of this template is based on years of library user research.
Here is some additional information about this service:
- Prefab will host the site for its clients
- Libraries can integrate their ILS services on a Prefab site
- Libraries can keep their original URL/domain name
- Prefab sites work on desktop and mobile devices, including tablets
- It comes with 6 color options, and libraries can use their own logo
- Libraries can customize colors and layouts with CSS
- Prefab sites are built in WordPress (interesting…)
- Prefab offers e-mail and phone support, back-end training, and domain set-up assistance
- The initial set-up costs $1,500 and then it’s $500 each year
Overall, Prefab sounds very intriguing. Prefab might be a promising service for libraries seeking to design and host their website on a budget. Many of Prefab’s features are enticing, but there are some potential pitfalls. I’m mostly concerned about Prefab’s support, including ongoing training and troubleshooting; it’s difficult to gauge how robust and responsive this support will truly be for clients. Another concern is the possibility of a Prefab site looking too similar to other Prefab library sites; although Prefab claims that “there’s no disadvantage if your site shares similar design elements with other libraries,” I’m inclined to disagree. It’s very important to distinguish your site (and, ultimately, your organization) from others. However, Prefab clearly states that a library can differentiate its site by implementing unique logos, graphics, and CSS for colors.
I was recently asked by the US Postal Service’s website to complete an online customer satisfaction survey. I agreed to take it, and while I was completing the questions, I noticed that my patience with the survey questions began to wane. I thought some of the open-ended questions required too much information and detail. Some questions were also designed rather poorly. Instead of providing radio buttons to indicate a number between 1 and 10, I had to manually enter the number myself. Although that doesn’t seem like a big deal, it presents a UXD problem; brief customer satisfaction surveys should be brief and should reduce the amount of work that the survey taker has to perform, especially considering that he or she is voluntarily giving their time to complete the survey.
This USPS survey got me thinking of a somewhat strange question: shouldn’t surveys undergo usability testing? If surveys are used regularly by an organization to assess customer/user satisfaction, then I think it’s safe to assume that the organization is relying on the surveys to be usable/user-friendly for those who choose to take them. If a survey is difficult and/or frustrating for users, then their feedback could be negatively affected by its poor design. Some users might even abandon the survey before finishing, thereby reducing user feedback.
Is it logistically confusing to test the usability of a survey? In other words, is surveying a survey contradictory? After all, some usability tests include surveys themselves. I might, however, be over-thinking this issue. One simple method is to ask users to complete surveys with different designs, which could help determine a preferred survey design among most users.
Has anyone tested a survey before? If you have any experiences or thoughts to share, please do so!
Time again for a Make It More Usable! post.
This usability issue applies not just to library buildings, but to all public buildings in general, and it’s a very simple issue: doors.
More specifically, I’m referring to two problematic door designs.
(1) Doors that are confusing to open:
This picture is a good example as it illustrates a common problem. On which side of the door do you push to open?
I’ve encountered these types of doors many times, and it’s always inevitable that I end up pushing on the wrong side of the door.
How to make it more usable? Doors should communicate 2 important pieces of information to people: (a) whether it should be pushed or pulled and (b) which side of the door should be maneuvered. (Revolving doors and automatic sliding doors are two obvious exceptions to this). Ideally, door handles should be placed on only one side of the door. The handle designs should also make it clear as to whether they should be pushed or pulled; if the handle designs do not communicate this information, then the words “Push” or “Pull” should be indicated on or near the handles.
(2) Opaque doors:
This issue isn’t just about usability, it’s also about safety. Although I understand that in some cases, certain doors should not have windows due to privacy reasons, users can benefit greatly from see-through doors. The problem is simple: doors that are completely opaque prevent users from knowing whether someone is on the other side of the door; consequently, the door can suddenly swing open and potentially cause injury. Opaque doors also prevent users from knowing whether a room is occupied, which could cause them to interrupt private meetings (assuming the door is unlocked).
How to make it more usable? Whether the door is entirely made of glass or just partially, implementing see-through doors can prevent unnecessary injuries and interruptions.
Since my last post, I’ve continued to think about the correlation between usability and assessment. I recently discussed this topic with one of my librarian colleagues, and she recommended that I research LibQual+, which offers a set of assessment services specifically geared toward libraries. From what I can gather, LibQual+ can be used by any type of library.
According to the LibQual+ website, its services allow libraries to “solicit, track, understand, and act upon users’ opinions of service quality…LibQUAL+® gives your library users a chance to tell you where your services need improvement so you can respond to and better manage their expectations. You can develop services that better meet your users’ expectations by comparing your library’s data with that of peer institutions and examining the practices of those libraries that are evaluated highly by their users.” This description closely resembles how I would define/describe usability testing.
The primary service offered by LibQual+ appears to be its Web-based survey, which includes training for librarians to help them assess library services and develop best practices. The survey measures ”user perceptions of Affect of Service, Information Control, and Library as Place,” and it allows for “opened-ended comments from users regarding their concerns and suggestions about library services.”
The LibQual+ survey isn’t cheap, but it’s obviously very thorough and robust. I would love to see a sample survey. I am curious as to how popular/widespread LibQual+ currently is, and whether any libraries in my area have used its services before. If you know of a library that has used LibQual+, please let me know. My interest in assessment, particularly as it relates to usability, has been peaked.