Read part 1 of this article here
Karen L. Ziech - Performance Technologist
Information Products & Training, Lucent Technologies
Cheryl L. Coyle, Ph.D. - Technical Manager, Human Factors
Bell Labs Research, Lucent Technologies
Finalist Presentations
All five finalists were asked to prepare a 10-minute presentation for the judges and to participate in a 20-minute question and answer session. The finalists were asked to prepare their presentation addressing the following questions:
- Specific ways in which the product is designed to fit users and tasks (i.e., how does your product match your users' needs/help them meet their goals?)
- Specific design features that make the product easy to use
- Specific user support aspects (e.g., help; documentation; on-line tutorials)
- Specific measures to ensure standards compliance and/or internationalization
- Specific user analysis steps (e.g., analyses of users and/or tasks; setting usability goals)
- Specific user-centered design process steps (e.g., inclusion of user data; iterative design)
- Specific usability evaluations (e.g., usability reviews; heuristic evaluations; user testing)
- Specific customer-focused activities (e.g., use of feedback from customers; consultation; demos etc.)
The eight judges committed to attending as many of the presentations as possible, with the goal of at least four judges at each presentation. As the finalists submitted their available dates for the half-hour meetings, we discovered that we could schedule all five meetings back-to-back in one morning. This would make for a grueling schedule, but would have the advantage of keeping each presentation in memory while hearing the others. We scheduled 2 ½ hours of back-to-back presentations with a final ½ hour scheduled for the judges to convene and vote for winners.
This schedule gave each finalist an an equal opportunity to demonstrate their project’s product and usability process, and did not allow for any running over of the presentations. Project teams were given advanced notice that their presentations would be timed and the 10-minute maximum length would be strictly enforced, which it was. In retrospect, we found that the advantages of this schedule outweighed the arduous nature of the task. It would have been better, though, if we had built in at least one 5 – 10 minute break for the judges.
Six or more judges were in attendance at each presentation. Given that both the USIG judges and the finalist teams were all geographically disbursed, the presentations were conducted using an audio bridge and NetMeeting.
After the 10-minute presentation was completed, a 20-minute question and answer session followed. Judges took turns, based on a predetermined random order, asking questions. Time permitted only one question per judge.
Judges’ Meeting
Immediately following the five 30-minute sessions with the finalists, the judges convened to discuss the presentations and vote on which projects, if any, should win an award. Holding the judges’ meeting so close in time to the presentations themselves was beneficial because the presentations were fresh in our minds.
We began the meeting with a straw vote – before any discussion, we wanted to know how close we were in our thinking. We asked each judge to give a yes/no vote on whether each specific project team should win a usability award. The voting was not anonymous; we simply called each judge by name and asked for the vote. The initial vote revealed the following:
- Team A: 5 yes, 1 no
- Team B: 5 yes, 1 no
- Team C: 0 yes, 5 no, 1 maybe
- Team D: 5 yes, 1 no, 1 maybe
- Team E: 6 yes, 1 no
Since there was overwhelming agreement about Team C, we immediately made a decision that Team C would not win an award. The remaining 4 teams required some discussion. To expedite the discussions, each judge who voted “no” for a given team was given a chance to explain why he/she believed that the team was not worthy of an award. Other judges were allowed to rebut, if they chose to. After hearing other judges' thoughts on the team presentations, we took another vote. Before the second vote, we agreed that in order for a team to win an award, every judge but one must vote “yes”. In other words, if a team received more than one “no” vote, it would not win. The second vote looked like this:
- Team A: 3 yes, 3 no
- Team B: 5 yes, 1 no
- Team C: --- did not vote --
- Team D: 5 yes, 2 no
- Team E: 6 yes, 1 no
Based on our a priori decision to award teams who received no more than one “no” vote, our votes yielded 2 winners: Teams B and E.
After the voting was over, an informal discussion began about how much effort the finalists made in preparing their presentations and how each of the five finalists achieved a degree of user-centered design beyond the non-finalist entries. We easily came to an agreement that the three teams who did not win the Usability Award should have some kind of recognition. We also agreed that the other 6 semi-finalists should be recognized in some way. All the judges agreed that it was important to acknowledge that these submissions had been strong enough to make it past the first couple rounds of review, and we wanted to acknowledge their work as well. Someone suggested that we identify the 3 finalists who did not win as Runners-Up and the other 6 semi-finalists as Honorable Mentions. As soon as it was suggested, we all agreed and a decision was made.
The Awards
We’d decided on the winners. Next, we needed to decide what, exactly, the winners would “win.” How would we reward the winners? We knew that, at minimum, we wanted to shower the winners with publicity (internal-Lucent publicity). We also quickly agreed that the winners should receive certificates. We decided that both the USIG chairperson and the sponsor of the award, a Lucent executive, should sign the certificate. We also decided that we wanted some small verbiage about the purpose of the award program or of USIG in general. We ended up with: Thank you for helping USIG further its goal of promoting good usability in Lucent products.
Next, we needed to come up with a look for the certificates. We started comparing certificates we had each received, and decided to get the help of a graphic artist. Luckily, one of the judges happened to be working with a very friendly graphic artist who agreed to help us. We sent him our basic ideas--what we wanted the certificates to say, etc.--and asked him for some ideas on how the certificates should look.
We also needed to determine who would receive a certificate – only the two winning project teams or the runners-up as well? Again, since we wanted to spread around the congratulations, we made an easy decision to prepare certificates for the winning teams, the runners-up and the honorable mentions. In the notices sent to the 11 teams being honored, we asked for a list of names of the team members so we could prepare certificates. Our graphic designer came up with beautiful certificates.
Unfortunately, the certificates were developed in an application that none of the judges knew how to use. Because only the graphic designer could edit the certificates, we had to send all the names to him for creation of the individual certificates. As we requested, the winning teams, runners-up and honorable mentions sent us their list of team members. Because we had 11 teams getting certificates, and because some teams sent us names of more than 10 team members, we had over 100 certificates to prepare. This became unwieldy. We decided that since we were on a short time schedule, we would print personal certificates for members of the two winning teams only. All other teams would receive team certificates, one copy for each member, but without personalization.
The judges discussed what other award we’d like to bestow on the winners, in addition to the certificates. We all felt strongly that we needed an actual award like a plaque or a trophy to give to the winning teams. We felt so strongly about this that we were willing to spend our own money on the purchase of an award. One USIG member did some local research on trophies and plaques and found that they were reasonably priced. We began to look through various catalogues of awards when it occurred to us that our award sponsor might agree to pay for the awards out of his organization’s budget. Our sponsor agreed to pay, not only for two trophies, but included enough to cover the cost of the certificates and refreshments for the awards ceremony.
As with the certificates, we had to decide on the artwork and wording on the trophy. We picked out very nice looking trophies, which we decided would be presented to the member who represented the team during the final presentation session.
Finally, we contacted the internal Lucent publications and asked them to write an article about the award and announce the two winners. We sent individual emails to all 11 finalists and their managers, congratulating them on their status. These emails helped to recognize teams – in front of their managers – for the work they had done on behalf of usability. These communications were forwarded around and up the management chain, and team members were congratulated. This kind of recognition helped foster the good will of the award process and increase visibility of the value of usability.
The Ceremony
In order to hand out the certificates and bestow the trophies on the two winning teams, we needed to have some sort of ceremony. New questions formed in our minds: what will we do at a ceremony; whom should we invite? We knew that our sponsor should be there, presiding over the ceremony and distributing the awards, even if virtually. At first we thought the ceremony would be for the two winning teams and members of USIG. Then, as before, we realized it would be better to open this up to a larger community, to help people feel good about their own contributions toward usability, and to make additional progress on raising awareness.
We invited all the winning team members, runners-up, honorable mentions, and their managers, to a one-hour long ceremony. We had our next monthly USIG meeting already on the judges’ schedule for a few weeks away. We decided to use our upcoming regular USIG meeting time slot for our ceremony. After checking with our sponsor for his availability, we sent out invitations to the ceremony. We had over 100 people in attendance in 5 major locations, with several remote people on the bridge.
We sent the certificates to 7 different locations to arrive in time for the ceremony. USIG representatives from each of 4 different locations purchased light refreshments for the ceremony in order to create a festive, party-like atmosphere.
We created an agenda, which included a few words about USIG and the award program and then the announcing of the awards by the sponsor. A representative from each team had an opportunity to say a few words about their project and to thank their team members. Recognition was given to all who had emphasized usability in their work.
At the end of the ceremony, amidst the good will, we had an open Question & Answer session and asked attendees for questions and feedback. After a few minutes of fluff and not much content, one of the judges suggested that all 11 teams should use this as an opportunity to network with one another, in addition to USIG, to share plans or concerns about usability in their projects. This suggestion led to several other new ideas on how to continue to support this network of 11 teams – possibly all 39 teams – plus new ideas on other ways USIG can provide value internally.
All in all, the ceremony was a success. Some influential Lucent leaders had joined and expressed interest in learning more about usability and how to build the right processes into their development. That was exactly what we were hoping to achieve. Everyone left feeling good about their contributions to the usability of their project and, we are optimistic enough to believe, this good feeling will lead to increased attention to usability by the various winners who were recognized in the ceremony.
Lessons Learned
The lessons we learned during the year were many; some so obvious they seemed to jump out at us with baseball bats, others, more subtle. The more obvious lessons tended to be discovered as we went along:
- Focus on the good. A positive approach when attempting to raise awareness and promote usability got us farther, faster than trying to be the “usability cops.” Trying to “fix the process,” to get usability engineers onto product teams, or to bring attention to poorly designed, less than usable products were all negative approaches.
- Follow a solid “user-centered design process” when planning the award project. Specifically, make sure to define whom the audience is and “what’s in it for them (WIIFM).” Answer the two basic questions about who will likely submit entries and why. That we had not done this became apparent in the first “call for nominations” that we put out. The articles were geared to “user interface designers.” We had assumed our audience was people like us—such a common mistake, one that we all complain about, yet we stumbled into it. The important WIIFMs we initially overlooked were the marketing value for an award-winning project and the benefit that award-winning team members could realize during their annual performance review discussions. This WIIFM strongly drove our efforts to finalize the awards by the end of September 2005.
- Even though we did not plan out all aspects of the USIG Award, and “made it up” as we moved into each new phase, we did ask ourselves the important questions along the way. There were times when we wished we’d been able to foresee certain issues, but each issue we faced we resolved successfully. In the end, we can look back with pride on the project.
- Think about the language of usability. Related to conducting audience analysis and/or defining the audience is the use of language when discussing usability issues and concepts. We did not take into account that the language of usability is not well understood outside the circle of HF and usability folks. Our first “call for nominations” articles assumed that our audience understood usability processes and concepts; the response to this assumption was dead silence. We received no responses to the first two articles we published. Even the rewritten call asked too generally, “What makes your product usable?” and “What did you do to make your product usable?” Several of the nominations focused on the usefulness of the product alone, and included feature lists and descriptions of how the coding was done—a clue that the teams did not understand the concepts of usability methods and metrics. It would have been better to clearly define the usability criteria and judging process we planned to use. Examples would have helped a lot.
- Emphasize the user-centered design process, not product usability in the call for nominations and judging. Even for a small number of entries, conducting true usability tests would have been a prohibitively time consuming, expensive task. Our judges did not share common training or experience in usability methods, so even a good heuristic evaluation or expert review would have been difficult. As it turned out, the judges’ overall ratings for the 11 semi-finalist submissions were correlated with their ratings on the usability process questions and not their ratings on product usability questions.
- Ceremony planning is a big task! We were working against a tight deadline and had not taken this component into account. The administrivia nearly wiped out the 4 USIG members who jumped in to get the ceremony organized.
- The final lesson learned doesn’t relate to the Award program or project, but rather to the effort. As the USIG members discussed the ceremony, we suddenly realized how cohesive the group had become. We had had long, sometimes contentious, discussions and different members had had to pitch in to cover for others over the months. We had accomplished what we set out to do—to raise awareness about usability—and in the process, we had become a group that had bonded and was even more committed to our purpose: To promote good usability in Lucent products. The excitement of all the teams during the ceremony confirmed that there was both a need for the USIG group, and that our efforts had forwarded our cause.
Final Steps
Even as we basked in the glow of a job well done, we understood there was still work to be done around the USIG Award. We have started some of them and are working on others.
First, USIG members decided that the USIG Usability Award would be granted every two years. We have published some articles in internal newsletters about the award winning teams and projects and have posted the information on our USIG website. For future use, we will also post on our website the artifacts of our project—judging worksheet, analysis of the results, announcement letters and articles, etc.
Next, we have asked all teams that submitted award nominations for feedback about the program. We have begun to receive email answers to a few short questions:
1. What did you LIKE about the USIG usability award process?
(a) What went well?
(b) What made this a positive experience for you?
2. How can the USIG usability award process be IMPROVED for future years?
(a) What problems did you have?
(b) What would have made this a better experience for you?
Finally, when all the comments are in, we’ll document our process and the outline of our work. We know that the next USIG Usability award will be an even bigger success.
The details about the presentation is a good one and the Specific design,has a great role to play and this is really important,and also its design is identified and incorporated into many solution theories of operation research, such as dynamic programming and divide and conquer.The list you share is absolutely awesome and really useful to see,great to allocate the list.
Posted by: Dissertation Writing | March 01, 2011 at 02:24 AM