The communique

Home -> Research -> Communique

‘MASTER BLASTER’ The Power of the Evaluation Matrix
By: K. Brian Dorval, Creative Problem Solving Group - Buffalo & Samantha Stead, International Masters Publishers, Inc.

Can you imagine using a single tool for an entire three days? Or that a single tool could have a fundamental impact on the way a company makes decisions about its new product concepts? Well, let us tell you a story about the Evaluation Matrix.

Background

In the cover article of the last Communiqué (Volume X, Fall 2000) we told you how the use and improvement of deliberate process made a huge difference to the time, cost, and quality of our new product development work at IMP. This time, we want to dive deeper and explain how, as part of the process, we used the evaluation matrix to help us make the decisions that produced such significant results.

Global Product Development

You will recall from our last article that the IMP Family Education product development initiative was structured around a series of international meetings with ongoing work in between. The international meetings began with a forum where, based on trend and other data, we identified global market opportunities, explored possible customer needs in the target groups we intended to go after, and generated initial product ideas based on those needs. We left this meeting with the charge to work together across functions and cultures to develop nine of these ideas into concepts over the next four months. After that we came together again in a second international meeting, the purpose of which was to make decisions on which concepts to take forward to test.

The evaluation matrix provided the overall framework for that meeting.

Preparing to Use the Matrix

To make the most effective use of an evaluation matrix, it is critical to be clear about the options and to ensure that the criteria are well developed and clearly defined. The following gives you some indication of the energy that went into our preparation.

Developing concepts and business cases. Central to our new product development initiative was the philosophy that we should be working globally to create products that would meet the needs of customers in all four of our main markets. During the four months between meetings the Editorial Directors in those markets coordinated and led international, cross-functional teams. The mission of these teams was to conduct research with our potential customers, thoroughly develop the concepts based on the findings, and develop business cases to determine the potential success of the products in the market-place. The business cases were extremely detailed – each around 70 pages long. They were focused primarily on our customers and how the concepts might meet their needs in each of our different market places. We took a thorough look at our competition and how we might position ourselves uniquely. The business cases also aimed to predict in some detail the likely market performance, cost of development, and return on investment. Never before had we analyzed product concepts so thoroughly in the initial stages of development.

To represent the products visually we used standard-format, one-page visual concepts (or ‘viscons’). There is a challenge in creating truly good viscons: the concepts must be thoroughly worked through to be conveyed effectively in such a way. In past development processes, we had developed printed prototypes and glitzy presentations. That approach had been costly and had often led to the focus being on the presentations rather than the innate strength of the concepts. So this time, we leveled the playing field.

The business cases and viscons were to form the cornerstones of our international decision-making meeting. They were distributed in advance of the meeting so all participants could prepare. The product concepts, represented by their working titles, were our ‘options’ on the evaluation matrix.

Developing criteria. While development work was taking place, criteria by which to evaluate the business cases needed to be developed. This was the role of the Business Unit Leaders in the main markets – the sponsors. The criteria development was facilitated by CPS-B in a day-long sponsor meeting, and several follow-up discussions between the sponsors and the Editorial Directors. Twelve criteria were eventually developed and used. The criteria were both stop/go and developmental in nature, and were aimed up against the customer experience as well as business performance measures. Despite all the preparation some criteria still needed refining during the meeting itself, showing us just how critical it is to have complete clarity in the criteria used for decision-making.

The Decision-making Meeting

Our product development process was anchored throughout on the principles of inclusive leadership. This cross-cultural, cross-functional meeting was a prime example. Its purpose was to have everyone engage and to add value through the multitude of experiences and perspectives they could bring; to have the whole team make recommendations on which concepts to move forward; but not to replace the key decision-making responsibility of the sponsors. Rather, inclusive leadership significantly enriched the data the sponsors had to make those decisions.

Key players in the meeting were:
• The sponsors and final decision-makers: the Business Unit Leaders from each of our four main development markets and their global President

• The clients: the Editorial Directors from the same four markets, responsible for new product development

• The resource group: various other Editorial, Marketing, Operational and Customer Service staff from a range of markets, bringing a valuable diversity of perspective

• The facilitators: a mix of CPS-B and internally trained facilitators

Using the Matrix

So, to the matrix. You will all be familiar with the 11x7 inch Evaluation Matrix worksheets designed for individual use, and we made very good use of them. However, we needed to combine our individual evaluations into a format that could be used by all 20 people in the meeting. The result was no small Evaluation Matrix. At something like 12 x 20 feet, it covered almost an entire wall of the conference room! So powerful was it as a visual tool, apart from everything else, one of the sponsors christened it the ‘Master Blaster’. We used the Evaluation Matrix for an entire 3 days.

The Evaluation Matrix: Day 1

The aim of the first day was to ensure we all had a shared understanding of the product concepts and their business cases. We reviewed the business cases through round-table discussions of the key information, facilitated question and answer sessions, and the use of an ALUo. The product concepts were aimed up against the different customer target groups we had identified at the first meeting – 2 or 3 in each group. Once we had reviewed all the business cases that fell within a target group, we individually completed an evaluation matrix.

The scale used for the matrix ranged from 1 to 5 according to how closely the concept met each of the twelve criteria. When the individual matrices were complete, the facilitation support team took them away to transfer the individual scores on to the Master Blaster matrix, which was kept hidden until it was complete. The participants went on to review the next set of business cases.

The Evaluation Matrix: Day 2

The purpose of Day 2 was to come to a shared understanding of the evaluations we had made and to determine which concepts warranted further strengthening and development. When the ‘Master Blaster’ matrix was finally unveiled to the group it revealed all our individual scores. The facilitators had highlighted those areas where there was obviously very close agreement amongst the participants or where there was a wide range of difference. As you know, the evaluation matrix is a developmental tool. So the next step was not to add up the numbers – despite every attempt possible by one of the sponsors! Instead we had a facilitated dialogue amongst all the participants, concept by concept, to ensure all points of view were shared and understood, even if people were not in complete agreement.

After this, the sponsors had separate meetings to work out strategies, timeframes and budgets for the months of upcoming development and testing. Meanwhile, the rest of the participants, with the Editorial Directors as the clients and decision-makers, had further facilitated dialogue using the matrix as a basis to determine the key limitations of the concepts. From that we decided which three concepts would benefit most from some time spent developing and strengthening them using the global resources in the meeting.

The Evaluation Matrix: Day 3

The purpose of Day 3 was to develop and strengthen the promising concepts, re-evaluate them, and finally to make recommendations and decisions about which of the 9 concepts should move forward to testing and launch. The facilitated workshops, encompassing work both in generating ideas and developing solutions, lasted one- to two-hours, and significant progress was made during that time – so much so, in fact, that there were some important changes to scores on the Matrix.

At last (it seemed! – using the Matrix took a lot of energy!), we came to making decisions. The sponsors had clarified that there were three categories into which products could be placed: (1) Go ahead to testing and possible launch; (2) Explore and develop the concept further; and (3) Hold for future consideration. We were each given green, yellow, and red dots to correspond to these categories, and, concept by concept, were asked to place the color that represented our recommendation next to that concept on the matrix. Once all the hits were placed, there was another facilitated dialogue that enabled us to share the reasons for our recommendations. As a result of that dialogue and the understanding we gained of people’s different perspectives, some participants changed the color of their hits.

Finally, one more hit was allowed. There had been some discussion early on in the meeting about the tendency of the criteria to seem very rational and objective, and some question about where intuition and gut feeling for the likely success of a concept might come into play. So, at the end of all the discussions, everyone was given one gold star to place next to the concept he or she thought had the most ‘magic’. A decision on one of the concepts – which subsequently went on to test successfully – was certainly influenced by this exercise.

So, after three days of working with the ‘Master Blaster’ Evaluation Matrix, we made our recommendations about which concepts to move forward, as well as which should be the lead and test markets, and which key issues needed addressing. The sponsors had a further meeting, and concurred with all the recommendations except one, where they had even more confidence in moving forward than the rest of the group. Of the nine concepts we had evaluated, five were given the go-ahead to test, with a target launch-date 11 months later. One was placed in the ‘Explore’ category and three put on hold. Two of these will be revisited when the external marketplace is a little more ready for them.

The Power of the Evaluation Matrix

Of course, many factors contributed to the success of our decision-making meeting. But key to that success was our Master Blaster matrix and how well it was prepared for, facilitated, and used. The matrix enabled us to create shared understanding all the way through the meeting, which gave us a basis for sensible, positive, co-operative, energetic, and productive dialogue. Ultimately, this meant we could make decisions right at the meeting, decisions in which everyone had confidence and for which they felt a degree of ownership. The development of the criteria ensured our decisions were aimed up against customer and business need; while the use of the ‘magic factor’ star ensured we didn’t lose the all-important emotional and intuitive appeal of a concept amongst the very rationally-based discussions. Visually, the matrix enabled us to track our discussions quickly and effectively throughout the meeting. Globally, it provided us a common framework, language, and platform for decision making. It promoted quality thinking and focused thinking. From a leadership perspective, the matrix enabled, indeed demanded, everyone’s involvement in making recommendations. Such collaboration in the preparation and the meeting itself inspired highly cooperative efforts in the follow-up development work, which led to very high quality results. The preparation work for the decision-making meeting also put us in a position that enabled a vastly accelerated time to market once the decisions were made. Everyone, not least the facilitation team, was thoroughly exhausted by the end!

As we told you last time, our new approach to product development led to reductions of 88% in idea development costs, over 50% in development time to launch, and an improvement of over 401% in our success rate.

Our decisions were clearly good ones. And the Evaluation Matrix helped us make them.

It’s amazing what a single tool can do!

Footnote: More Than Just Process

The PricewaterhouseCoopers Innovation and Growth study (Davis, 2000) clearly demonstrates the importance of paying attention to all three key capabilities of deliberate process, creative climate, and inclusive leadership if you want to be successful in your innovation efforts. Our two Communiqué articles have emphasized the power of deliberate process in impacting time, cost, and quality. Indeed, the specific need we addressed for the business unit was helping with deliberate process.  But it is important to note that our initiative was also aimed up against developing the climate necessary for high-level creative performance and ensuring that leadership was shared right from the creation of strategy through to decision making and implementation. The use of the Master Blaster Matrix is just one example of how we operationalized the three capabilities. Ultimately, it was paying attention to the whole system of deliberate process, inclusive leadership, and creative climate combined that led to such remarkable success in Family Education’s new product development.

 

Samantha Stead is the Editorial Director for Family Education, International Masters Publishers, Inc., Stamford, CT. She is also a certified Creative Problem Solving (CPS) Facilitator and pursuing qualification as a Creative Problem Solving (CPS) Trainer. Contact Samantha for further information or questions about this case study at sam@cpsb.com.  

Brian Dorval is Director of Programs for CPS-B etc.

With thanks to Robert Botta, Senior Vice President, Family Education, International Masters Publishers, Inc., for the title ‘Master Blaster’.

Source CPSB’s Communiqué, Vol. 11, p.24-27, 2001, © 2001 CPSB, Reprinted with Permission