Create a free Manufacturing.net account to continue

Do You Flinch When You Hear 'Design Review'?

When the call goes out for design reviews, do people volunteer, or do they hide in the break room or begin scheduling vacation?

To get the most out of your design reviews, employ a diversified panel of design expertise, stay focused on the purpose of the review, eliminate the “check-the-box” behavior, and ensure your panel approaches the review with a mindset of enhancing the design.

In my career I have participated in countless design reviews. Many were very constructive, while others were disastrous. In an age where we often focus on value-added activity and trim out concepts of “quality by inspection” the often-painful design review activity gets a great deal of scrutiny and criticism.

I am the first person to recommend the elimination of non-value-added activity and to try and find a way to eliminate inspections for quality. In some organizations, that idea means the engineering design review should go.

However, I’m of a mind that design reviews are a critical component of a robust design process. How can I say one thing and believe the other? It’s simple. Design reviews, done correctly, are absolutely value added and are part of the process of improving designs, not just a process of inspecting designs.

Take a moment and reflect on your organization’s design reviews of late. When the call goes out for design reviews, do people volunteer, or do they hide in the break room, find excuses to visit the manufacturing floor for a while, or begin scheduling vacation? Are they fun, or are they painful?

If you read my posts, you’ve heard me say before; pain is the indicator of waste. If your design reviews are painful, then chances are they are wasteful. Let’s examine some of the common sources of waste in design reviews and discuss how to eliminate them. Then, I’d like to share some of the design review practices that I have witnessed to be most effective and sometimes fun.

First, let’s look at some common sources of waste and pain in design reviews. I submit the following for your consideration:

  • Preparation for the review.
  • “Did you do,” check-the-box phenomenon.
  • Detailed inspections of design models or drawings.
  • Chaotic or long, unstructured meetings.
  • Non-constructive interrogations.
  • Escapes or design defects that happen anyway.

Let’s briefly discuss each one and how to eliminate them. Hopefully, if your organization experiences others, this discussion will provide some clues or insights to solve them as well.

Perhaps the greatest source of waste in design reviews is in preparing for design reviews. Do your design teams spend days assembling presentations to walk a review team through the design? Such preparation is activity that does absolutely nothing to further the design. It is waste of the “over-processing” category in Lean terms.

If our organization of design information such as requirements, risks and mitigations, and design materials is any good, then the same documents, databases, or materials used to plan and execute the design should be clean enough to use to facilitate the design review. If it isn’t then perhaps your organization should invest some work correcting that. Chances are if you can’t use it for a design review, then it is disorganized or confusing enough to cause wasted time or mistakes in the design process.

If your are compelled to develop presentations to show executives or customers who are not interested in the details of the design, but just want to have a milestone they can experience, or want to feel good about the design, then run a calculation of man-hours that go into making them feel better and show them the numbers. Chances are executives responsible for the bottom line, or customers who pay for your time will begin to ask for suggestions to reduce the effort involved.

If that still doesn’t eliminate the need to create presentations, make an effort to reduce the effort to the lowest common denominator. Find out fundamentally what the executives or customers want to know and see if you can reduce the communication from a long and involved presentation to a simple dashboard that only takes a few minutes for an administrator or project manager to update.

My biggest pet peeve is the “check-the-box” design review. I’ll bet you are all familiar with it too. It’s the design review where you spend hours or days with (usually) managers asking, “Did you do this or that,” and the design team showing documentation or data to prove that it did. This is absolute waste.

Let’s think about it for a moment. If the team knows what is expected or follows a design process, then the answer to, “Did you do,” should be “yes.” If the team didn’t follow the process then you have bigger problems than the design review answer of, “no.” And if the answer is “no,” what is the right thing to do?

If you make the team go back and do whatever it was, that might be nice for enforcing the rules, but it’s probably too late for whatever it was to be used to improve or perfect the design. Here’s an example.

Suppose the team, under pressure to design more quickly, decided to cut a corner and skipped the required Failure Modes and Effects Analysis (FMEA). The point of FMEA is to proactively identify risks and mitigate them. By the time the design review rolls around, the design might be done and ready for production. What’s the point of mitigating risk now? The design risks that might have been mitigated have already taken place.

To eliminate the check-the-box, non-value-added design review, simply change the question. Stop asking if the team followed the process. Following the process should be the team and the managers’ real-time responsibility, not an end-of-line check. Instead, ask, “How do you know…?”

Ask the team to prove or demonstrate how the design satisfies the requirements. Ask the team to show how risks have been identified and addressed. This changes the focus from following process and checking boxes to proving out the design, which is what the design review is supposed to be.

Detailed inspections of design models and drawings, or of programming code, should be part of the design process. I think everywhere I have ever worked or benchmarked or consulted had a practice of a second designer signing his or her name to code or drawings indicating that they reviewed it and that the quality and practice are appropriate.

If such a practice has any integrity, using design reviews to review models and drawings is a waste of time. Again it is over-processing, non-value-added work. Stop the practice by putting some teeth into the “checked-by” block of your design documentation.

If an error is found, call the owners of both names to the table and make them fix it. Let there be consequences for the checker having checked the box without putting forth the correct effort.

Chaotic and long, unstructured reviews are painful, and therefore, most certainly wasteful. In such reviews, how do we know that we are addressing the important items or that we aren’t forgetting to investigate something relevant?

Keep meetings short with two techniques. First, use your risk mitigation practices to know or identify the elements of the design that most need to be scrutinized. Perhaps the solution is controversial, or the requirements are challenged or difficult to fulfill. Start with the showstoppers. There’s no sense in reviewing the rest if a controversial element sends us back to the drawing board.

Second, have a standard review practice. If you have multiple reviews at different stages of design, each one can have its own recipe. If everyone knows the recipe, then everyone knows how to save their questions for the correct time and we can be sure that we review all the right elements.

Just keep in mind that a recipe is no substitute for common sense. If something out-of-the-ordinary needs to be discussed, adjust the agenda appropriately.

Non-constructive interrogations are usually the result of members of the review team having misperceptions about the purpose of the design review. The design review should always be about improving the design. It is not about finding fault with the design team’s practices. It is a very important distinction.

Coach reviewers concerning the proper attitude and questions they should prepare in order to execute a meaningful design review. Remind them that their purpose is to help the team identify ways to improve the design before moving to the next step. It is not to dig, and scratch, and gouge, and torment until the team admits to some flaw.

If your design review practice generates a habit of prioritizing discussions of controversial elements first, and demonstrates an attitude of helping to improve designs, then design teams will habitually identify potential design weaknesses first and seek that help. When this happens, we are set up to truly improve designs with our design reviews.

We don’t have to interrogate teams like an adversary to get them to confess the weaknesses. In fact that very behavior is a huge contributor to the prepared presentations and check-the-box behaviors, which design teams use as a means of managing their managers. Make it stop.

Escapes and design defects that slip through design reviews have two general sources. The first is failures of the design process to prevent those mistakes. The second is a design review that is incapable of screening those mistakes.

Ideally, we should not want our design reviews to be about catching mistakes, and our design process should be robust to allowing them to happen. Here’s the bottom line, and I’m afraid that it repeats everything we have already discussed.

Prepared presentation, check-the-box, hostile interrogation design reviews do not catch mistakes. If anything they encourage design teams to hide mistakes. Make a behavioral, cultural, habitual change in your design reviews and bolster your design process.

Design reviews effectively focused on improving the design review encourage discussion of troublesome elements and bring challenges to the surface. A diversified review team can bring extra ideas and expertise to the challenges and propose improvements.

Now, before you run off and start fixing problems with your design review practices, please allow me to share some of the great practices that I have experienced and that can turn design review sessions into something truly profound. It will only take a few short paragraphs and then, perhaps, you will have some new ideas to work with too.

A diversified review team is a huge asset. Design reviews that include cross-functional representation tend to tease out questions and considerations that engineer and engineering manager only reviews miss.

We can make cross-functional reviews quicker and less painful by conducting discussions with appropriate audiences. Review the cost trade-offs with program management and executives. Discuss answers to the requirements and aesthetic elements with sales and marketing. Discuss manufacturability with production teams.

Inviting technical experts with absolutely no involvement on the project is similar excellent practice. I worked with an organization that instilled a practice of inviting engineers and technologists from other sectors or businesses within the corporation to design reviews.

Not only did the participation of expert non-experts tease out design possibilities, but it was highly engaging, even fun, to learn technology and design lessons from peers and people whom we had never before met. Also, those same individuals would consult with us about their designs, either in similar reviews or over the phone. It became an in-house social network of design expertise. Powerful.

It required an investment of resources to provide travel and time for these experts to participate, but the ideas that were generated could be priceless. One way to minimize time away from projects and travel expense was to leverage network-meeting capabilities to brief everyone on the product and design intent, before they travel.

Doing so gave reviewers time to explore the high-level aspects of the design and prepare their questions. Many times this made for time well spent in transit or in hotels.

Minds that are already “out of the box” and are familiar with the technologies incorporated can often disclose alternatives the design team didn’t consider. The best time for such a review is at the concept decision point of the design process.

By inviting new ideas before detailed development begins, when we think we have a winning concept, we have a real opportunity to make a winning idea even better. It’s too late to capitalize on out-of-the-box ideas when you are done with detailed design and are ready to test.

In fact, the most important place in the design process for a constructive design review is at concept selection. The selection of the idea or concept that goes into development and production is the most important decision of the product development process.

The concept selection determines product cost, and customer satisfaction, and it significantly influences and bounds quality and produce-ability. If you do only one design review, do it here.

The most efficient and effective, and least painful, design review practice that I have experienced drove several design reviews with specific agendas at different stages of the design process. The largest design review was at concept selection.

Additional design reviews were conducted after concept testing (if any took place), after detailed design - before validation testing, and before product launch. Suffice it to explain that each one’s purpose was to answer the following question. “How do we know we are ready to go to the next step, and that we won’t be reworking the design, or the step, as a result?”

Redoing testing because we forgot something is an expensive and time-consuming mistake. Entering production without all of our documentation and supply chain elements in place can cause disaster. Entering production with a product that is not produce-able also causes disaster. Rework is waste caused by defects.

My final piece of advice to share is worth finishing reading this post. It’s perhaps the biggest reason that even effectively conducted design reviews fail to provide value.

When you plan your design reviews into your schedule, plan for a period afterward to incorporate findings into the design, before proceeding to the next activity. What is the point of an exercise to identify ways to improve a design if you don’t enable the team to act on those findings?

If your schedules won’t allow a team to improve the design after the review, then admit your design reviews are absolutely non-value-added, and eliminate the practice altogether. You are better off investing your time praying.

If your team doesn’t need the time to incorporate improvements then your schedule just got accelerated. But, I caution you. If the team didn’t have anything to improve, either your design process is so robust that you can afford to eliminate the design review practice, or your design review wasn’t effective. My money is on the latter.

As I wrote above, I have experienced design reviews from the profound and fun, to the absolute disaster, and every experience in between. Take some time to consider the observations that I shared above. Bring them to your design teams and ask them about it and for their observations.

Design reviews are one of those practices that are what we make them to be. Consider you own carefully, with an eye for waste and pain. Make some changes to get rid of the pain and make your reviews truly value added. Your design teams will appreciate it and your products and production will improve.

Stay wise, friends.

More