One of the key tools that we used was the formal design review process. We used it during complete product developments – it was an integral part of our approach to ISO 9001 and we were trained how to do it – but it was of even greater benefit when we were parachuted in to rescue a project.
I have used the same technique with clients ever since.
However clever the designers, however sophisticated the design, this approach finds bugs. Peer-reviewing new designs before you commit a lot of time and money can be hugely beneficial in preventing problems further downstream… if done properly.
Your people will naturally be capable of finding design weaknesses if given the opportunity, environment and culture that encourages them to do so, even if – especially if – they aren’t personally involved in that part of the design.
The review is done by a select group of peers (colleagues) from different disciplines; electronic engineers, mechanical engineers, system architects, manufacturing people, software experts, etc, under the chairmanship of an experienced reviewer - not the designer/s themselves.
The timing of the review is usually set by project management but is typically at a point in the project where a significant commitment of time, money or risk is about to be made e.g. release of design details into prototype manufacturing.
For a complex design the requirements specification, functional specification and the design documentation should be circulated in advance so the attendees can spend time understanding it and assessing it for themselves. The design review meeting then reviews and challenges these findings.
For a simple design, or an iteration, the findings can usually be derived on-the-fly during the meeting itself.
In both instances the meeting decides on the relative importance of the findings and identifies the actions that need to be taken. These are documented in a meeting note or minutes, and the actions are progressed to a conclusion through project or line management.
To help guide the review, give it structure, and avoid omitting key questions, I have always found it beneficial to use a detailed checklist. Added to over time it can become a ‘superset’ of all possible questions. Many will be not applicable for any given circumstance so can be omitted, but it’s a way to avoid leaving anything out. And it captures best practice for your products and industry.
There isn’t room here to reproduce a generic checklist – in any case it should be bespoke to you and your business – but, for illustration, I would expect an electronic or electro-mechanical checklist to cover:
…and so on (the full checklist asks much more detailed questions, of course).
I suggest that you draw up a checklist specific to your own products and technologies, then evolve the list over time on the basis of experience and as a Corrective Action if you find that design shortcomings have slipped through its safety net.
In any case, it isn’t the list itself that’s important, it’s the things that going through the list – and asking questions of each other in a constructive way – brings up.
And, as a bonus, it’s a very effective way of addressing ISO 9001 Section 8.3.4.
Instinctively, some design engineers don’t like this process. If it’s not done well they can feel like they are under unfair pressure or criticism. After one such review an engineer said to me “it was a waste of time, most questions were irrelevant, it took too long”. “Sorry to hear that”, I replied, “so you didn’t find anything that could be improved?” “Oh yes, we spotted some things we definitely needed to change…”
QED!
The fix for their reluctance? Make it constructive not critical, make it relevant, show how effective it can be as a design safety-net, and make them part of developing the process so they are passing on their experience and knowledge to others.
By finding and fixing the design shortcomings and risks at this stage you can prevent hugely expensive field failures or product recalls.