Reflections on Product Quality - Comments

Click For Summary
SUMMARY

The forum discussion centers on the critical distinction between product quality and production quality, emphasizing that quality must be designed into products rather than tested in. Svein, a former RF test engineer at Cisco Systems, highlights the importance of rigorous testing against design specifications to ensure product reliability. He argues that many design flaws only emerge during mass production, necessitating thorough testing to achieve desired quality levels. The conversation also touches on consumer perceptions of quality, particularly in the context of modern technology, where rapid obsolescence often overshadows traditional quality metrics.

PREREQUISITES
  • Understanding of product lifecycle management
  • Familiarity with quality assurance testing methodologies
  • Knowledge of design validation processes
  • Experience with production yield analysis
NEXT STEPS
  • Research "Quality Assurance in Product Development" methodologies
  • Explore "Design for Manufacturing and Assembly (DFMA)" principles
  • Learn about "Statistical Process Control (SPC)" in manufacturing
  • Investigate "Failure Mode and Effects Analysis (FMEA)" techniques
USEFUL FOR

This discussion is beneficial for product managers, quality assurance professionals, hardware engineers, and anyone involved in the design and manufacturing processes seeking to enhance product quality and reliability.

Svein
Science Advisor
Insights Author
Messages
2,316
Reaction score
813
svein submitted a new PF Insights post

Reflections on Product Quality

qualitycontrol.png


Continue reading the Original PF Insights Post.
 
  • Like
Likes   Reactions: QuantumQuest
Technology news on Phys.org
It is impossible to test quality into a product. Quality must be designed into the product.

Reference https://www.physicsforums.com/insights/reflections-product-quality/

Some years ago I was the main RF test guy at Cisco Systems. I developed an automated test system that provided both the factory floor tests AND had a superset of those used for design validation.

During the design validation phase of many products, there was a running discussion/debate with the design engineers whether what my system was reporting represented design flaws or manufacturing flaws in exemplars of the early design. I was under constant pressure to tweak the system to PASS the product so we could get it to market quicker. My refrain was, "Let's fix the product, not the tests." The tests, while often complex, were a relatively straight forward implementation of testing to the specifications set out in the design goals.

So while I agree that you cannot test quality into a product, testing can certainly determine that the design has not yet achieved the level of desired quality.

This is an important point, because the design engineers often wanted to argue that a given specification need not be tested, claiming that it was "guaranteed by design." I often answered, "If it is guaranteed by design, then why does it keep failing the test?" Eventually, I took the philosophy, "If it has not been tested, then it does not work." This is not an absolute truth statement, just an expression of my conviction that a company needs to test product to ALL their specifications before they gain confidence things are working as designed.

The hardware designer is always responsible for the production yield. The purpose of production tests is to check production quality, not product quality.

Reference https://www.physicsforums.com/insights/reflections-product-quality/

The distinction between production quality and product quality is somewhat artificial, and completely artificial from the customer's perspective. What he buys either meets the specs or it doesn't. He doesn't care why, nor should he have to.

There are inevitably design flaws that do not show up until the product is mass produced near or at full volumes. Other design flaws may not show up until production is moved from one factory to another, because only then is a design sensitivity to some manufacturer detail that was thought insignificant revealed.

At the same time, the notion that the hardware designed is always responsible for the production yield ignores some of the nonsense and noncompliance that can occur on the factory floor, as well as some of the quality issues that can occur with components suddenly out of spec.

When the prototype fulfills all marketing requirements, you are only 40% done

Reference https://www.physicsforums.com/insights/reflections-product-quality/

In my experience, people in a company tend to think of the "real" marketing requirements as the subset of the engineering specs they think the customers really care about; whereas, I always thought about them as the complete engineering spec. The motive was often to have an excuse to ship product that fulfilled what "customers really want" that may be out of spec in areas that "customers don't care about." In cases where this dichotomy exists, one needs to distinguish between the complete engineering specs and the "marketing requirements."
 
  • Like
Likes   Reactions: Pepper Mint
Dr. Courtney said:
So while I agree that you cannot test quality into a product, testing can certainly determine that the design has not yet achieved the level of desired quality.
I remember - a lot of years ago I was with Tandberg Data. One division was producing tape streamers and had put much effort in the design. They then told a Japanese customer somewhat proudly that their yield now was 99.95% (i.e. 0.05% failure in the final tests). The customer responded with: "Congratulations! You are now #3 on our lists - from the bottom!".

The result: One of the leading hardware engineers was given the responsibility to fix the product quality and the production quality. It took two years of intensive work, but they ended up as #3 from the top.

A digression: In another context (where I was a consultant), I learned that the main product had a 40% failure rate in the production test. My response was: "Are you kidding? How can you live with such a bad product / production quality?" Needless to say, they were offended and told me to keep my opinions to myself...
 
I think there is some validity to the notion that the public is not always interested in quality.

Good article, congratulations. But let me present an different view on the value of product quality.

Consider the phone apps subset of the world. Perfection is wasted on an app that people don't want, and defects are not sufficient to defeat a popular app.

Think of AT&T phones from the 1970s. They were top quality, and built to last for 40 years. Who wants to pay for that quality today? Today's consumers are anxious to throw away our smart phones after a year or two so that we can get the newest versions with extra features. Reliability, ruggedness, longevity, maintainability, and repair-ability are all deprecated in the modern smart phone market. Bugs in the software are simply patched at the next forced update.

I think that the buying public is suspicious of claims of additional quality justifying higher prices. Audio cables are a good example. Scam artists exploit false claims of quality (as well as every other falsehood they can imagine), thus turning customers into skeptics. I'm such a skeptic. I find that often (not always) I get the best value for my money at the dollar store.

None of these things dispute what you said in the Insights article. I just wanted to point out that product quality is not always the most important feature.
 
anorlunda said:
I just wanted to point out that product quality is not always the most important feature.
Not in the consumer market, probably. But would you feel comfortable in an oceangoing boat where the final test fallout were 10% or above? Or, for that matter, in an airplane with similar production problems?
 
Svein said:
Not in the consumer market, probably. But would you feel comfortable in an oceangoing boat where the final test fallout were 10% or above? Or, for that matter, in an airplane with similar production problems?

Of course not. My background is in nuclear plant controls which are even more stringent. But the Insights article didn't say that it was limited to critical applications, while your advice on how to achieve product quality does applies to all kinds of software.

My post was not criticism, it merely pointed out that we shouldn't look down our noses at low quality in every domain.
 
  • Like
Likes   Reactions: Svein

Similar threads

  • · Replies 25 ·
Replies
25
Views
4K
  • · Replies 52 ·
2
Replies
52
Views
7K
  • · Replies 67 ·
3
Replies
67
Views
8K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 46 ·
2
Replies
46
Views
5K
  • · Replies 5 ·
Replies
5
Views
4K