Archive for Manufacturing Training

Applying Taguchi to Load Development

Posted in Manufacturing Improvement with tags , , , , , , , on August 4, 2013 by manufacturingtraining

This blog entry describes the application of the Taguchi design of experiments technique to .45 ACP load development in a Smith and Wesson Model 25 revolver.

IMG_0692-450

Taguchi testing is an ANOVA-based approach that allows evaluating the impact of several variables simultaneously while minimizing sample size.  This is a powerful technique because it allows identifying which factors are statistically significant and which are not.   We are interested in both from the perspective of their influence on an output parameter of concern.

Both categories of factors are good things to know.  If we know which factors are significant, we can control them to achieve a desired output.   If we know which factors are not significant, it means they require less control (thereby offering cost reduction opportunities).

The output parameter of concern in this experiment is accuracy.   When performing a Taguchi test, the output parameter must be quantifiable, and this experiment provides this by measuring group size.   The input factors under evaluation include propellant type, propellant charge, primer type, bullet weight, brass type, bullet seating depth, and bullet crimp.  These factors were arranged in a standard Taguchi L8 orthogonal array as shown below (along with the results):

Taguchi-1

As the above table shows, three sets of data were collected.  We tested each load configuration three times (Groups A, B, and C) and we measured the group size for each 3-shot group.

After accomplishing the above, we prepared the standard Taguchi ANOVA evaluation to assess which of the above input factors most influenced accuracy:

Taguchi-2

The above results suggest that crimp (or lack thereof) has the greatest effect on accuracy.   The results indicate that rounds with no crimp are more accurate than rounds with the bullet crimped.

We can’t simply stop here, though.  We have to assess if the results are statistically significant.   Doing so requires performing an ANOVA on the crimp versus no crimp results.  Using Excel’s data analysis feature (the f-test for two samples) on the crimp-vs-no-crimp results shows the following:

Taguchi-3

Since the calculated f-ratio (3.817) does not exceed the critical f-ratio (5.391), we cannot conclude that the findings are statistically significant at the 90% confidence level.  If we allow a lower confidence level (80%), the results are statistically significant, but we usually would like at least a 90% confidence level for such conclusions.

So what does all the above mean?   Here are our conclusions from this experiment:

  • This particular revolver shoots any of the loads tested extremely well.  Many of the groups (all fired at a range of 50 feet) were well under an inch.
  • Shooter error (i.e., inaccuracies resulting from the shooter’s unsteadiness) overpowers any of the factors evaluated in this experiment.

Although the test shows that the results are not statistically significant, this is good information to know.  What it means is that any of the test loads can be used with good accuracy (as stated above, this revolver is accurate with any of the loads tested).  It suggests (but does not confirm to a 90% confidence level) that absence of a bullet crimp will result in greater accuracy.

QMCoverThe parallels to design and process challenges are obvious.   We can use the Taguchi technique to identify which factors are critical so that we can control them to achieve desired product or process performance requirements.   As significantly, Taguchi testing also shows which factors are not critical.  Knowing this offers cost reduction opportunities because we can relax tolerances, controls, and other considerations in these areas without influencing product or process performance.

If you’d like to learn more about Taguchi testing and how it can be applied to your products or processes, please consider purchasing Quality Management for the Technology Sector, a book that includes a detailed discussion of this fascinating technology.

And if you’d like a more in depth exposure to these technologies, please contact us for a workshop tailored to your needs.

Advertisements

Statistical Tolerance Analysis

Posted in Creativity, Manufacturing Improvement with tags , , , , , on June 17, 2013 by manufacturingtraining

Dimensional tolerances specify allowed variability around nominal dimensions.   We assign tolerances to assure component interchangeability while meeting performance and producibility requirements.  In general, as tolerances become smaller manufacturing costs become greater.  The challenge becomes finding ways to increase tolerances without sacrificing product performance and overall assembly conformance.  Statistical tolerance analysis provides a proven approach for relaxing tolerances and reducing cost.

Before we dive into how to use statistical tolerance analysis, let’s first consider how we usually assign tolerances.  Tolerances should be based on manufacturing process capabilities and requirements in the areas of component interchangeability, assembly dimensions, and product performance.  In many cases, though, tolerances are based on an organization’s past tolerance practices, standardized tolerance assignment tables, or misguided attempts to improve quality by specifying needlessly-stringent tolerances.  These latter approaches are not good, they often induce fit and performance issues, and organizations that use them often leave money on the table.

There are two approaches to tolerance assignment – worst case tolerance analysis and statistical tolerance analysis.

In the worst case approach, we analyze tolerances assuming components will be at their worst case conditions.  This seemingly innocuous assumption has several deleterious effects.  It requires larger than necessary assembly tolerances if we simply stack the worst case tolerances.   On the other hand, if we start with the required assembly tolerance and use it to determine component tolerances, the worst case tolerance analysis approach forces us to make the component tolerances very small. Here’s why this happens:   The rule for assembly tolerance determination using the worst case approach is:

Tassy = ΣTi

where

Tassy = assembly tolerance

Ti = individual component tolerances

The worst case tolerance analysis and assignment approach assumes that the components will be at their worst case dimensions; i.e., each component will be at the extreme edge of its tolerance limits.  The good news is that this is not a realistic assumption.  It is overly conservative.

Here’s more good news:  Component dimensions will most likely be normally distributed between the component’s upper and lower tolerance bounds, and the probability of actually being at the tolerance limits is low.   The likelihood of all of the components in an assembly being at their upper and lower limits is even lower.  The most likely case is that individual component dimensions will hover around their nominal values.  This reasonable assumption underlies the statistical tolerance analysis approach.

We can use statistical tolerance analysis to our advantage in three ways:

  • If we start with component tolerances, we can assign a tighter assembly tolerance.
  • If we start with the assembly tolerance, we can increase component tolerances.
  • We can use combinations of the above two approaches to provide tighter assembly tolerances than we would use with the worst case tolerance analysis approach and to selectively relax component tolerances.

Statistical tolerance analysis uses a root sum square approach to develop assembly tolerances based on component tolerances.   In the worst case tolerance analysis approach discussed above, we simply added all of the component tolerances to determine the assembly tolerance.  In the statistical tolerance analysis approach, we find the assembly tolerance based on the following equation:

Tassy = (ΣTi2)(1/2)

Using the above formula is straightforward.  We simply square each component tolerance, take the sum of these squares, and then find the square root of the summed squares to determine our assembly tolerance.

Sometimes it is difficult to understand why the root sum square approach is appropriate.   We can think of this along the same lines as the Pythagorean theorem, in which the distance along the diagonal of a right triangle is equal to the square root of the sum of the squares of the triangle’s sides.  Or we can think of it as distance from an aim point.  If we have an inch of lateral dispersion and an inch of horizontal dispersion, the total dispersion is 1.414 inches as we see below:

STA-2

To continue our discussion on statistical tolerance analysis, consider this simple assembly with three parts, each with a tolerance of ±0.002 inch:

STA-1

The worst case assembly tolerance for the above design is the sum of all of the component tolerances, or ±0.006 inch.

Using the statistical tolerance analysis approach yields an assembly tolerance based on the root sum square of the component tolerances.  It is (0.0022 + 0.0022 + 0.0022)(1/2), or 0.0035 inch.  Note that the statistically-derived tolerance is 42% smaller than the worst case tolerance.   That’s a very significant decrease from the 0.006 inch worst case derived tolerance.

Based on the above, we can assign a tighter assembly tolerance while keeping the existing component tolerances.  Or, we can the stick with the worst case assembly tolerance (assuming this is an acceptable assembly tolerance) and relax the component tolerances.   In fact, this is why we usually use the statistical tolerance analysis approach – for a given assembly tolerance, it allows us to increase the component tolerances (thereby lowering manufacturing costs).

Let’s continue with the above example to see how we can do this.   Suppose we increase the tolerance of each component by 50% so that the component tolerances go from 0.002 inch to 0.003 inch.   Calculating the statistically-derived tolerances in this case results in an assembly tolerance of 0.0052 inch, which is still below the 0.006 inch worst case assembly tolerance.   This is very significant:  We increased component tolerance 50% and still came in with an assembly tolerance less that the worst case assembly tolerance.  We can even double one of the above component’s tolerances to 0.004 inch while increasing the other two by 50% and still lie within the worst case assembly tolerance.  In this case, the statistically-derived assembly tolerance would be (0.0032 + 0.0032 + 0.0042)(1/2), or 0.0058 inch. It’s this ability to use statistical tolerance analysis to increase component tolerances that is the real money maker here.

The only disadvantage to the statistical tolerance analysis approach is that there is a small chance we will violate the assembly tolerance.   An implicit assumption is that all of the components are produced using capable processes (i.e., the process capability is such that ±3σ or all parts produced lie within the tolerance limits for each part).  This really isn’t much of an assumption (whether you are using statistical tolerance analysis or worst case tolerance analysis, your processes have to be capable).  With a statistical tolerance analysis approach, we can predict that 99.73% (not 100%) of all assemblies will meet the required assembly dimension.  This relatively small predicted rejection rate (just 0.27%) is usually acceptable.  In practice, when the assembly dimension is not met we can usually address it by simply selecting different components to bring the assembly into conformance.

Leaving money on the table…

Posted in Manufacturing Improvement with tags , , , on April 10, 2012 by manufacturingtraining

On the subject of drawing tolerances, many organizations leave a lot of money on the table.   This is an important area from both cost reduction and quality perspectives.  Here’s a question for  you:  How does your organization assign tolerances?

Common approaches for tolerance selection include the following:

  • In some organizations, tolerances are based on the nominal dimension.  Dimensions up to 1 inch might get a tolerance of ± 0.001 inch, dimensions up to 5 inches might get a tolerance of ± 0.01 inch, and everything above 5 inches might get a tolerance of ± 0.05 inch.  This makes the designer’s work easy, but it is a poor practice.
  • In some organizations, tolerances are based on decimal places.  If the designer specifies a nominal dimension of, say, 1.000 inch (3 decimal places), the tolerance for might be ± .001 inch (all 3-decimal-place dimensions are assigned a ± .001 inch tolerance).  If the designer specifies a nominal dimension of 1.00 inch (2 decimal places), the tolerance is ± .01 inch.  The tolerances are restricted to fixed steps, and it’s not likely the steps correspond to fit, function, or process capabilities.
  • In some cases, designers assign tight tolerances to parts in an effort to improve quality.  This practice is misguided and builds unnecessary cost into the product.
  • In some cases, the designers assess how the parts fit together, what the parts have to do, and how the parts will be manufactured, and base the tolerances on these factors.

That last approach is the best approach.  Based on our observations of many organizations, though, it’s not what usually happens.

Cost Reduction Opportunities

The best point for reducing cost is during the design process.   A good approach is to include the manufacturing folks in the design process, assess the production approach as designs emerge, and identify processes and process capabilities for each part.  It’s the engineering organization’s responsibility to select dimensions and assign tolerances that will assure fit and function; it’s the manufacturing organization’s responsibility to raise a red flag where tight tolerances mandate expensive processes or a high likelihood of nonconformances.

If you didn’t do the above during the design process any you have tightly-toleranced parts in production, you can still reduce cost by targeting unnecessarily-tight tolerances.  Here’s a recommended approach:

  • Talk to your QA and manufacturing people.   They’ll be able to identify parts and dimensions that cause frequent rejections.   Where this situation exists, evaluate relaxing the tolerances.
  • Look for “use as is” dispositions on nonconforming parts (trust me on this…your manufacturing people will know where this is occurring).  If a “use as is” disposition is the acceptable, it’s likely the tolerance on the nonconforming dimension can be relaxed.
  • Talk to your purchasing folks.   They can reach out to the supplier community and ask the same kinds of questions.   This is a particularly important area to explore, because in most manufacturing organizations approximately 70% of the cost of goods sold flows through the purchasing organization.  You may not know without asking how many parts your suppliers are rejecting; all you’ll see are the costs buried in what you have to pay for the parts.   The best way to ask the question is the most direct:   What are we doing that’s driving your costs?  The suppliers know, and they’re usually eager to answer the question.

All of the above is associated with cost reduction, but that’s not the only place where inappropriately-toleranced parts create problems.  In many cases, dimensioning and tolerancing practices can induce system-level failures.    That’s another fascinating area, and we’ll address it in a future blog entry.

Would you like to know more about cost reduction opportunities you act on right now?  Consider our cost reduction training programs, or take a look at our most recent book, Cost Reduction and Optimization for Manufacturing and Industrial Companies!

Posted in Manufacturing Improvement with tags , , on April 1, 2012 by manufacturingtraining

If you work in manufacturing, I know you have been inundated with cute titles for quality and productivity improvement programs for decades:

  • Zero defects (that one made a few guys in Winter Haven wealthy)
  • TQM (does anyone use that term any more?)
  • 6σ (we are fascinated by Greek letters and martial arts belts)
  • 5 Whys (hey, why not?)
  • 5S (in both English and Japanese, no less!)
  • Lean (perhaps picking up on our anti-obesity predilection?)

And many, many more. You get the idea.

Over the last three or four decades I’ve watched all of the above with some detachment and great amusement.  Much of what’s included in these programs is the same; the titles are simply new wrappings around old ideas.  But the old ideas still make sense.  Process improvement.  Scrap reduction. Clean workplaces.  Reduced setup times.  Straight-line manufacturing.  The list goes on.  My challenge to you is this: Find something in any of the above programs that didn’t originate in basic manufacturing/industrial management concepts…concepts that go all the way back to the Industrial Revolution and Frederick Taylor.  I’d be interested in hearing your comments.

The above notwithstanding, I’d like to weigh in with a program of my own.  I’ve thought about this a lot. It’s got to be simple.  It needs a Greek letter to lend an air of the esoteric and perhaps make it sound needlessly scientific (although I promise you, it won’t be either).  It needs to offer a catchy way to package Mr. Taylor’s key concepts.  It needs to be marketable.  And it needs to be focused on improving manufacturing, quality, and profitability.

Here we go:  7 Pi.

Yep. I originally started out with 6P, but then I realized I was leaving out an important P, and P didn’t sound as cool as Pi, or ππ, as you know, is the Greek letter for P.

About now, as you’re reading this, you’re probably wondering what this is all about.  The focus here is delivery performance improvement, or getting and staying on schedule as a manufacturer.  If you’ve ever run a plant that was behind schedule, you know how tough life can be.  And if your plant is on schedule, you know that quality and profitability are going to be okay (trust me on this, I’ve seen it happen in the plants I’ve run and in the ones I’ve advised).  Staying on schedule is critical.  If you can do that, everything else falls into place.  And if you do everything you need to do to be on schedule, everything is in place.

So, here we go…the 7 Pi’s for delivery performance improvement:

  • People
  • Product
  • Process
  • Procurement
  • Productivity
  • Production Control
  • caPacity

I know, I fudged it a little on that last one, but that’s the only bit of artistic license I’ll take here.  Watch the ManufacturingTraining blog, folks, because we’re going to explore each of our 7π’s in the coming weeks!

KU Online Courses Scheduled for 2012-2013

Posted in Manufacturing Improvement with tags , , , , , on March 21, 2012 by manufacturingtraining

ManufacturingTraining and the University of Kansas have finalized the course schedule for our next series of six online Manufacturing Optimization courses:

  • Delivery Performance Improvement:  21 August 2012
  • Cost Estimation:  16 October 2012
  • Industrial Statistics:  8 January 2013
  • Quality Management:  5 March 2013
  • Root Cause Failure Analysis:  30 April 2013
  • Cost Reduction and Optimization:  25 June 2013

Each course is 3 weeks long and the University of Kansas will grant Continuing Education credit.  We’ll meet for online lectures twice each week, with interactive assignments and discussion board activities following the lectures.  We’ll be posting more information here and on the ManufacturingTraining.com website in the near future, so stay tuned for more information on this exciting new professional education opportunity!  In the meantime, if you want advance information on pre-enrolling, you can do by shooting an email to info@ManufacturingTraining.com.

Hello There!

Posted in Manufacturing Improvement with tags , , , , on March 2, 2012 by manufacturingtraining

Hello, everyone!  Joe Berk here.  I’m the principal consultant and trainer at ManufacturingTraining.   We’re firing up this blog to keep people posted on what’s going on in our world, to tell a few process improvement war stories, and to have fun!

So why a blog?

Well, I write another blog for one of my clients, the California Scooter Company…a great motorcycle manufacturer right here in the United States.   That blog has a huge following, and it’s helped the California Scooter Company enormously.   My thought is that it might be cool to develop a similar following here.

So who are we?

Well, if you’re like me, you probably haven’t had great experiences with consultants.   There are nearly as many consultant jokes as there are lawyer jokes, and where there’s humor, it’s usually based on truth.  To be blunt, when I managed manufacturing organizations I thought the consultants I had rammed down my throat by my bosses were hucksters.

I like to think my organization is different, and our clients tell me that’s the case.  There’s nothing “touchy-feely” about us.   We don’t have any consulting buzzwords, three-letter  acronyms, Greek-lettered improvement programs, or martial arts belts, and there’s a reason for that:  I ‘ve managed large engineering and manufacturing organizations.    I know what it means to have to deliver conforming products on time at or below budgeted cost, and that is the theme underlying all that we do.   If you want to hold hands and sing songs, we’re not your guys.  If you want the hard facts and measurable improvement, I think you’ll want to talk to us.   But don’t do so right away.  Follow the blog for a little bit, and if you have an interest, give us a call.

So this is our first post on the ManufacturingTraining.com blog.  It’s new, we are going to make it interesting, and it’s focused on manufacturing.   Like I said above, there’s no snake oil here.   It’s the real deal.  I’m a writer and a photo nut, and I’ll do my best to keep interesting stuff posted here for you.   We’ll have photos, videos, graphics, and sometimes just words.   Want to see a sample?   Take a look at what it took to make a motorcycle for Melanie Troxel, the In-N-Out Burger Corporation’s funny car driver…

That was fun.   Sometimes we get to do things like that for some of our clients.

Let me throw a question at you more directly related to process improvement….do you know what this photo shows?   I’ll give you a hint…this product had a rejection rate of about 50%, each assembly cost about $50K, and the company lived with the problem for 10 years before they applied what we offer!   Let me know if you can guess what this is.  Shoot your comments in, and in a another day or two I’ll share with you what it’s all about!

If you’d like to follow the ManufacturingTraining blog, just click on the subscription link up top (it will let you know when new things are posted here).   You’ll have to enter your email address, but this is the only place it will be used and I promise you’ll never be spammed by us!

Stay tuned…there’s lots more coming up!