Engineering Modeling

Tolerance Analysis using Worst Case Approach, continued (Part 3 / 13)

Eric Torkia, MASc

Share:

Print

Rate article:

No rating
Rate this article:
No rating

In my last couple of posts, I provided an introduction into the topic of Tolerance Analysis, relaying its importance in doing upfront homework before making physical products. I demonstrated the WCA method for calculating extreme gap value possibilities. Implicit in the underlying calculations was a transfer function (or mathematical relationship) between the system inputs and the output, between the independent variables and the dependent variable. In order to describe the other two methods of allocating tolerances, it is necessary to define and understand the underlying transfer functions.

For the stacked block scenario (as in Figure 3-1), the system output of interest is the gap width between the right end of the stacked blocks and the right edge of the U-shaped cavity. The inputs are the individual widths of each block and the overall cavity width. Simple addition and subtraction of the inputs results in the calculated output. Such is the case with all one-dimensional stack equations as can be seen with this transfer function:

Do all dimensional transfer functions look like this? It depends. Certainly, the one-dimensional stacks do. But let us also see how this might work with dimensional calculations of a two-dimensional nature.

Consider the case of the overrunning or freewheel one-way clutch, a mechanism that allows a shaft or rotating component to rotate freely in one direction but not the other. Fishing reels use this mechanism to allow fish line to spool out freely in one direction but also allow the fish to be reeled in when rotating in the opposite direction. A typical cross-section of the one-way clutch is depicted in Figure 3-2. Primary components of the system are the hub and outer race (cylinder), connected by a common axis. They rotate freely with respect to each other, as wheels revolve around an axle. They also have internal contact via four cylindrical bearings that roll against both the outer surface of the hub and the inner surface of the race. The bearing contacts are kept in place with a series of springs that push the bearings into both the hub and race (springs are also in contact with hub). The end result is that the race rotates freely in one direction (counter-clockwise) but not the other (clockwise); thus the name.

The two system outputs of interest are the angle at which the bearings contact the outer race and the gap where the springs reside. Why are these two outputs important? To reduce shocks and jerks when contact is made during clockwise rotation, the design engineers must know where the rotation will stop and what potential variation exists around that stopping point. For the springs to have some nominal compression and maintain bearing position, a desired gap with some allowable variation needs to be defined. Neither of the two system outputs can be defined by a simple one-dimensional stack equation in that there are common input variables affecting both outputs simultaneously. The approach in defining the transfer functions, however, is the same. It is done by following a "closed loop" of physical locations (like contact points) through the system back to the original point, as a bunch of "stacked" vectors. Those vectors are broken down into their cartesian components which are the equivalent of two one-dimensional stacks. A more notable difference in the nature of the one-way clutch transfer functions versus those of the gap stack is their nonlinearities. Nonlinearities can introduce their unique influence to a transfer function. (But I digress.)

After some mathematical wrangling of the equations, the transfer functions for stop angle and spring gap are found to be:

 

 

Let us apply the WCA approach and determine the extreme range of the outputs. (Figure 3-4 displays nominal and tolerance values for input dimensions in our transfer functions.) After some pondering the mechanical realities (or perhaps tinkering with high/low input values in the transfer functions), it can be seen that when the bearing diameters (dB1, dB2), the hub height (hHUB) and the race inner diameter (DCAGE) are at LMC limits, the contact angle () is at its extreme maximum value. Vice versa, if those same component dimensions are at MMC limits, the contact angle is at its extreme minimum value. The same exercise on spring gap brings similar results. Based on the information we know, the WCA approach results in these predicted extreme values for the two outputs:

 

OUTPUT

WCA Minimum Value

WCA Maximum Value

Stop Angle

27.380 ˚

28.371 ˚

Spring Gap (mm)

6.631

7.312

What is the allowable variation of the stop angle and spring gap? And do these minimum and maximum values fit within the customer-driven requirements for allowable variation?

For design engineers, these requirements come on holy grails that glow in the dark; Specification Limits are inscribed on their marbled surfaces (cue background thunder). They are much like the vertical metal posts of a hockey goal crease. Any shots outside the limits do not ultimately satisfy the customer; they do not score a goal. These values will now be referred to as the Upper Specification Limit (USL) and the Lower Specification Limit (LSL). (Some outputs require either USL or LSL; not so for these outputs.) This table provides the specification limit values:

OUTPUT

LSL

USL

Stop Angle

27.50 ˚

28.50 ˚

Spring Gap (mm)

6.50

7.50

 

Comparing the WCA outcomes against the LSL/USL definitions, it appears we are in trouble with the stop angle. The extreme minimum value for stop angle falls below the LSL. What can be done? The power in the transfer functions is that it allows the design engineer to play "what-ifs" with input values and ensure the extreme WCA values fall within the LSL/USL values. If done sufficiently early in the design phase (before design "freezes"), the engineer has the choice of tinkering with either nominal values or their tolerances. Perhaps purchasing decisions to use off-the-shelf parts has locked in the nominal values to be considered but there is still leeway in changing the tolerances; in which case, the tinkering is done on only input tolerances. The opportunities for tinkering get fewer and fewer as product launch approaches so strike while the iron is hot.

How does this approach compare to the Root Sum Squares (RSS)? Before we explain RSS, it would be helpful to understand the basics of probability distributions, the properties of the normal distribution, and the nature of transfer functions and response surfaces (both linear and non-linear). So forgive me if I go off on a tangent into my next two posts. I promise I will come back to RSS after some brief digressions.

Creveling, Clyde M., Tolerance Design: A Handbook for Developing Optimal Specifications (1997); Addison Wesley Longman, pp. 111-117.

Comments

Collapse Expand Comments (0)
You don't have permission to post comments.

Oracle Crystal Ball Spreadsheet Functions For Use in Microsoft Excel Models

Oracle Crystal Ball has a complete set of functions that allows a modeler to extract information from both inputs (assumptions) and outputs (forecast). Used the right way, these special Crystal Ball functions can enable a whole new level of analytics that can feed other models (or subcomponents of the major model).

Understanding these is a must for anybody who is looking to use the developer kit.

Why are analytics so important for the virtual organization? Read these quotes.

Jun 26 2013
6
0

Since the mid-1990s academics and business leaders have been striving to focus their businesses on what is profitable and either partnering or outsourcing the rest. I have assembled a long list of quotes that define what a virtual organization is and why it's different than conventional organizations. The point of looking at these quotes is to demonstrate that none of these models or definitions can adequately be achieved without some heavy analytics and integration of both IT (the wire, the boxes and now the cloud's virtual machines) and IS - Information Systems (Applications) with other stakeholder systems and processes. Up till recently it could be argued that these things can and could be done because we had the technology. But the reality is, unless you were an Amazon, e-Bay or Dell, most firms did not necessarily have the money or the know-how to invest in these types of inovations.

With the proliferation of cloud services, we are finding new and cheaper ways to do things that put these strategies in the reach of more managers and smaller organizations. Everything is game... even the phone system can be handled by the cloud. Ok, I digress, Check out the following quotes and imagine being able to pull these off without analytics.

The next posts will treat some of the tools and technologies that are available to make these business strategies viable.

Multi-Dimensional Portfolio Optimization with @RISK

Jun 28 2012
16
0

Many speak of organizational alignment, but how many tell you how to do it? Others present only the financial aspects of portfolio optimization but abstract from how this enables the organization to meets its business objectives.  We are going to present a practical method that enables organizations to quickly build and optimize a portfolio of initiatives based on multiple quantitative and qualitative dimensions: Revenue Potential, Value of Information, Financial & Operational Viability and Strategic Fit. 
                  
This webinar is going to present these approaches and how they can be combined to improve both tactical and strategic decision making. We will also cover how this approach can dramatically improve organizational focus and overall business performance.

We will discuss these topics as well as present practical models and applications using @RISK.

Reducing Project Costs and Risks with Oracle Primavera Risk Analysis

.It is a well-known fact that many projects fail to meet some or all of their objectives because some risks were either: underestimated, not quantified or unaccounted for. It is the objective of every project manager and risk analysis to ensure that the project that is delivered is the one that was expected. With the right know-how and the right tools, this can easily be achieved on projects of almost any size. We are going to present a quick primer on project risk analysis and how it can positively impact the bottom line. We are also going to show you how Primavera Risk Analysis can quickly identify risks and performance drivers that if managed correctly will enable organizations to meet or exceed project delivery expectations.

.

 

Modeling Time-Series Forecasts with @RISK


Making decisions for the future is becoming harder and harder because of the ever increasing sources and rate of uncertainty that can impact the final outcome of a project or investment. Several tools have proven instrumental in assisting managers and decision makers tackle this: Time Series Forecasting, Judgmental Forecasting and Simulation.  

This webinar is going to present these approaches and how they can be combined to improve both tactical and strategic decision making. We will also cover the role of analytics in the organization and how it has evolved over time to give participants strategies to mobilize analytics talent within the firm.  

We will discuss these topics as well as present practical models and applications using @RISK.

The Need for Speed: A performance comparison of Crystal Ball, ModelRisk, @RISK and Risk Solver


Need for SpeedA detailed comparison of the top Monte-Carlo Simulation Tools for Microsoft Excel

There are very few performance comparisons available when considering the acquisition of an Excel-based Monte Carlo solution. It is with this in mind and a bit of intellectual curiosity that we decided to evaluate Oracle Crystal Ball, Palisade @Risk, Vose ModelRisk and Frontline Risk Solver in terms of speed, accuracy and precision. We ran over 20 individual tests and 64 million trials to prepare comprehensive comparison of the top Monte-Carlo Tools.

 

Excel Simulation Show-Down Part 3: Correlating Distributions

Escel Simulation Showdown Part 3: Correlating DistributionsModeling in Excel or with any other tool for that matter is defined as the visual and/or mathematical representation of a set of relationships. Correlation is about defining the strength of a relationship. Between a model and correlation analysis, we are able to come much closer in replicating the true behavior and potential outcomes of the problem / question we are analyzing. Correlation is the bread and butter of any serious analyst seeking to analyze risk or gain insight into the future.

Given that correlation has such a big impact on the answers and analysis we are conducting, it therefore makes a lot of sense to cover how to apply correlation in the various simulation tools. Correlation is also a key tenement of time series forecasting…but that is another story.

In this article, we are going to build a simple correlated returns model using our usual suspects (Oracle Crystal Ball, Palisade @RISK , Vose ModelRisk and RiskSolver). The objective of the correlated returns model is to take into account the relationship (correlation) of how the selected asset classes move together. Does asset B go up or down when asset A goes up – and by how much? At the end of the day, correlating variables ensures your model will behave correctly and within the realm of the possible.

Copulas Vs. Correlation

Copulas and Rank Order Correlation are two ways to model and/or explain the dependence between 2 or more variables. Historically used in biology and epidemiology, copulas have gained acceptance and prominence in the financial services sector.

In this article we are going to untangle what correlation and copulas are and how they relate to each other. In order to prepare a summary overview, I had to read painfully dry material… but the results is a practical guide to understanding copulas and when you should consider them. I lay no claim to being a stats expert or mathematician… just a risk analysis professional. So my approach to this will be pragmatic. Tools used for the article and demo models are Oracle Crystal Ball 11.1.2.1. and ModelRisk Industrial 4.0

Excel Simulation Show-Down Part 2: Distribution Fitting

 

One of the cool things about professional Monte-Carlo Simulation tools is that they offer the ability to fit data. Fitting enables a modeler to condensate large data sets into representative distributions by estimating the parameters and shape of the data as well as suggest which distributions (using these estimated parameters) replicates the data set best.

Fitting data is a delicate and very math intensive process, especially when you get into larger data sets. As usual, the presence of automation has made us drop our guard on the seriousness of the process and the implications of a poorly executed fitting process/decision. The other consequence of automating distribution fitting is that the importance of sound judgment when validating and selecting fit recommendations (using the Goodness-of-fit statistics) is forsaken for blind trust in the results of a fitting tool.

Now that I have given you the caveat emptor regarding fitting, we are going to see how each tools offers the support for modelers to make the right decisions. For this reason, we have created a series of videos showing comparing how each tool is used to fit historical data to a model / spreadsheet. Our focus will be on :

The goal of this comparison is to see how each tool handles this critical modeling feature.  We have not concerned ourselves with the relative precision of fitting engines because that would lead us down a rabbit hole very quickly – particularly when you want to be empirically fair.

RESEARCH ARTICLES | RISK + CRYSTAL BALL + ANALYTICS