By Dr. Barna Szabó
Engineering Software Research and Development, Inc.
St. Louis, Missouri USA
Some years ago, I attended a meeting of senior engineers at a Fortune 100 company. The topic of discussion was: What should be done about the significant discrepancies between outcomes predicted by finite element models and those observed in physical tests? The company was committed to developing a new product with key components made of fiber-reinforced composite materials. The project was already years behind schedule and billions of dollars over budget. Clearly, it was far too late in the production cycle to address questions about the reliability of the design rules. Understandably, the atmosphere of the meeting was rather gloomy.
The chief engineer, who called the meeting, vented his frustration, declaring he was tired of hearing that the finite element mesh was the problem. He no longer believed the predictions from finite element modeling and had lost confidence in the engineers who produced those results.
In the ensuing discussion, possible sources of the discrepancies were examined. Soon, it became apparent that the senior engineering staff, who had impeccable academic credentials, albeit somewhat dated, had not kept up with developments in finite element analysis. For them, finite element modeling using a particular software tool was the answer to all numerical simulation problems, not realizing that while finite element modeling can be useful for solving structural problems, it is ill-suited for solving the strength problems associated with the formulation and validation of design rules.
It is not easy to find the right words to explain to senior engineers, with their boss present, that their entire approach was ill-advised. The task before them was not to construct finite element models but to formulate, calibrate, and rank predictors of failure for composite materials. The development of design rules for new materials and material systems is essentially a model development project and should be conducted accordingly [1].
The company’s experts were visibly upset by the idea that finite element modeling, which they had learned years before and practiced ever since, was not the right approach for the problem at hand. From their questions and comments, it became clear that they did not know that model form errors and approximation errors should be treated separately, and they did not understand the notion of an exact solution and, hence, did not know what the meaning of the error approximation was. One of the experts ventured to say that, in his view, “the exact solution was the outcome of a physical experiment.” This statement makes sense in the context of finite element modeling (FEM) but makes no sense in finite element analysis (FEA), the goal of which is to approximate the exact solution of a well-posed mathematical problem. (Well-posedness means that a solution exists, is unique, and stable, such that small changes in the input result in small changes in the quantities of interest).
At this point, the discussion began to resemble a Numerical Simulation 101 class, and the chief engineer, visibly frustrated with what he had heard, ended the meeting. A few weeks later, he announced his retirement.
What Went Wrong?
The root cause of the problem was management’s failure to exercise simulation governance. In other words, management failed to exercise proper command and control over the critically important task of developing design rules for a composite material system. What they should have done was formulate technical requirements based on the concepts of verification, validation, and uncertainty quantification (VVUQ). Formulating and documenting the technical requirements is the first and most important step in any model development project.
Instead, management decided to mandate the use of a specific finite element modeling tool across the corporation. In doing so, they dictated how problems should be approached without addressing whether the mandated modeling tool had the requisite technical capabilities. Had they produced a technical requirements document, they would have realized early on that the technical capabilities of legacy finite element modeling tools were inadequate for the task at hand.
The motivation to avoid an unnecessary proliferation of software tools in an organization is understandable. However, the primary consideration should have been meeting the technical requirements of ongoing projects. Management set the engineering team up for failure by prescribing a software tool instead of setting the goals and stating the technical requirements.
Contributing Factor: Lack of Continued Professional Development
The notion that the exact solution is the outcome of a physical experiment reflects pre-1970s thinking in finite element modeling. The senior engineers were unaware that since then, finite element analysis has become a scientific discipline, a bona fide branch of applied mathematics.
I believe the reason senior engineers were unaware of the significant increase in the knowledge base of numerical simulation lies in this observation: In large corporations, mastering organizational politics—building alliances, managing perceptions, and wielding influence—often outweighs professional competence in driving career advancement. Employees skilled in navigating organizational politics tend to rise faster than their technically proficient peers [2].
The organization’s decision-makers did not have the expertise to formulate the technical requirements for the model development project at hand. Unfortunately, this is not the exception; it is the rule across the numerical simulation landscape.
How to Avoid Problems Like This
Decision-makers need to understand that model development is, essentially, a scientific research project. It has (a) an established conceptual framework, such as continuum mechanics, (b) a set of competing hypotheses, such as candidate predictors of failure, (c) a problem-solving machinery, and (d) records of experimental data. For example, twelve different predictors were investigated in Part II of the World Wide Failure Exercise [3]. A properly constructed model development project provides an objective framework for ranking candidate models based on their predictive performance.
To advance the development of design rules for composite materials, stakeholders need to initiate a model development project, as outlined in reference [1]. This approach will provide a science-based framework for innovation. Without such a coordinated effort, organizations have no choice but to rely on the inefficient and costly method of make-and-break engineering, hindering overall progress and leading to more unhappy results [4].
Costs
The consequences of management’s failure to exercise simulation governance proved extremely costly and embarrassing. The project took several years longer to complete than anticipated, with cost overruns reaching billions of dollars. Other costs are harder to quantify: maintenance expenses over the product’s entire lifecycle, operating costs due to large margins of safety to account for uncertainties, and opportunity losses from launching the product much later than originally planned.
Euclid delivers unsettling news to king Ptolemy I: “There is no royal road to geometry”. Image created by Microsoft Copilot.
There Is No Royal Road
According to legend, when Ptolemy I, the king of Egypt (c. 305 to 282 BC), asked Euclid if there was an easier way to learn geometry, Euclid replied: “There is no royal road to geometry.” In other words, there are no shortcuts to learning and understanding geometry; it requires effort and dedication.
The same is true for numerical simulation projects. Despite enticing marketing claims, such as “unparalleled ability to visualize, build, edit, and interact with complex simulations and digital twins in a shared turnkey environment,” management should never consider outsourcing the responsibilities associated with the exercise of simulation governance. Underestimating the complexity of difficult questions and seeking simple solutions will likely produce unhappy results.
References
[1] Szabó, B. and Actis, R. The demarcation problem in the applied sciences. Computers and Mathematics with Applications. Vol. 162, pp. 206–214, 2024. [2] Pfeffer, J. Power: Why some people have it and others don’t. Fletcher & Company, LLC, 2010.[3] Kaddour, A. S., and Hinton, M. J. Maturity of 3D Failure Criteria for Fibre-Reinforced Composites: Comparison Between Theories and Experiments: Part B of WWFE-II,” J. Comp. Mats., 47, 925-966, 2013.[4] Szabó, B. and Actis, R. Planning for simulation governance and management. Ensuring simulation is an asset, not a liability. Benchmark, a NAFEMS Publication, July 2021.Related Blogs:
- Where Do You Get the Courage to Sign the Blueprint?
- A Memo from the 5th Century BC
- Obstacles to Progress
- Why Finite Element Modeling is Not Numerical Simulation?
- XAI Will Force Clear Thinking About the Nature of Mathematical Models
- The Story of the P-version in a Nutshell
- Why Worry About Singularities?
- Questions About Singularities
- A Low-Hanging Fruit: Smart Engineering Simulation Applications
- The Demarcation Problem in the Engineering Sciences
- Model Development in the Engineering Sciences
- Certification by Analysis (CbA) – Are We There Yet?
- Not All Models Are Wrong
- Digital Twins
- Digital Transformation
- Simulation Governance
- Variational Crimes
- The Kuhn Cycle in the Engineering Sciences
- Finite Element Libraries: Mixing the “What” with the “How”
- A Critique of the World Wide Failure Exercise
- Meshless Methods
- Isogeometric Analysis (IGA)
- Chaos in the Brickyard Revisited
- Why Is Solution Verification Necessary?
- Variational Crimes and Refloating the Costa Concordia
Leave a Reply
We appreciate your feedback!
You must be logged in to post a comment.