Sign up for our newsletter and get the latest HPC news and analysis.
Send me information from insideHPC:


The Past, Present and Future of Engineering Simulation

In this special guest feature from Scientific Computing World, Bill Clark, executive vice president of CD-adapco, considers the successes of computer-aided engineering through the “three ages of CFD.”

Bill Clark, EVP at CD-adapco

Bill Clark, EVP at CD-adapco

Computational Fluid Dynamics (CFD) is about solving difficult engineering problems, using expensive software, enormous computing resources, and highly trained engineers. If the problems weren’t difficult, then it is doubtful that anyone would devote so much time and money to solving them. From the perspective of a modern engineer, it would be easy to assume that this desire to apply simulation technology to complex problems is a recent concern; that only today are we able to contemplate solving tough industrial problems, armed with a complex array of multi-physics simulation tools.

This is a misconception. Twenty or so years ago, commercial CFD was born from a desire to solve problems involving turbulence, heat transfer, and combustion, based on the vision of a small group of pioneering researchers who were able to see beyond the meagre computing resources available at the time, and to develop the techniques and methods that would ultimately revolutionize engineering.

CFD meshes took weeks, or even months, to construct, usually by a process of ‘hand-meshing’ by which an engineer (usually PhD-qualified) painstakingly built up meshes vertex-by-vertex. Although ‘automatic meshing technology’ was starting to become available in the early 90s, it was far from reliable, particularly when it came to defining layers of prismatic cells that were required to accurately capture boundary layers. Another issue with the so-called ‘automatic meshing’ technology of the day, is that it tended to generate more cells than the meagre computing resources of the time could handle. In 1994, I can remember submitting a Star-CD simulation that consisted of 750,000 cells for the first time, and fully expecting smoke to start flowing from the large Unix box that sat under my desk.

The timescales required meant that analyzing multiple design variations was impractical. This was the ‘first age’ of CFD. Getting a simulation result at all was difficult. CFD was usually deployed at the end of the design process, as a final verification, or for troubleshooting purposes, when everything else had failed.

The arrival of cheap Linux computers reduced parallel licensing costs and the continually improving simulation technology opened up the ‘second age of CFD’, in which CFD engineers could reliably provide simulation results within reasonable timescales. Consequently, engineering simulation began to establish itself as the core part of the design process, occurring earlier and earlier and providing a constant stream of simulation data that could be used to drive design decisions. Increasingly, simulation began to displace experimentation as a way of verifying designs. The problems that we could solve expanded beyond the core CFD disciplines of fluid mechanics and heat transfer, as we began to consider problems that involved ‘fluid-structure interaction’, multiphase flow, and chemical reaction. With a little engineering ingenuity, there were very few problems that engineering simulation couldn’t offer some insight to.

Which brings us to today, and the dawn of the ‘third age of CFD’, where lines between CFD and structural mechanics are becoming so blurred that it makes little sense calling it ‘CFD’ at all. An uncomfortable truth about modern engineering is that there really are no easy problems left to solve. In order to meet the demands of industry, it’s no longer good enough to do ‘a bit of CFD’ or ‘some stress analysis’. Complex industrial problems require solutions that span a multitude of physical phenomena, which often can only be solved using simulation techniques that cross several engineering disciplines. What our customers are really asking for is the ability to ‘see the big picture’. Simulating whole systems rather than just individual components, taking account of all of the factors that are likely to influence to performance of their product in its operational life.

In short, to simulate the performance of their design in the context that it will actually be used.

Whereas previous generations of engineers could take some comfort in the ‘safety net’ of extensive physical testing to rescue them from the occasional poor prediction, CAE is increasingly the victim of its own success as simulation continues to displace hardware testing as industry’s verification method of choice. Although this increased confidence in simulation is well-deserved (and has been hard-earned through many years of successful prediction), it brings with it a great deal of pressure to ‘get the answer right’ every time.

An important part of this is ‘automated design exploration’, in which the simulation results automatically drive design improvements, with minimal input from the engineer (other than defining the initial problem and design constraints). With this approach, CFD is used to compile databases of simulation results that explore the complete range of usage scenarios, or it is tied to optimization technology (such as our HEEDS software) to determine the best solution to a given problem automatically.

Such are the magnitude of changes in the past two decades of simulation technology that it would be foolish to speculate what might be happening 20 years from now. Whatever those changes are, I hope that SCW will still be around to report them.

This story appears here as part of a cross-publishing agreement with Scientific Computing World.

Sign up for our insideHPC Newsletter.

Resource Links: