One of the uniquely powerful aspects of Simcenter 3D is the strong geometry creation, editing, and clean-up foundation on which it is based. The approach to Finite Element Analysis (FEA) adopted by Simcenter 3D draws on the strengths of the CAD (Computer Aided Design) modeling approach and is thus rather unique. This article will provide an overview of the different files and the modeling approach used by Simcenter 3D for FEA.
The Part File (.prt)
A Simcenter 3D structural analysis typically starts with a Part (.prt) file. The Part file is simply CAD geometry that can be:
Sourced from NX
Created within Simcenter
Imported from another modeling package (Parasolid, STEP, Solid Edge etc.)
The Part file is typically used for design and manufacturing purposes. A typical part file is illustrated in Figure 1, showing numerous small features in the form of blended faces
Figure 1: A typical Part file
The Idealised Part File (.prt)
The Idealised Part is an optional interim CAD model based directly on the Part file. It is intended specifically for simulation purposes and enables geometry modifications to be made without affecting the Part file. Updates performed on the part will be propagated to the Idealised Part, but any modifications conducted on the Idealised Part will not change the Part file. This one-way flow of information allows the FEA model to always be based on the latest Part file while neatly separating the FEA specific geometry from that used for design and manufacturing. Figure 2 shows the result of the idealisation of the part shown in Figure 1. The blended faces have been removed and the body split up to facilitate meshing.
Note that the Idealised Part is not a necessary step in the analysis process. If the analyst prefers, all modifications can be performed directly on the Part.
Figure 2: An Idealised Part file
Figure 2: An Idealised Part file
The FEM File (.fem)
The FEM file stores all finite element data (nodes, elements, connections, material properties, element properties) for a specific entity. The finite element data is, by default, associated to the geometry of the Idealised Part or Part used for the construction of the mesh. The FEM file can contain all finite element data for a specific analysis, but this approach is inefficient for large models because it does not take advantage of the important Simcenter 3D functionality. It is usually beneficial to break the model down into logical separate entities, each with its own FEM file. Figure 3 shows the mesh constructed based on the idealised geometry of Figure 2.
Figure 3: A FEM file
The Assembly FEM File (.afm)
In the same way that a typical CAD assembly forms a logical grouping of appropriately positioned geometrical parts, an Assembly FEM does this using FEM parts. Several types of connections between the various FEM parts can be defined within the Assembly FEM file. An example of an Assembly FEM is provided in Figure 4.
The Assembly FEM organisation technique can greatly reduce model setup effort by ensuring that subassemblies used repeatedly in a model only need me meshed and connected/assembled once. It also serves as a valuable method to break very large models down into smaller, more manageable portions. It is an optional modeling technique that can be used at the analyst’s discretion.
Figure 4: Assembly FEM example
The SIM File (.sim)
The SIM file contains information on the loads, boundary condition and solver/solution settings for a specific model. It is the top-level item in the Simulation Navigator and references all FEM and Assembly FEM files used to construct the complete Finite Element model. The SIM file can be thought of as the means of drawing the various Simcenter 3D analysis files together into a form that is ready for solving.
The Finite Element Analysis approach used by Simcenter 3D uses a strong coupling between the CAD geometry and the mesh. The modeling methodology adopted by the analyst should take this into account, making use of Idealised Parts, FEM files, and Assembly FEM files, to benefit from Simcenter 3D’s unique strengths. Some practice will be required to acquire these skills. To facilitate this process, several tutorials have been set up to guide the interested analyst.
An analyst faced with the task of performing a dynamic finite element analysis has numerous choices to make. For someone that is starting out in the field, the range of options available can be daunting. In this blog article, some of the foundational concepts related to linear dynamic analysis will be discussed, with the aim of providing a high-level overview which will make the task of performing an analysis appreciably less daunting.
Linear Dynamic Analysis Types
There are three types of dynamic analysis:
Normal Modes Analysis
A normal modes or eigenvalue analysis involves the calculation of eigenvalues (natural frequencies) and corresponding eigenvectors (mode shapes) for a structure. These analyses are relatively simple to perform and provide helpful insights into the characteristics of a structure from both design and analysis perspectives. The eigenvalues and eigenvectors are solely dependent on the structure’s characteristics (stiffness and mass matrices) as no loads are considered when executing this analysis type. A modal analysis is often the first step in the process when performing a linear dynamic analysis and is the foundation for a range of further analyses. The interested reader is referred to the following blog article for a detailed discussion on modal analysis – Modal Analysis: What it is and is not.
Transient response analysis is used when loads are defined as arbitrary functions of time, making this approach generically applicable to linear dynamic structural analysis.
Frequency response analysis is intended for use on structures that are subjected to constant sinusoidal excitation. The excitation loads are defined in the frequency domain. Rotating machinery fit into this class, so this analysis technique is applicable in a fairly wide range of scenarios. The adjacent image depicts a sinusoidal time domain signal at the top, with its frequency domain representation, following from an FFT (Fast Fourier Transform), depicted below it.
Two approaches that can be used when solving transient and frequency response simulations:
The direct method employs numerical integration procedures directly on the full set of coupled equations of motion to calculate the structural response.
The modal method uses the structure’s mode shapes the decouple the equations of motion, resulting in a set of single degree of freedom systems that can be solved much more efficiently. Additionally, it is standard practice to exclude some modes from the analysis set, thereby reducing the size of the analysis.
The choice of whether to use the direct or modal approach is highly dependent on the specific model in question. Below are a few guidelines that can be used to inform the decision-making process:
The direct approach has a higher level of accuracy because the entire system is analysed whereas the modal approach uses a truncated set of modes.
Transient analyses that require many time steps tend to be better suited to a solution using the modal approach.
Structures that undergo high-frequency excitation are likely better suited to the direct method because the modal method will require many modes to accurately approximate the response.
Small models are usually better solved using the direct method. When the model is large, modal methods hold distinct advantages so long as the time required to compute the eigenvalues and eigenvectors does not become excessive.
This information will help make the first steps performing dynamic analyses simpler. For practical assistance in setting up these types of analyses in Simcenter, please see the Simcenter tutorials page.
At the start of Part I of this series, the question: “What do you do when faced with analysing a shell and tube heat exchanger as in the model shown in Figure 1” was raised? The answer is simple, you use FloEFD. Part I and Part II focused on the capability of FloEFD to provide accurate engineering results for heat transfer in internal as well as external flow applications. Both of these cases were considered in isolation though, however, in this discussion, the revelations made during those investigations are combined and finally applied to the full heat exchanger example.
Figure 1: Shell and tube heat exchanger.
Part III: Full heat exchanger
The reason for all of this was basically to establish just how coarse of a mesh one could dare to use when having to analyze large or complex heat exchangers like this. Since, you might find yourself in the same position as many engineers in South Africa, usually required to make do with limited computer resources. Therefore, it would be very beneficial if you can use CFD software that can double-up as an engineering tool to solve large problems on your standard issue laptop or desktop computer. And this is exactly where FloEFD starts to make a lot of sense.
In order to analyse the heat exchanger in question, the only limiting factor would be the computer memory (RAM), as the memory effectively limits the size of the model in terms of the number of cells that can be used. Due to the sheer length of piping care needed to be taken in order to obtain a mesh that can fit into the 32GB memory limit. Therefore based on the knowledge gained, the ‘four cells per diameter’ value were used as a gauge to generate a reasonable mesh that could still provide a high level of confidence in the ‘engineering’ answer. Setting up the mesh was as simple as inserting a few control planes in the base mesh settings that “box” the tube bank and specifying the number of cells between the set of planes. Thereafter, based on the base mesh, local mesh refinement with level 3 was applied to the tube bank part/component to ensure all of the tubes met the characteristic ‘four cells per diameter’ requirement. Within the limit of 32GB memory, some stretching of the cells still had to be applied to save a little on the memory requirements. Stretching the cells away from the bends resulted in a mesh size of approximately 5.7 million cells in total. The mesh setup and generation only took a few minutes and FloEFD did not have any problems in generating the mesh, almost like magic it just happens. For this size mesh, the memory usage peaked at 29GB from time to time during solving. A portion of the resulting mesh is shown in Figure 2.
Figure 2: Mesh resolution around the tubes
Onto the question of solving full heat exchanger. First of all, it must be stated that the solution was very stable and convergence simply just happened. Quite astonishing really, considering the type of problem. Regarding the required calculation time, this particular model solved in a very respectable 15 hours on a mere quad-core CPU, with the respective outlet temperatures already converged after 1.5 travels (flow freezing enabled). The resulting outlet temperatures obtained were, Tair,out = 51.6°C and Twater,out = 24.6°C. See the tube internal and shell-side temperatures in Figure 3 and Figure 4 respectively. Based on the merits of the previous discussions, this result would already be very useful to base decisions on, especially when doing comparative studies of various baffle plate designs for example.
Figure 3: Tube-side – Water temperature.
Figure 4: Shell-side – Air temperature.
I have long since realized the value of FloEFD whenever it comes to solving heat transfer problems. However, it has only now also become evident that FloEFD has made it possible for engineers to solve large problems like shell and tube heat exchangers with the minimum amount of effort and resources required, compared to ‘old school’ CFD programs, thanks to the underlying SmartCells™ technology and the ever-so-fantastic thin boundary layer model. The only demand being placed on computer resources is on the memory which limits the mesh size of these models. It is simply astonishing how easy it is and how little effort is required by the user to set up such a model, including the meshing. It goes without saying that all of this would be useless were it not for the remarkable accuracy, stability, and robustness of the solver. From an ‘Engineering in South Africa’ perspective, i.e. to be as resourceful as possible, FloEFD really resonates well with our kind of thinking.
FloEFD, the only CFD software that can be used as an Engineering tool.
Part I of this series asked the question: “What do you do when faced with analysing a shell and tube heat exchanger as in the model shown in Figure 1”? The discussion in Part I revolved around the solution of the ‘internal pipe flow with heat transfer problem’ and how FloEFD can be used as an engineering tool in this regard thanks to the SmartCells™ technology. Let’s take the discussion around the SmartCells technology further then. FloEFD is fully CAD-embedded, and by fully CAD-embedded we don’t mean it is just an interface plug-in to some CAD software. No, we mean that FloEFD is tied directly to the CAD model, literally to the background mathematical definitions that make the CAD geometries look the way they do. So being fully CAD-embedded in this strict sense of the term has a set of serious advantages:
There is no translation to some intermediate neutral file format, i.e. NO information gets ‘lost in translation’, literally speaking.
Because of this direct link to the CAD model FloEFD will recognize any solid feature, regardless of size.
And to top it all, FloEFD will also use such geometric features as curvatures, during the calculation, as illustrated in Figure 2.
Couple the feature from point nr. 3 above with the two-scale wall function employed by FloEFD to calculate the boundary layer and it allows for much coarser meshes to be used to generate reliable and useful results, as demonstrated in Figure 3. The two-scale wall function forms part of the “enhanced turbulence modelling” approach employed by FloEFD. The technology decides automatically if the boundary layer is “thin” or “thick” relative to the characteristic cell size and applies the relevant boundary layer calculation. The result in Figure 3 clearly shows the “thin boundary layer” of the two-scale wall function model at work. Again, as with Part I, if you’ve ever wondered exactly how well FloEFD performs in this regard, perhaps the following observations may be very beneficial.
Figure 1: Shell and Tube heat exchanger.
Figure 2. FloEFD SmartCells – Capturing curvature of geometry.
Figure 3: FloEFD SmartCells – Capturing the boundary layer.
Part II: External flow over a heated cylinder
So then, what about the flow on the outside of the pipes, i.e. the ‘shell-side’ flow of the heat exchanger in question? To represent the ‘shell-side’ flow we will consider the standard validation example of external flow over a heated cylinder. In this analysis, only the heat transfer behavior is considered, and not the drag, per sé. Again the mesh is set up such that the characteristic number of cells across the diameter was varied incrementally. Consider the graph in Figure 4 which shows the Nusselt number prediction for several mesh densities across a wide range of Reynolds numbers. It is evident from the graph that regardless of the mesh density the FloEFD prediction is very good, always within the scatter of the experimental data, even considering the extremely coarse meshes used in CFD terms, i.e. four to ten cells per diameter. See especially the close-up image showing the four and six cell mesh results. Thus, it is evident then that similar to the internal pipe flow case in Part I, FloEFD is still capable of producing the same level of results for this external flow case with Reynolds numbers ranging across 4 orders of magnitude, all with the same mesh.
Figure 4: External flow over a heated cylinder – FloEFD prediction of a Nusselt number.
The above observations, fortunately, aligns very well with that of the internal flow results from Part I in that one should also be able to generate very useful engineering results for the heat transfer in an external flow, with meshes as coarse as just four characteristic cells across the pipe diameter. It should be stated that it does seem that six characteristic cells per pipe diameter would be more desirable, but for the purposes of this engineering approach, the ‘four cells per diameter’ case would be more than sufficient and will be used when analysing the full heat exchanger in Part III.
What do you do when faced with analysing a shell and tube heat exchanger as in the model shown in Figure 1? I can already hear you saying “you want to ‘C..F..D’ this thing!? There’s like a thousand meters worth of piping..?” Quite literally in fact, approximately 1km in total with a 1mm wall thickness and a total of 800 bends. Thoughts that run through my mind are; “How big is this mesh going to be? How long is it going to take to solve? I only have a quad core laptop (at least with 32GB of memory which helps)”. And if I were to use anything other than FloEFD I’d also think “with all those bends I’m probably going to have to remodel the piping so that I can HEX-mesh it…or something” It seems overwhelming at first because most of the time us engineers simply don’t have time for all of that, we need answers and we needed them yesterday!
Fortunately, this is exactly where FloEFD starts to make a lot of sense, especially for the internal pipe flow, where the SmartCells™ technology within FloEFD really comes into play. SmartCells will recognize directly from the CAD geometry if it is a pipe or a channel, and decide based on the number of cells across this pipe or channel to apply a textbook or engineering calculation (1D) for the pressure drop and heat transfer when there is insufficient cells across the pipe to numerically resolve the flow. Alternatively, when there is indeed a sufficient number of cells across the pipe, SmartCells will then automatically switch to resolving the flow field (3D) with the numerical grid. But, if you’ve ever wondered exactly how well FloEFD performs in this regard, perhaps the following observations may be very beneficial. Let us start this discussion by looking first of all at solving internal pipe flow with heat transfer in FloEFD.
Figure 1: Heat exchanger example.
Part I: Internal pipe flow with heat transfer
See the FloEFD validation example. Let’s consider an example slightly more relevant to the heat exchanger at hand. Figure 2 shows the FloEFD model of a 10-pass pipe layout with internal flow. Heat transfer to the internal fluid is modeled with a Heat Transfer Coefficient applied to the outer wall boundary, to allow for the calculation of conduction through the wall along with the conjugate heat transfer at the fluid-solid interface on the internal pipe surface. Radiation is neglected for this example. The mesh was generated such that the characteristic number of cells across the diameter of the pipe was gradually increased, starting with as little as 2 cells across the pipe diameter up to 6 cells. Figure 3 illustrates the typical Cartesian mesh used. One other very important aspect of the SmartCells technology is “Thin walls” technology which allows the original cartesian cells to be divided into multiple control volumes at the solid-fluid boundaries, such that they can contain either a fluid or solid control volume or a series of both and still calculate the conjugate heat transfer at the solid-fluid interfaces. So you can see in Figure 3 there is no need to generate a ‘body-fitted’ mesh that adapts the mesh to the solid boundaries.
Figure 2: FloEFD CAD model of 10-pass pipe layout.
Figure 3: FloEFD mesh resolution.
Now let us compare the results from FloEFD with that of the very reliable 1D thermal-hydraulic system solution called Flownex (developed locally here in South Africa). The Flownex model of the same pipe layout is shown in Figure 4. Consider the graph of the total heat transfer as presented in Figure 5. The FloEFD results are displayed with respect to the increasing mesh density and compared to the Flownex result. A band of +10% and -10% of the Flownex result is also shown to add some perspective to the comparison. It runs out that for this example the heat transfer prediction by FloEFD is always within the +/-10% band compared to Flownex, regardless of the mesh density. Quite fascinating really.
Figure 4: 1D Flownex model of 10-pass pipe layout.
Figure 5: FloEFD versus Flownex results comparison.
It should be noted that I am only showing one example here, but an extensive study of first comparing 1-pass, 2-pass and then 10-pass pipe layouts, with flows at varying Reynolds numbers (as high as Re=600,000 with air at 45m/s), all produced very similar behavior. Consider Figure 6 which shows the expanded study results for 1-pass and 10-pass pipe layouts respectively with air as the fluid at vastly different flow rates. What is so astonishing about these results is the fact that even at much higher velocities FloEFD predicts a total heat transfer still within 10% of the Flownex result for what can only be considered ridiculously coarse meshes in CFD terms. I want to go right out and say that FloEFD is the only CFD solution that will allow you to use the same level of mesh resolution and produce the same level of accuracy across a wide range of Reynolds number flows – I just want to let that sink in for a moment…I will prove and restate this fact in Part II of this series. For the sake of everybody’s curiosity, I want to make the following interesting observation: It seems to me then that the switchover point from the engineering calculation to the fully resolved pure CFD solution happens at around eight to ten cells… Beyond this point, one could see a sudden jump in heat transfer prediction as the mesh resolution is increased, but all the while remaining within the +/-10% band compared to Flownex.
Figure 6: FloEFD vs. Flownex for 1-pass and 10-pass pipe layouts with Air as the fluid.
In conclusion, considering the revelations made above, it is evident that one can use FloEFD as an engineering tool for solving internal pipe flow with heat transfer by utilising the SmartCells technology and resolving the pipe cross-section with meshes that are as coarse as 4 to 6 characteristic cells across the diameter. This makes it possible to at least attempt to solve large heat exchanger models with much fewer computer and engineering resources than one would expect with CFD if one can follow the same approach regarding the flow external to the tubes, or the shell-side, which will be investigated in Part II. The full heat exchanger will be discussed in Part III.
I recently attended the ESTEQ’s FEA101 course – the practical Finite Element Analysis (FEA) course which focuses on linear static, buckling and modal analysis. If you are unfamiliar with the world of FEA, a simple explanation would be that it is a numerical method that is used to solve physics and engineering problems. FEA software was developed to digitally solve such problems. This ultimately results in red, green and blue pictures which aren’t only pretty but saves you from sleepless nights and probably a couple of trees in the process. Sounds good, right? Here’s the catch, like any software, the Finite Element Method has rules and methods that will lead to the correct answer but an untrained user could make the smallest mistake which could have dire consequences. It is for this reason that I attend the course and am tremendously glad that I did.
The course covered the theory behind FEA, and although I did not study it at university, I was provided with the fundamental knowledge required to use FEA software and avoid fundamental mistakes. The course was product independent, meaning that it could be completed using any FEA software. Another major perk of the course was that it was presented by an experienced and passionate instructor, Paul Naudé, who went the extra mile to ensure that everyone was on track with everything. With his previous experiences, he could not only give real-life engineering examples on the uses of FEA but also on the world in which the software is being used, expanding on the engineering judgment needed as well as an emphasis on time, cost, and quality. The exercises involved in the course were extremely helpful and with the help of Paul, the concepts and methods were easily understood.
Here are the 5 things I learned from this course:
Ask the Right Questions
The emphasis on asking the right questions. Asking the right questions before beginning an FEA model will save you a lot of precious time. This doesn’t only apply to modeling the components but also solving time.
The fundamental theory of the Finite Element Method.
This made all the course significantly more comprehensible. This resulted in completing activities by not just clicking the mouse but understanding what you are doing and why.
The implications of solving time.
The solving time can exponentially increase when using a more complex geometry, for example, a 2D surface compared to a full 3D solid. Solving times can range from a few seconds to a couple of times based on the complexity of the FEM model. Knowing how to simplify a model and manipulating the geometry can save you valuable time.
The limitations of Finite Element Analysis.
The major limitation of any software would be the user. Knowledge of the typical errors and an emphasis on the vital inputs such as the boundary conditions minimize the common mistakes associated with FEA. These aspects were explained thoroughly in the course and provided the trainees with a method of identifying an error and recognizing where the root of the error is.
Several FEA Software
The course was not specific to one software. The content that was covered was universal but the exercise booklets that were provided were specific to the software the trainee was comfortable with.
The NICD has identified the Enterprise Factory in Polokwane as the source of the current outbreak in South Africa which is the worst in recent history as 180 people have lost their lives so far. Cold meat products have been recalled and returned around the country to prevent any further contamination. For these businesses, the impact of listeriosis could also be fatal: Tiger Brands share price dipped almost 13% and a fine of 10% of annual turnover could be imposed. More than that, the trust in the brand has been compromised and could take years to recover. This is any food company’s worst nightmare.
It will be interesting to find out how this happened. There have been many warnings regarding the increasing number of listeriosis cases occurring since December. Tiger Brands said that they have proactively amplified testing for listeria at their facilities and found low counts that are within industry guidelines, but something slipped through the cracks. Was the measurement equipment faulty? Were the tests conducted incorrectly? Or was the laboratory information just mismanaged?
What Would We Do?
Ask yourself how your LIMS (Laboratory Information Management System) system could prevent this from happening. Are processes in place to ensure that your measurement equipment is functioning correctly? Or do you rely on a system that might be outdated or hasn’t been implemented correctly? A solution to manage these processes with and ensure that you know exactly what is happening in your factory is critical in the food industry.
Our suggestion would be to introduce and correctly implement your next generation Simatic IT Unilab LIMS. This is a system developed by Siemens that can be used for large enterprises or small labs. It enables you to efficiently manage quality and safety of your products. This enables you to ensure that your factory is safe and adheres to industry standards, without anything being overlooked.
Unilab is a future-proof LIMS with no client-side installation required.
It harnesses HTML5 technology to run on any device with any browser.
Its intuitive user interface presents the right information to the relevant people in a simple manner.
Instrument connections with dynamic connection are supported for data exchange.
Result verification and validation is done upon entry.
If you would like to know more about this solution, you can contact us or leave a comment below:
There are many terms that are thrown around when it comes to Digital manufacturing and in particular, simulation modeling. This article will discuss the differences between Agent-based modeling, Discrete events simulation modeling, and Continuous modeling.
In short digital manufacturing is the concept of creating a duplicate (“Digital-twin”) of the system within a manufacturing environment. Then the proposed actions or solutions can be safely tested within the virtual world before spending large amounts of capital money to roll out the solution.
Different simulation model approaches
Agent-based simulation modeling
Now that we know what Digital Manufacturing is, we can look at the different types of simulation models that are out there. In short Agent-based modeling (or ABS) is the simulation of individual agents (on a micro level) and how they interact with each other and their environment (on a macro level) . A typical example would be a logistics model, as seen in Figure 1, where the roads, entry and exit points (onto the roads), how the lanes work and vehicles passing each other are created as rules and the basic frame within which the model will be run. Then the agents themselves are created which could be cars, buses and trucks and each of these have their unique behaviors. These agents are then pushed into the system and left to see how they react to each other and the roads on which they drive. The key thing here is that the “main” logic resides within the entities and they make the decisions. Typical application fields are biology, ecology, and social science.
Discrete event simulation modeling (or DES) on the other hand is a modeling approach where the entire system is modeled in detail and the logic is encapsulated within the framework of the system. The entities are for all intensive purposes dumb and just move through the system, the system then decides what to do with them . Discrete in this case means that the models time frame does not run every second, it only runs events jumping from one event to the other. If there is a 10-hour plant shut down in your model, you as the viewer of the model won’t witness this time gap, the model will just jump past it to the next event . Table 1 shows a comparison of DES models and ABS models, from the “Journal of Simulation” :
These days most DES software comes standard with object orientated building blocks. This means that pure DES models can be built, hybrid DES/ABS models are built (with smart logic/rules in the entities as well as the system) and a pure ABS can be built from these software packages. It then depends on the modeling approach used to define the problem and build the model.
Continuous simulation modeling
The last approach is continuous. In contrast to DES, continuous is used in systems where the variables can change continuously . As an example: a normal bank queuing problem can be modeled with a DES because the number of people in the system at any point in time can only be discrete values. Good examples of continuous are any type of flow, like the volume in a tanker measured against time as the water is being flushed out of the system. Just like the previous comparison between ABS and DES, we find that in modern-day software models are hybrids of continuous and DES. A fast-moving bottle filling factory line would be an example of this. The entities themselves represent discrete units entering and exiting the system at discrete moments in time, however, the line pushes so many bottles through per second that the DES model is beginning to look more like a continuous model.
Figure 2: Hybrid 3D model 
 Agent-based-models.com. (2017). Agent-Based Modeling: An Introduction | Agent-Based Models. [online] Available at: http://www.agent-based-models.com/blog/2010/03/30/agent-based-modeling/ [Accessed 30 Oct. 2010].
 Siebers, P., Macal, C., Garnett, J., Buxton, D. and Pidd, M. (2017). Discrete-event simulation is dead, long live agent-based simulation!. [online] SpringerLink. Available at: https://link.springer.com/article/10.1057/jos.2010.14 [Accessed 27 Nov. 2017].
 Matloff, N. (2008). Introduction to Discrete-Event Simulation and the SimPy Language. [ebook] p.3. Available at: http://heather.cs.ucdavis.edu/~matloff/156/PLN/DESimIntro.pdf [Accessed 27 Nov. 2017].
 Agents.fel.cvut.cz. (2018). Agent-based Computing for Intelligent Transport Systems. [online] Available at: http://agents.fel.cvut.cz/projects/agents4its# [Accessed 5 Feb. 2018].
 Flickr. (2018). Tecnomatix 12 Plant Simulation 3D Visualization. [online] Available at: https://www.flickr.com/photos/31274959@N08/15718572982 [Accessed 5 Feb. 2018].
Good day and thank you for taking the time to read through this article.
A quick introduction. During this series of posts, we would like to introduce the reader to the Siemens PLM Simcenter 3D technologies. We will predominantly be focusing on the Simcenter 3D suite of technologies (what is meant by this will be clear after this article) but felt it important to mention that this forms part of a visionary (although already available) approach to integrating solutions to combine system simulation (Simcenter Amesim), 3D CAE (Simcenter 3D) and test (LMS Testing Solutions) to assist in the prediction of performance throughout the product lifecycle. This suite of technologies forms part of the Simcenter portfolio (all in a managed environment).
Although we are always excited and happy to discuss our solutions (therefore please be in touch). We do however believe that a series covering Simcenter 3D will be best suited to the community as this will be most widely used or implemented.
As part of this series, we will delve deeper into the technology’s architecture in terms of positioning and licensing, but also include technical content specifically generated to transfer a specific applications skillset (by using the appropriate technology).
This being said, returning to the title:
What is Simcenter 3D?
Just a note – should time be a constraint (which it normally is), please feel free to skip to the Never Enough Time (NET) section below. We won’t be offended.
Simcenter 3D is an integrated and open platform that merges various discipline-specific technologies in an environment where the user has access to CAD and design tools. All in an effort to improve efficiency and to optimise workflows for the complex development cycles of this day and age.
By combining various technologies like NX Nastran, NX CAE, LMS solvers (Virtual Lab – utilising the Dads solver; Amesim) and recent acquisitions, CD Adapco (Star and Heeds) and Mentorgraphics (Electronic Design Automation; FloEFD), with the integrated physics of structural, acoustics, flow, thermal, motion and composites provides a best in class solution to product development and consulting teams. As always, a picture is worth a thousand words. For more clarity please see the image below (Note – Engineering Desktop contains the relevant pre- and post-processing tools).
But why be satisfied with only integrated physics coupled with a world-class CAD engine? Even though these tools exist, we don’t develop in silo’s (or rather, we shouldn’t) and specialist tools will always play a role. We also have legacy technologies with heaps of valuable content.
For this reason, Simcenter 3D is also open and scaleable (catering for the designer and the analyst). Simcenter 3D can be used with various CAD formats and is also solver and geometry independent. This enables the user to prepare models for industry standard technologies and also includes specific pre- and post-processing tools for these technologies. Making efficient use of historical data and providing increased capabilities for getting models simulation ready when specialized tools are required. Insert another smart comment about the relevance of pictures here.
In summary (NET readers, please join in here.)
When one narrows this down and takes a look at the business impact Simcenter 3D can have, we find the following benefits:
Reduced time spent on modeling and model preparation through the integration of geometry modeling and analysis (CAD <-> CAE).
Faster simulation (Concept Evaluation) results due to CAD associative simulation models.
Ease of adoption – A shared user’s interface for all users and applications.
Streamlined workflows for simulation processes.
Multiphysics capability allows for the simulation of real-world applications.
Overall improved processes with best in class modeling techniques and solver schemes.
Flexible licensing options (to be discussed in a future post).
Since actions speak louder than words, I would like to reference one of our users who has implemented Siemens PLM technologies across the organisation. This is their feedback after moving away from different CAD and FEA technologies.
“Having an integrated CAD-to-Simulation environment greatly enhances the efficiency of design iterations.”
Hennie Roodt, Simera
Our environment is changing. The products that we design is changing. As well as the individuals that we work with. If change exists all around us we definitely require a technology partner that recognises this and strives to continuously adapt and change to meet these new demands.
For some further information, please feel free to be in touch with us here at ESTEQ. Again, we are passionate about people and technology and enjoy the realm where these intersect.
For more ‘marketing’ related content (hey, you never know when this might be valuable!) I provide some links below.
By definition, Simulation modeling is the process of creating and analysing a digital prototype of a physical model to predict its performance in the real world. A model is a representation of the construction and working of some system of interest.
The topic of discussion will be focussing on the increasing role of 3D simulation. The main focus will be Discrete Event Simulation. This is based on the assumption that the system changes instantaneously in response to certain discrete events. Simulation modeling has opened up a whole new world of mathematical analysis on the impact of uncertain inputs and decisions we make on the outcomes we care about. We find ourselves in an era where technology advancements change the way we do things.
2D vs 3D simulation
More and more we are beginning to see 3D simulation playing a big role in simulation modeling. The traditional 2D modeling has been replaced with impressive 3D data providing visuals that are not only appealing to the audience, but also represent what is physically on the factory floor. 3D simulation provides enhanced visuals and accuracy that otherwise could not have been achieved in 2D modeling (as seen in Figure 1 – 3D Factory Model in Tecnomatix Plant Simulation). We are now able to pull in an object’s CAD data, point cloud of the facility etc. to develop a digital prototype.
Figure 1 – 3D Factory Model in Tecnomatix Plant Simulation 
Figure 2 – Point cloud image 
Instead of pulling in a 2D drawing of your facility, we see point cloud images (Figure 2 – Point cloud image) are being used in 3D simulation models. A point cloud is a set of data points in a three-dimensional coordinate system. This is quite pricey but comes with its own benefits. The level of accuracy of these point cloud images enables us to check for possible collisions with equipment. This would not have been possible in 2D models. The special relation between objects is very important, which is why point clouds are gaining popularity among modelers. The information gained from pint cloud images allows for smooth execution of facility renovations and retrofit projects.
3D Simulation models provide an opportunity for non-simulation personnel to get a better understanding of the model. When presented with visuals of their facility, machines etc., the team is able to offer more input and actually engage in the simulation process more effectively. Therefore making the whole exercise meaningful and produces even more accurate results.
Previously, presenting simulation models to management was a tedious and daunting task because most people found it difficult to relate to a 2D modeling environment with objects flying around as seen in Figure 3 – 2D Model of Production Facility . As soon as you present familiar visuals in 3D, people are able to relate and make better decisions based on the visuals they see in front of them.
Some might argue that it takes a lot of effort and time into building a model in 3D which essentially does not add any statistical significance. My argument is that the response and support you will receive from key stakeholders will determine how far your project will go. If you can get your audience to understand what you are building and aim to achieve, your results will be better.
3D in Education
3D modeling together with Virtual Reality (VR) has redefined the way learning is taking place (Figure 4- University of Pretoria VR Centre ). The University of Pretoria has a state of the art Kumba Virtual Reality Centre for Mine Design. The VR center presents an environment for ‘immersive’ experiences destined to change the face of education, research and design in mining and beyond.
Figure 4– University of Pretoria VR Centre
The center is set to enhance learning, training, and research in operational risks across industries through an innovative approach to information optimization and visualization. Essentially, such facilities are not limited to just the mining industry and can be used in other fields of study. Imagine medical students performing open heart surgery simulations!
Such technological advances and developments have revolutionized the way simulation modeling has traditionally been done. We can now create solutions to otherwise complex challenges that we are faced with in industry today.
Workers are now able to identify and get a better understanding of what is going on in the simulation model presented to them. They are able to physically/visually see the effects of certain decisions they make while working. This makes the whole process interactive and a better learning experience.
Benefits in a nutshell
Precision and Control
 3D Factory51 Model in Tecnomatix Plant Simulation taken from Tecnomatix Plant Simulation V14 example models.
The Basler PowerPack for Microscopy now pays tribute to challenging fluorescence applications. New monochrome Basler Microscopy ace cameras offer best imaging performance due to Sony´s latest CMOS technology. The Basler Microscopy Software 2.0 comes along with dark skin mode and fluorescence color preset.
Ahrensburg, October 25, 2017 – Camera manufacturer Basler enhances its PowerPack for Microscopy to address the challenging requirements of fluorescence imaging. The choice of cameras has been rounded off by powerful monochrome Microscopy ace cameras with Sony´s latest CMOS technology. The Basler Microscopy Software increases user convenience with dark skin mode and fluorescence color preset, as well as additional feature upgrades.
Basler offers two cameras which are particularly suitable for fluorescence imaging: the Microscopy ace 2.3 MP Mono offers a resolution of 2.3 MP combined with high sensitivity thanks to its large pixel size. The Microscopy ace 5.1 MP Mono scores with an ideal balance between high resolution (5.1 MP), large pixel size and low noise level. An important factor in fluorescence applications is the use of low light emissions, to reduce the risk of photobleaching the sample. The cameras provide high quantum efficiency and sensitivity, to take images even in low light. Besides suitable frame rates, both cameras deliver a high dynamic range for recording the differentiation between subject and background.
The Basler Microscopy Software included in the Basler PowerPack has released its 2.0 version: the graphical user interface can be switched to dark skin mode to reduce the light emissions from the display towards the sample. This feature also reduces the user eye fatigue and stress when working in a dark environment.
To make fluorescence imaging more convenient and to save the user´s time, the software has also been enhanced with color presets for the most common fluorescence markers. For quick access, these presets can be activated with a single click and configured to individual needs. Images still remain as a greyscale image for further processing in other applications or can be saved as a color version.
The new 2.0 version of the Basler Microscopy Software also offers exposure compensation and a new zoom feature for stereo microscopes.
The new Basler Video Recording Software captures single images, high-speed videos for slow-motion analysis and image sequences for time-lapse microscopy. It comes with the Basler PowerPack for Microscopy and also works with all Basler USB 3.0 cameras.
Ahrensburg, 11 October 2017 – Camera manufacturer Basler is now offering a software solution to enhance the possibilities of microscopic imaging. Taking single images, recording videos, as well as image or video sequences, becomes very simple and intuitive. The recording software even offers camera control features to improve image quality, to set up different options for recording and to use hardware trigger signals.
The Basler Video Recording Software enables the capture of slow-motion videos. Such recordings are useful for motion analysis where fast-moving objects need to be investigated. This is particularly crucial in applications like material analysis, sperm analysis or for monitoring cell transportation processes.
In addition, the software offers two options for time-lapse microscopy. Take uncompressed image sequences for further analysis and processing, or capture time-lapse videos for monitoring processes and changes in samples as well as for publications. The time interval for both images and video can be set to your needs, as well as automated start and stop the recording.
When using a Basler Microscopy ace camera, the software even takes images or videos automatically when using hardware trigger signals. This comes in handy for many use cases and can, for example, support hands-free documentation during material inspection when using a foot-operated switch connected to the camera.
Comprehensive software features at a glance:
Live view and camera control
Image adjustments and automated settings
Videos in modern MPEG-4 format
High-speed recordings for slow-motion analysis
Image and video sequences for time-lapse microscopy
Image capturing with hardware trigger signal
Easy installation and intuitive user interface
Supported operating systems: Windows 7, Windows 8.1, Windows 10 – 32 bit and 64 bit
User-friendly software design for ease of use
The Basler Video Recording Software comes with each Basler PowerPack for Microscopy and all Basler USB 3.0 cameras can be connected. The software can be downloaded from the Basler website: www.baslerweb.com/VideoRecordingSoftware