**Warning**: count(): Parameter must be an array or an object that implements Countable in

**/kunden/78200_6020/webseiten/wordpress/wp-content/plugins/wordpress-tabs-slides/wp-tabs-slides.php**on line

**504**

- Session 1
- Session 2
- Session 3

#### Session 1: Guest Session

#### Time: 9:45 - 12:30

9:45 - 10:30

**Optimization and stochastic analysis - towards robust design**

*C. Bucher (Vienna University of Technology)*

The effect of stochastic uncertainties of structural parameters and/or environmental conditions frequently leads to imperfections which can significantly reduce the performance of a highly optimized design as obtained from traditional deterministic procedure. Robustness analysis takes this into account already during the optimization process thus leading to practically more useful optimized structural designs. The presentation will cover the basics concepts of combining optimization and stochastic analysis for the purpose of reducing sensitivities and increasing robustness. Several simple examples demonstrate the potentially achievable advantages as well as the inherent additional effort required.

10:30 - 11:00

**A program view on POF modeling with uncertainties**

*R. Strunz (ASTRIUM GmbH - Space Transportation)*

The program manager's preconception of probabilistic structural analyses is: "Do not need it because it is not only time consuming but also cost prohibitive!" It contradicts with the customer request of shortened development times and reduced development budgets. Historically, the design verification has been one of the drivers for cost overruns and schedule slippages which were mainly influenced by lengthy test-analyze-and-fix (TAAF) cycles. The breakup of these cycles or the reduction of a single cycle is paramount for future program successes.

The speech discusses how Physics-of-failure (POF) modeling in combination with Probabilistic Structure Analyses (PSA) can be used in a risk-informed satisficed decision making methodology that trades the three areas of concern cost, time, and demonstrated reliability. In that context, the POF modeling with uncertainties is used not only in an FMMEA but also in a Bayesian estimation based reliability-as-an-independent-variable (RAIV) strategy. Case studies highlight the impacts of test failures and use of knowledge transfer on the three areas of concern. By that means, the usefulness of POF with uncertainties is substantiated.

11:00 - 11:30

**Continuum damage mechanics with ANSYS USERMAT: numerical implementation and application for life prediction of rocket combustors**

*W. Schwarz (ASTRIUM GmbH - Space Transportation)*

A viscoplastic Chaboche material law is extended to account for material damage based on the effective stress concept. The damage evolution equations are formulated to be valid for failure under tensile loads as well as for low cycle fatigue. A second order tensorial damage state variable is employed in order to account for deformation induced anisotropy. The unified material-damage-model is discretized and numerically implemented in the finite element software ANSYS as a user defined material. It is applied for life prediction of the hot gas wall of a cryogenic rocket combustion chamber and the results are subsequently compared to data obtained during test campaigns. It is concluded that the continuum damage approach considerably improves the life prediction capabilities compared to classical, Coffin-Manson based, estimations.

11:30 - 12:00

**Sequential LHS Design Strategy for Reliability Analysis and Robust Optimization**

*Eva Myšáková (Czech Technical University in Prague), Matěj Lepš (Czech Technical University in Prague)*

Latin Hypercube Sampling Design Strategies (LHS) constitute an essential part of a simulation-based reliability analysis. Two main objectives are usually placed on the resulting designs-prescribed correlations and space-filling properties. The last decade has witnessed the development of several methods for both objectives.

In detail, our contribution presents a space-filling technique of sequential LHS generators. In comparison to standard procedures for generating uniform designs, sequential strategies offer the possibility of new samples inclusion into the existing design in case the initial set of designs is not sufficient. This is done by superimposing new set of designs. What is completely new of our methodology is the ability to solve non-constant boundaries that appear e.g. within the Asymptotic Sampling method. The idea is to repeatedly use as much already simulated samples as possible and concurrently minimize the number of additional samples needed to fulfil the LHS restrictions. The same technology can be used for Robust (Design) Optimization as will be shown in the final part of the contribution.

12:00 - 12:30

**Analysis of the energy absorption of aluminium tubes for crash boxes**

*F.O. Riemelmoser (FH Kärnten, speaker), M. Kotnik (SZ Oprema Ravne), H. Lammer (Wood Competence Center W3C), V.K. Bheemineni (FH Kärnten), B. Käfer (FH Kärnten)*

In the project UL4C the companies Fachhochschule Kärnten, SZ Oprema Ravne and Wood Competence Centre W3C develop a crash box for an electro car and solutions for their economic production are worked out. The crash box shall be designed as a hybrid material with aluminum EN AW 6060 and carbon fiber reinforced plastics (CFRP). The main challenge in this project is seen in a perfect adjustment of the aluminum and the CFRP part. There are several factors influencing the system behavior during the crash including:

- the macro geometry such as height, length and thickness of the crash box the relative size of the geometry of the aluminum and the CFRP part
- the heat treatment condition of the aluminum 6060
- the orientation of the fibers in a multilayer system

In order to find the best crash box parameters to optimize the energy absorption per kg mass we conducted several tests on our crash sled. The results on aluminum tubes show that there are different failure mechanisms from Euler buckling to non-axisymmetric folding to axi-symmetric folding and non specific folding conditions. It is possible to plot diagrams with normalized geometrical parameters on the both axes such that regimes with different folding mechanism can be identified. It will be shown that these diagrams (we call them failure mechanism maps) depend on the geometry and to a lesser extent on the heat treatment conditions. In the presentation the failure mechanism map is explained and Finite Element simulations are shown in order to verify our theory. Also an outlook is given how the CFRP hybrid will influence the energy absorption capability.

#### Session 2: Mathematical modeling, advances in FE-analysis & data handling

#### Time: 13:30 - 15:10

13:30 - 13:50

** Robust Model building **

* M. Luger (INTALES GmbH), A. Grassl (INTALES GmbH) *

At the beginning of the analysis stands the buildup of a model which covers all parameters in a manageable way. The Finite Element mesh is generated on a given geometry, the properties to run the analysis are added in an automated build process. This allows the handling of large amounts of data and an iterative optimization of the structure. During the buildup process all model-relevant data is prepared and saved for the following Analysis-steps.

13:50 - 14:10

** Refined Investigations and Data Manipulation for FE Post Processing **

* B. Caillaud (INTALES GmbH), S. Müller (INTALES GmbH) *

The latest developments in computer technologies are now enabling deeper investigations in structural analyses of FE models, even for industry-related projects. The size of the models or the number of load cases to be analyzed is increasing accordingly. As a good illustration, the Non-Linear-Analysis (NLA) Manager enables the computation of full-loading scenarios at a non-linear level, in an automated way including buckling investigation, branching analysis, in-time checks of certain state variables and application of imperfections to the model. The amount of data generated is enormous and requests appropriate post-treatment. It is here apparent that the automated handling of data (inputs and outputs) becomes unavoidable, to bring a good compromise between the level of detail of the model and the accuracy of the results checking process. Together we will go through examples of in-house tools developed for FE Post-processing (refined 3D-analysis based on shell-elements models, radius bending analysis, sandwich failure criterion investigation) in order to enhance the methodologies of handling large amount of data, and support the engineer and his decision-making process.

14:10 - 14:30

** Element and Materials Formulation - An overview **

* R. Winkler (INTALES GmbH) *

The 20 years old formulation of Abaqus' cash cow, the S4 element, still serves as a reference for general purpose shell element formulations. Why does a company like INTALES still devote reasonable resources to the development of element and material routines? The implementation of specific failure and damage mechanisms as well as the application of advanced methods for optimization and stability analysis require access to material and/or element routines. Therefore, we pursue a dual strategy. The first track is targeted at an immediate application to industrial tasks: reconstructing well-proven routines within the framework of Abaqus user subroutines (S4 type shell elements, rate form constitutive laws for anisotropic large strain elasto-plasticity, damage laws, and failure criteria). The second track pursues a mid- to long-term strategy: making some topics of recent research available for large-scale applications within the framework of our in-house FEM environment ICONA (advanced stability analysis, new element formulations, multi-phase approaches for composite materials, stochastic FEM). The talk gives an overview of corresponding endeavors and particular challenges as well as recent results.

14:30 - 14:50

** Computation of Bifurcation Points - A challenging Task **

* F.-J. Falkner (University of Innsbruck) *

Numerical schemes have been developed over the last years for the direct computation of bifurcation points. The main idea of these procedures is to extend the equilibrium equations by appropriate equations describing the stability point. Newton’s method is used to solve this set of nonlinear equations. The singularity of the Jacobi matrix close to the bifurcation point is so far mainly ignored in the literature. Due to this fact convergence problems are encountered during the iteration when applying existing procedures to shell problems. In the presentation these schemes are applied to shell structures.

14:50 - 15:10

** A Material Model for Fibre-Reinforced Plastic **

* E. Eidelpes (University of Innsbruck) *

Fiber reinforced plastics (FRP) are anisotropic composite materials characterized by large differences in the stiffness of its single components, fiber and matrix. Conventional models for the description of material nonlinearities with simultaneous consideration of anisotropy rely on highly complex formulations involving questionable assumptions, simplifications, and an unmanageable number of parameters. An alternative approach is adopted from the recent literature: the stiffness contribution of each component is calculated separately and contributes to the total stiffness matrix according to its volume fraction. The nonlinear behavior of the polymer matrix is modeled by isotropic von Mises plasticity. The fiber material model involves a Rankine type failure criterion describing rupture and micro-buckling at tension and compression, respectively. The occurrence of failure is accompanied with a degradation of fiber stiffness. For the numerical treatment of strain localization a smeared crack model has been implemented. The accuracy and reliability of the implementation is demonstrated by a number of plain stress benchmark problems.

#### Session 3: Stochastic structural analysis & computational aspects

##### Time: 15:40 - 17:20

15:40 - 16:00

** Stochastic methods in sensitivity and reliability analysis **

* M. Oberguggenberger (University of Innsbruck) *

A major concern in the design and analysis of structures is their reliability (avoidance of failure) and their sensitivity with respect to changes in the structural parameters (material properties, boundary conditions, geometry). It is possible to assess reliability and sensitivity on the premise that uncertain inputs are modeled as random variables or as random fields, using sampling based Monte Carlo type analyses. As a rule, the calculation of the structural response involves a large finite element model and thus is computationally expensive. Thus methods are needed that can credibly predict the behavior of large structures, using only a small sample size. This talk intends to give a survey of the stochastic concepts and new methods needed in such a situation. Topics touched upon include correlation control in sample generation, bootstrap resampling, methods for selecting influential parameters, and modeling spatially distributed imperfections by random fields. The concepts of tolerance intervals and of stochastic response surfaces pave the way from sampling based sensitivity analysis to reliability.

16:00 - 16:20

** Quantitative assessment of random field models in FE buckling analyses of composite cylinders **

* V. De Groof (University of Innsbruck), M. Oberguggenberger (University of Innsbruck), H. Haller (Intales GmbH), R. Degenhardt (DLR), A. Kling (DLR) *

The buckling load of thin-walled structures is known to be highly sensitive to imperfections. The presentation will show results using random fields to include realistic geometrical imperfections in a finite element model to increase the accuracy of the predicted structural behavior. Based on real geometry measurements of composite cylinders, covariance matrices are assembled to be used for the Karhunen-Loéve expansion of the random fields. Different techniques, based on principal component analysis (PCA) and analytical covariance functions are evaluated and compared with the deterministic, finite element buckling loads of the geometric measurements. This approach allows isolating the effects of different correlation models of the geometrical imperfections from other uncertainties such as boundary conditions, loading imperfections, numerical errors, etc. A quantification of the performance of the different covariance models has been made. The results show that the random fields approach can be used to improve the predictions and the understanding of the structural behavior of thin-walled structures, especially when these predictions are based on available imperfection data, using PCA. On the contrary, one should be very careful when using analytical covariance functions since they may fail to capture the complete behavior of the structure.

16:20 - 16:40

** Probabilistic Identification of Material Model Parameters **

* A. Kučerová (Czech Technical University in Prague), J. Sýkora (Czech Technical University in Prague), B. Rosić (Czech Technical University in Prague), H.G. Matthies (Technische Universität Braunschweig) *

In trying to predict the behaviour of physical systems, one is often confronted with the fact that although one has a mathematical model of the system which carries some confidence as to its fidelity, some quantities which characterise the system may only be incompletely known, or in other words they are uncertain. Here we want to identify these parameters through observations or measurement of the response of the system. Such an identification can be approached in different ways. One way is to measure the difference between observed and predicted system output and try to find parameters such that this difference is minimised, this optimisation approach leads to regularisation procedures. Here we take the view that our lack of knowledge or uncertainty of the actual value of the parameters can be described in a Bayesian way through a probabilistic model. The unknown parameter is then modelled as a random variable - also called the prior model - and additional information on the system through measurement or observation changes the probabilistic description to the so-called posterior model. The second approach is thus a method to update the probabilistic description in such a way as to take account of the additional information.

16:40 - 17:00

** Sensitivity analysis of large scale spacecraft structures **

* B. Goller (INTALES GmbH) *

The increasing requirements of spacecraft industry with respect to the structural performance and mass minimization bring the need for advanced analysis strategies to the forefront of interest. Sensitivity analysis offers the possibility to identify important parameters and critical regions of spacecraft structures and gives hence a deeper insight into the finite element model. In addition, it also offers the possibility to gain information for enhanced strategies for structural optimization. The sensitivity analysis presented in this talk is a simulation based approach, where the parameters are ranked based on correlation coefficients. In this way, linear and non-linear monotonic dependencies between input-output pairs are detected. Due to the large computational efforts associated with the evaluation of the samples, enhanced simulation strategies (i.e. Latin Hypercube sampling) and a parallelization strategy have been adopted. An additional challenge is posed by the large amount of data to be handled during the analysis for which reason automated tools have to be implemented. In this talk, the basic steps of the sensitivity analysis and its computational translation will be presented. As an application, a rocket nozzle is used, where the most influential parameters with respect to the failure mode of the structure are identified.

17:00 - 17:20

** Parallelization of the FE-solver ICONA **

* M. Gratt (University of Innsbruck) *

So far, the FEM environment ICONA is a by-product of a number of research projects jointly performed by INTALES and the University of Innsbruck. The original motivation to create an in-house FEM code originates from severe problems with the restrictive features that are offered by commercial programs for a reliable treatment of severely nonlinear problems in structural analysis. In addition, the further development aims at the efficient implementation of sophisticated stochastic analysis tools. In the course of a running project, the Distributed Parallel Systems Group, Department of Computer Science, is dealing with the efficient implementation, parallelization, and optimization of the code. The C++ implementation relies on the PETSc (Portable, Extensible Toolkit for Scientific Computation) library. It provides an interface to easily use MPI based parallel solvers and other numerical tools. The first part of the talk deals with the internal data flow structure of ICONA. The second part is about the parallelization of element operations and the assembling of the global stiffness matrix.