**Keynote Speaker:** **Pascal
Frey, Universite Pierre & Marie Curie**

Pascal Frey is a professor of applied
maths at the Universite Pierre et Marie Curie, member of the Jacques Louis
Lions
laboratory
(Y. Maday dir.), former member of the
Gamma group (P.L. George, dir.) at INRIA research center. Research areas: surface
and volume mesh generation, mesh adaptation, scientific computing, visualisation.

**Perspectives of local anisotropic
Delaunay mesh adaptatio**n

In this lecture, we will focuss on h-adaptation based on the anisotropic Delaunay
kernel in view of numerical simulations based on finite element methods. An
application of this scheme will be presented in the area of rigid-body mesh
movement. The robustness of the approach relies heavily on preserving mesh
quality during each adaptation stage.

Invited Speakers:

**Kokichi Sugihara, University of Tokyo**Kokichi Sugihara received B. Eng., M. Eng. and Dr. Eng. Degrees in mathematical engineering from the University of Tokyo in 1971, 1973 and 1980 respectively. In 1973 he joined a computer vision group in Electrotechnical Laboratory of the Japanese Ministry for International Trade and Industry. He moved to Nagoya University in 1981 and to University of Tokyo in 1986. He is currently a professor of the Department of Mathematical Informatics of the Graduate School of Information Science and Technology of the University of Tokyo. His research interests include computational geometry (robust geometric computation in particular), computer graphics and computer vision. He is a member of the Information Processing Society of Japan, the Operations Research Society of Japan, Japan SIAM, IEEE and ACM

**Use of Digital Topology for Robust Geometric Computation**Geometric algorithms are fragile if they are implemented naively because numerical errors generate inconsistency in geometric structures. To overcome this difficulty, many approaches to robust implementation have been proposed, and those that survive till now can be classified into two groups: an exact computation approach and a topology-based approach. The exact computation approach uses high precision arithmetic that is sufficient to judge the topological structures always correctly. This approach is usually used together with symbolic perturbation to cope with degeneracy and a lazy evaluation scheme to decrease computational cost. The topology-based approach, on the other hand, uses floating-point arithmetic while placing higher priority on topological consistency than on numerical values and thus avoids failures.

The topology-based approach has many good merits; it never fails because the consistency is guaranteed from a topological point of view, it is fast because we can use floating-point arithmetic, it is simple because degeneracy never arises, etc. However, the use of this approach is not so automatic as the exact computation approach, because we have to extract topological invariants from each individual geometric problem. This has been a bottleneck for common use of this approach.

Recently, we could improve this approach so that it can be used even by beginners. The basic idea is the use of digital topology. That is, instead of finding topological invariants for a given problem, we first generate an approximate solution of the problem in terms of digital picture, then extract the topological structure from the approximation, and finally use it in the topology-based approach. We show this approach with examples, and discuss about the applicability to mesh generation.**Mike Holst, University of California, San Diego**Professor Holst joined the UCSD Mathematics Department in Summer 1998. Prior to arriving at UCSD, he was an assistant professor at UC Irvine during 1997-1998, and from 1993-1997 he was a Prize Research Fellow and a von Karman Instructor of Applied Mathematics at the California Institute of Technology. Professor Holst was a UCSD Hellman Fellow in 1999, and was the recipient of an NSF CAREER Award during the period 1999-2004 for his research in computational and applied mathematics. He is currently PI, Co-PI, and/or on the steering committees for a number of interdisciplinary research projects and centers at UCSD and elsewhere, including:

- The La Jolla Interfaces in Science Program (http://ljis.ucsd.edu);
- The Center for Theoretical Biological Physics (http://ctbp.ucsd.edu);
- The National Biomedical Computation Resource (http://nbcr.ucsd.edu);
- The Bioinformatics Ph.D. Program (http://bioinformatics.ucsd.edu);
- The Southern California Applied Mathematics Symposium (SoCAMS);
- and the Computational and
Applied Mathematics Research Group within

the UCSD Mathematics Department (http://cam.ucsd.edu).

Professor Holst's general research background and interests are in a broad area called computational and applied mathematics; his specific research areas are partial differential equations (PDE), numerical analysis, approximation theory, applied analysis, and mathematical physics. His research projects center around developing mathematical techniques (theoretical techniques in PDE and approximation theory) and mathematical algorithms (numerical methods) for using computers to solve certain types of mathematical problems called nonlinear PDE. These types of problems arise in nearly every area of science and engineering; this is just a reflection of the fact that physical systems that we try to manipulate (e.g., the flow of air over an airplane wing, or the chemical behavior of a drug molecule), or build (e.g., the wing itself, or a semiconductor), or simply study (such as the global climate, or the gravitational field around a black hole) are described mathematically by nonlinear PDE. In simple cases, these problems can be simplified so that purely mathematical techniques can be used to solve them, but in most cases they can only be solved using sophisticated mathematical algorithms designed for use with computers. Computational simulation of PDE is now critical to almost all of science and engineering; the mathematicians provide the mathematical tools and understanding so that scientists in physics, chemistry, biology, engineering, and other areas can confidently use the modern techniques of computational science in the pursuit of new understanding in their fields of study. To learn more about Professor Holst's particular research program, please see his webpage: http://cam.ucsd.edu/~mholst

Parallel Adaptive Finite Element Techniques

We describe a low-communication approach to the use of adaptive finite element methods with parallel computers, developed jointly with R. Bank at UCSD. The algorithm deals with load balancing in an a priori manner, and decouples the coupled elliptic problem into a set of independent subproblems. We give some numerical examples illustrating the approach, and then provide a rigorous analysis of the resulting solution quality. We give local and global error estimates for the solutions produced by the parallel algorithm by reinterpreting it as a partition of unity method, and by using some local estimates from the approximation theory literature. The algorithm is applicable to general elliptic equations in 2D and 3D polyhedral domains.

**Ulisses T. Mello, IBM Research**Ulisses T. Mello is a Research Scientist at IBM's T. J. Watson Research Center and Adjunct Associate Research Scientist at Columbia University's Lamont-Doherty Earth Observatory. He received his Ph.D. and M.A. in Geology from Columbia University, M.Sc. in Geology from Federal University of Ouro Preto and B.Sc. in Geology from University of Sao Paulo, Brazil. From 1987 to 1994 he worked for the Petrobras Research Center in large-scale fluid and heat flow within sedimentary basins. He joined IBM Research in 1994 and currently he works in the Mathematical Sciences department. He has more than 50 papers published in the literature, and in 1998 he received the American Association of Petroleum Geologists (AAPG) Wallace Pratt Award. His research interests are mainly in the numerical simulation of geological process using unstructured meshes, and 3D representation of geological structures. Since 2000, Dr. Mello is also serving as the IBM Research Relationship Manager for the Chemical and Petroleum Industry sector. He is member of the American Association of Petroleum Geologists (AAPG), American Geophysical Union (AGU), Brazilian Association of Petroleum Geologists (ABGP), European Association of Geoscientists & Engineers (EAGE), Latin American Association of Organic Geochemistry (ALAGO), Society of Exploration Geophysicists (SEG), and Society of Petroleum Engineers (SPE).

**Challenges in Mesh Generation for Geological Objects**

Meshes representing geological objects are required for the solution of PDEs in reservoir simulation, basin modeling, seismic wave propagation, and geo statistics. Geological objects have complex geometry and are in general heterogeneous and anisotropic. In addition, these objects may present sharp discontinuities such as faults and fractures that have to be represented in numerical meshes. Geological features such as pinch-outs and erosion surfaces can be especially challenging because of low angle and poor aspect characteristics. Basin modeling requires the modeling evolving and deforming bodies over time. Modeling of the evolution of salt bodies can cause excessive deformation of meshes and conservative remeshing techniques are required. The simulation of deposition and erosion processes during the formation of geological basins requires meshes that follow the evolution timelines of these basins, which is an additional constrain in the mesh generation process. This talk gives an overview of the approaches used currently to face some of the challenges in mesh generation for representing static and evolving geological objects.

**Bernd Hamann, University of California, Davis**

Bernd Hamann is a computer scientist specializing in visualization, geomeric modeling, computer graphics, and virtual reality. He is a professor in Computer Science and an Associate Vice Chancellor for Research at the University of California, Davis. He received his Ph.D. in 1991 from Arizona State University, and has made contributions over the past decade primarily in the area of massive data representation and visualization.

**A Survey of Some State-of-the-art Visualization Technologies and Challenges for Interactive Data Exploration**

Currently, many scientific, engineering and bio-medical fields are ndergoing a major "revolution': They are becoming extremely data-rich, and techniques to effectively interact with very large data sets and gaining insights quickly are lacking. This talk discusses applications encountered in diverse areas requiring novel approaches to massive data visualization and interaction.

**Banquet Speaker:** **Dipankar
Choudhury, Fluent Inc.**

Dr. Dipankar Choudhury
is the Chief Technology Office at Fluent Incorporated, a leading Computational
Fluid Dynamics software and solutions provider.
He is responsible for directing Fluent Inc.'s R&D and funded development
activities. Prior to his appointment in his current CTO role, Dr. Choudhury
has held positions in software development, product management, overseas
business development, consulting, and customer support.
Dr. Choudhury obtained his Ph.D. in the area of Computational Fluid
Dynamics and Heat Transfer from the University of Minnesota in 1987.
He is a member of the ASME and of the AIAA and has CFD related publications
in journals, conference proceedings and trade magazines. |

**The Central Role of Mesh
Generation
in Commercial Computational Fluid Dynamics Applications
**The use of commercial CFD analysis tools is now in its third decade. This talk
provides a perspective on how CFD analysis has been applied in industry,
the current status of the business, technology and software tools and their
relevance to the needs of end users. Some thoughts are provided on evolving
trends (both in technology as well as market needs and usage patterns) and
the major mesh generation challenges that have to be overcome in order for
CFD software to be employed more broadly in industrial applications.

IMR
Home Page | Sandia National
Laboratories