Documente Academic
Documente Profesional
Documente Cultură
NASA Technical
Memorandum
102890
G_/p
December
1990
\
\
"L 11 _ _ L _ _ tm
NASA
Technical
Memorandum
102890
December
1990
NationalAeronautics and Space Administration Ames Research Center Moffett Reid, California 94035-1000 \
\
\
NASA's
EXPERIENCE
NASA Moffett
Center USA
ABSTRACT A brief overview of NASA's recent experience in supereomputing is presented from two perspectives: early systems development and advanced supercomputing applications. NASA's role in supercomputing systems development is illustrated by discussion of activities carried out by the Numerical Aerodynamic Simulation Program. Current capabilities in advanced technology applications are illustrated with examples in turbulence physics, aerodynamics, aerothermodynamies, chemistry, and structural mechanics. Capabilities in science applications are illustrated by examples in astrophysics and atmospheric modeling. The paper concludes with a brief comment on the future directions and NASA's new High Performance Computing Program.
1.
Introduction The National Aeronautics and to solve Space Administration of supercomputing problems (NASA) in science has for many and years
been cation
in the development
systems
challenging
Today supercomputing installations at every NASA could to develop not have been
tool within the Agency. and development center and to make them. supercomputers at Ames Center. innovations successful, into the 1980s parallel computers Research new
There are supercomputer and their use has enabled scientific discoveries that
new aerospace
parallel
computer
CDC
computer through
at Langley
fin'st of a new
of parallel they
Although environment
in which
Architectural Parallel
investigations Processor,
a massively
Flight Center, and the Cosmic Cube/Hypercube, parallel MIMD systems, Institute of Technology and NASA's Jet Propulsion Laboratory. In the also established the Numerical Aerodynamic Simulation (NAS) Program to make them available to the aerospace comacquired the first full-scale Cray-2 and first the first supercomputer UNIX environment.
supercomputing technologies and The NAS Program, for example, computer systems and developed
YMP
NASA's involvement in supercomputing applications also began in the 1970s. Acquisition of the Illiac-IV, STAR-100,andCray-1computersspurredthe development of innovative parallel computing algorithms. New computational disciplines, such as computational_ fluid dynamicsandcomputationalchemistrysoonemergedandit became clear that supercomputers were a truly enablingtechnology.Today, supercomputers are being appliedroutinely to advancetechnologiesin aeronautics, transatmospherics, space transportation, andspaceexploration,andto studythe Earth,otherplanetarysystems, and objectsin outerspace.
The purpose in supercomputing supercomputing to remain nology computers lence puting directions Examples of this paper from applications. edge two is to provide perspectives: The NAS and of supercomputing development applications and astrophysics Finally, computing. the paper a brief overview of selected system technology through NASA activities and effort
supercomputing system
Program
is illustrated
examples
physics,
aerodynamics, in science.
aerothermodynamics,
a look
in high-performance
2.
Aerodynamic Program
Simulation
Program The first goal is to provide Department as a necessary and related computing in computer of Defense element disciplines. capability hardware and today a national (DOD), in ensuring The second through and provides software comother congoal techto
goals.
capability agencies,
to NASA,
leadership
in computational in advanced
systematic service
of state-of-the-art
nologies. The NAS Program began full operation in 1986 over 1700 users at 133 locations in the United States. The (NPSN), puters, 2.3 Intel stations, through NAS complex in Fig. 5880 of computers, 1. It currently mainframe capacity, prototype hardware called includes computer 5 VAX parallel data Machine combining rates the and NAS
is shown an Amdahl
Cray-2
associated parallel
terabytes Touchstone
Graphics systems
a 32,000-processor an extensive
Ethernet,
HYPERchanne!, 10 to 800
net technologies
to provide
computers operate under the UNIX operating system work communications provided via the well-known Thus, a user can access any computer, run jobs,
and transfer
Fig.l
a single given set of commands in references users NSFnet, Research
NAS Processing System Network (NPSN). A more complete description of the NPSN is
on all computers.
2 and 3. access including Network the NPSN through a wide Internet. 1544 choice NAS kbits/sec of existing national Science communiFoundationResearch
networks
MILnet,
and NASA's
Center, Bay Area universities, Department of Energy is the main communication path to DOD laboratories, the university NASnet. throughput at bandwidths The 1984 design uration. words) 2's very NAS community. uses to remote ranging Program's it decided NAS, For NASA Ethemet from and aerospace technology users. NASnet bridging workstation
laboratories, and industry. MILnet while NSFnet, and BARRnet serve industry users, fast NAS serves has developed and high sites 31 remote to provide currently response
graphics
NASnet
56 to 1544 kbits/sec. in advanced Research, supercomputer Inc.'s Cray-2 technology computer, of Energy, full-scale (over which began Cray-2 began was config64-bit in in
when
with the Department took delivery of 2.048 capacity to sustain memory of the first gigabytes
proto-
type testing
capacity
The Cray-
memory,
memory-disk
net effect
was decreased
preparation
increasingly
problems, plus improved job turnaround to perform fluid dynamics simulations vehicle configurations that were
time. More importantly, researchers about realistic three-dimensional just a few years before. environment and other based DOD across new subsystems common subsystems available proprietary protocols, Cray-2. Prior and
aerospace
not possible
Providing a major added. nitions tions UNIX computer later mented resource interactive networking Berkeley had that UNIX NPSN Only could subsystems
the environment (implementation that was free system, implemented Today, UNIX added
system goal.
and many
to this,
implemented
on minicomputers
on other
supercomputing.
Moreover, seamless
coupled
with
a so-called
environment
that enables
users to move easily between systems, allows for easy file access across machine boundaries, and enhances user-code portability. tecture bilities systems. Enhanced the NPSN supercomputers. user productivity The design was was another scientific implemented directly at the major NPSN design NAS concept and has allows proved for modularity beneficial essential in the for ready installation of the
and command initiation Finally, the open archiof new and other capanew YMP
implementation Cray
goal.
To meet
was designed
to include
workstations by installing
interface Graphics
workstations and connecting them This was a unique implementation supercomputer supercomputer workstation are submitted and analyzed. workstation's workstations, qualitative where installations. becomes literally with, text and small numerical
to the Cray-2 though a local area time and has since been replicated with UNIX and network The user where amount of the workstation. complex produce input data
software, resides
and results
graphics display capability is especially beneficial. researchers can digest results in a reasonable amount picture of the physical phenomenon simulated. Numerous
have been developed to operate acrossworkstations and Cray supercomputersin a distributed mannerand to allow researchers to display any facet of their datafor easy interpretationandevaluation 4-7.Today, interactivegraphical post-dataprocessingis an essentialaspectof today's sophisticated numericalsimulationprocess. Providing ready accessby the national aerospace researchcommunity from their homelocationswasanotherNPSNdesigngoal.This led to the development of NASnet, a unique high-performancenationwide communication network that links remote user locations to the NAS facility via land-links, with communication speeds up to 1544kbits/sec.NASnet connects the communicationlines from the NAS facility to local areanetworksat eachremotesite throughEthemetbridgesandtherebycreatesan extension of the NPSN local network.The implementationof UNIX andnetwork softwareon remoteworkstationsprovidesan environmentin which remoteresearchers havevirtually the sameinteractivecapability as usersat the NAS facility itself. Presently,NASnet is being redesignedto utilize new internet gateway technology and thereby increaseits performanceto approximately6200Kbits/sec. 3.
Computational Computational supercomputing 1960s to treat embedded tions to fluid Fluid fluid Dynamics dynamics has been one of the earliest drivers in advancing in the late airfoils with soluin both
The introduction of numerical techniques flows, such as transonic flow over lifting led to greatly discipline increased grew rapidly interest with The
shock
in computational improvements
computer systems mance have been shown orders here of magnitude used
and algorithms. Fig. 2 shows that improvements closely paralleled by improvements in numerical that the cost of performing due to advances in numerical to solve in computers over methods a given calculation and also three period. forms
in computer performethods 9. The data has decreased of magnitude the example equations in other three due shown that
illustrates
orders
a 15-year
While
govern the fluid dynamics as well. Computational the aerospace is more below, and future involved
gas, it is indicative
disciplines
is now industry's
within which
NASA
and
industry
applications, dynamic
NASA processes to
objectives of new
are discussed
fundamental vehicles
of basic
(b) develop
for application
100
--
_.
R \
2-D INVISCID (NONLINEAR)-,_ \ \\IBM \ \ \ \ \ "3-D INVISCID 709Q _\ _",_ /(NONLINEAR) 1BM7094 --\\ -_ _\ IBM 360-50 \_ \\ \,41 IBM 360-67 _ _ \_ \ \ IBM \ CDC 76OO. '1t,,370-195 \ \ \
o
--
8
.J 14,1 M.01
ILLIAC-4
e"CRAY-1 \ CRAY-XMP
CDC 205
COMPUTERS (GIVEN ALGORITHM) 0 D A ALGORITHMS (GIVEN COMPUTER) .001 1950 ! I 1960 I I 1970 1 I 1980 I I 1990
YEAR OF AVAILABILITY Fig. 2 Comparison of numerical simulation cost trend resulting from improvements in fluid dynamics algorithms and in computers.
3.1
Turbulence Turbulence
Phystcs is the most of bodies turbulence this common fluid motion fluids, encountered fluid mixing, some is still not well in nature. and heat enough It greatly transfer. minds understood affects Because for over to be
moving attention,
through
Despite controlled
If it could
the benefits
of supercomputers solutions
accurate
numerical equations
it
numerical
Navier-Stokes
and at low Reynolds and can be treated in this way, flow between
numbers. as though
These
on first experifor
they were
simulations possible.
insights pressure
no measurement
detail, flows
in results
in a turbulent
the statistical
characteristics
of significant
flow features.
This visualization
ity is valuable both for identifying new relationships among the various elements dominant flow structures and for suggesting new types of physical experiments. shows the calculated vortical structures near a flat plate 13. These results were
Fig. 3 Calculated
vortical
flow structures
in a turbulent boundary
layer.
ORIGrNAL OF POOR
PAGE
IS
QUALITY
provided velocity
by direct
(first
principles)
numerical
simulation
of a flat-plate in the
provided
pressure
for each time step and for each of 9.4 million of this magnitude requires
domain.
about 200
time on a Cray-2 computer. Turbulence physicists have analyzed the results shown in Fig. 3 and have concluded that locations of significant contributions to Reynolds stresses occur
flows 13.
adjacent to large hook-shaped vortical structures that are common Information of this sort aids in the development of statistical reflect the underlying physical behavior of turbulence.
accurately
3.2
Aerodynamics Computational aerodynamics altitudes models to trade is concerned numbers differ employed geometric with below numerical five where Due simulations air behaves to computer approximate of aircraft as a perfect to the governlimitations, solutions or
flying
at normal
and at Mach
computational is forced
in levels complexity.
equations
simplicity
more approximate solutions less severe and the solutions solution flow field grow rapidly and treat viscosity
complexity 9. As the approximations become complex, the computer resources needed for a that neglect layer viscosity Formulations the boundary lift. research include envi(inviscid) in the are in widespread use for aerodythat include layer is sepaenvelope,
15. Generally,
approximations
namically clean configurations viscous terms become important rated from the surface. inlets These in engine and with those of modeling
at or near cruise conditions. when treating flows where for example, associated the most over flows
occur,
of the flight
receiving
within R_NS
Navier-Stokes averaged
that is large
the transport
fluctuating of the
modeling continues
pacing
RANS
approach
to be a focus
of experimental
performance
have
dramatically of increasing
affected
the utility
of the RANS on
supercomputer
performance
during the last 15 years. In 1975, a twoflow over a simple airfoil strained the capabilat that time. wing The increased by performance 1983. By 1986 of the it was 3-D flows
7600
it possible
to compute
simple
INCREASES ADVANCES
IN SUPERCOMPUTER IN COMPUTATIONAL
1975
ii'l
1986
COMPUTER: CRAY-XMP:4 EFFECTIVE SPEED: 200 MFLOPS COMPUTATION TIME: 15 HOURS
1989
COMPUTER: CRAY YMP8 EFFECTIVE SPEED: . 1000 MFLOPS COMPUTATION TIME: 6 HOURS
POINT
flows about
over very
bodies
on the Cray
XMP/4.
Today
the Cray
YMP/8
geometries
shuttle
configuration
more flow
from
the use of the RANS space shuttle obtained agreement along apparently in localized emergency
approximation
to comcom-
the integrated
in ascent
puted surface pressures computed and measured complex compared with the tional configuration. with flight wind data
are compared with those results are in excellent A limited tunnel and Mach with there amount computed tunnel to predict proposed of flight
from wind tunnel tests. The over large portions of this data side is available of the and is in wall the fuselage agreement tunnel
test pressure
Fig. 6. At this particular than interference. approach flight the current Although envelope
of 1.05,
the computation
is disagreement
is considered
the aerodynamic
and to evaluate
maneuvers.
ORIGINAL OF POOR
PAGE
IS
QUALITY
{a) Fig. 5 Comparison of pressure contours between computation (outermost) and wind tunnel (innermost), Moo -- 1.05, tz = -3 , and Re = 4.0 x 106/ft (3% model). (a) Side view, (b) top view.
4.
Aerothermodynamics
Aerothermodynamics combines the discipline of computational fluid dynamics with the modeling of real gas properties. Vehicles flying at Mach numbers greater than approximately five create shock waves strong enough to cause the chemistry of the air to change significantly. Therefore, to adequately simulate such flows, additional equations governing the behavior of the real gas chemistry must be added to the Navier-Stokes equations. Clearly, the level of computational complexity increases over that of computational fluid dynamics alone.
10
.8
Cp .4
-.4 0 O -.8
.1 I l I I I
1100
1500
X
1900
2300
Fig. 6 Comparison of pressure coefficient (CP) from computation (-), wind tunnel (o), and flight test (V right side, A left side) along _ = 70 line side of the orbiter fuselage,/vt** = 1.05, o_= -3 , Re = 4.0 x 106/ft and 10/9 elevon deflection. The goal of computational aerothermodynamics ments both in the case of external flow around is to predict aerothermal environvehicles and the internal flow within aerodynamic interactions, System to and of the (TPS) from next
propulsion units. The physical phenomena which must be addressed include forces and moments, convective and radiative heating rates, gas and surface catalytic materials, the vehicle. surfaces, interactions layers with these active cooling of Thermal is critical Protection to the design and plasma and their effect on electronic communications
Understanding
phenomena
generation aerospace transportation tected from the very high thermal current basis state-of-the-art, for the solutions Real-gas processes effects for chemicalthe RANS to these include
systems and to ensure that they are adequately proloads encountered in hypervelocity flight. At the equations coupled with the real-gas effects form the
phenomena. radiative thermochemical phenomena nonequilibrium occur. where finite-rate equation typically species
and energy-exchange
To account
for reactions,
conservation equations for each chemical species set. For a relatively simple case of dissociating consider nearly nine air species. triples the number Including the associated of equations
must be added to the RANS and ionizing air, one must conservation In addition, equations to account
for each
to be solved.
for thermal
11
nonequilibrium there are additional energy equationsto describethe energy exchange betweenthe various energymodes(translational,rotational,vibrational, andelectronic). Finally, when thermalradiation is important, the radiative flux mustbe included in the energyequation. Supercomputers havebeen essentialin the developmentof computationalaerothermodynamicsandhavepermittedthe modelingof complex3-D flow fields with finite-rate chemistry. As an example,Fig. 7 showsthe thermochemicalnonequilibrium flow field about the AeroassistFlight Experiment (AFE) vehicle at oneof its trajectory points18 The left sideof the figure showscomputednormalizeddensitycontours.Superimposed on these are circles representingthe location of the bow shock as determined from experimental shadowgraphs obtainedfrom ballistic rangetests. The experimentaland computationalbow shock locationsare nearly identical. Shown on the right side of the figure is a comparisonof the vibrational and translationalstagnationline temperatures. As illustrated, there is a significant difference in the two temperatures. This difference,
depending on the entry conditions, heating to the vehicle 19. can result in a significant increase in the total radiative
(a)
(b)
Fig. 7 Thermochemical nonequilibrium flow field about the aeroassist flight experiment (AFE) vehicle. (a) Computed density contours, computed and experimental bow shock locations; (b) vibrational and translational temperatures along the stagnation line.
12
5.
Computational During
Chemistry computational chemistry nature radiative For example, predict has become of a variety computational intensity factors, a dynamic and versatile notably methods colliboth chemare more
the fundamental
of substances, chemistry
materials,
high-temperature
transport properties, reaction sions. These data are required external resistant oxygen and internal to the atoms. chemistry harsh istry methods are being used
rates, and rates of energy exchange by computational aerothermodynamics of aerospace of space vehicles. and, In addition, the to study and design space-based in particular,
materials
of energetic
Computational
involves
solving
the Schrodinger
equation
to determine
the
properties of atoms, molecules, and matter. well-documented and will not be described computers mance gains considered, For example, realistic more have basis have had a profound influence have had of an order supercomputers the ability of magnitude have to perform These these time.
used to solve this equation are be noted, however, that superchemistry. much While larger (FCI) the perforto be in of YMP, techniques. calculations for evaluation and Cray from experimenCI wave onecould problems
Full Configuration FCI calculations in accuracy have the to very results. to obtain program. constants for high speed results results
researchers
approximate produced
that are superior the FCI results Finally, to obtain a specific reaction prediction performance. to obtain orders rate
demonstrated
reproduce
tremendously
Previously,
if a desired
be obtained, to be useful
High-temperature crucial modeling and often mental orders which almost to the accurate of the engine impossible
air plus
hypersonic
available
uncertain data.
by several As shown,
Fig.
rate constant
of H + H + H2 with a variety the scatter calculations to clarify in the experimental like what these is often
at high temperatures
of magnitude. data.
as well as helping
of experimental
13
17
50
1
100
I
T,K 300
I
500
I
1000
I
3000
I
5000
I
O O "7 16 O O
I IN ! O
&
&
E , 15
ID
$ A o El O O
PRESENT CALCULATION
_.. _.El_
E
U
- 14
EXPERIMENTAL
DATA
MOST PROBABLE
VALUES
13 1.0
t 1.5
1 2.0
I 3.0
' 3.5
4.0
Fig, 8 Comparison of calculated and experimental rate constant for the three-body recombination of H+H+H 2.
6.
Structural Computational
Mechanics structural greatly mechanics influenced is another major computational 22. The finite thrust element within method,
NASA
by supercomputers
developed by the aerospace industry, is the predominant technique used for the analysis of solid and structural mechanics problems. In the early 1970s, NASA became heavily involved NASTRAN computers dynamic propulsion ment in finite finite analysis. systems, more element element used modeling program. to solve areas system through Today, linear include The structures, finite and the development techniques problems analysis of advanced in turn of the widely static used superand vehicles, matedincreased element nonlinear employing for both composite placed
are widely
structures.
and propulsion
the develop-
numerical
on computer of the
application shuttle
solid
following
the accident
that destroyed
the Challenger.
14
was believed
to have been
caused
by the failure
of a case joint
motor 23.
An analysis of the joint was performed at three levels of detail 24 shown in Fig. 9. The first level consisted of a 2-D stress analysis for a shell model of the entire SRB that included calculated lated an equivalent at level stiffness including model the joint. of the joint. Finally, condition Analysis was At the second level, the displacements 2-D shell model calcuand the of a l-degree original reduced 1 were used as boundary conditions for a more refined
of an SRB segment at level circumferential redesigned gap motion The researchers preparation tural formed analysis, joints. slice
at the third level for a detailed performed joint to O-ring NASA and used The Center. solid finite
2 was used as a boundary of the joint. results showed surface was The
SRB
performed
various Center
were
at Langley equations,
Research
for model
2-D shell
analysis performed
on Cray
This effort,
and 1986, was a graphic illustration far remote locations. The solution 14 hours on a VAX 11/780 computer
of how effectively supercomputers time for the full SRB model and was reduced to 14 minutes
Fig. 9 Three levels of modeling detail for solid rocket booster (SRB) case joint structural analysis.
15
ORIGINAL OF POOR
PAGE
IS
QUALITY
when fully vectorized. More recent algorithm improvements have reduced the time to 13 seconds on the Cray-2 using 4 processors and microtasking and to on the Cray floating-point calculations YMP using 8 processors for matrix and microtasking. factorization took In the latter 6 seconds, case a rate the of operations per second.
1.517 billion
7.
Astrophysics Supercomputers are also heavily objects used (stars, within NASA methods galaxies, Computation observations that cannot to perform to investigate sources, affected radio has astrophysical black nearly holes, every analytic investiand X-ray area of
the formation
to be an essential
scientific theory
phenomena
in the laboratory.
Applications of supercomputers to astronomy is too broad a topic to cover here; thus, attention is focused on computer application of galactic formation and evolution 25. Three features power requiting wavelengths data requiring The dynamics galaxies collide. study problem. act with experiments study drive present research These complex interaction efforts are models, source and give (a) the (b) greatly modeling, rise to the play improved and analysis. leads up the one bulk to concentrate of the mass practically dimensions. on the need for the computational processes detail at all of supercomputers. nonlinear requiring high-speed of the stellar complex of many physical
observational
of raw observational
reduction, global
enhancement,
dynamics The
of the
population.
of interest. Stars move freely through The collision mean free path is large of the stellar dynamics is then the study
gravitational
The n-bodies are the stars, and they act like mass-points, or particles, the 1/r 2 forces of Newtonian gravitation. The models used for describe the time development of the self-gravitational particle
system.
experiments dealt with 2-D systems, but, with the advent of supercomputers in the late 1970s, fully 3-D experiments were made possible. The number of particles followed in an experiment motion potentials at each leapfrog is typically are calculated integration method. described above has been used to study galaxy formulation and internal of star forfrom the step 100,000. are obtained The forces necessary that to integrate are calculated by solving the equations equation of of the particles from potentials of motion on a grid. These
star densities
the Poisson
are integrated
by the time-centered
The approach
dynamics and the interaction of galaxies 26. One mation stimulated by the collision of galaxies.
16
Infared Astronomy Satellitehaverevealeda classof very luminous interactinggalaxies which appearto be starburstgalaxies.These results have stimulated researchon the natureof the interactionsandthe ways in which the star-formationprocesscanbe stimulated throughthe interaction.An exampleof an optical photographof an interactingpair of galaxiesis shownin Fig. 10.The photograph wasmadein sucha way to emphasize the regionsof high starformation. The observationsshow vividly the violent disruption of the two galaxiesandthe regionsof high currentstarformation,particularly at the centers of the two galaxies. One of the computationalapproaches to understanding the starburstphenomenon lies in understandingthe dynamicalinteractionof two spiral galaxiesembeddedin massive halos by numericalexperiments. For example,onecan follow the dynamicalinteraction of two disk galaxiesto understand the natureof the contractionandsubsequent expansion of the two galaxies.Fig. 11 showsfour snapshots of a collision sequence in a numerical
Fig. 10 Interacting
galaxies
UGC
courtesy
of Dr. H. Bushouse,
NASA
Ames
IS
17
QU._LITY
IS
QUALITY
of numerical
galaxies
were oriented
of the orbit and both the disk particles particles (100,000). units. shown The are a sampling counter (2048) shown in the bottom
in the calculation
in nondimensional
corresponds to roughly 150 million the time variations of the densities longer systems. term project toward
years. Understanding the nature of the interaction and and momenta in the two galaxies is the first step in a the nature of the star-formation process in these
understanding
8.
Atmospheric Atmospheric
Modeling modeling climate to assess on supercomputers change. the influence of man's has become activity a principal tool in predicting is clearly evident NASA in has
weather light
and studying
The importance
of this activity
of the need
on the environment.
18
including NASA
of the Earth's
composition; increases
the atmospheres
of the planets,
of the Earth's
to
planetary
Spacecraft of water
in its atmosphere
of a year
equivalent
to about
active hydrological ous other reservoirs. clear, waxes that finally but it is known and wanes remains sublimes
cycle in which water is exchanged between the atmosphere and variThe nature and distribution of these other reservoirs is not entirely that one of them the seasons the in late vapor. spring, summer. the summer's Surprisingly, is the "residual" that one words, behind however, In other it leaves north is made when polar the polar dioxide cap--not seasonal water cap serves survives the one that the one ice cap CO2 with because of CO2 frost--but
for atmospheric all year long. The supplies does models proof. puters. tions The losses. years). as cloud easily the
For reasons
not understood,
of this asymmetry each been transport problem, transport and carry simulations may to transport, precipitation, just as much there boundary demand NASA There have
cap on
to the atmosphere atmospheric of the atmospheric on a globe, aspect multi-year this, however,
where
suggestions,
of the problem
the problem.
The rea-
are important
physical
processes
that occur,
the behavior
as transport.
to the problem
by an order researchers
of magnitude. at Ames problems. The of polar speed Research They and mixing Center have memory processes, have been a of these cloud
the Cray
at NAS the
to attack examination
these models.
constructed
as various enabled
physical
supercomputers
19
formation, andboundarylayerexchangeata level of detail not previouslypossible 27.For example, the ability of its atmosphereto mix water vapor into the polar regionsduring winter is a key issueconcerningthe currentMars climatesystem.In Fig. 12contoursof a nearly conserveddynamicalquantity (isentropic potential vorticity, IPV) are shown to illustrate this mixing process.The IPV field is taken from a simulation of the Martian winter circulation using a 3-D climate model being developedby researchers at Ames ResearchCenterandbasedon the work in reference27. The center of the figure is the north pole while the edgesare at about30 degreesnorth latitude.The large gradientsi;_
20
ORIGINAL OF POOR
PAGE
IS
QUALITY
the IPV
that small Martian resolution,
ribbon down
represent
a barrier At is the
large-scale and
wave
break of three
if mixing experiment,
is to occur. therefore,
disturbance
but its amplitude Antarctic and spatial was sluggish suggesdo not readily
is weak. stable
By running
at high
to earlier
Similarly,
to the ground;
probback it
a coupled
layer/soil
tical and temporal resolution, return. This will significantly statements can be made about
a physical process has been discovered affect the field and open up new areas recent transport and cloud studies.
9.
A Look To meet
to the Future future and challenges (HPC) application in many computing governing flows in computational Program. of high-performance for continued science For have been example, to develop vehicle have of NASA's power. equations science, NASA has initiated the High the realfluid it has
Computer
The goal of the I-IPC Program computing leadership in the known for tools
to meet signifi-
requirements
dynamics,
supercomputer-level
performance
the physics
of aerodynamic Recently,
multidisciplinary
These
methods
used
to
perform computer simulations of coupled phenomena such as those needed to study wing flutter. Flutter is the motion due to the complex interaction of the instantaneous changes in wing forces exceed shape due to aerodynamic forces and the instantaneous change in aerodynamic due to wing shape. The effect of flutter can be catastrophic if dynamic forces structural limits. However, flutter can, in some cases, be suppressed by a coupled of control surfaces. Simulation of flutter control requires combining fluid structural mechanics, of simulation of wing obtained by solving and a nonlinear equation, and control laws into a time-dependent formulation. One flutter control is presented in Fig. 13. In this work 29, the the model inviscid structural form equations of the fluid of motion, dynamic a simple equations con(full
were
21
AEROELASTICALLY
OSCILLATING
WING
CONTROL
SURFACE
ACTIVATED
ACTIVE
CONTROL
SUPPRESSES
OSCILLATION
Fig.
13
Computed
pressures suppress
ratio
wing
with
surface
activated
to
c_ = O, simulated
in a fully coupled
manner.
Wing
flutter
is established
with
the control
surface
(upper figure). Next, the control deflections reduce and maintain on the wing surface represent
surface is activated (middle) so that its instantathe wing steady (bottom). The various shades of of instantaneous pressure onthe wing.
the levels
The example above was a highly approximate treatment for a simple shape. Today's aeroscience challenge is to provide integrated, multidisciplinary simulations of aerospace vehicles systems, computer (teraFLOPS). In space simulation, and Earth simulation sciences, data. impact NASA Today's has made space significant solid Earth and Earth and strides sciences its global better in weather/climate and analysis assessof addressing will require of challenges changes, are the throughout complete systems manner. their external capable mission profiles. This structural that these a trillion requires calculations floating-point the simulation controls, will take of propulsion and chemistry tens of hours in on aerodynamics, of sustaining response,
an integrated
It is estimated
operations
per second
evolution,
modeling,
multi-spectral/multi-sensor multidisciplinary ments space the of their complex phenomena, modeling potential formation
of man, nonlinear
understanding
to solve in a flight
in
facilities,
as multi-sensor
or satellite
the computational space exploration and ultra reliable facing the Agency Performance The puting basic
be located
environment.
systems, e.g., Mars Rover sample return, spaceborne computers. These specific provide a computational imperative
will require high-performance grand challenges and others for a NASA initiative in High
Computing. to be taken by science focusing research. by the HPC Program on system a sustained because is to fully exploit highly specific Cray depends highly parallel algorithms approximately parallel proof very limand costs. and for comand
approach technology
architecture, computing
computer
is to develop
performance
technology
The conventional
on a small
fast processors sharing a very fast common ited in the future by electronic technology electronic Highly share research packaging), parallel development has been processors costs ongoing processor greatly within to memory increase the NAS
memory. This approach will be severely limits (speed of light, circuit density, bandwidth, relax microcomputer and within and high market. other development on technology, Parallel NASA programs parallelism, program demands
larger
processing
23
the past few years.The primary thrust hasbeento evaluateparallel processingarchitectures and to developefficient parallel applications.Resultshavebeenencouraging.For example,resultshavebeenobtainedthatshowa 16,000-processor CM-2 will out-perform a Cray-2 singleprocessor for certainlargecomputationalfluid dynamicsapplications 3.
Developing parallel concepts application is another algorithms objective and architecture capable of fully utilizing parallel highly proces-
of the program.
High-performance,
sor testbed facilities will be implemented for developing software. They will be scalable so that initial feasibility scales and at reduced cost. The applicability on selected of new algorithms Three areas (a) will be demonstrated have been chosen:
parallel
architectures
multidisciplinary
simulation
of
(b)
Earth
and
monitoring on the
of the future
Earth and environment (c) Remote missions Finally, enhanced institutes. needed the
its global
impact
extended-duration
human
exploration
and experimentation research is crucial infrastructure NASA obtained to achieving newly potential increased today's a powerful within NASA will be
research basic
and at several
computationally the innovative capabilities. the current needs is aimed achievement safety
oriented approaches
In summary, the-art tics and in Earth ing these enable ability benefits needs a teraFLOPS
computers in computer
state-ofat meetwill
and accelerate
the growth
speed.
NASA
sciences increase
the technology
that before
to maximize
24
10. 1.
of the
Numerical
Aerodynamic
Program,
"Status Mass.,
Proon
Proceedings
Third 15-20,
P. Kutler,
Year,"
Proceedings Sciences
(Jerusalem,
D. Choi System
and C. Levit,
Interactive
Graphics
Journal
of Supercomput-
ing Applications, S. E. Rogers, Applications computing T. Lasinski, Visualization (1987). V. Watson, Computer Conference 8. R. Magus (1970) 9.
"Distributed International
Interactive Journal
Graphics of Super-
Fluid
Graphics
Workstations,
87-1180-CP
G. Bancroft, of Flow
and 1987).
S. Rogers,
Use
of
Aerospace
Engineering
and H. Yoshihara,
"Inviscid
Over
Airfoils,"
AIAA
J., 8
pp. 2157-2162. "Impact Proc. of Computers pp. 68-79. Status of Computational Fluid Dynamics in on Aerodynamic Research and Develop-
72 (1984)
10.
87-1135-CP
11.
P. Kutler Dynamics,
Progress (1988).
and Future
TM- 100091
25
Proc. eds.
LamiE. M.
13.
S. J. Kline, Structures
and Layer,
TimeFirst
in a Numerically
Congress
14.
P. R. Spalart, NASA
Simulation (1986).
of a Turbulent
up to Rtheta=14IO,
TM-89407
15.
V. L. Peterson, Aerospace,"
A. B. Watson, Important
and to
F. R. Bailey,"Supercomputer
for Selected
Disciplines
16.
T. Hoist,
Navier-Stokes
in Aircraft
AIAA
Paper
90-
1800 (1990).
17.
P. G. Buning,
I. T. Chiu, F. W. Martin,
Jr., R. L. Meakin,
S. Obayashi,
J. S. Steger, and M. Yarrow, "Flow Field Simulation Ascent," Proc. Fourth International Conference World Inc., Supercomputer 1989) pp. 20-28. Enhanced Thermochemical Flight Experiment Exhibition,
Vol. II (International
18.
G. Palmer, Around
19.
J. N. Moss, an Aeroassist
Experiment,
Paper
20.
C. W. Bauschlicher, S. R. Langhoff, and P. R. Taylor, "Accurate Quantum Mechanical Calculations," Advances in Chemical Physics, 77 (1990) p. 103. D. W. nation Schwenke, "Calculations of Rate Constants Phys. for the Three-Body 89 (1988) p. 2076. Needs for StrucVol. 1I Recombi-
21.
of H2 in the Presence
of H2,"
22.
International
Conference
on Supercomputing,
Supercomputing
23.
Report
on the Space
Challenger
Accident
(Washington,
26
Behavior
of the
25.
R. H. Miller, on
Formation and
Proc.
Fourth
Conference
Third
Vol. II (International
1989)
26.
B. F. Smith and R. H. Miller, Evolution," (International Proc. Second Supercomputing R. M. Haberle, of the Martian
to Galaxy
International Institute,
Conference Inc.,
27.
J. Schaeffer, Atmosphere,
and I. Polar
H. Lee,
Processes,"
J. Geophys.
pp. 1447-1473. and B. M. Jakosky, Cap on Mars," "Sublimation J. Geophys. and Res., Transport 95 (1990) of Water from the
28.
Polar
29.
Transonic
Aeroelasticity
87-0709-CP Implicit
30.
C. Levit
D. Jesperson,
Computer," Computational Structural Mechanics and Trends, Special Issue of Computers and 1988).
27
N/ A
N86ond Aeconaullcs Sp4t_ Admintelr I_on
Report Documentation
2. Government Accession No.
Page
3. Recipient's Catalog No.
t. Report No.
NASA
TM- 102890
5. Report Date
NASA's
Supercomputing
Experience
December
1990
7. Author(s)
F. Ron Bailey
A-91024
10. Work Unit No.
505-59
9, Performing Organization Name and Address 1 1. Contract or Grant No.
Center CA 94035-1000
t 31Type of Report and Period Covered
Technical Administration
Memorandum
National
Aeronautics DC
and Space
Washington,
20546-0001
Point of Contact:
Ames
Research
Center,
MS 200-4,
Moffett
Field,
CA 94035-1000
Presented
16. Abstract
at Singapore
S upercomputing
overview early
of NASA's
recent
from NASA's
two role
systems systems
development development
of activities in advanced
Program. Capabilities
in turbulence
aerodynamics, applications
in science
are illustrated
in astrophysics and atmospheric modeling. The paper concludes with a brief comment directions and NASA's new High Performance Computing Program.
by Author(s))
Unclassified-Unlimited Subject
20. Security Classif. (of this page)
Category
- 59
22. Price
Unclassified
NASA FORM 1626 0CT86
Unclassified
30
A03
For sale by the National Technical Information Service, Springfield, Virginia 22161