University Politehnica of Bucharest [602150]
University Politehnica of Bucharest
Faculty of Automatic Control and Computers
Computer Science and Engineering Department
Diploma Thesis
Optimization of Simulation
Models in Hydrology
WRF-Hydro
by
Sorina-Nicoleta Stoian
Supervisor: Prof. Dr. Eng. Emil Slu sanschi,
As. Drd. Ing. Mihai Caraba s
Bucharest, July 2015
Contents
Contents i
List of Figures iii
List of Tables iv
1 Introduction 2
1.1 General perspective . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2 State of the Art and Related Work 4
2.1 The Weather Research and Forecasting Model( WRF ) . . . . . . . 4
2.2 The WRF-Hydro Simulation Model . . . . . . . . . . . . . . . . . . 5
2.3 Variable Inltration Capacity( VIC ) . . . . . . . . . . . . . . . . . 6
2.4 Global Forecast System (GFS) . . . . . . . . . . . . . . . . . . . . 6
3 Hardware and Software Conguration 8
3.1 Weather Research and Forecasting Model (WRF) . . . . . . . . . . 8
3.2 The WRF-Hydro Modeling System . . . . . . . . . . . . . . . . . . 9
3.3 WRF Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.3.1 Initialization for idealized cases . . . . . . . . . . . . . . . . 11
3.3.2 Initialization for real data cases . . . . . . . . . . . . . . . . 12
3.4 WRF Preprocessing System . . . . . . . . . . . . . . . . . . . . . . 12
3.4.1 The geogrid program . . . . . . . . . . . . . . . . . . . . . . 13
3.4.2 The ungrib program . . . . . . . . . . . . . . . . . . . . . . 13
3.4.3 The metgrid program . . . . . . . . . . . . . . . . . . . . . 14
3.5 Nesting in WRF . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.6 WRF-Hydro on NCIT cluster . . . . . . . . . . . . . . . . . . . . . 15
3.6.1 The NCIT Cluster . . . . . . . . . . . . . . . . . . . . . . . 15
3.6.2 NetCDF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.7 WRF/WRF-Hydro . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
i
ii CONTENTS
3.7.1 Compiling WPS . . . . . . . . . . . . . . . . . . . . . . . . 19
3.7.2 Running WRF-Hydro idealized . . . . . . . . . . . . . . . . 19
3.7.3 Running WRF-Hydro real data case . . . . . . . . . . . . . 19
3.7.4 Running examples . . . . . . . . . . . . . . . . . . . . . . . 20
4 Case studies 21
4.1 Results 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
4.2 Results 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
4.3 Comparing obtained results . . . . . . . . . . . . . . . . . . . . . . 21
5 Conclusions and future work 24
Bibliography 25
Appendices 27
.1 Appendix1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
Listings 29
List of Figures
2.1 Schematic of the modularized WRF-Hydro modeling framework . . . . 6
2.2 Global Forecast System Output . . . . . . . . . . . . . . . . . . . . . . 7
3.1 Schematic of parallel domain decomposition scheme in WRF-Hydro.
Boundary arrays in which memory is shared between processors (P1
and P2) are shaded in purple. . . . . . . . . . . . . . . . . . . . . . . . 9
3.2 Schematic of parallel domain decomposition scheme in WRF-Hydro
applied to channel routing. The black and red stars indicate channel
elements that overlaps. . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.3 Schematic illustrating the coupling between WRF and WRF-Hydro. . 11
3.4 WPS data
ow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
3.5 Comparison between basic and nested simulations for a 6h forecast. . 15
4.1 Run time results on nehalem queue . . . . . . . . . . . . . . . . . . . . 22
4.2 Run time results on the opteron queue . . . . . . . . . . . . . . . . . . 23
4.3 Comparison between opteron and nehalem run time results . . . . . . 23
iii
List of Tables
3.1 Opteron and Nehalem conguration . . . . . . . . . . . . . . . . . . . 20
.1 Run time results on the nehalem queue . . . . . . . . . . . . . . . . . 28
.2 Run time results on the opteron queue . . . . . . . . . . . . . . . . . . 28
iv
Abstract
One of the main concerns in the technological environment today is the simulation
of complex systems. This is often achieved through model coupling which consists
in taking two ore more working models and linking them together in order to achieve
better results. Weather Research and Forecasting ( WRF ) is one domain which cur-
rently make use of the coupling method between systems. It also represents a domain
where it becomes necessary to use High Performance Computing Systems to support
the large scale computationally intensive simulations.
The weather models can be divided in two main categories: meso-scale and micro-
scale. To obtain more meaningful results it is often necessary to establish a con-
nection between them and transfer data. The WRF Model is a next-generation
meso-scale numerical weather prediction system designed for both atmospheric re-
search and operational forecasting needs. It oers operational forecasting a
exible
and computationally-ecient platform, while providing recent advances in physiscs,
numerics and data assimilation.
The main objective of this research is to analyse the performance and optimize the
structural and behavioral characteristics of the WRF model and its hydrological
component, WRF-Hydro . In order to accomplish this goal it is important to analyse
the pre-processing and processing stages of these models.
1
Chapter 1
Introduction
1.1 General perspective
The weather is a deciding factor in plenty domains and accurate weather predic-
tions are important for planning our day-to-day activities. For example, airlines
need to know about the local weather conditions in order to schedule
ights.
Weather forecasting helps us to make more informed daily decisions and may
even keep us out of the danger. With so many implications, the weather has
become a subject of great importance and considerable eorts are made in order
to predict its evolution.
The hydrological component of the Earth system has an important in
uence on
our lives as well. From ground water and soil moisture to storms and
oods they
all represent atmospheric phenomenons of great importance.
Being limited by the fact that the atmosphere is chaotic, it can never be fully
known the exact initial state of the atmoshpere. The weather and the atmo-
spheric phenomenons can be described by physics laws translated as numerical
models. These explain the evolution in time of dierent indicators like tempera-
ture, precipitation or pressure. For instance, these models can provide a better
understanding of a lake reservoir or predict
ood risks.
In order to obtain a weather forecast we need to acquire information about the
parameters mentioned before and use it in the complex equations describing the
numerical models. However we have to simplify our models of many processes
which occur at very small scale. This means that all forecasts will have a dose of
uncertainty associated with them.
On the other hand, we can make use of the aid of computers in order to solve
these complex equations and transform the real data provided by those indica-
tors to information that can be stored and interpreted by computers. Typically
this process consist of representing the parameters as numerical values. Usu-
2
1.2. OBJECTIVES 3
ally these numerical models oer a probabilistic forecast. In order to obtain a
ner prediction, we have to use a larger amount of data. Thus the processes
involved are computationally expensive and they require the use of clusters or
High Performance Computer systems.
1.2 Objectives
This project will present in the subsequent sections more details about two such
models, the Weather Research and Forecasting model and WRF-Hydro Simula-
tion Model. The main goal is to make use of the resources oered by the National
Center for Information Technology(NCIT) Cluster [NCI13] in order to run those
models. The next chapter contains examples of other models similar with WRF
and WRF-Hydro. The third chapter describes details of the implementation of
the Weather Research and Forecasting Model. It also includes information about
the steps necessary to compile and run the model as well as the hardware congu-
ration of the NCIT cluster, which was the environment were the WRF model was
run. The last two chapters contain case studies and results, as well as examples
of further improvements.
Chapter 2
State of the Art and Related
Work
In this chapter are presented other similar models with the Weather Research and
Forecasting model. There are dierent approches and numerical methods which
describe the weather forecast and each of them provides a set of advantages and
new perspectives.
2.1 The Weather Research and Forecasting Model(WRF)
The Weather Research and Forecasting ( WRF ) is a numerical weather predic-
tion system multi-purposed designed. It can full both atmospheric research and
operational forecasting needs. The model has many features such as: two dy-
namical cores, a data assimilation system and a software architecture facilitating
parallel computation and system extensibility. It can be used for a wide range
of meteorological applications across scales from tens of meters to thousands of
kilometers. The development process of WRF started in the latter part of the
1990's and was a common eort of the National Center of Atmospheric Research
(NCAR), the National Oceanic and Atmospheric Administration (represented by
the National Centers for Environmental Prediction (NCEP) and the Forecast
Systems Laboratory (FSL)), the Air Force Weather Agency (AFWA), the Naval
Research Laboratory, the University of Oklahoma, and the Federal Aviation Ad-
ministration (FAA). [WRFa]
The WRF model is suitable to perform simulations using real data or idealized
congurations. It oers a
exible and ecient platform from the computational
point of view, while providing recent advances in dierent domains like physics,
numerics or data assimilation. WRF is currently in operational use at NCEP,
AFWA and other centers. [WRFa]
4
2.2. THE WRF-HYDRO SIMULATION MODEL 5
2.2 The WRF-Hydro Simulation Model
At the beginning, the WRF-Hydro system was designed as a model coupling
framework to oer an easier coupling between the Weather Research and Fore-
casting model ( WRF ) and components of terrestrial hydrological models.RAL
WRF-Hydro can be built and run both as a stand-alone project as well as coupling
model, linking the hydrological models with atmospheric models. WRF-Hydro is
fully parallelized, being suited to run on clusters and High Performance Comput-
ing Systems alike. This system is intended to be extensible to new hydrological
parametrizations and it is built upon a modular FORTRAN 90 architecture. Ac-
cording to the ocial WRF-Hydro website, [RAL] , this system was designed
to be used within the WRF model, but it has evolved over time and now it has
many features like:
It is suitable to simulate the atmospheric, land surface and hydrological
processes on dierent spatial grids
It can be used for many terrestrial hydrological processes such as surface
runo, channel
ow, lake
ow, sub-surface
ow
Hydrological prediction using stand-alone capabilities
Wide range of standard data formats for ecient job construction and eval-
uation
As described in The WRF-Hydro Modeling System: System Overview and Status
Update [D.G15], the WRF-Hydro system was designed to provide:
An extensible model capable of coupled and uncoupled assimilation and
prediction of important water drivers such as precipitation, soil moisture,
snowpack, groundwater, stream
ow or inundation.
It provides accurate and reliable stream
ow prediction across scales
WRF-Hydro is a community-based software which operates in two major modes:
coupled or uncoupled to an atmospheric model. Uncoupled mode is critical for
spin-up, data assimilation and model calibration while the coupled version is
used for land-atmosphere coupling research and long-term predictions. Among
other attributes, WRF-Hydro has also the advantage of being portable and scal-
able accross multiple computing platforms. WRF-Hydro oers a wide range of
physics component such as subsurface routing, catchment aggregated channel in-
ow or detailed representation of wetlands. It also improves the representation
of landscape dynamics essential to
ood risks. [D.G15]
To support its goal, WRF-Hydro was built as an extensible, multi-scale coupling
architecture which links weather and climate models with hydrological component
models shown below in gure 2.1.
6 CHAPTER 2. STATE OF THE ART AND RELATED WORK
Figure 2.1: Schematic of the modularized WRF-Hydro modeling framework
2.3 Variable Inltration Capacity(VIC)
VIC was originally developed by Xu Liang at the University of Washington and,
unlike WRF-Hydro, is a a large-scale, semi-distributed hydrologic model that
solves full water and energy balances. [VIC] Nonetheless, it shares several basic
features with the other land surface models that are commonly coupled to global
circulation models (GCMs) such as:
The land surface is represented as a grid of large, uniform cells;
The inputs consist of meteorogical indicators like precipitation or temper-
ature organized chronologically in daily or sub-daily series;
The water can only enter a grid cell via the atmosphere and once it reaches
the channel network, it is assumed it stays there;
The VIC model is now an open source program. It was implemented in such
manner that the grid cells are simulated independently of each other and it uses
a dierent model to simulate the routing of stream
ow, according to the ocial
VIC website. [VIC]
2.4 Global Forecast System (GFS)
The Global Forecast System [GFS] is a numerical weather prediction system. It
is ran four times a day and produces forecasts for up to 16 days in advance, with
decreased spatial resolution after 10 days. The GFS model [NOA] is a coupled
model as well which comprises four other models: an atmoshpere model, an ocean
model, a land model and a sea ice model. These models work together to provide
an accurate simulation of weather conditions. GFS covers the entire global at a
2.4. GLOBAL FORECAST SYSTEM (GFS) 7
base horizontal resolution of 28 kilometers between grid points. Vertically it is
divided in 64 layers and temporally it provides output forecast every hour for the
rst 12 hours, every 3 hours out to 10 days and every 12 hours after that. [GFS]
Figure 2.2: Global Forecast System Output
Chapter 3
Hardware and Software
Conguration
This chapter includes detailed information about the implementation of the WRF
and WRF-Hydro models as well as the steps that must be followed in order to
compile and run these models on NCIT cluster.
3.1 Weather Research and Forecasting Model (WRF)
The WRF software architecture comprises a number of separable layers and sup-
porting components: the driver layer, mediation layer, model layer, an utility
program called Registry and other API's (application program interfaces) to facil-
itate interprocessor communication and support dierent data formats and I/O,
which are described in The Weather Research and Forecast Model: Software Ar-
chitecture and Performance [SA]. The driver layer performs runtime allocation
and parallel decomposition of model domain data structures. It also handles the
interface to other components when WRF is part of a coupled model and con-
trols the high-level interfaces to input/output operations on model domain. The
model layer represents the actual computational model. It is comprised of rou-
tines like: advection, diusion or physical parametrizations, which are called by
a model layer interface [SA]. The mediation layer contains the solve routine
which calls the set of sub-routines from model layer and handles the invocation
of interprocessor communication and multi-threading, isolating the parallelism
problems from the user. As it is specied in [SA], the current version of WRF
makes use of the RSL communication library [Joh], that in turn, uses the Mes-
sage Passing Interface communication package [MPI] . The solve routine also
includes shared-memory parallelism over tiles, a second level of domain decom-
8
3.2. THE WRF-HYDRO MODELING SYSTEM 9
position within distributed memory patches. [SA]
3.2 The WRF-Hydro Modeling System
WRF-Hydro is a modularized project, with a FOTRAN90 coding structure whose
routing physics modules are switch activated through a model namelist le (hy-
dro.namelist), according to WRF User's Guide [Use13]. The code has been
parallelized in order to be executed on High Performance Computing systems or
clusters. The process of parallelization makes use of code structures similar to
those used in WRF, such as domain decomposition or boundary array. (3.1).
Figure 3.1: Schematic of parallel domain decomposition scheme in WRF-Hydro.
Boundary arrays in which memory is shared between processors (P1 and P2) are
shaded in purple.
WRF-Hydro model uses MPICH protocols to pass messages between proces-
sors 3.2. It requires separate compilations for single processor, sequential version
in comparison with the parallel congurations.
WRF-Hydro is comprised of modules and functions which manage the commu-
nication between atmosphere components such as WRF and sets of land surface
components. This system is called from an architecture like the WRF model,
the HRLDAS, and so forth. Each time WRF-Hydro is coupled with another
model modications are made to a general set of functions in order to handle
the coupling. For instance, when WRF-Hydro is coupled with the WRF model,
its components get compiled as a single library function call with the WRF sys-
tem. Thus, there is only one executable. WRF-Hydro is called from the WRF
driver model mentioned before, from le module surface driver.F , stored in phys
directory.
10 CHAPTER 3. HARDWARE AND SOFTWARE CONFIGURATION
Figure 3.2: Schematic of parallel domain decomposition scheme in WRF-Hydro
applied to channel routing. The black and red stars indicate channel elements
that overlaps.
The WRF drvhydro.F interface module handles the communication. It is re-
sponsible of passing the data, grid and time information between WRF and
WRF-Hydro [Use13] . The next step is accomplished by components within
WRF-Hydro which are responsible of the dynamic regridding and sub-component
routing functions like surface or channel routing. Right before the user-specied
routing functions are completed, WRF-Hydro remaps the data back to the WRF
model grid [Use13] and passes all the necessary variables back to the WRF model.
This nal step is accomplished by the WRF drvHydro.F interface, which be-
comes a key component of the WRF-Hydro system [Use13]. Figure 3.3 illustrates
the coupling and calling structure of WRF-Hydro system from the WRF model.
3.3 WRF Initialization
In order to generate predictions, the WRF model uses as input a grid containing
characteristics for each point. The grid itself has dierent layers, e.g horizontal
and vertical layers and includes values for indicators that can be measured from
the atmosphere .
The WRF model is suitable to perform simulations using idealized congurations
or real data. The idealized cases can be run for 1-D, 2-D or even 3-D sounding
and assume a simplied analytic orography [CAS]. The real data simulations
typically require a pre-processing stage accomplished by the WPS system. The
WRF executable, wrf.exe, is not altered by choosing one initialization option
over the other(real or idealized) [CAS], but each one of these options generates
a pre-processor program: real.exe or ideal.exe. The selection is made by the user
when issuing the ./compile command.
The ideal.exe and real.exe programs share a large amount of source code, respon-
sible with the following duties: read data from the namelist le, allocate space
3.3. WRF INITIALIZATION 11
Figure 3.3: Schematic illustrating the coupling between WRF and WRF-Hydro.
for the requested domain and generate inital condition le, as it is specied in
[CAS]. In addition, the real data simulations do other processing such as reading
input data from the WPS system, checking that the parameters are consistent
with each other and so forth.
3.3.1 Initialization for idealized cases
The program ideal.exe is generated when the user issues the command ./compile
followed by an ideal case as argument. This program allows the user to run a
controlled scenario. The only les that it requires are the namelist.input and
theinput sounding located in the test/test case/ directory. The output of the
ideal.exe program is the wrnput d01 which is used by the main executable,
wrf.exe . The idealized simulations represent an easy way to insure that the
model is working properly [CAS].
The idealized cases are not set by the default to run complex physics, on the
other hand they can make use of any of the boundary conditions. In order to
specify the parameters of simulation we have to edit the namelist le. It contains
information about the size of the domain, the number of vertical layers, the
grid size, time scale, boundary conditions and physics options, as described in
[CAS]. Another le named input sounding can set parameters like the surface
pressure, potential temperature or moisture ratio. The ideal.exe interpolates the
data given by the input sounding le.
The next chapter contains the results of a 3-D idealized case, LES(large eddy
12 CHAPTER 3. HARDWARE AND SOFTWARE CONFIGURATION
simulation). It uses a domain with the following characteristics: 100-m grid size,
2-km top, a surface layer physics with
uxes and it is doubly periodic.
3.3.2 Initialization for real data cases
The real data simulations obtain their input from the real.exe program. The
rst step consist of running the WRF Preprocessing System(WPS) with data
input originated from one of the national weather agencies. WPS generates
les with the prex "met", which are used as input for the real.exe program.
For instance, for a simulation that runs every 6 hours, the WPS system cre-
ates the following les: met em.d01.2015-06-24 00:00:00.nc, met em.d01.2015-
06-24 06:00:00.nc, met em.d01.2015-06-24 12:00:00.nc. The "d01" portion repre-
sents the domain the data refers to, which allows nesting. The next characters
represent the time of the processed data and the sux "nc" species the data
format of the output, which must be in netCDF for real.exe.
The next step is running the real.exe program. A successful run generates the
les "wrnput d01" and "wrfbdy d01", which are used by the WRF executable,
wrf.exe.
3.4 WRF Preprocessing System
The WRF Preprocessing System is a collection of three programs whose main
objective is to generate input for real.exe program in order to simulate real data
congurations. Each one of the programs has a dierent role, as it is explained
in [WPS]:
geogrid describes model domains and interpolates static geographical data
to the grids;
ungrib selects meteorogical information from GRIB les;
metgrid inserts horizontally meteorogical elds extracted by the ungrib
program to the model grids generated by the geogrid program. The verti-
cal interpolation of meteorological elds is accomplished by the real. exe
program.
The data
ow between the programs of which WPS is consisted of is shown
below in gure 3.4. All three programs share a common le called "namelist.wps"
from which they read parameters. This le has dierent sections for each of the
programs and a common record.
The WPS system is built in a similar manner as the WRF model, providing
dierent options of compiling on multiple platforms. Two programs of the WPS,
these being metgrid and geogrid, are suitable to be compiled for a distributed
3.4. WRF PREPROCESSING SYSTEM 13
memory run, allowing larger domains to be prepared in less time. The ungrib
program instead may only be executed using a single processor.
Figure 3.4: WPS data
ow
3.4.1 The geogrid program
The geogrid program is responsible of creating the simulation domains and inter-
polating terrestrial data set from the model grids [WPS]. In order to generate the
simulation domain the user must provide the necessary information in the "ge-
ogrid" section of the namelist.wps le. Besides computing latitude and longitude
for every grid point, geogrid also interpolates soil categories, land use category,
terrain height, annual mean deep soil temperature, maximum snow albedo and so
forth. Global sets for each of these indicators are available on the MMM website
[MMM]. If additional data sets are required, they may be interpolated to the
simulation domain through the use of the table le, GEOGRID.TBL. This le
specify all the elds that will be reproduced by geogrid, as well as the location
where the data sets reside.
The output of the geogrid program is written in the netCDF I/O format.
3.4.2 The ungrib program
The ungrib program's task is to read GRIB les, degrib the data and rewrite it in
a simple format. This program is able to read GRIB Edition 1 and GRIB Edition
2 les. A GRIB le usually includes time-varying meteorological elds and are
from another regional or global model, like GFS models. The ungrib program
14 CHAPTER 3. HARDWARE AND SOFTWARE CONFIGURATION
makes use of tables of codes named Vtables to determine which elds to extract
from the GRIB le. Ungrib is also capable of writing intermediary data les in
three formats, chosen by the user, according to [WPS]:
WPS – is a new format which includes additional information that can be
used by the downstream programs;
SI – represents the previous intermediate le of the WRF system;
MM5 – it helps the ungrib program to produce GRIB2 input for the MM5
modeling system.
3.4.3 The metgrid program
The metgrid program horizontally interpolates the data provided by the ungrib
program onto the simulation domains described by the geogrid program. The
output generated can be passed down to real.exe program. The date interval
which will be interpolated is specied in the "share" section of the namelist.wps
le and the date intervals must be dened for each simulation domains. It must
be executed every time a new simulation is computed.
The metgrid program includes a METGRID.TBL le which controls each mete-
orological eld and oers the posibility to specify options such as interpolation
methods to be used [WPS]. The output of the metgrid is written as well in
NetCDF format for easy visualization.
3.5 Nesting in WRF
Nesting enables running at ner resolution without mismatching time and spatial
lateral boundary conditions for a small domain at high resolution [Dav]. The nest
covers a portion of the parent domain and is driven along its lateral boundaries
by the parent. Depending on the data exchanged between the domains there are
several types of nesting:
Static: the nest location is xed.
One-way: the data exchange between the parent and the nest is down-scale.
The nest solution does not impact the parent solution.
Two-way: the data exchange is bi-directional.
The nesting option oers a ner resolution of the domain hence the simulation will
be more accurate. The type of nesting can be chosen by the user at conguration
time, before compiling.
3.6. WRF-HYDRO ON NCIT CLUSTER 15
Figure 3.5: Comparison between basic and nested simulations for a 6h forecast.
3.6 WRF-Hydro on NCIT cluster
Typically the weather forecasts can be described by physics laws translated as
numerical models that need a large volume of data as input. Processing this
data requires a important amount of computing power which can be oered by
clusters or High Performance Computing systems. Being a student at the Faculty
of Automatic Control and Computers, gave me the privilege of having access to
such a system and therefore I was able to conduct experiments using the WRF
and WRF-Hydro programs.
3.6.1 The NCIT Cluster
The National Center for Information Technology ( NCIT ) of the University Po-
litehnica of Bucharest started back in 2001 with the creation of CoLaborator
labortory, a Research Base with Multiple Users (R.B.M.U) for High Performance
Computing (HPC) and it continues to evolve. The structure of this computing
center is a heterogenous one regarding both hardware and software environments.
Currently there are six dierent computing architectures available on NCIT as
follows: Intel Xeon Quad 64b, Intel Xeon Nehalem 64b, AMD Opteron 64b, IBM
Cell BE EDp 32/64b, IBM Power7 64b and NVidia Tesla M2070 32/64.
All the platforms can be accessed using the frontend machine, fep.grid.pub.ro( Front
(EndProcessors). The machines behind it are able to run only non-interactive
jobs sent through the frontend [NCI13].
From the software point of view, the NCIT cluster is a suitable environment to
conduct experiments using WRF and WRF-Hydro programs because it oers a
16 CHAPTER 3. HARDWARE AND SOFTWARE CONFIGURATION
wide variety of tools such as: Sun Studio, OpenMP and OpenMPI. In order to
use all the necessary tools, the cluster permits module loading. For this project,
I used GNU and PGI compilers as well as OpenMPI tool.
Listing 3.1: Loaded modules
1module load compilers /pgi 11.7
2module load compilers /gnu 4.7.0
3module load libraries /openmpi 1.6.3 pgi 11.7
4module load libraries /netcdf 4.1.3 pgi 11.7
5module list
6Currently Loaded Modulefiles :
7 1)compilers /pgi 11.7 3) libraries /openmpi 1.6.3 pgi 11.7
8 2)compilers /gnu 4.7. 4) libraries /netcdf 4.1.3 pgi 11.7
3.6.2 NetCDF
As mentioned before, in order to compile and run WRF and WRF-Hydro there
are other dependencies that must be loaded. For instance, the NetCDF library.
NetCDF is a set of software libraries capable of creating, accessing and sharing
array-oriented scientic data, according to the ocial website [NC]. It is platform
independent and oers a wide range of data access libraries in languages like C,
Fortran and so forth . It is used by the WPS and WRF programs for the input
and ouput les.
For instance, I have used NetCDF compiled with PGI.
Listing 3.2: NetCDF librabry
1module load libraries /netcdf 4.1.3 pgi 11.7
3.7 WRF/WRF-Hydro
WRF-Hydro is a coupling architecture designed to facilitate the coupling of ter-
restrial hydrological models with the WRF model. The WRF-Hydro system is
compiled as an independent library to link with the WRF model and called by
the WRF model as a function. The calling of WRF-Hydro within the WRF
model is controlled by a macro denition that is specied as an environment
setting during the compiling process. When WRF-Hydro is not activated within
the environment setting before the WRF conguration process, the entire system
defaults to the standard WRF model. In order to compile WRF-Hydro system,
the user only needs to set an environment variable ("export WRF HYDRO=1"),
and then follow the standard WRF model congure and compiling process.
3.7. WRF/WRF-HYDRO 17
WRF is an open source project that can be found here, [WRFb]. In order to
download the last version of WRF, you have to register as a new or returning
user. After downloading it, there are several steps that must be completed for
a successful run. First of all, we have to load the necessary modules and set a
number of environmental variables as follows:
Listing 3.3: Preparing the environment for compiling WRF
1module load libraries /netcdf 4.1.3 pgi 11.7
2module load libraries /openmpi 1.6.3 pgi 11.7
3module load compilers /pgi 11.7
4module load compilers /gnu 4.7.0
5#Environment v a r i a b l e s necessary for WRF Hydro coupling
6export WRF_HYDRO =1
7export HYDRO_D =1
8export NETCDF =/opt/tools /libraries /netcdf /4.1.3 pgi 11.7
9#Environment v a r i a b l e that s p e c i f i e s where the W R F d i r e c t o r y can be found
10export WRF_SRC_ROOT_DIR =/export /home /acs/stud /s/sorina .stoian / . . . / WRFV3
The next step consists in running the ./congure script. It allows the user to
specify which version of WRF wants to run: serial, shared memory, distributed
memory or a hybrid version. WRF was extensively tested using the distributed
memory version. This is the reason why we have chosen the same option.
Listing 3.4: Congure WRF
1. /configure
2Please select from among the following supported platforms .
3
4 1 . Linux x86_64 i486 i586 i686 ,PGI compiler with gcc
(serial )
5 2 . Linux x86_64 i486 i586 i686 ,PGI compiler with gcc
(smpar )
6 3 . Linux x86_64 i486 i586 i686 ,PGI compiler with gcc
(dmpar )
7 4 . Linux x86_64 i486 i586 i686 ,PGI compiler with gcc
(dm+sm)
8 . . .
9 58. Linux x86_64 i486 i586 i686 ,PGI compiler with pgcc f90=
(dmpar )
10 59. Linux x86_64 i486 i586 i686 ,PGI compiler with pgcc f90=
(dm+sm)
11
12 Compile for nesting ? (1= basic , 2= preset moves , 3= vortex following ) :
18 CHAPTER 3. HARDWARE AND SOFTWARE CONFIGURATION
13 Configuration successful .To build the model type compile .
14
15#S e t t i n g s for Linux x86 64 i486 i586 i686 , PGI compiler with gcc
(dmpar)
16#
17DMPARALLEL = 1
18OMPCPP = # D_OPENMP
19OMP = # mp Minfo =mp Mrecursive
20OMPCC = # mp
21SFC = pgf90
22SCC = gcc
23CCOMP = pgcc
24. . .
25CFLAGS = $(CFLAGS_LOCAL ) DDM_PARALLEL n
26 DMAX_HISTORY = $(MAX_HISTORY ) DNMM_CORE = $(WRF_NMM_CORE )
27. . .
28LIB = $(LIB_BUNDLED ) $(LIB_EXTERNAL ) $(LIB_LOCAL ) $(LIB_WRF_HYDRO )
29LDFLAGS = $(OMP) $(FCFLAGS ) $(LDFLAGS_LOCAL )
30ENVCOMPDEFS = DWRF_HYDRO
31WRF_CHEM = 0
32CPPFLAGS = $(ARCHFLAGS ) $(ENVCOMPDEFS ) I $(LIBINCLUDE ) $(TRADFLAG )
33NETCDFPATH = / opt/tools /libraries /netcdf /4.1.3 pgi 11.7/
34. . .
The congure script generates a le named congure.wrf where it stores infor-
mation about the platform where WRF will be compiled. When this step is
completed we can run the compile command.
Listing 3.5: Example of compiling WRF
1Usage :
2 compile [ d] [ j n]wrf compile wrf in run dir
3 (NOTE :no real .exe,ndown .exe,or ideal .exe generated )
4 or choose a test case (see README_test_cases for details ) :
5 compile em_b_wave
6 compile em_fire
7 compile em_les
8 compile em_real
9 compile em_tropical_cyclone
10 compile exp_real
11 compile nmm_real
12
13 compile d compile without optimization and with debugging
3.7. WRF/WRF-HYDRO 19
14 compile j n parallel make using n tasks if supported (default 2)
15 compile h help message
As we can observe above, there are many cases, real and idealized as well, from
which we can choose. For this project, we have compiled two versions: the
idealized case emlesand a real data case emreal.
Listing 3.6: Compiling WRF
1. /compile em_les >&compile .log
3.7.1 Compiling WPS
In order to run a real data case we need to download and compile WRF Pre-
processing System as well. This process is very similar with compiling WRF
itself.
Listing 3.7: Congure WPS
1 7 . Linux x86_64 ,PGI compiler (dmpar )
After compiling in the main folder of WPS should be three executables: ge-
ogrid.exe ,ungrib.exe ,metgrid.exe .
Listing 3.8: Compiling WPS
1. /compile >&compile .log
3.7.2 Running WRF-Hydro idealized
Compiling generates two executables: ideal.exe andwrf.exe stored in run/ di-
rectory. These executables can be found in the specic test directory as well.
Depending on the case we run, we have to check in the test case folder for scripts
called "run merst.ch", which we have to run before ideal.exe. Before we run
the ideal.exe program that generates wrnput d01 le, we have to copy hy-
dro.namelist and HYDRO.TBL in the same directory with ideal.exe and wrf.exe.
Now we can run the main executable wrf.exe .
3.7.3 Running WRF-Hydro real data case
If the compiling was successful in the run directory we can nd four executables:
real.exe ,tc.exe ,ndown.exe andwrf.exe .
We have to provide data for the geogrid program in order to transform it in
NetCDF format. In order to do that, we must change the namelist.wps le and
add the path where it can nd the real data.
20 CHAPTER 3. HARDWARE AND SOFTWARE CONFIGURATION
Listing 3.9: Editing the namelist.wps le
1geog_data_path ='/export/home/acs/stud/s/sorina.stoian/../WPS_GEOG/'
After running geogrid.exe, we have to run the ungrib.exe program which will
create the input necessary for metgrid.exe. At the end we should have les
like met em.d01.2015-06-24 00:00:00.nc in the WPS folder. These les must be
copied in the same folder with real.exe. We also have to copy hydro.namelist and
HYDRO.TBL in this folder.
Thehydro.namelist le contains options that can be edited. Here the user
specify the type of coupling(2 represents WRF), where the NetCDF format les
are stored, the date of the simulation and so forth.
3.7.4 Running examples
In order to perform tests, I have used two queues from the NCIT cluster: ne-
halem andopteron , 3.1. The jobs for these queues were submitted using scripts
like the following:
Listing 3.10: Running job script
1#! / bin /bash
2module load libraries /netcdf 4.1.3 pgi 11.7
3module load libraries /openmpi 1.6.3 pgi 11.7
4module load compilers /pgi 11.7
5
6export NETCDF =/opt/tools /libraries /netcdf /4.1.3 pgi 11.7
7export WRF_SRC_ROOT_DIR =/export /home /acs/stud /s/sorina .stoian / . . / Build_WRF /n_3.6/WRFV3
8export WRF_HYDRO =1
9export HYDRO_D =1
10
11time (mpirun np8 . / wrf.exe)
Model IBM LS22 14 nodes IBM HS22 4 nodes
Processor AMD Opteron 2435 Intel Xeon E5630
Processor frequency 2.6 GHz 2.53 GHz
Number of cores 12 16
L2 cache 6 x 512 Kb 12Mb
Memory 16 Gb 32Gb
Operating System Scientic Linux release
6.6 (Carbon)Scientic Linux release
6.6 (Carbon)
Hostname opteron-wn nehalem-wn
Table 3.1: Opteron and Nehalem conguration
Chapter 4
Case studies
This chapter comprises the results obtained by running jobs with WRF-Hydro
on two queues of the NCIT cluster, nehalem andopteron .
4.1 Results 1
The rst case consist of simulating idealized conditions on nehalem queue. The
ideal.exe and wrf.exe executables were obtained after compiling the em les case.
It comprises the following congurations: 100m grid size, 2km top, surface layer
physics with
uxes.
The simulation was running using a dierent number of processes from 1 to 256
and the run time result can be consulted in the appendices, .1.
Figure 4.1 presents the runtime obtained on this queue.
4.2 Results 2
The second set of results consist of simulating the same conditions of emles
case on the opteron queue.
The simulation was running using a dierent number of processes from 1 to 64
and the run time results can be consulted in the appendices, .2.
Figure 4.2 presents the run time obtained on this queue.
4.3 Comparing obtained results
In the gure 4.3 it is shown a comparison between the results obtained on those
queues. First of all, we can observe that in the same conditions, the run time
results are better on the nehalem queue. Another important fact is that after
21
22 CHAPTER 4. CASE STUDIES
Figure 4.1: Run time results on nehalem queue
a number of processes the results stop improving, on the contrary, they begin
to deteriorate. This happens when the communication between the processes
become more expensive than the computing itself.
4.3. COMPARING OBTAINED RESULTS 23
Figure 4.2: Run time results on the opteron queue
Figure 4.3: Comparison between opteron and nehalem run time results
Chapter 5
Conclusions and future work
WRF-Hydro Modeling system was designed to be coupled with the WRF model in
order to oer a better prediction of the hydrological phenomenons. The Weather
Research and Forecasting Model represents one of the most accurate mesoscale
numerical weather prediction and data assimilation system. It is a model capable
of managing nesting domains providing high resolution simulations and uses data
formats based on international standards such as NetCDF.
Another advantage of using this model refers to the fact that it can generate
predictions based on idealized conditions as well as real data congurations. In
my opinion the coupling between these two models represents a powerful soft-
ware which can prove its usefulness by forecasting dangerous phenomenons like
oods,heavy rain or predict the evolution in time of reservoirs or rivers courses.
These models are both parallelized enabling the posibility of using them on clus-
ters and High Performance Computing systems alike. In this project we have
successfuly installed WRF-Hydro integrated with WRF model and tested their
performance on the NCIT cluster. The experiments were conducted on two of
the NCIT cluster's queues: ibm-opteron.q and ibm-nelahem.q and the best run
time results were obtained on the nehalem queue.
Further studies should be made in order to improve the performance of WRF
and WRF-Hydro models, for instance the distributed communcations is a domain
suitable to enhancements.
Since the Weather Research and Forecast Model needs a large amount of compu-
tational power, I nd the National Center for Information Technology (NCIT) of
the University Politehnica of Bucharest a convenient environment to run WRF
and WRF-Hydro models on a daily basis. In my opinion, by using meaningful
data, revelant simulations can be obtained and thus, Romania and, Politehnica
University of Bucharest in particulary, can join the other international research
centers which make use of these models.
24
Bibliography
[CAS] User Guide V3 – chapter 4. http://www2.mmm.ucar.edu/wrf/users/docs/
user_guide_V3/users_guide_chap4.htm . [Online accessed 26-Jun-2015].
[cited at p. 10, 11]
[Dav] Dave Gill, Matthew Pyle. Nesting in WRF. http://www2.mmm.ucar.edu/
wrf/users/tutorial/201107/WRFNesting.ppt.pdf . [Online accessed 27-Jun-
2015]. [cited at p. 14]
[D.G15] D.Gochis, J. McCreight, W. Yu, A. Dugger, K. Sampson, D. Yates, A.
Wood, M. Clark, R. Rasmussen. The WRF-Hydro Modeling System:
System Overview and Status Update. http://www2.mmm.ucar.edu/wrf/
users/tutorial/201501/HYDRO.pdf , 2015. [Online; accessed 24-June-2015].
[cited at p. 5]
[GFS] Global Forecast System. http://www.emc.ncep.noaa.gov/GFS/exp.php . [On-
line; accessed 24-Jun-2015]. [cited at p. 6, 7]
[Joh] John Michalakes. A Runtime System Library for Parallel Finite Dierence
Models with Nesting. http://citeseerx.ist.psu.edu/viewdoc/download?
doi=10.1.1.29.9525&rep=rep1&type=pdf . [Online; accessed 25-Jun-2015].
[cited at p. 8]
[MMM] WRF Source Codes and Graphics Software Download Page. http://www2.
mmm.ucar.edu/wrf/users/download/get_sources_wps_geog.html . [Online
accessed 27-Jun-2015]. [cited at p. 13]
[MPI] The Message Passing Interface (MPI) standard. http://www.mcs.anl.
gov/research/projects/mpi/index.htm . [Online, accessed 25-Jun-2015].
[cited at p. 8]
[NC] The NETCDF library website. http://www.unidata.ucar.edu/software/
netcdf/ . [Online accessed 27-Jun-2015]. [cited at p. 16]
[NCI13] The NCIT Cluster Resources User's Guide -v4.0. pages 5{6, 2013. [cited at p. 3,
15]
[NOA] National Centers for Environmental Information – GFS. https:
//www.ncdc.noaa.gov/data-access/model-data/model-datasets/
global-forcast-system-gfs . [Online; accessed 24-June-2015]. [cited at p. 6]
25
26 BIBLIOGRAPHY
[RAL] WRF-Hydro Modeling System. https://www.ral.ucar.edu/solutions/
products/wrf-hydro-modeling-system . [Online; accessed 24-June-2015].
[cited at p. 5]
[SA] The Weather Research and Forecast Model: Software Architecture and Perfor-
mance. http://www.wrf-model.org/wrfadmin/docs/ecmwf_2004.pdf . [On-
line; accessed 25-Jun-2015]. [cited at p. 8, 9]
[Use13] The ncar wrf-hydro technical description and users guide. http:
//www.ral.ucar.edu/projects/wrf_hydro/images/WRF_Hydro_Technical_
Description_and%20User_Guide_v1.0.pdf , 2013. [Online accessed 25-Jun-
2015]. [cited at p. 9, 10]
[VIC] Variable Inltration Capacity Macroscale Hydrological Model, University
of Washington. http://www.hydro.washington.edu/Lettenmaier/Models/
VIC/Overview/ModelOverview.shtml . [Online; accessed 24-June-2015].
[cited at p. 6]
[WPS] Users Guide for Advanced Research WRF (ARW) Modeling System Ver-
sion 2; Chapter 3: The WRF Preprocessing System (WPS) Preparing In-
put Data. http://www2.mmm.ucar.edu/wrf/users/docs/user_guide/users_
guide_chap3.html . [Online accessed 25-Jun-2015]. [cited at p. 12, 13, 14]
[WRFa] The Weather Research and Forecasting Model website. [cited at p. 4]
[WRFb] WRF source code. http://www2.mmm.ucar.edu/wrf/users/download/get_
source.html . [Online accessed 28-Jun-2015]. [cited at p. 17]
Appendices
27
28 BIBLIOGRAPHY
.1 Appendix1
The following runtime results were obtained during experiments on the nehalem
and opteron queues of the NCIT cluster:
Number of processes Runtime(seconds)
1 1010.680
2 554.968
4 336.863
8 224.067
16 247.805
24 253.706
32 250.452
48 205.623
64 156.663
128 143.935
256 136.139
Table .1: Run time results on the nehalem queue
Number of processes Runtime(seconds)
1 1673.64
2 858.333
4 603.045
8 497.588
16 367.142
24 239.839
32 238.997
48 358.169
64 511.726
Table .2: Run time results on the opteron queue
Listings
3.1 Loaded modules . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.2 NetCDF librabry . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.3 Preparing the environment for compiling WRF . . . . . . . . . . . 17
3.4 Congure WRF . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.5 Example of compiling WRF . . . . . . . . . . . . . . . . . . . . . . 18
3.6 Compiling WRF . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.7 Congure WPS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.8 Compiling WPS . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.9 Editing the namelist.wps le . . . . . . . . . . . . . . . . . . . . . . 20
3.10 Running job script . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
29
Copyright Notice
© Licențiada.org respectă drepturile de proprietate intelectuală și așteaptă ca toți utilizatorii să facă același lucru. Dacă consideri că un conținut de pe site încalcă drepturile tale de autor, te rugăm să trimiți o notificare DMCA.
Acest articol: University Politehnica of Bucharest [602150] (ID: 602150)
Dacă considerați că acest conținut vă încalcă drepturile de autor, vă rugăm să depuneți o cerere pe pagina noastră Copyright Takedown.
