
Author: Ben Foster (NCAR/HAO) foster@ucar.edu
Dates:  Oct-Nov-Dec, 2012.

This is a stand-alone electro-dynamo code, extracted from the timegcm_pdyn 
branch of TIME-GCM ($SVN/timegcm/branches/timegcm_pdyn).  The goal is 
to install the edynamo module to the CAM/WACCM-X ionosphere, and possibly 
other models.

This code is in the hao repository: file:///home/tgcm/svn/edynamo
To view the commit log, execute the following command on the hao network:
"svn log file:///home/tgcm/svn/edynamo/trunk"

As of Nov 27, 2012, this code reads neutral atmosphere inputs from either
TIME-GCM or WACCM-X history files, and calls the dynamo driver subroutine 
to calculate 3d ionospheric electric potential and electric field (phim3d, 
ed13d, ed23d, emz3d), and optionally ion drift velocities on the geographic
grid. It also writes a netcdf output file containing fields and arrays on 
geographic or magnetic grids requested by user calls to sub addfld throughout 
the code.

The code is parallel with pure MPI using a simple 2d decomposition, except 
for setting up and executing the multi-grid PDE MUDPACK solver, which is 
executed by the root task only.  A gather to the root task is done before 
the solver, and the solution is scattered to the non-root task subdomains 
after the solver. There are some OpenMP directives in the solver itself, 
which are not implemented. Regridding (geo2mag or mag2geo) is performed on 
the MPI-distributed grids using the ESMF library. The apex module (still
mostly legacy code) is called once per run, and calculates needed global 
magnetic coordinates (a version of the apex module is already in WACCM).

This code is written to (mostly) conform with CGD "CAM/WACCM-style" coding 
practices. The intent is to eventually import the edynamo module into 
WACCM-X itself to support the ionosphere module.

The current code is built with the ifort Intel compiler on Linux systems,
or with the xlf90 compiler on the IBM/AIX NCAR system "bluefire". The MPI libs
that come with the compilers are used. ESMF libs are built separately on
each platform. Have also run this code on NWSC yellowstone machine, w/ Intel/ifort.

Thoughts on porting this code to CESM as a sub-component of ATM/CAM/WACCM:

  1. Main calling interface will be subroutine dynamo in module edynamo (edynamo.F90).
     Most use-associations at the top of the edynamo module will be removed, or replaced 
     by equivalent CAM/WACCM structures.
  2. This code uses a simple 2d horizontal decomposition in lat x lon and mlat x mlon.
     3d neutral atmosphere input fields are passed to sub dynamo on the geographic pe 
     subdomains. These fields are regridded to the magnetic coord system, and subsequent 
     code loops over the 2d or 3d mlat x mlon magnetic subdomain. I'm not sure how or 
     where this scheme will fit into CAM/WACCM physics data structure.
  3. Currently in this code, the geomagnetic grid is parameter-based, so memory can be 
     allocated at compile time. However the geographic grid is read from the input history 
     file (either WACCM or TIMEGCM), so that memory is dynamically allocated at run-time.
     WACCM already has the magnetic grid and apex coords same as in this code, and when 
     in WACCM, the geographic subdomains would be passed to sub dynamo and the input fields
     would be automatic arrays. Same for output arrays, except some output arrays are on 
     magnetic grid (electric potential and field) and others on geographic (ion drifts).
  4. Use of ESMF for calculating interpolation weights at init, and doing sparse matrix 
     multiply for geographic<->magnetic regridding could be converted to MCT if this 
     would be convenient for CESM.
  5. edynamo uses the MUDPACK elliptical PDE solver (source code is included in edynamo).
     This solver is currently serial, altho there are some commented OpenMP directives.
     Options should be considered for parallizing the solver, or replacing it with a
     third party parallel solver.

-----------------------------------------------------------------------------------------
High-level calling tree:

main               ! (main.F90)      Main program
  set_cons         ! (params.F90)    Set runtime constants
  get_geogrid      ! (geogrid.F90)   Read global geographic grid from input file
  set_maggrid      ! (maggrid.F90)   Set parameter-based global magnetic grid
  mp_init          ! (my_mpi.F90)    Initialize MPI library and set up domain decomposition
  output_init      ! (output.F90)    Initialize output

  do istep=1,ntime ! (main.F90)      Main time loop (number of times on input file) 
    read_tgcm      ! (read_ncfile.F90) Read timegcm history file at current time
    read_waccm     ! (read_ncfile.F90) Read waccm history file at current time
    get_apex       ! (apex.F90)      Get apex coordinates (first step only)
    heelis_model   ! (heelis.F90)    Call empirical model for high-latitude potential
    dynamo         ! (edynamo.F90)   Main driver for electro-dynamo
      esmf_init    ! (my_esmf.F90)   Set up ESMF for regridding (first call only)
      dynamo_input        ! Read neutral atmosphere inputs, and regrid to magnetic
      fieldline_integrals ! Integrate along magnetic field lines
      rhspde              ! Prepare right-hand-side
      gather_edyn         ! Gather 2d arrays to root task
      solve_edyn          ! Perform serial PDE solver
      scatter_phim        ! Scatter 2d solution to non-root subdomains
      highlat_poten       ! Call Heelis model for high-latitude potential
      pthreed             ! Expand potential to 3d and calculate electric field
      pefield             ! Regrid electric field to geographic
      ionvel              ! Calculate ion drifts on geographic grid 
    write_output   ! (output.F90)    Write to netcdf output file 
  enddo            ! End main time loop

  report_timing    ! (timing.F90)    Report wallclock timing to stdout
  mp_close         ! (my_mpi.F90)    Close and finalize MPI
end ! end main program

-----------------------------------------------------------------------------------------
Detailed calling tree:

main               ! (main.F90)      Main program
  set_cons         ! (params.F90)      Set runtime constants
  mp_init          ! (my_mpi.F90)      Initialize mpi library, allocate task table
  get_namelist     ! (namelist.F90)    Get namelist input parameters from user
  get_geogrid      ! (read_ncfile.F90) Read and set geographic grid
  set_maggrid      ! (maggrid.F90)     Set parameter-based magnetic grid

  mp_decomp        ! (my_mpi.F90)      Set up domain decomposition
    mkntask            ! Choose decomposition for geographic domain
    mkntask            ! Choose decomposition for magnetic domain
    mp_distribute_geo  ! Set up geographic subdomains
    mp_distribute_mag  ! Set up magnetic subdomains
    mp_exchange_tasks  ! Broadcast grid information to all mpi tasks

  output_init      ! (output.F90)
    field_init     ! (fields.F90) Initialize field type structures (fields.F90)

  do istep=1,ntime ! (main.F90)        Main time-step loop (number of times on input file) 
    read_tgcm      ! (read_ncfile.F90) Read tgcm netcdf history file for current time
      alloc_inputs ! (read_ncfile.F90) Allocate timegcm input arrays
    read_waccm     ! (read_ncfile.F90) Read waccm netcdf history file for current time
      alloc_inputs ! (read_ncfile.F90) Allocate waccm input arrays
      reverse_vec  ! (util.F90)        Reverse order of waccm vertical dimension 
      shift_lons   ! (util.F90)        Shift waccm longitudes from 0->360 to -180->180

    get_apex       ! (apex.F90)    Get apex coordinates and related parameters (first time only)
      apxmka       ! (apex_subs.F) Calculate base vectors and quantities for apex coordinates
      apxmall      ! (apex_subs.F) Get modified apex and quasi-dipole coords
      apxq2g       ! (apex_subs.F) Convert from quasi-dipole to geographic coordinates.

    heelis_model   ! (heelis.F90) Call empirical model for high-latitude potential
      heelis_init  ! Initialize auroral parameters for Heelis model
      sunloc       ! Determine sun's longitude coordinates
      colath       ! Calculate fraction of potential (pfrac)
      potm         ! Calculate potential (phihm) on global magnetic domain

    dynamo         ! (edynamo.F90)   Main driver for electro-dynamo
      alloc_edyn   ! allocate arrays on magnetic subdomains (first call only)

      esmf_init    ! (my_esmf.F90) Initialize ESMF regridding mechanism (first call only)
        ESMF_Initialize       ! Initialize ESMF library, calculate weights and save router handles
        create_geo_grid       ! Create geographic source grid      for geo2mag 
        create_mag_grid       ! Create magnetic destination grid   for geo2mag
        create_mag_grid       ! Create magnetic source grid        for mag2geo
        create_geo_grid       ! Create geographic destination grid for mag2geo
        esmf_create_geofield  ! Several calls to create ESMF fields on geographic grid
        esmf_create_magfield  ! Several calls to create ESMF fields on magnetic grid
        ESMF_FieldRegridStore ! Save ESMF route handle for geo2mag
        ESMF_FieldSMMStore    ! Init sparse matrix multiply for geo2mag (apex or ESMF weights)
        ESMF_FieldRegridStore ! Save ESMF route handle for mag2geo
        ESMF_FieldSMMStore    ! Init sparse matrix multiply for mag2geo (apex or ESMF weights)

      dynamo_input ! (edynamo.F90) Receive neutral atmosphere inputs and convert to magnetic grid
        calc_adotv ! Calculate additional quantities from inputs
        esmf_set3d_geo   ! (my_esmf.F90) Give values to 3d ESMF fields on geographic grid
        esmf_set2d_geo   ! (my_esmf.F90) Give values to 2d ESMF fields on geographic grid
        esmf_regrid      ! (my_esmf.F90) Regrid 2d and 3d fields from geographic to magnetic
        esmf_get_3dfield ! (my_esmf.F90) Get values of 3d regridded fields on magnetic grid
        esmf_get_2dfield ! (my_esmf.F90) Get values of 2d regridded fields on magnetic grid
        mp_mageq ! (my_mpi.F90) Send longitude subdomain values at magnetic equator to all tasks

      fieldline_integrals ! (edynamo.F90) Perform integrations along magnetic field lines
      complete_integrals  ! (edynamo.F90) Calculate values at magnetic equator and poles
        mp_mageq_jpm1     ! (my_mpi.F90) Get global values at mag equator and lats +eq and -eq
        mp_magpole_2d     ! (my_mpi.F90) Get global values at mag poles and 2 latitudes below each pole
        mp_mag_foldhem    ! (my_mpi.F90) Get values from southern hemisphere, and sum to north
        mp_mag_periodic_f2d ! Set magnetic periodic point

      rhspde       ! (edynamo.F90) Calculate right-hand side for PDE (use global poles and equator)
      gather_edyn  ! (edynamo.F90) Gather 2d magnetic fields to global arrays at root task
        mp_gather_edyn (my_mpi.F90) Use collective mpi_gather to root task

      solve_edyn   ! (solve.F90) Driver for serial stenciling and PDE solver (called by root task only)
        stencils   ! Set up stencils for solver      
          clearcee ! Set up and initialize stencil coefficients
          stencmd  ! Calculate contribution to stencils from each PDE coefficient
          edges    ! Set boundary condition at poles
          divide   ! Divide stencils by cos(lam_0)
          stenmd   ! Modify stencil to set potential to heelis value within auroral circle
        solver     ! Call mudpack elliptical PDE solver
          mudmod   ! (mudmod.F) Driver for mudpack solver

      mp_scatter_phim ! (my_mpi.F90) Scatter solution to task magnetic subdomains (mpi_bcast)
      highlat_poten   ! (edynamo.F90) Add high-latitude empirical potential to solution
      pthreed         ! (edynamo.F90) Expand potential in vertical (phim3d) and calculate 3d electric field
        mp_mageq_jpm3 ! (my_mpi.F90) Get mag equator +/- 3 latitudes
        mp_magpoles   ! (my_mpi.F90) Get mag poles
        mp_mag_periodic_f3d ! (my_mpi.F90) Set magnetic periodic point of 3d arrays

      pefield         ! Regrid mag electric field to geographic in ex,ey,ez
        mp_magpole_3d ! Get global mag longitudes surrounding 2 latitudes below the poles
        mp_magpoles   ! Get global mag longitudes surrounding jspole,jnpole
        mag2geo_3d    ! Use ESMF to regrid electric field from mag to geo

      ionvel          ! Calculate ion drifts ui,vi,wi on geographic grid from ex,ey,ez

    write_output       ! (output.F90) Write to netcdf output file (fields added w/ addfld calls)
      define_ncfile    ! (output.F90) Define dimensions and vars on output file (first call only)
      gather_fields    ! (output.F90) Gather 2d and 3d fields to root task for i/o
        mp_gather2root ! (my_mpi.F90) Uses call to collective mpi_gather 
      write_fields     ! (output.F90) Write fields to output file
        reverse_vec    ! (util.F90)   Optionally return vertical dimension to waccm-format
        shift_lons     ! (util.F90)   Optionally shift waccm longitudes back to 0->360

  enddo          ! end main time loop

  report_timing  ! (timing.F90) Report wallclock timing to stdout
  mp_close       ! (my_mpi.F90) Close and finalize MPI lib

end ! end main program

-----------------------------------------------------------------------------------------
Module dependencies, listed in order of compilation:

OBJS = shr_kind_mod.o params.o timing.o util.o geogrid.o maggrid.o namelist.o my_mpi.o 
       read_ncfile.o fields.o apex_subs.o apex.o my_esmf.o output.o addfld_mod.o solve.o 
       mud.o mudcom.o mudmod.o muh2cr.o heelis.o edynamo.o main.o

shr_kind_mod
  [no dependencies]

params
  use shr_kind_mod

timing
  use shr_kind_mod
  params

util
  use shr_kind_mod
  use netcdf
  use esmf
  use params

geogrid
  use shr_kind_mod

maggrid
  use shr_kind_mod
  use params
  use geogrid

namelist
  use shr_kind_mod
  use params

my_mpi
  use shr_kind_mod
  use params
  use geogrid
  use maggrid
  use namelist
  use timing

read_ncfile
  use netcdf
  use shr_kind_mod
  use params
  use my_mpi
  use namelist
  use geogrid
  use maggrid

fields
  use shr_kind_mod
  use params

apex_subs (not a module)
  [none]

apex
  use shr_kind_mod
  use params
  use geogrid
  use maggrid

my_esmf
  use esmf
  use shr_kind_mod
  use params
  use my_mpi
  use apex
  use geogrid
  use maggrid
  use namelist
  use timing

output
  use netcdf
  use shr_kind_mod
  use params
  use my_mpi
  use fields
  use read_tgcm
  use geogrid
  use maggrid

addfld_mod
  use shr_kind_mod
  use params
  use geogrid
  use maggrid
  use fields
  use my_mpi

solve
  use shr_kind_mod
  use params
  use maggrid
  use addfld_mod

mud, mudcom, mudmod, muh2cr (not modules)
  use solve

heelis
  use shr_kind_mod
  use maggrid
  use geogrid
  use params
  use read_tgcm
  use addfld_mod
  use solve
  use apex

edynamo
  use shr_kind_mod
  use my_esmf
  use params
  use maggrid
  use my_mpi
  use solve
  use addfld_mod
  use apex

main
  use shr_kind_mod
  use params
  use geogrid
  use maggrid
  use apex
  use my_mpi
  use namelist
  use read_ncfile
  use heelis
  use edynamo
  use output
  use timing
