PYPEC Source documentation¶
pypec
– Main Package¶
Python modules for data visualization and postprocessing, as well as python wrappers for GPEC and PENT.
- Author
N.C. Logan
- Location
Princeton Plasma Physics Laboratory
Python at PPPL¶
To set up python on portal at PPPL, insert the following lines into your .bashrc file:
export PYTHONPATH=$PYTHONPATH:/path/to/gpec
module load anaconda/2.3.0
Note
Anaconda is the most complete and current python distribution available at pppl. Users can also use locally built python/2.7.2, but will loose 3D plotting capabilities and may experience problems reading large data files.
Obviously, “/path/to/gpec” needs to be replaced with the path to your installation of the GPEC package (containing the pypec directory).
For tcsh shell users, replace export with setenv syntax and insert in .cshrc.
To put the changes into effect:
$ source ~/.bashrc
Users are further encouraged to create a matplotlib rc file in user ~/.config/matplotlib similar to /u/nlogan/.config/matplotlib/matplotlibrc. This file sets a number of plotting defaults.
Python Tutorials¶
The 3 workhorse modules for scientific programming in python are numpy, scipy and matplotlib (although the third is really a personal preference). Tutorials for each can be found at
Setting Your Personal Defaults¶
To set personal defaults for runs, edit the _defaults_.py file in this directory. Some of these (like email) should be obvious, others you may not understand until becoming more familiar with the package. Do not worry if these are confusing, and feel free to skip them for now.
Using the Best Environment¶
The ipython (interactive) environment is recommended for commandline data analysis. To start, it is easiest to auto-import a number of basic mathematics and visualization modules and use an enhanced interactivity for displaying figures. To do this enter:
$ ipython
In [1]: import mayavi
In [2]: %pylab
Note
Mayavi’s API settings will still conflict with matplotlib if it is not imported in the above order! For this reason, we cannot use the –pylab call option with ipython.
Look into online tutorial on numpy and matplotlib. Advanced users are recommended to edit ~/.matplotlib/matplotlibrc and ~/.ipython/profile_default/startup/autoimports.ipy files.
Now in the ipython environment, type
>>> from pypec import gpec,data
There may be a few warnings/hints… take them or leave them.
pypec.data
– Pythonic Treatment of Scientific Data¶
The data module is a very generalized tool for visualizing and manipulating scientific data from netcdf and ascii files.
GPEC data is migrating to netcdf, and this module uses the xarray module for its python netcdf interface. The xarray module is very powerful and well documented here.
Previous GPEC results and some current outputs are still writen to ascii. This module contains a custom I/O for converting ascii data to python objects. A custom Data object was created for this purpose, and conversion to the xarray Dataset is automated in the open_dataset function to facilitate migration to netcdf. The only requirement for creating a data object from an ascii file is that the file have one or more tables of data, with appropriate labels in a header line directly above the data.
Here, we show some examples using common outputs from both GPEC and PENT.
Beginners¶
First, add your release of GPEC to your PYTHONPATH environment variable.
In an open ipython session, import the data module
>>> from pypec import data
Explore the data module by typing ‘data.’ and hitting tab. For information on any particular object in the module, use the ‘?’ syntax or ‘help’ function. For example, enter ‘data.read?’. You will see that it has a lot of metadata on this function, but the important things to look at for now are the arguments and key word arguments. These are passed to the function as shown in the Definition line… you can either pass a single string, or a string and specification for squeeze. If this confuses you, consult any of the many great python tutorials online.
Lets use it! To get a typical GPEC output into python, use the open_dataset function.
>>> con = data.open_dataset('examples/DIIID_ideal_example/gpec_control_output_n1.nc')
Printing to the terminal is intelegent, giving a conviniently human readable summary of the data.
>>> con
<xarray.Dataset>
Dimensions: (i: 2, m: 129, mode: 34, mode_PX: 34, mode_RX: 34, mode_SC: 4, mode_WX: 34, mode_XT: 34, theta: 513)
Coordinates:
* i (i) int32 0 1
* m (m) int32 -64 -63 -62 -61 -60 -59 -58 -57 -56 -55 -54 -53 ...
* mode (mode) int32 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 ...
* mode_SC (mode_SC) int32 1 2 3 4
* theta (theta) float64 0.0 0.001953 0.003906 0.005859 0.007812 ...
* mode_XT (mode_XT) int32 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 ...
* mode_WX (mode_WX) int32 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 ...
* mode_RX (mode_RX) int32 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 ...
* mode_PX (mode_PX) int32 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 ...
Data variables:
L (mode, m) complex128 (-0.478146242223-0.0251176845733j) ...
Lambda (mode, m) complex128 (1.4032106893e-07-3.56355823737e-08j) ...
P (mode, m) complex128 (0.000212307471204-9.52839135614e-05j) ...
rho (mode, m) complex128 (9.96920996839e+36+9.96920996839e+36j) ...
X_EDT (mode_XT, m) complex128 (3.88116613338e-05+1.90243722796e-05j) ...
X_EVT (mode_XT) float64 2.517e+04 1.334e+05 5.081e+05 1.119e+06 ...
W_EDX (mode_WX, m) complex128 (2.0673520088e-06-4.52521668335e-07j) ...
W_EVX (mode_WX) float64 8.543 2.882 2.516 1.137 0.8107 0.6662 ...
W_EVX_A (mode_WX) float64 3.922e+05 7.805e+04 2.466e+05 3.042e+05 ...
W_EVX_energyv (mode_WX) float64 6.701e+06 4.499e+05 1.241e+06 6.917e+05 ...
W_EVX_energys (mode_WX) float64 6.383e+06 8.481e+05 1.724e+06 6.867e+05 ...
W_EVX_energyp (mode_WX) float64 6.817e+06 5.582e+05 8.381e+05 5.807e+05 ...
R_EDX (mode_RX, m) complex128 (-9.04352944467e-07-5.23229281269e-07j) ...
R_EVX (mode_RX) float64 1.881e+06 8.165e+05 2.477e+05 1.895e+05 ...
R_EVX_RL (mode_RX) float64 5.689 0.7149 0.3795 0.8092 0.2417 ...
R_EVX_energyv (mode_RX) float64 4.298e+05 1.373e+06 1.01e+06 7.832e+05 ...
R_EVX_energys (mode_RX) float64 8.994e+05 1.745e+06 1.096e+06 8.803e+05 ...
R_EVX_energyp (mode_RX) float64 3.785e+05 8.975e+05 8.31e+05 6.814e+05 ...
P_EDX (mode_PX, m) complex128 (-5.0711563448e-06-2.51239391938e-06j) ...
P_EVX (mode_PX) float64 493.5 161.8 146.4 128.7 127.8 110.2 ...
O_WRX (mode_RX, mode_WX) float64 0.082 0.99 0.08839 0.01132 ...
O_WPX (mode_PX, mode_WX) float64 0.01414 0.9607 0.08296 0.01058 ...
O_RPX (mode_PX, mode_RX) float64 0.9571 0.01583 0.02807 0.1558 ...
O_XT (mode_XT) complex128 (0.00296215168999+0.00489536349583j) ...
O_WX (mode_XT) complex128 (0.0101206436883-0.00397946238035j) ...
O_RX (mode_RX) complex128 (-0.00967090555989-0.000112969521773j) ...
O_PX (mode_PX) complex128 (-0.00598278876631+0.0105074013581j) ...
Phi_EX (m) complex128 (-1.42853908779e-10-4.20811059626e-11j) ...
Phi_ET (m) complex128 (-6.89310992668e-10-1.01978717443e-09j) ...
W_EDX_FUN (mode_WX, theta) complex128 (-1.45421344314-0.0472501327701j) ...
R_EDX_FUN (mode_RX, theta) complex128 (1.45665015126-0.521253285202j) ...
b_nm (m) complex128 (-9.46126704696e-10+1.48737377032e-09j) ...
b_xnm (m) complex128 (-5.48365827795e-10+5.89720875704e-10j) ...
xi_nm (m) complex128 (-3.3871234611e-09-1.86371281018e-09j) ...
xi_xnm (m) complex128 (-6.60388933097e-10+2.7635104984e-09j) ...
xi_n (theta) complex128 (0.000402066428626+0.00260098774146j) ...
xi_xn (theta) complex128 (-0.000601114809848-0.00197978522356j) ...
b_n (theta) complex128 (-0.00157772177652+0.00052698051739j) ...
b_xn (theta) complex128 (-0.000920387597747+0.000315158824532j) ...
R (theta) float64 2.289 2.289 2.288 2.287 2.286 2.284 2.282 ...
z (theta) float64 0.02442 0.03972 0.05506 0.07046 0.08596 ...
R_n (theta) float64 1.0 0.9995 0.9981 0.9959 0.9928 0.9889 ...
z_n (theta) float64 0.003568 0.03241 0.06134 0.09036 0.1195 ...
dphi (theta) float64 0.0 -0.039 -0.078 -0.117 -0.1559 -0.1948 ...
Attributes:
title: GPEC outputs in Fourier or alternate modal bases
Jacobian: pest
q_lim: 5.2
psi_n_lim: 0.991536136179
shot: 147131
time: 2300
machine: DIII-D
n: 1
version: GPEC version 0.3.1
long_name: Surface Inductance
energy_vacuum: 3.28697635252
energy_surface: 7.8714975121
energy_plasma: 3.7597335618
We see the data contains dimensions, variables, and global attributes. Some individual data arrays may have attributes as well.
For quick visualization, we can use xarray’s built in plotting routines,
>>> import xarray.plot as xplot
>>> f1,a1 = plt.subplots()
>>> l1 = xplot.plot(con['W_EVX']) # 1D data, returns a list of lines
>>> f1.savefig('examples/figures/example_eigenvalues.png')
>>> f2,a2 = plt.subplots()
>>> l2 = xplot.plot(con['W_EDX']) # This is 2D, returns Quadmesh
>>> f2.savefig('examples/figures/example_eigenvectors.png')
Note this uses matplotlib, and the plots are fully interactive. Navigate and zoom using the toolbar below plots.
This is not the recommended way to display detailed studies using GPEC data, however. It is recommended that the user familiarise themselves with matplotlib (there are many great online tutorials) and write their own scripts for plotting the data of interest, as details will inevitably need tweaking.
For a common example of more customized plots, lets look at some 2d data
>>> cyl = data.open_dataset('examples/DIIID_ideal_example/gpec_cylindrical_output_n1.nc')
We could use the built in plotting methods like ds[‘b_z’].plot), but these are (R,z) figures should be done with a little care to look right. Lets look at b_z for example,
>>> plt = data.plt
>>> f,ax = plt.subplots(figsize=(8,8))
>>> v = cyl['b_z']
>>> im = ax.imshow(np.real(v),aspect='equal',origin='lower',cmap='RdBu',vmin=-2e-3,vmax=2e-3,
... extent=[v['R'].min(),v['R'].max(),v['z'].min(),v['z'].max()],
... interpolation='gaussian')
>>> cb = f.colorbar(im)
>>> cb.set_label('$\delta B_z$')
>>> xtks = ax.set_xticks([1,1.7,2.4])
>>> xtxt = ax.set_xlabel('R (m)')
>>> ytxt = ax.set_ylabel('z (m)')
>>> f.savefig('examples/figures/example_pbrzphi.png')
There we go. That looks publication ready!
Another common plot of GPEC outputs is the (psi,m) spectrogram plot. Lets quickly make a nice one,
>>> prof = data.open_dataset('examples/DIIID_ideal_example/gpec_profile_output_n1.nc')
>>> f,ax = plt.subplots()
>>> extent = [prof['m'].min(),prof['m'].max(),prof['psi_n'].min(),prof['psi_n'].max()]
>>> im = ax.imshow(np.abs(prof['xi_m_contrapsi']).T,origin='lower',cmap='viridis',
... interpolation='nearest',aspect='auto',extent=extent)
>>> cb = f.colorbar(im)
>>> cb.set_label(r'$\xi^\psi_m$')
>>> lim = ax.set_xlim(-20,20)
>>> xtxt = ax.set_xlabel('m')
>>> ytxt = ax.set_ylabel(r'$\psi$')
>>> f.savefig('examples/figures/example_xispectrogram.png')
Note that the choice of interpolation resulted in discrete bands in m, which is true to the the discrete fourier decomposition used in the codes and provides a more intuitive connection between what the user sees and what the code is calculating.
One of the many cool additions to matplotlib in the pypec package is the ability to write line plots to tables. Lets look at a 1D mode spectrum plot for example,
>>> f,ax = plt.subplots()
>>> for m in range(1,9):
... lines = ax.plot(prof['psi_n'],np.abs(prof['xi_m_contrapsi'].sel(m=m)),label=r'm={:}'.format(m))
>>> mappable = plt.set_linearray(ax.lines,cmap='viridis')
>>> ytxt = ax.set_ylabel(r'$\xi^\psi_m$')
>>> xtxt = ax.set_xlabel(r'$\psi$')
>>> leg = ax.legend(ncol=2)
>>> f.savefig('examples/figures/example_xim.png')
>>> f.printlines('examples/figures/example_xim.txt',squeeze=True)
Wrote lines to examples/figures/example_xim.txt
True
That’s it! Read the table with your favorite editor. It will probably need a little cleaning at the top since it tries to use the lengthy legend labels as column headers.
At this point we have outlined the basics. Using this module and similar commands to those used here the user should be able to fully interact with and visually explore the GPEC output.
Advanced Users¶
In this section, we will assume you have a pretty good hold on both python and the basic ploting functions of these modules.
Now that you use this module regularly, save yourself some time by placing the line:
from pypec import *
into your autoimport file located at:
~/.ipython/profile_default/startup/autoimports.ipy
Deprecated ASCII Interface¶
As mentioned above, GPEC has moved the vast majority of its I/O to netcdf, and open_dataset should be used to create xarray Dataset objects whenever possible. This section describes the deprecated ascii interface for context, and motivates the move to netcdf.
Data in ascii tables can be read into a custom Data object using the read function,
>>> mydata = read('examples/DIIID_ideal_example/g147131.02300_DIIID_KEFIT.kin')
Casting table 1 into Data object.
>>> type(mydata),len(mydata)
(<type 'list'>, 1)
The read function creates a list of data objects corresponding to the tables in the text file. In this case, there is only one table. If we knew this ahead of time, we could have gotten the data object directly by splitting the list.
>>> mydata = read('examples/DIIID_ideal_example/g147131.02300_DIIID_KEFIT.kin')[0]
Casting table 1 into Data object.
At its heart, the data object consists of independent data and dependent data. This are stored differently, namely as a list in the pts attribute and as a dictionary in y. The x attribute is a dictionary if the data is 1 dimensional or a regular grid is found in the first n columns of the data (n=2,3), and left empty if the n>1 dependent data is irregular.
Explore the data using the corresponding python tools.
>>> mydata.xnames
['psi']
>>> mydata.y.keys()
['nim^3', 'tieV', 'teeV', 'wexbrad/s', 'nem^3']
You would access the ion density data using mydata.y[‘nim3’] for example.
The data object, however, is more that just a place holder for parced text files. It contains visualization and data manipulation tools. Plotting functions return interactive matplotlib Figure objects.
>>> fig = mydata.plot1d()
>>> fig.savefig('examples/figures/example_profiles.png')
Interpolation returns a new Data object of the desired data, which can be one, multiple, or all of the dependent variables found in y.
>>> myinterp = mydata.interp(np.arange(0.1,1.0,0.1),['nim^3','tieV'])
Forming interpolator for nim^3.
Interpolating values for nim^3.
Forming interpolator for tieV.
Interpolating values for tieV.
>>> f = mydata.plot1d('nim^3')
>>> f = myinterp.plot1d('nim^3',marker='s',lw=0,figure=f)
>>> f.savefig('examples/figures/example_profiles2.png')
Note that the interpolator initialized the interpolation function on the first call, this is saved internally for subsequent calls.
One common need is to look at spectra. For this we want to utilize the full functionality of the data instances’ plot1d function.
>>> xc, = read('examples/DIIID_ideal_example/gpec_xclebsch_n1.out')
Casting table 1 into Data object.
>>> f = xc.plot1d('xi^psi','psi',x2rng=(1,3))
>>> f.savefig('examples/figures/example_spectrum.png')
We knew it was one table, so we used the “,” to automatically unpack returned list. We were then so familiar with the args and kwargs of the plotting function that we did not bother typing the names. If you do this, be sure you get the order consistent with the order given in the documentation “Definition” line (watch out for inconsistent “Docstring” listings).
Note
These examples can be tested by developers using ipython in this directory as follows:
In [1]: import data,doctest
In [2]: doctest.testmod(data,verbose=True)
- class pypec.data.DataBase(fromtxt, preamble, forcex=[], forcedim=[], maxlength=1000000.0, auto_complex=True, quiet=False)[source]¶
Bases:
object
An emtpy class object with the following operations overloaded:
+,-,*,/,**
such that the operation is only performed on the attributes starting with .real_ and .imag_ and ending with r,z, or phi. If the two factors in the operation have different r or z attributes data is interpolated to the wider spaced axis.
- interp(x, ynames=None, quiet=False, **kwargs)[source]¶
Interpolate data to point(s).
- Parameters
x – ndarray shape (npts,ndim). Point(s) on dependent axis(axes).
ynames – list. Any number of keys from y attribute
quiet – bool. Suppress status messages
- Returns
dict. Each name contains array of interpolated data.
- plot1d(ynames=None, xname=None, x1rng=None, x2rng=None, x3rng=None, squeeze=False, **kwargs)[source]¶
Line plots of from 1D or 2D data.
- Parameters
ynames – list. Strings specifying dependent data displayed.
xname – string. The independent axis if 2D data.
x1rng – optional. Valid formats are, x - Plot single nearest point. (xmin,xmax) - Plots all data in range (min,max). (xmin,xmax,n) - Plots n evenly spaced points in range.
x2rng – Same as x1rng. Applicable to 2D or 3D data.
x3rng – Same as x2rng for 3D data only. Order of x is xnames with xname placed in front if given.
squeeze – bool. Plots all ynames in same axis.
kwargs – dict. Valid matplotlib pyplot.plot keyword arguments.
- Returns
figure.
- plot2d(ynames=None, aspect='auto', plot_type='imshow', cbar=True, center=None, grid_size=(256, 256), swap=False, **kwargs)[source]¶
Matplotlib 2D plots of the data.
- Parameters
ynames – list. Strings specifying dependent data displayed.
aspect – str/float. Choose from ‘auto’ or ‘equal’ (see matplotlib axes.set_aspect). A float stretches a circle to hight=aspect*width.
plot_type – str. Choose one of, “imshow” - Use matplotlib imshow to plot on grid. “pcolormesh” - Use matplotlib pcolormesh to plot on grid. “contour” - Use matplotlib contour to plot on grid. “contourf” - Use matplotlib contourf to plot on grid. “tripcolor” - Use matplotlib tripcolor to plot raw points. “tripcontour” - Use matplotlib tripcontour to plot raw points. “tripcontourf” - Use matplotlib tripcontourf to plot raw points.
cbar – bool. Show colorbar.
grid_size – tuple. Size of grid used if no regular axes x (in order of xnames).
swap – bool. Swap x/y axes.
kwargs – dict. Valid pcolormesh/contour/contourf key arguments.
- Returns
figure.
- plot3d(ynames=None, filter={'psi': 1}, cbar=False, size=(600, 600), plot_type='', phi=None, center=None, **kwargs)[source]¶
Three dimensional plots. Data with xnames r,z will be plotted with third dimension phi where Y = Re[y]cos(n*phi)+Im[y]sin(n*phi).
1D data > lines in a plane in 3-space.
2D data > z axis becomes y value
3D data > chosen display type
Must have mayavi.mlab module available in pythonpath.
- Parameters
ynames – list. Strings specifying dependent data displayed.
filter – dict. Filter points as only those closest to this y value.
cbar – bool. Display a vertical colorbar in the Mayavi Scene.
size – tuple. Size of figure in pixels.
plot_type – str. Valid mayavi.mlab function name. Overrides default function choices (plot3d,mesh,quiver3d).
phi – ndarray. Toroidal angle grid for complex 2D data. Default is 360 degrees, 180 pts.
center – float. Center colormap on this value.
- Returns
figure.
- scatter(ynames=None, xname=None, x1=None, x2=None, x3=None, **kwargs)[source]¶
Scatter plot display for non-regular data. Display mimics plot1d in t hat it displays a single independent axis. If is another independent axis exists, and is not sliced using one of the optional keyword arguments, it s represented as the color of the plotted symbols. If the data is 3D and unsliced, preference if given in order of xnames.
- Parameters
ynames – list. Dependent data to be displayed. Default is all y data.
xname – str. Independent axis used in scatter plots. Default is first in xnames.
x1 – float. Use only data with first axis clossest to value.
x2 – float. Use only data with second axis clossest to value.
x3 – float. Use only data with thrid axis clossest to value.
- Returns
figure.
- slice(x, xname=None, ynames=[])[source]¶
UNDER CONSTRUCTION
Return 1D data object from 2D. Method grids new data, so beware of loosing accuracy near rational surfaces.
- Parameters
x – float. Lower bound of the axis.
xname – str. Axis allong which slice is performed.
- Returns
obj. New data nd-1 dimensional data object.
- pypec.data.getshot(path='.', full_name=False)[source]¶
Find 6 digit shot number in path to directory or file.
- Parameters
d – str. Path.
- Returns
str. Shot number if found, ‘’ if not.
- pypec.data.open_dataset(filename_or_obj, complex_dim='i', **kwargs)[source]¶
Wrapper for xarray.open_dataset that allows automated reduction of a dimension distinguishing real and imaginary components.
- Parameters
filename_or_obj – str, file or xarray.backends.DataStore Strings are interpreted as a path to a netCDF file or an OpenDAP URL and opened with python-netCDF4, unless the filename ends with .gz, in which case the file is gunzipped and opened with scipy.io.netcdf (only netCDF3 supported). File-like objects are opened with scipy.io.netcdf (only netCDF3 supported).
complex_dim – str. Dimension designating real/imaginary (0,1)
kwargs – dict. All other key word arguments are passed to xarray.open_dataset.
Original Documentation:
Open and decode a dataset from a file or file-like object.
Parameters
- filename_or_objstr, Path, file-like or DataStore
Strings and Path objects are interpreted as a path to a netCDF file or an OpenDAP URL and opened with python-netCDF4, unless the filename ends with .gz, in which case the file is gunzipped and opened with scipy.io.netcdf (only netCDF3 supported). Byte-strings or file-like objects are opened by scipy.io.netcdf (netCDF3) or h5py (netCDF4/HDF).
- engine{“netcdf4”, “scipy”, “pydap”, “h5netcdf”, “pynio”, “cfgrib”, “pseudonetcdf”, “zarr”} or subclass of xarray.backends.BackendEntrypoint, optional
Engine to use when reading files. If not provided, the default engine is chosen based on available dependencies, with a preference for “netcdf4”. A custom backend class (a subclass of
BackendEntrypoint
) can also be used.- chunksint or dict, optional
If chunks is provided, it is used to load the new dataset into dask arrays.
chunks=-1
loads the dataset with dask using a single chunk for all arrays.chunks={}
loads the dataset with dask using engine preferred chunks if exposed by the backend, otherwise with a single chunk for all arrays.chunks='auto'
will use daskauto
chunking taking into account the engine preferred chunks. See dask chunking for more details.- cachebool, optional
If True, cache data loaded from the underlying datastore in memory as NumPy arrays when accessed to avoid reading from the underlying data- store multiple times. Defaults to True unless you specify the chunks argument to use dask, in which case it defaults to False. Does not change the behavior of coordinates corresponding to dimensions, which always load their data from disk into a
pandas.Index
.- decode_cfbool, optional
Whether to decode these variables, assuming they were saved according to CF conventions.
- mask_and_scalebool, optional
If True, replace array values equal to _FillValue with NA and scale values according to the formula original_values * scale_factor + add_offset, where _FillValue, scale_factor and add_offset are taken from variable attributes (if they exist). If the _FillValue or missing_value attribute contains multiple values a warning will be issued and all array values matching one of the multiple values will be replaced by NA. mask_and_scale defaults to True except for the pseudonetcdf backend. This keyword may not be supported by all the backends.
- decode_timesbool, optional
If True, decode times encoded in the standard NetCDF datetime format into datetime objects. Otherwise, leave them encoded as numbers. This keyword may not be supported by all the backends.
- decode_timedeltabool, optional
If True, decode variables and coordinates with time units in {“days”, “hours”, “minutes”, “seconds”, “milliseconds”, “microseconds”} into timedelta objects. If False, leave them encoded as numbers. If None (default), assume the same value of decode_time. This keyword may not be supported by all the backends.
- use_cftime: bool, optional
Only relevant if encoded dates come from a standard calendar (e.g. “gregorian”, “proleptic_gregorian”, “standard”, or not specified). If None (default), attempt to decode times to
np.datetime64[ns]
objects; if this is not possible, decode times tocftime.datetime
objects. If True, always decode times tocftime.datetime
objects, regardless of whether or not they can be represented usingnp.datetime64[ns]
objects. If False, always decode times tonp.datetime64[ns]
objects; if this is not possible raise an error. This keyword may not be supported by all the backends.- concat_charactersbool, optional
If True, concatenate along the last dimension of character arrays to form string arrays. Dimensions will only be concatenated over (and removed) if they have no corresponding variable and if they are only used as the last dimension of character arrays. This keyword may not be supported by all the backends.
- decode_coordsbool or {“coordinates”, “all”}, optional
Controls which variables are set as coordinate variables:
“coordinates” or True: Set variables referred to in the
'coordinates'
attribute of the datasets or individual variables as coordinate variables.“all”: Set variables referred to in
'grid_mapping'
,'bounds'
and other attributes as coordinate variables.
- drop_variables: str or iterable, optional
A variable or list of variables to exclude from being parsed from the dataset. This may be useful to drop variables with problems or inconsistent values.
- backend_kwargs: dict
Additional keyword arguments passed on to the engine open function, equivalent to **kwargs.
- **kwargs: dict
Additional keyword arguments passed on to the engine open function. For example:
‘group’: path to the netCDF4 group in the given file to open given as a str,supported by “netcdf4”, “h5netcdf”, “zarr”.
‘lock’: resource lock to use when reading data from disk. Only relevant when using dask or another form of parallelism. By default, appropriate locks are chosen to safely read and write files with the currently active dask scheduler. Supported by “netcdf4”, “h5netcdf”, “scipy”, “pynio”, “pseudonetcdf”, “cfgrib”.
See engine open function for kwargs accepted by each specific engine.
Returns - dataset : Dataset
The newly created dataset.
Notes -
open_dataset
opens the file with read-only access. When you modify values of a Dataset, even one linked to files on disk, only the in-memory copy you are manipulating in xarray is modified: the original file on disk is never touched.See Also
open_mfdataset
- pypec.data.plotall(results, xfun, yfun, label='', axes=None, **kwargs)[source]¶
Line plot of data gethered recursively from a dictionary containing multiple namelists or data objects.
- Parameters
results – dict. A readall result.
xfun – func. A function that takes args (key,val) for each dictionary item and whose result is appended to the xaxis.
yfun – func. A function that takes args (key,val) for each dictionary item and whose result is appended to the yaxis.
axes – obj. matplotlib axes object.
kwargs – dict. Passed to matplotlib plot function
- Returns
figure.
- pypec.data.read(fname, squeeze=False, forcex=[], forcedim=[], maxnumber=999, maxlength=1000000.0, auto_complex=True, quiet=False)[source]¶
Get the data from any gpec output as a list of python class-type objects using numpy.genfromtxt.
- Parameters
fname – str. Path of gpec output file.
squeeze – bool. Sets all attributes to single data object.
forcex – list. Set independent data labels.
forcedim – list. Set dimensionality of each x.
maxnumber – int. Reads only first maxnumber of tables.
maxlength – int. Tables with more rows than this are downsampled for speed.
auto_complex – bool. Include amplitude and phase of complex data as variables.
quiet – bool. Prevent non-warning messages printed to terminal.
- Returns
list. Class objects for each block of data in the file.
- pypec.data.readall(base='.', filetype='pent_n1.out', **kwargs)[source]¶
Recursively searches base directory for files with name filetype, reads them into python objects, and stores objects in a dictionary with directory names as keys.
- Parameters
base – str. Top level directory in which to begin search.
filetype – str. Files to read.
kwargs – dict. Passed to appropriate reader function.
- Returns
dict. Key-Value pairs are directories and their contained data.
- pypec.data.write(dataobj, fname='', ynames=[], **kwargs)[source]¶
Write data object to file. The aim is to replicate the original file structure… the header text will have been lost in the read however, as will some column label symbols in the use of genfromtxt.
- Parameters
dataobj – obj. Data_Base type object.
fname – str. File to write to.
kwargs – dict. Passed to numpy.savetxt
- Returns
bool. True.
pypec.post
– Post-Processing Fundamental GPEC Outputs¶
The post module is a repository of basic post-processing functions for expanding, manipulating, and visualizing fundamental gpec output. This is not an exhaustive list of post-processing procedures. The focus is on a) transforming between coordinate systems and b) matrix optimizations.
Here, we show some examples using common outputs.
Examples¶
First, add your release of GPEC to your PYTHONPATH environment variable.
In an open ipython session, import the data module
>>> from pypec import data, post
To get a typical GPEC output into python, use the open_dataset function.
>>> con = data.open_dataset('examples/DIIID_ideal_example/gpec_control_output_n1.nc')
>>> prof = data.open_dataset('examples/DIIID_ideal_example/gpec_profile_output_n1.nc')
First, 3D equilibrium calculations lend themselves naturally to 3D figures. This module has a helpful function for forming the necessary x,y,z meshes from the 2D boundary. After adding the requisite geometry to the control surface output,
>>> con = add_3dsurface(con, phi_lim=pi * 1.5)
it is easy to make 3D surface plots using mayavi,
>>> x,y,z = con['x_surf'].values,con['y_surf'].values,con['z_surf'].values
>>> b_n_3d = np.real(con['b_n_fun']*con['phase_surf'])
>>> fig = mmlab.figure(bgcolor=(1,1,1),fgcolor=(0,0,0),size=(600,600))
>>> mesh = mmlab.mesh(x,y,z,scalars=b_n_3d,colormap='RdBu',vmin=-2e-3,vmax=2e-3,figure=fig)
Note that mayavi explicitly requires numpy.ndarray objects so we had to use the values of each xarray DataArray.
Now, lets cap the ends with our full profile information. This is a little more involved. For the toroidal angle of each end, we need to rotate the 2D arrays for the mesh and the field values.
>>> for phi in [0,np.pi*3/2]:
... b = np.real(prof['b_n_fun']*np.exp(-1j*1*phi)) # poloidal cross section at phi
... xy = prof['R']*np.exp(1j*phi) # rotate the plane
... x,y,z = np.real(xy),np.imag(xy),np.real(prof['z'])
... x,y,z,b = np.array([x,y,z,b])[:,:,::10] # downsample psi by 10 to speed up mayavi
... cap = mmlab.mesh(x,y,z,scalars=b,colormap='RdBu',vmin=-2e-3,vmax=2e-3,figure=fig)
>>> mmlab.savefig(filename='examples/figures/example_bn_3d_capped.png',figure=fig)
That looks awesome!
There are, of course, plenty more outputs and important ways of visualizing them intuitively. See, for example, the 2D plots in the add_3dsurface function’s documentation. At this point we have outlined the basics. Using this module and similar commands to those used here, the user should be able to do all sorts of post-processing tasks.
- pypec.post.add_3dsurface(control_output, phi_lim=6.283185307179586, nphi=180, overwrite=False, inplace=False)[source]¶
Add 3D geometric dimensions to dataset from gpec_control_output_n#.nc.
Note, this surface uses the machine toroidal angle. It should only by used with functions that do the same (tmag_out=0).
- Parameters
control_output – Dataset. xarray Dataset opened from gpec_control_output_n#.nc
phi_lim – float. Toroidal angle extended from 0 to phi_lim radians.
nphi – int. Length of toroidal angle dimension array.
overwrite – bool. Overwrite geometric quantities if they already exist.
inplace – bool. Modify dataset inplace. Otherwise, return a new dataset.
- Returns
Dataset. New dataset with additional dimensions.
- Examples
After opening, adding geometry and confirming the functional forms are using the machine angle,
>>> con = data.open_dataset('examples/DIIID_ideal_example/gpec_control_output_n1.nc') >>> con = add_3dsurface(con) >>> con = add_fun(con,tmag=False)
it is easy to make 3D surface plots using mayavi,
>>> fig = mmlab.figure(bgcolor=(1, 1, 1), fgcolor=(0,0,0), size=(600, 600)) >>> mesh = mmlab.mesh(con['x_surf'].values,con['y_surf'].values,con['z_surf'].values, ... scalars=np.real(con['b_n_fun'] * con['phase_surf']), colormap='RdBu', figure=fig) >>> mmlab.savefig(filename='examples/figures/example_bn_3d.png', figure=fig)
Make 2D perturbed last-closed-flux-surface plots quickly using,
>>> fac = 1e-1 # good for DIII-D unit eigenmodes >>> f,ax = plt.subplots() >>> ax.set_aspect('equal') >>> l, = ax.plot(con['R'], con['z'], color='k') >>> for i in range(4): # most and least stable modes ... v = np.real(con['R_xe_eigenvector_fun'][i] * fac) ... l, = ax.plot(con['R'] + v * con['R_n'], con['z'] + v * con['z_n']) >>> sm = plt.set_linearray(ax.lines[-4:], cmap='viridis') >>> cb = f.colorbar(sm, ticks=range(4)) >>> cb.set_label('Mode Index') >>> xtl = ax.set_xticks([1, 1.7, 2.4]) >>> txt = ax.set_xlabel('R (m)') >>> txt = ax.set_ylabel('z (m)') >>> f.savefig('examples/figures/example_boundary_wiggle.png')
Combining these two concepts, we can even plot a 3D perturbed surface distorted by the normal displacements and colored to show the normal field.
>>> fig = mmlab.figure(bgcolor=(1, 1, 1), fgcolor=(0,0,0), size=(600, 600)) >>> xinorm = 10 * con['phase_surf'] # good for ~cm displacements >>> x = np.real(con['x_surf'].values + con['xi_n_fun'] * xinorm * con['x_surf_n']) >>> y = np.real(con['y_surf'].values + con['xi_n_fun'] * xinorm * con['y_surf_n']) >>> z = np.real(con['z_surf'].values + con['xi_n_fun'] * xinorm * con['z_surf_n']) >>> s = np.real(con['b_n_fun'] * con['phase_surf']) >>> mesh = mmlab.mesh(x,y,z,scalars=s, colormap='RdBu', figure=fig) >>> mmlab.savefig(filename='examples/figures/example_bn_xin_3d.png', figure=fig)
Fancy!
- pypec.post.add_fun(control_output, keys=None, tmag=False, inplace=True)[source]¶
Add real space functions of theta calculated from the spectral outputs. This is helpful when analyzing a GPEC run for which fun_flag was off for speed/efficiency.
- Parameters
control_output – Dataset. xarray Dataset opened from gpec_control_output_n#.nc
keys – iterable. List of keys for which <key>_fun will be added. Default is all spectral variables.
tmag – bool. Keep the magnetic coordinate angle (default is machine angle).
inplace – bool. Add to dataset. Otherwise, make a new dataset object.
- Returns
Dataset. New dataset with real space functions of theta.
- Example
You can check this against fun_flag outputs,
>>> con = data.open_dataset('examples/DIIID_kinetic_example/gpec_control_output_n1.nc') >>> con_mag = post.add_fun(con, tmag=True, inplace=False) >>> con_mac = post.add_fun(con, tmag=False, inplace=False) >>> f,ax = plt.subplots(2) >>> l, = ax[0].plot(con['theta'], np.abs(con['xi_n_fun']), label='fun_flag Output') >>> l, = ax[0].plot(con['theta'], np.abs(con_mag['xi_n_fun']), label='Magnetic Angle') >>> l, = ax[0].plot(con['theta'], np.abs(con_mac['xi_n_fun']), ls='--', label='Machine Angle') >>> leg = ax[0].legend() >>> txt = ax[0].set_title('Normal Displacement') >>> txt = ax[0].set_ylabel('Amplitude') >>> l, = ax[1].plot(con['theta'], np.angle(con['xi_n_fun']), label='fun_flag Output') >>> l, = ax[1].plot(con['theta'], np.angle(con_mag['xi_n_fun']), label='Magnetic Angle') >>> l, = ax[1].plot(con['theta'], np.angle(con_mac['xi_n_fun']), ls='--', label='Machine Angle') >>> txt = ax[1].set_ylabel('Phase') >>> txt = ax[1].set_xlabel('theta') >>> f.savefig('examples/figures/example_add_fun.png')
To see the function is properly recovered. Note that the fun_flag outputs are on the machine coordinate.
- pypec.post.optimize_torque(matrixprofile, psilow=0, psihigh=1, normalize=False, energy=False, minimize=False)[source]¶
Calculate the eigenvalue and eigenvector corresponding to maximum the torque within the specified range of normalized poloidal flux.
- Parameters
matrixprofile – DataArray. T_x or T_coil from gpec_profile_output_n#.nc.
psilow – float. Lower limit of normalized poloidal flux.
psihigh – float. Upper limit of normalized poloidal flux.
normalize – bool. Optimize value within psi range normalized to the total.
energy – bool. Optimize 2ndW (default is torque).
minimize – bool. FInd the minimum (default is maximum).
- Returns
w,v. Eigenvalue, eigenvector of optimum.
- Note
The matrix order will be forced to end in the primed dimension, i.e. (‘psi_n’, ‘m’, ‘m_prime’) or (‘psi_n’, ‘coil_index’, ‘coil_index_prime’).
- Examples
With the output of a kinetic GPEC run in hand,
>>> con = data.open_dataset('examples/DIIID_kinetic_example/gpec_control_output_n1.nc') >>> prof = data.open_dataset('examples/DIIID_kinetic_example/gpec_profile_output_n1.nc')
This function provides the spectrum that optimizes the integral torque and the corresponding profile is easily obtained.
>>> w,v = optimize_torque(prof['T_xe']) # handles any order of dims >>> T_mat = prof['T_xe'].transpose('psi_n', 'm', 'm_prime') # order matters in dot products >>> T_opt = np.real(np.dot(np.dot(v.conj(), T_mat), v))/2 # 0.5 for quadratic fourier analysis
We can then compare the spectrum and torque from our GPEC perturbed equilibrium to the theoretical optima,
>>> f,ax = plt.subplots(2) >>> input_power = np.dot(con['Phi_xe'].conj(), con['Phi_xe']) >>> l = np.abs(con['Phi_xe']).plot(ax=ax[0], label='Input') >>> l, = ax[0].plot(con['m'], np.abs(v) * np.sqrt(input_power), label='Optimum Torque') >>> leg = ax[0].legend() >>> l = prof['T'].real.plot(ax=ax[1], label='Input') >>> l, = ax[1].plot(prof['psi_n'], T_opt*input_power, label='Optimized Torque') >>> leg = ax[1].legend()
>>> f.savefig('examples/figures/example_optimize_torque.png')
pypec.gpec
– Python Wrappers for FORTRAN Codes¶
This is a collection of wrapper functions for running DCON, GPEC, and PENT.
If run from the command line, this module will call an associated GUI.
This module is for more advanced operators who want to control/run the fortran package for DCON, GPEC, and PENT. To get started, lets explore the package
>>> gpec.
gpec.InputSet gpec.join gpec.optimize gpec.run
gpec.bashjob gpec.namelist gpec.optntv gpec.subprocess
gpec.data gpec.np gpec.os gpec.sys
gpec.default gpec.ntv_dominantm gpec.packagedir gpec.time
many of these are simply standard python modules imported by gpec lets submit a full run in just 4 lines: steal the settings from a previous run (leaving run director as deafult)
>>> new_inputs = gpec.InputSet(indir='/p/gpec/users/nlogan/data/d3d/ipecout/145117/ico3_n3/')
change what we want to
>>> new_inputs['dcon']['DCON_CONTROL']['nn']=2
>>> new_dir = new_inputs.indir[:-2]+'2'
Double check that all inputs are what you expect (not shown here) and submit the job to the cluster.
>>> gpec.run(loc=new_dir,**new_inputs.infiles)
Running the package using this module will
Create the new location (if not already existing), and cd to it
rm all .out, .dat, and .bin files already in the directory (not dcon ones if rundcon=False, not pent ones for runpent=False, etc).
cp all .dat and .in files from the run directoy to the new location
Write all namelists supplied in kwargs to file (overwriting)
Write a unique bash script for the run, and submit it OR run each fortran routine from the commandline.
Remove all .dat and xdraw files from the directory
You will not that there is significant oportunity for overwriting files, and the user must be responsible for double checking that all inputs are correct and will not result in life crushing disaster for themselves or a friend when that super special file gets over-written.
- class pypec.gpec.InputSet(rundir='/p/gpec/users/nlogan/gpec/bin', indir='/p/gpec/users/nlogan/gpec/input')[source]¶
Bases:
object
Class designed to hold the standard inputs for a run.
- Attributes
rundir: str. Location of executables, .dat files, and extra .in files.
indir: str. Location of .in files read in and stored here
infiles: dict. Tree like storage of namelist objects.
- pypec.gpec.nustarscan(base='.', scale=(0.1, 1.0, 10.0), pentfile='pent.in', scalen=True, scalet=True, **kwargs)[source]¶
Run pent for series of approximately scaled collisionality values. Default scaling holds P=NT constant (NTV~P), but this is broken using scalen or scalet (scalet=False, for example, scales only density).
User may want to include a non-zero ximag if going to low collisionality.
User must specify full path to all files named in the pent input.
Kinetic file must have header with ‘n’ and ‘i’ in the ion density label but no other label, both ‘t’ and ‘e’ in only the electron temperature label, ‘omega’ omega_EXB label, etc.
Note
Approximate scaling ignores log-Lambda dependence, and uses nu_star ~ nu/v_th ~ (NT^-3/2)/(T^1/2) ~ N/T^2.
- Parameters
base – str. Top level directory containing dcon and gpec runs. Must contain euler.bin and vacuum.bin file.
scale – ndarray. The scale factors iterated over.
pentfile – str. Original input file.
scalen – bool.Use density scaling to change nu_star.
scalet – bool.Use temperature scaling to change nu_star.
- Returns
bool. True.
- pypec.gpec.omegascan(omega='wp', base='.', scale=(- 2, - 1, - 0.5, 0.5, 1, 2), pentrcfile='pentrc.in', **kwargs)[source]¶
Run pent for series of scaled nu/omega_phi/omega_E/omega_D/gamma_damp values. User must specify full path to all files named in pent.in input file located in the base directory. Note that the rotation profile is scaled by manipulating omega_E on each surface to obtain the scaled rotation.
- Parameters
omega – str. Choose from nu, wp, we, wd, or ga
base – str. Top level directory containing dcon and gpec runs. Must contain euler.bin and vacuum.bin file.
scale – ndarray. The scale factors iterated over.
pentfile – str. Original input file.
rundcon – bool. Set true if using hybrid kinetic MHD DCON. Will also attempt GPEC + PENT.
- Returns
bool. True.
- pypec.gpec.optntv(mlist, maxiter=50, loc='.', rundir='/p/gpec/users/nlogan/gpec/bin/', **kwargs)[source]¶
Python wrapper for running gpec package multiple times in a search for the normalized spectrum that maximizes NTV.
- Parameters
mlist – list. Integer poloidal modes included on plasma surface.
maxiter – int.Maximum number of iterations.
loc – str. Directory location for run.
rundir – str. GPEC package directory with executables and .dat files.
kwargs –
dict. namelist instance(s) written to <kwarg>.in file(s).
Note
Namelists taken from (in order of priority): 1) Key word arguments 2) .in files from loc 3) .in files from rundir
- Returns
bool. True.
- pypec.gpec.optpentrc(ms=range(- 5, 25), ttype='tgar', tfac=- 1, perp1=False, norm=0.001, qsub=True, method='Powell', maxiter=1000, loc='.', rundir='/p/gpec/users/nlogan/gpec/bin/', **kwargs)[source]¶
Python wrapper for running gpec packages multiple times in a search for the normalized spectrum that maximizes NTV.
Approach is as follows: 1) Run DCON and GPEC in base directory with singcoup_flag True 2) Run GPEC twice for each m, creating new subdirectories cosmn* and sinmn* 3) Optimize a functional wrapper for PENTRC using optimize.fmin_slsqp - Spectra constrained to have 1 Gm^2 norm - wrapper uses data package to linearly combine gpec_xclebsch outputs - Initial guess is the first singcoup_svd mode from GPEC
- Parameters
ms – list. Integer poloidal modes included on plasma surface.
ttype – str.PENTRC_OUTPUT flag (fgar,tgar,rlar,etc).
tfac – int.Optimization minimizes tfac*|T_phi|. Negative values maximize applied NTV.
perp1 – bool.Optimizes in space perpendicular to 1st GPEC SVD mode.
norm – float.Amplitude of applied area normalized spectrum Phi_x (Tesla meter^2).
qsub – bool.Submits individual jobs to cluster. Use gpec.run for qsub overall optimizer.
method – str.Method used in scipy.optimize.minimize.
maxiter – int.Maximum number of iterations.
loc – str. Directory location for run.
rundir – str. GPEC package directory with executables and .dat files.
kwargs –
dict. namelist instance(s) written to <kwarg>.in file(s).
Note
Namelists taken from (in order of priority): 1) Key word arguments 2) .in files from loc 3) .in files from rundir
- Returns
bool. True.
- pypec.gpec.run(loc='.', rundir='/p/gpec/users/nlogan/gpec/bin/', submit=True, return_on_complete=False, rerun=False, rundcon=True, rungpec=True, runpentrc=True, runrdcon=False, runstride=False, cleandcon=False, fill_inputs=False, version='1.3', mailon='NONE', email='', mem=3000.0, hours=36, partition='general', runipec=False, qsub=None, **kwargs)[source]¶
Python wrapper for running gpec package.
- Parameters
loc – str. Directory location for run.
rundir – str. GPEC package directory with executables and .dat files.
submit – bool. Submit job to cluster.
return_on_complete – bool. Return only after job is finished on cluster (irrelevant if qsub=False).
rerun – bool. Does not delete existing .out files.
rundcon – bool. Run dcon.
runipec – bool. Run ipec.
rungpec – bool. Run gpec.
runpentrc – bool. Run pentrc.
runrdcon – bool. Run rdcon and rmatch.
runstride – bool. Run stride.
cleandcon – bool. Remove euler.bin file after run is complete.
fill_inputs – bool. Use inputs from rundir (see kwargs).
version – str. Version of GPEC loaded (may set compilers if 1.2 or later)
mailon – str. Choose from NONE, BEGIN, END, FAIL, REQUEUE, or ALL.
email – str. Email address (default is submitting user).
mem – floatMemory request of q-submission in megabytes (converted to integer).
hours – int. Number of hours requested from job manager.
partition – str. Specify a specific computing queue.
kwargs – dict. namelist instance(s) written to <kwarg>.in file(s).
Note
Namelists taken from (in order of priority): 1) Key word arguments 2) .in files from loc 3) .in files from rundir
- Returns
bool. True.
pypec.namelist
– IO for FORTRAN Namelists¶
This module provides pythonic tools for reading and writing fortran namelists.
Examples¶
This module can be used to convert FORTRAN namelist files into python ordered dictionary type objects. The individual namelists are capitalized and their paramters are left as written. The actual parameter is thus stored two levels deep in the dictionary.
>>> nl=read('examples/example_namelist.in')
>>> nl.keys()
['GPEC_INPUT', 'GPEC_CONTROL', 'GPEC_OUTPUT', 'GPEC_DIAGNOSE']
>>> nl['GPEC_CONTROL'].keys()
['resp_index', 'sing_spot', 'reg_flag', 'reg_spot', 'chebyshev_flag', 'nche']
>>> nl['GPEC_CONTROL']['reg_spot']
0.05
The parameters can be changed, added, or deleted as in any python dictionary.
>>> nl['GPEC_CONTROL']['reg_spot'] = 0.01
>>> nl['GPEC_CONTROL']['reg_spot']
0.01
>>> del nl['GPEC_CONTROL']['nche']
>>> nl['GPEC_CONTROL'].keys()
['resp_index', 'sing_spot', 'reg_flag', 'reg_spot', 'chebyshev_flag']
>>> nl['GPEC_CONTROL']['nche'] = 30
The resulting namelists can then be re-written in ASCII format for use with the FORTAN codes.
>>> write(nl,'examples/example_namelist_edit.in')
True
Note
These examples can be tested by developers using ipython as follows:
In [1]: import pypec.namelist,doctest
In [2]: doctest.testmod(pypec.namelist,verbose=True)
- class pypec.namelist.Objectify(d)[source]¶
Bases:
object
Class base to convert iterable to instance.
- pypec.namelist.read(filename, out_type='dict', comment='!', old=False)[source]¶
Read in a file. Must have namelists in series, not embedded. Assumes name of list preceded by ‘&’ and list ends with a single backslash.
- Parameters
filename – str. Path to input namelist.
out_type – str. Can be either ‘dict’ or ‘obj’.
comment – str. Lines starting with any of these characters are considered annotations.
- Returns
dict (object). Top keys (attributes) are namelists with sub-key(-attribute) parameter values.
Note
The returned dict is actually “OrderedDict” type from collections module. If returning an object, the object is un-ordered.
pypec.modplot
– matplotlib Customizations¶
Collection of modified matplotlib functions and objects.
- Examples
Plot complex arguments.
>>> f,ax = subplots()
>>> lines = ax.plot(np.arange(10)*(1+0.5j),label='test')
>>> pyplot.show()
Automatically resize plots to fit labels.
>>> xlbl = ax.set_xlabel('X AXIS')
Plot huge data sets quickly.
>>> x = np.linspace(0,9,1e5)
>>> data = np.arange(1e5)/1.5e4+(np.random.rand(1e5)-0.5)
>>> newline = ax.plot(x,data,label='large data set')
This plots a line capped at 1000-pts by default. The maximum number of points is maintained as you manipulate the axis, so zooming in will provide you with new points and increased detail until the window samples fewer than that many points in the orginal data. The first two lines, for instance, contain only their original 10 points (not 1000 interpolated points).
- class pypec.modplot.MidPointNorm(midpoint=0, vmin=None, vmax=None, clip=False)[source]¶
Bases:
matplotlib.colors.Normalize
Taken from http://stackoverflow.com/questions/7404116/defining-the-midpoint-of-a-colormap-in-matplotlib
- pypec.modplot.add_inset_axes(ax, rect, axisbg='w', polar=False, **kwargs)[source]¶
Add a axes inset in another axes. Unlike the native matplotlib inset_axes, here the inset axes can be a polar axes.
Note that the parent axes should be drawn first if using autolayout (changes postion). The inset axes will not track the original if it is moved.
- Parameters
ax – Axes. Parent axes.
rect – list. Inset axes location and dimensions [left, bottom, width, height]
axisbg – str. Background color.
polar – bool. Polar plot.
kwargs – dict. Passed to add_subplot.
- Returns
Axes. A new inset axes.
- pypec.modplot.colorbar(mappable=None, cax=None, ax=None, use_gridspec=True, **kw)[source]¶
Modified pyplot colorbar for default use_gridspec=True.
ORIGINAL DOCUMENTATION
Add a colorbar to a plot.
Function signatures for the
pyplot
interface; all but the first are also method signatures for the ~.Figure.colorbar method:colorbar(**kwargs) colorbar(mappable, **kwargs) colorbar(mappable, cax=cax, **kwargs) colorbar(mappable, ax=ax, **kwargs)
- mappable
The matplotlib.cm.ScalarMappable (i.e., ~matplotlib.image.AxesImage, ~matplotlib.contour.ContourSet, etc.) described by this colorbar. This argument is mandatory for the .Figure.colorbar method but optional for the .pyplot.colorbar function, which sets the default to the current image.
Note that one can create a .ScalarMappable “on-the-fly” to generate colorbars not attached to a previously drawn artist, e.g.
fig.colorbar(cm.ScalarMappable(norm=norm, cmap=cmap), ax=ax)
- cax~matplotlib.axes.Axes, optional
Axes into which the colorbar will be drawn.
- ax~matplotlib.axes.Axes, list of Axes, optional
Parent axes from which space for a new colorbar axes will be stolen. If a list of axes is given they will all be resized to make room for the colorbar axes.
- use_gridspecbool, optional
If cax is
None
, a new cax is created as an instance of Axes. If ax is an instance of Subplot and use_gridspec isTrue
, cax is created as an instance of Subplot using thegridspec
module.
- colorbar~matplotlib.colorbar.Colorbar
See also its base class, ~matplotlib.colorbar.ColorbarBase.
Additional keyword arguments are of two kinds:
axes properties:
- fractionfloat, default: 0.15
Fraction of original axes to use for colorbar.
- shrinkfloat, default: 1.0
Fraction by which to multiply the size of the colorbar.
- aspectfloat, default: 20
Ratio of long to short dimensions.
- padfloat, default: 0.05 if vertical, 0.15 if horizontal
Fraction of original axes between colorbar and new image axes.
- anchor(float, float), optional
The anchor point of the colorbar axes. Defaults to (0.0, 0.5) if vertical; (0.5, 1.0) if horizontal.
- panchor(float, float), or False, optional
The anchor point of the colorbar parent axes. If False, the parent axes’ anchor will be unchanged. Defaults to (1.0, 0.5) if vertical; (0.5, 0.0) if horizontal.
colorbar properties:
Property
Description
extend
{‘neither’, ‘both’, ‘min’, ‘max’} If not ‘neither’, make pointed end(s) for out-of- range values. These are set for a given colormap using the colormap set_under and set_over methods.
extendfrac
{None, ‘auto’, length, lengths} If set to None, both the minimum and maximum triangular colorbar extensions with have a length of 5% of the interior colorbar length (this is the default setting). If set to ‘auto’, makes the triangular colorbar extensions the same lengths as the interior boxes (when spacing is set to ‘uniform’) or the same lengths as the respective adjacent interior boxes (when spacing is set to ‘proportional’). If a scalar, indicates the length of both the minimum and maximum triangular colorbar extensions as a fraction of the interior colorbar length. A two-element sequence of fractions may also be given, indicating the lengths of the minimum and maximum colorbar extensions respectively as a fraction of the interior colorbar length.
extendrect
bool If False the minimum and maximum colorbar extensions will be triangular (the default). If True the extensions will be rectangular.
spacing
{‘uniform’, ‘proportional’} Uniform spacing gives each discrete color the same space; proportional makes the space proportional to the data interval.
ticks
None or list of ticks or Locator If None, ticks are determined automatically from the input.
format
None or str or Formatter If None, ~.ticker.ScalarFormatter is used. If a format string is given, e.g., ‘%.3f’, that is used. An alternative ~.ticker.Formatter may be given instead.
drawedges
bool Whether to draw lines at color boundaries.
label
str The label on the colorbar’s long axis.
The following will probably be useful only in the context of indexed colors (that is, when the mappable has norm=NoNorm()), or other unusual circumstances.
Property
Description
boundaries
None or a sequence
values
None or a sequence which must be of length 1 less than the sequence of boundaries. For each region delimited by adjacent entries in boundaries, the color mapped to the corresponding value in values will be used.
If mappable is a ~.contour.ContourSet, its extend kwarg is included automatically.
The shrink kwarg provides a simple way to scale the colorbar with respect to the axes. Note that if cax is specified, it determines the size of the colorbar and shrink and aspect kwargs are ignored.
For more precise control, you can manually specify the positions of the axes objects in which the mappable and the colorbar are drawn. In this case, do not use any of the axes properties kwargs.
It is known that some vector graphics viewers (svg and pdf) renders white gaps between segments of the colorbar. This is due to bugs in the viewers, not Matplotlib. As a workaround, the colorbar can be rendered with overlapping segments:
cbar = colorbar() cbar.solids.set_edgecolor("face") draw()
However this has negative consequences in other circumstances, e.g. with semi-transparent images (alpha < 1) and colorbar extensions; therefore, this workaround is not used by default (see issue #1188).
- pypec.modplot.figure(num=None, figsize=None, dpi=None, facecolor=None, edgecolor=None, frameon=True, FigureClass=<class 'matplotlib.figure.Figure'>, **kwargs)[source]¶
- pypec.modplot.line_downsample(line, npts=1000)[source]¶
Downsample line data for increased plotting speed.
- Key Word Arguments:
- nptsint.
Number of data points within axes xlims.
- pypec.modplot.onkey(event)[source]¶
Function to connect key_press_event events from matplotlib to custom functions.
Matplotlib defaults (may be changed in matplotlibrc):
keymap.fullscreen : f # toggling keymap.home : h, r, home # home or reset mnemonic keymap.back : left, c, backspace # forward / backward keys to enable keymap.forward : right, v # left handed quick navigation keymap.pan : p # pan mnemonic keymap.zoom : o # zoom mnemonic keymap.save : s # saving current figure keymap.quit : ctrl+w # close the current figure keymap.grid : g # switching on/off a grid in current axes keymap.yscale : l # toggle scaling of y-axes (‘log’/’linear’) keymap.xscale : L, k # toggle scaling of x-axes (‘log’/’linear’) keymap.all_axes : a # enable all axes
My custom function mapping:
popaxes : n #Creates new figure with current axes tighten : t #Call tight_layout bound method for figure
- pypec.modplot.plot_axes(ax, fig=None, geometry=(1, 1, 1))[source]¶
Re-create a given axis in a new figure. This allows, for instance, a subplot to be moved to its own figure where it can be manipulated and/or saved independent of the original.
- Arguments:
- axobj.
An initialized Axes object
- Key Word Arguments:
- fig: obj.
A figure in which to re-create the axis
- geometrytuple.
Axes geometry of re-created axis
- returns
Figure.
- pypec.modplot.png_to_gif(files, gif_file, delay=20, loop=0, clean=False)[source]¶
gif_file should end in .gif
- pypec.modplot.printlines(self, filename, labeled_only=False, squeeze=False)[source]¶
Print all data in line plot(s) to text file.The x values will be taken from the line with the greatest number of points in the (first) axis, and other lines are interpolated if their x values do not match. Column lables are the line labels and xlabel.
- Arguments:
- filenamestr.
Path to print to.
- Key Word Arguments:
- labeled_onlybool
Only print labeled lines to file.
- squeezebool.
Print lines from all axes in a single table.
- returns
- bool.
True.
- pypec.modplot.set_linearray(lines, values=None, cmap=None, vmin=None, vmax=None)[source]¶
Set colors of lines to colormaping of values.
Other good sequential colormaps are YlOrBr and autumn. A good diverging colormap is bwr.
- Arguments:
lines : list. Lines to get set colors.
- Key Word Arguments:
values : array like. Values corresponding to each line. Default is indexing.
cmap : str. Valid matplotlib colormap name.
vmax : float. Upper bound of colormaping.
vmin : float. Lower bound of colormaping.
- Returns:
ScalarMappable. A mapping object used for colorbars.
- pypec.modplot.set_style(style=None, rc={})[source]¶
Seaborn set_style with additional ‘thick’,
Thick (thin) style multiplies rcParams axes.linewidth, lines.linewidth, xtick.major.width, xtick.minor.width, ytick.major.width, and ytick.minor.width by 2 (0.5).
- pypec.modplot.shiftedColorMap(cmap, start=0, midpoint=0.5, stop=1.0, name='shiftedcmap')[source]¶
Function to offset the “center” of a colormap. Useful for data with a negative min and positive max and you want the middle of the colormap’s dynamic range to be at zero.
Taken from http://stackoverflow.com/questions/7404116/defining-the-midpoint-of-a-colormap-in-matplotlib
- Args:
cmap: The matplotlib colormap to be altered start: Offset from lowest point in the colormap’s range. Should be between 0 and ‘midpoint’. midpoint: The new center of the colormap. Should be between 0.0 and 1.0. Recommended is 1 - vmax/(vmax + abs(vmin)). stop: Offset from highest point in the colormap’s range. Should be between ‘midpoint’ and 1.0. name: str. Added to the name of cmap when creating the new colormap.
Example:
The following creates a shifted colormap that can be used in matplotlib arguments,
>>> orig_cmap = matplotlib.cm.coolwarm >>> shifted_cmap = shiftedColorMap(orig_cmap, midpoint=0.75, name='shifted')
- pypec.modplot.subplots(nrows=1, ncols=1, sharex=False, sharey=False, squeeze=True, autosize=True, subplot_kw=None, powerlim=(- 3, 3), useOffset=False, nybins=None, nxbins=None, **fig_kw)[source]¶
Matplotlib subplots with default figsize = rcp_size*[ncols,nrows] fig_kw.setdefault(‘figsize’,figsize).
- Additional Key Word Arguments:
- powlimtuple.
Axis labels use scientific notion above 10^power.
- useOffsetbool.
Axis labels use offset if range<<average.
Accepts standargs args and kwargs for pyplot.subplots.
ORIGINAL DOCUMENTATION
Create a figure and a set of subplots.
This utility wrapper makes it convenient to create common layouts of subplots, including the enclosing figure object, in a single call.
- nrows, ncolsint, default: 1
Number of rows/columns of the subplot grid.
- sharex, shareybool or {‘none’, ‘all’, ‘row’, ‘col’}, default: False
Controls sharing of properties among x (sharex) or y (sharey) axes:
True or ‘all’: x- or y-axis will be shared among all subplots.
False or ‘none’: each subplot x- or y-axis will be independent.
‘row’: each subplot row will share an x- or y-axis.
‘col’: each subplot column will share an x- or y-axis.
When subplots have a shared x-axis along a column, only the x tick labels of the bottom subplot are created. Similarly, when subplots have a shared y-axis along a row, only the y tick labels of the first column subplot are created. To later turn other subplots’ ticklabels on, use ~matplotlib.axes.Axes.tick_params.
- squeezebool, default: True
If True, extra dimensions are squeezed out from the returned array of ~matplotlib.axes.Axes:
if only one subplot is constructed (nrows=ncols=1), the resulting single Axes object is returned as a scalar.
for Nx1 or 1xM subplots, the returned object is a 1D numpy object array of Axes objects.
for NxM, subplots with N>1 and M>1 are returned as a 2D array.
If False, no squeezing at all is done: the returned Axes object is always a 2D array containing Axes instances, even if it ends up being 1x1.
- subplot_kwdict, optional
Dict with keywords passed to the ~matplotlib.figure.Figure.add_subplot call used to create each subplot.
- gridspec_kwdict, optional
Dict with keywords passed to the ~matplotlib.gridspec.GridSpec constructor used to create the grid the subplots are placed on.
- **fig_kw
All additional keyword arguments are passed to the .pyplot.figure call.
fig : ~.figure.Figure
- ax.axes.Axes or array of Axes
ax can be either a single ~matplotlib.axes.Axes object or an array of Axes objects if more than one subplot was created. The dimensions of the resulting array can be controlled with the squeeze keyword, see above.
Typical idioms for handling the return value are:
# using the variable ax for single a Axes fig, ax = plt.subplots() # using the variable axs for multiple Axes fig, axs = plt.subplots(2, 2) # using tuple unpacking for multiple Axes fig, (ax1, ax2) = plt.subplot(1, 2) fig, ((ax1, ax2), (ax3, ax4)) = plt.subplot(2, 2)
The names
ax
and pluralizedaxs
are preferred overaxes
because for the latter it’s not clear if it refers to a single ~.axes.Axes instance or a collection of these.
.pyplot.figure .pyplot.subplot .pyplot.axes .Figure.subplots .Figure.add_subplot
# First create some toy data: x = np.linspace(0, 2*np.pi, 400) y = np.sin(x**2) # Create just a figure and only one subplot fig, ax = plt.subplots() ax.plot(x, y) ax.set_title('Simple plot') # Create two subplots and unpack the output array immediately f, (ax1, ax2) = plt.subplots(1, 2, sharey=True) ax1.plot(x, y) ax1.set_title('Sharing Y axis') ax2.scatter(x, y) # Create four polar axes and access them through the returned array fig, axs = plt.subplots(2, 2, subplot_kw=dict(polar=True)) axs[0, 0].plot(x, y) axs[1, 1].scatter(x, y) # Share a X axis with each column of subplots plt.subplots(2, 2, sharex='col') # Share a Y axis with each row of subplots plt.subplots(2, 2, sharey='row') # Share both X and Y axes with all subplots plt.subplots(2, 2, sharex='all', sharey='all') # Note that this is the same as plt.subplots(2, 2, sharex=True, sharey=True) # Create figure number 10 with a single subplot # and clears it if it already exists. fig, ax = plt.subplots(num=10, clear=True)
regression.compare
– For Comparing Restults¶
Python modules for regression testing and benchmarking GPEC code developments.
- Author
N.C. Logan
- Location
Princeton Plasma Physics Laboratory
Interface¶
The compare submodules can be run as an executable from a linux terminal, and take one or more directory paths as arguments. Each check function will be called for the directories and the plots displayed. Be warned, this may create a large number of plots.
For example,
$ ./compare_dcons.py /p/gpec/GPEC-0.4/docs/examples/DIIID_example /p/gpec/GPEC-1.0/docs/examples/DIIID_ideal_example
Individual checks can be called from an interactive python environment (see pypec
).
For example,
$ ipython –pylab
>>> from regression.compare import compare_dcons
>>> data = compare_dcons.check_energies('/p/gpec/GPEC-0.4/docs/examples/DIIID_example','/p/gpec/GPEC-1.0/docs/examples/DIIID_ideal_example')
/p/gpec/GPEC-0.4/d… /p/gpec/GPEC-1.0/d…
vacuum +2.4850e+00 +2.5020e+00
>>> pyplot.gcf().savefig('examples/figures/example_compare_dcons.png')