ASCL.net

Astrophysics Source Code Library

Making codes discoverable since 1999

Welcome to the ASCL

The Astrophysics Source Code Library (ASCL) is a free online registry for source codes of interest to astronomers and astrophysicists and lists codes that have been used in research that has appeared in, or been submitted to, peer-reviewed publications. The ASCL is indexed by the SAO/NASA Astrophysics Data System (ADS) and is citable by using the unique ascl ID assigned to each code. The ascl ID can be used to link to the code entry by prefacing the number with ascl.net (i.e., ascl.net/1201.001).


Most Recently Added Codes

2017 Jun 23

[submitted] SASRST: Semi-Analytic Solutions for 1-D Radiative Shock Tubes

This small collection of Python scripts attempts to reproduce the semi-analytical one-dimensional equilibrium and non-equilibrium radiative shock tube solutions of Lowrie & Rauenzahn (2007, Shock Waves, 16, 445-453) and Lowrie & Edwards (2008, Shock Waves, 18, 129-143), respectively. The included code not only calculates the solution for a given set of input parameters, but also plots the results (using Matplotlib). This software was written to provide validation for numerical radiative shock tube solutions produced by a radiation hydrodynamics code, as exemplified in Ramsey & Dullemond (2015, A&A, 574, A81).

2017 Jun 22

[submitted] KERN

KERN is a bi-annually released set of radio astronomical software packages. It should contain most of the standard tools that a radio astronomer needs to work with radio telescope data. The goal of KERN to is to save time and frustration in setting up of scientific pipelines, and to assist in achieving scientific reproducibility.

[submitted] Kliko - The Scientific Compute Container Format

We present Kliko, a Docker based container specification for running one or multiple related compute jobs. The key concepts of Kliko is the encapsulation of data processing software into a container and the formalisation of the input, output and task parameters. Formalisation is realised by bundling a container with a Kliko file, which describes the IO and task parameters. This Kliko container can then be opened and run by a Kliko runner. The Kliko runner will parse the Kliko definition and gather the values for these parameters, for example by requesting user input or pre defined values in a script. Parameters can be various primitive types, for example float, int or the path to a file. This paper will also discuss the implementation of a support library named Kliko which can be used to create Kliko containers, parse Kliko definitions, chain Kliko containers in workflows using, for example, Luigi a workflow manager. The Kliko library can be used inside the container interact with the Kliko runner. Finally this paper will discuss two reference implementations based on Kliko: RODRIGUES, a web based Kliko container schedular and output visualiser specifically for astronomical data, and VerMeerKAT, a multi container workflow data reduction pipeline which is being used as a prototype pipeline for the commisioning of the MeerKAT radio telescope.

2017 Jun 21

[submitted] EXOSIMS: Exoplanet Open-Source Imaging Mission Simulator

EXOSIMS is an extensible, modular, open source software framework written in python for generating and analyzing end-to-end simulations of space-based exoplanet imaging missions. The software is built up of interconnecting modules describing different aspects of the mission including the observatory, the optical system, the scheduler (encoding mission rules) as well as the physical universe, including the assumed distribution of exoplanets and their physical and orbital properties. Each module has a prototype implementation that is inherited by specific implementations for different missions concepts, allowing for the simulation of widely variable missions. EXOSIMS development is supported by NASA Grant Nos. NNX14AD99G (GSFC), NNX15AJ67G (WPS) and NNG16PJ24C (SIT).

2017 Jun 20

[submitted] Light Curves Classifier

In this era of Big Data enormous amount of data are collected every day. Besides the others, the light curves are one of the most common product of the observations of the universe gathered by the astronomical instruments. The most fundamental task is to classify them - to identify what kind of objects are observed. Despite of the efforts to categorize particular light curves, there is no
tool which unifies the procedures related to the classification into one powerful instrument.
We present the Light Curves Classifier - ”self-learning” program which utilizes modern instruments of data mining and machine
learning in order to obtain and classify desired objects by using various methods. This task can be accomplished by attributes of
light curves (or any time series) - shapes, histograms, variograms etc, or also by other available information about the inspected
objects as color indexes, temperatures, abundances etc. After specifying of features which describe searched objects, the program
is capable to learn on given train sample. Moreover unsupervised clustering can be used for visualizing of natural separation of the
sample. The package can be also used for automatic tuning parameters of used methods (for example number of hidden neurons,
binning ratio, etc.).
Trained classifiers can be used for filtering of outputs from astronomical databases or data stored locally. Also this tool can
be used just for simple downloading of light curves and all available information of queried stars. There are several connectors
available - OgleII, OgleIII, ASAS, CoRoT, Kepler, Catalina and MACHO. Moreover there are no limits in applying new connectors
or descriptors. For example by using interfaces for TAP and Vizier database, new connectors are implemented just by few lines of
the code (e.g. MACHO connector is implemented just by 7 lines of the code).
All these databases have common interface which could be used for unified queries with the standardized output. Besides direct
usage of the package and command line UI, the program can be used thorough the web interface. Users can create jobs for ”training”
methods on given objects, querying databases and filtering outputs by trained filters. Preimplemented descriptors, classifier and
connectors can be picked by simple clicks and their parameters can be tuned by giving ranges of these values. All combinations
are then calculated and the best one is used for creating the filter. Natural separation of the data can be visualized by unsupervised
clustering. One can click on points representing objects in the feature space and visualize their light curves and other informations.
For the purposes of the visualization higher dimensions of features can be transformed by PCA.

2017 Jun 16

[submitted] PyMOC: Multi-Order Coverage map module for Python

PyMOC manipulates Multi-Order Coverage (MOC) maps. It supports reading and writing the three encodings mentioned in the IVOA MOC recommendation: FITS, JSON and ASCII.

2017 Jun 13

[ascl:1706.009] sick: Spectroscopic inference crank

sick infers astrophysical parameters from noisy observed spectra. Phenomena that can alter the data (e.g., redshift, continuum, instrumental broadening, outlier pixels) are modeled and simultaneously inferred with the astrophysical parameters of interest. This package relies on emcee (ascl:1303.002); it is best suited for situations where a grid of model spectra already exists, and one would like to infer model parameters given some data.

[ascl:1706.008] the-wizz: Clustering redshift estimation code

the-wizz clusters redshift estimates for any photometric unknown sample in a survey. The software is composed of two main parts: a pair finder and a pdf maker. The pair finder finds spatial pairs and stores the indices of all closer pairs around target reference objects in an output HDF5 data file. Users then query this data file using the indices of their unknown sample to produce an output clustering-z.

2017 Jun 11

[ascl:1706.007] encube: Large-scale comparative visualization and analysis of sets of multidimensional data

Encube is a qualitative, quantitative and comparative visualization and analysis framework, with application to high-resolution, immersive three-dimensional environments and desktop displays, providing a capable visual analytics experience across the display ecology. Encube includes mechanisms for the support of: 1) interactive visual analytics of sufficiently large subsets of data; 2) synchronous and asynchronous collaboration; and 3) documentation of the discovery workflow. The framework is modular, allowing additional functionalities to be included as required.

[ascl:1706.006] GenPK: Power spectrum generator

GenPK generates the 3D matter power spectra for each particle species from a Gadget snapshot. Written in C++, it requires both FFTW3 and GadgetReader.