ASCL.net

Astrophysics Source Code Library

Making codes discoverable since 1999

Browsing Codes

Results 801-900 of 3475 (3391 ASCL, 84 submitted)

Order
Title Date
 
Mode
Abstract Compact
Per Page
50100250All
[ascl:2110.009] Quokka: Two-moment AMR radiation hydrodynamics on GPUs for astrophysics

Quokka is a two-moment radiation hydrodynamics code that uses the piecewise-parabolic method, with AMR and subcycling in time. It runs on CPUs (MPI+vectorized) or NVIDIA GPUs (MPI+CUDA) with a single-source codebase. The hydrodynamics solver is an unsplit method, using the piecewise parabolic method for reconstruction in the primitive variables, the HLLC Riemann solver for flux computations, and a method-of-lines formulation for the time integration. The order of reconstruction is reduced in zones where shocks are detected in order to suppress spurious oscillations in strong shocks. Quokka's radiation hydrodynamics formulation is based on the mixed-frame moment equations. The radiation subsystem is coupled to the hydrodynamic subsystem via operator splitting, with the hydrodynamic update computed first, followed by the radiation update, with the latter update including the source terms corresponding to the radiation four-force applied to both the radiation and hydrodynamic variables. A method-of-lines formulation is also used for the time integration, with the time integration done by the same integrator chosen for the hydrodynamic subsystem.

[ascl:2110.008] ParSNIP: Parametrization of SuperNova Intrinsic Properties

ParSNIP learns generative models of transient light curves from a large dataset of transient light curves. It is designed to work with light curves in sncosmo format using the lcdata package to handle large datasets. This code can be used for classification of transients, cosmological distance estimation, and identifying novel transients.

[ascl:2110.007] PISCOLA: Python for Intelligent Supernova-COsmology Light-curve Analysis

PISCOLA (Python for Intelligent Supernova-COsmology Light-curve Analysis) fits supernova light curves and corrects them in a few lines of code. It uses Gaussian Processes to estimate rest-frame light curves of transients without needing an underlying light-curve template. The user can add filters, calculates the light-curves parameters, and obtain transmission functions for the observed filters and the Bessell filters. The correction process can be applied with default settings to obtain restframe light curves and light-curve parameters. PISCOLA can plot the SN light curves, filter transmission functions, light-curves fits results, the mangling function for a given phase, and includes several utilities that can, for example, convert fluxes to magnitudes and magnitudes to fluxes, and trim leading and trailing zeros from a 1-D array or sequence.

[ascl:2110.006] ArtPop: Artificial Stellar Populations generator

ArtPop (Artificial Stellar Populations) synthesizes stellar populations and simulates realistic images of stellar systems. The code is modular, making it possible to use each of its functionalities independently or together. ArtPop can build stellar populations independently from generating mock images, as one might want to do when interested only in calculating integrated photometric properties of the population. The code can also generate stellar magnitudes and artificial galaxies, which can be inject into real imaging data.

[ascl:2110.005] TauRunner: Code to propagate tau neutrinos at very high energies

TauRunner propagates ultra-high-energy neutrinos, with a focus on tau neutrinos. Although it was developed for extremely high energy (EeV+) applications, it is able to propagate neutrinos from 1 to 10^16 GeV. Oscillations are not taken into account at the lowest energies, but they become negligible above 1 TeV.

[ascl:2110.004] TULIPS: Tool for Understanding the Lives, Interiors, and Physics of Stars

TULIPS (Tool for Understanding the Lives, Interiors, and Physics of Stars) creates diagrams of the structure and evolution of stars. It creates plots and movies based on output from the MESA stellar evolution code (ascl:1010.083). TULIPS represents stars as circles of varying size and color. The code can also visualize the size and perceived color of stars, their interior mixing and nuclear burning processes, their chemical composition, and can compare different MESA models.

[ascl:2110.003] PSRDADA: Distributed Acquisition and Data Analysis for Radio Astronomy

PSRDADA supports the development of distributed data acquisition and analysis systems; it provides a flexible and well-managed ring buffer in shared memory with a variety of applications for piping data from device to ring buffer and from ring buffer to device. PSRDADA allows more than one data set to be queued in the ring buffer at one time, and data may be recorded in selected bursts using data validity flags. A variety of clients have been implemented that can write data to the ring buffer and read data from it. The primary write clients can be controlled via a simple, text-based socket interface, and read client software exists for writing data to an array of disks, sending data to an array of nodes, or processing the data directly from RAM. At the highest level of control and configuration, scripts launch the PSRDADA configuration across all nodes in the cluster, monitor all relevant processes, configure and control through a web-based interface, interface with observatory scheduling tools, and manage the ownership and archival of project data. It has been used in the implementation of baseband recording and processing instrumentation for radio pulsar astronomy.

[ascl:2110.002] exodetbox: Finding planet-star projected separation extrema and difference in magnitude extrema

Exodetbox provides mathematical methods for calculating the planet-star separation and difference in magnitude extrema as well as when planets have particular planet-star separations or differences in magnitude. The code also projects the 3D Keplerian Orbit into a reparameterized 2D ellipse in the plane of the sky. Exodetbox is implemented in the EXOSIMS modeling software (ascl:1706.010).

[ascl:2110.001] JWSTSim: Geometric-Focused JWST Deep Field Image Simulation

JWST_Simulation generates a novel geometric-focused deep field simulation of the expected JWST future deep field image. Galaxies are represented by ellipses with randomly-generated positions and orientations. Three scripts are included: a deterministic simulation, an ensemble simulation, and a more-realistic monochrome image simulation. The following initial conditions can be perturbed in these codes: H0, Ωm, ΩΛ, the dark energy equation of state parameter, the number of unseen galaxies in the Hubble Ultra Deep Field Image (HUDF), the increase in effective radius due to the JWST’s higher sensitivity, the anisotropy of dark energy, and the maximum redshift reached by the JWST. Galaxy number densities are estimated using integration over comoving volume with an integration constant calibrated with the Hubble Ultra Deep Field. A galaxy coverage percentage is calculated for each image to determine the percentage of the background occupied by galaxies.

[ascl:2109.030] Snowball: Generalizable atmospheric mass loss calculator

Snowball models atmospheric loss in order to constrain an atmosphere's cumulative impact of historic X-ray and extreme ultraviolet radiation-driven mass loss. The escape model interpolates the BaSTI luminosity evolution grid to the observed mass and luminosity of the host star.

[ascl:2109.029] BiPoS1: Dynamical processing of the initial binary star population

BiPoS1 (Binary Population Synthesizer) efficiently calculates binary distribution functions after the dynamical processing of a realistic population of binary stars during the first few Myr in the hosting embedded star cluster. It is particularly useful for generating a realistic birth binary population as an input for N-body simulations of globular clusters. Instead of time-consuming N-body simulations, BiPoS1 uses the stellar dynamical operator, which determines the fraction of surviving binaries depending on the binding energy of the binaries. The stellar dynamical operator depends on the initial star cluster density, as well as the time until the residual gas of the star cluster is expelled. At the time of gas expulsion, the dynamical processing of the binary population is assumed to effectively end due to the expansion of the star cluster related to that event. BiPoS1 has also a galactic-field mode, in order to synthesize the stellar population of a whole galaxy.

[ascl:2109.028] Healpix.jl: Julia-only port of the HEALPix library

Healpix.jl is a Julia-only port of the C/C++/Fortran/Python HEALPix library (ascl:1107.018), which implements a hierarchical pixelization of the sphere in equal-area pixels. Much like the original library, Healpix.jl supports two enumeration schemes for the pixels (RING and NESTED) and implements an optimized computation of the generalized Fourier transform using spherical harmonics, binding libsharp2 (ascl:1402.033). In addition, Healpix.jl provides four additional features: 1.) it fully supports Windows systems, alongside the usual Linux and MAC OS X machines; 2.) it uses Julia's strong typesystem to prevent several bugs related to mismatches in map ordering (e.g., combining a RING map with a NESTED map); 3.) it uses a versatile memory layout so that map bytes can be stored in shared memory objects or on GPUs; and 4.) it implements an elegant and general way to signal missing values in maps.

[ascl:2109.027] OSPREI: Sun-to-Earth (or satellite) CME simulator

OSPREI simulates the Sun-to-Earth (or satellite) behavior of CMEs. It is comprised of three separate models: ForeCAT, ANTEATR, and FIDO. ForeCAT uses the PFSS background to determine the external magnetic forces on a CME; ANTEATR takes the ForeCAT CME and propagates it to the final satellite distance, and outputs the final CME speed (both propagation and expansion), size, and shape (and their profiles with distance) as well as the arrival time and internal thermal and magnetic properties of the CME. FIDO takes the evolved CME from ANTEATR with the position and orientation from ForeCAT and passes the CME over a synthetic spacecraft. The relative location of the spacecraft within the CME determines the in situ magnetic field vector and velocity. It also calculates the Kp index from these values. OSPREI includes tools for creating figures from the results, including histograms, contour plots, and ensemble correlation plots, and new figures can be created using the results object that contains all the simulation data in an easily accessible format.

[ascl:2109.026] Varstar Detect: Variable star detection in TESS data

Varstar Detect uses several numerical and statistical methods to filter and interpret the data obtained from TESS. It performs an amplitude test to determine whether a star is variable and if so, provides the characteristics of each star through phenomenological analysis of the lightcurve.

[ascl:2109.025] Menura: Multi-GPU numerical model for space plasma simulation

Menura simulates the interaction between a fully turbulent solar wind and various bodies of the solar system using a novel two-step approach. It is an advanced numerical tool for self-consistent modeling that bridges planetary science and plasma physics. Menura is built around a hybrid Particle-In-Cell solver, treating electrons as a charge-neutralising fluid, and ions as massive particles. It solves iteratively the particles’ dynamics, gathers particle moments at the nodes of a grid, at which the magnetic field is also computed, and then solves the Maxwell equations. This solver uses the popular Current Advance Method (CAM).

[ascl:2109.024] BHJet: Semi-analytical black hole jet model

BHJet models steady-state SEDs of jets launched from accreting black holes. This semi-analytical, multi-zone jet model is applicable across the entire black hole mass scale, from black hole X-ray binaries (both low and high mass) to active galactic nuclei of any class (from low-luminosity AGN to flat spectrum radio quasars). It is designed to be more comparable than other codes to GRMHD simulations and/or RMHD semi-analytical solutions.

[ascl:2109.023] gphist: Cosmological expansion history inference using Gaussian processes

gphist performs Bayesian inference on the cosmological expansion history using Gaussian process priors. It is written in Python and includes driver programs to run inference calculations and plot the results. The code infers the cosmological expansion history using a Gaussian process prior, reads these ouputs, and performs checks to ensure they are indeed compatible. gphist then generates a single combined output file to plot expansion history inferences.

[ascl:2109.022] ShapeMeasurementFisherFormalism: Fisher Formalism for Weak Lensing

ShapeMeasurementFisherFormalism is used to study Fisher Formalism predictions on galaxy weak lensing for LSST Dark Energy Science Collaboration. It can create predictions with user-defined parameters for one or two galaxies simulated from GalSim (ascl:1402.009).

[ascl:2109.021] WeakLensingDeblending: Weak lensing fast simulations and analysis of blended objects

WeakLensingDeblending provides weak lensing fast simulations and analysis for the LSST Dark Energy Science Collaboration. It is used to study the effects of overlapping sources on shear estimation, photometric redshift algorithms, and deblending algorithms. Users can run their own simulations (of LSST and other surveys) or download the galaxy catalog and simulation outputs to use with their own code.

[ascl:2109.020] SNEWPY: Supernova Neutrino Early Warning Models for Python

SNEWPY uses simulated supernovae data to generate a time series of neutrino spectral fluences at Earth or the total time-integrated spectral fluence. The code can also process generated data through SNOwGLoBES (ascl:2109.019) and collate its output into the observable channels of each detector. Data from core-collapse, thermonuclear, and pair-instability supernovae simulations are included in the package.

[ascl:2109.019] SNOwGLoBES: SuperNova Observatories with GLoBES

SNOwGLoBES (SuperNova Observatories with GLoBES) computes interaction rates and distributions of observed quantities for supernova burst neutrinos in common detector materials. The code provides a very simple and fast code and data package for tests of observability of physics signatures in current and future detectors, and for evaluation of relative sensitivities of different detector configurations. The event estimates are made using available cross-sections and parameterized detector responses. Water, argon, scintillator and lead-based configurations are included. The package makes use of GLoBES (ascl:2109.018). SNOwGLoBES is not intended to replace full detector simulations; however output should be useful for many types of studies, and simulation results can be incorporated.

[ascl:2109.018] GLoBES: General Long Baseline Experiment Simulator

GLoBES simulates long baseline neutrino oscillation experiments. The package features full incorporation of correlations and degeneracies in the oscillation parameter space, advanced routines for the treatment of arbitrary systematical errors, and user-defined priors, which allowsn for the inclusion of arbitrary external physical information. Its use of AEDL, the Abstract Experiment Definition Language, provides an easy way to define experimental setups. GLoBES also provides an interface for the simulation of non-standard physics, and offers predefined setups for many experiments, including Superbeams, Beta Beams, Neutrino factories, Reactors, and various detector technologies.

[ascl:2109.017] HTOF: Astrometric solutions for Hipparcos and Gaia intermediate data

HTOF parses the intermediate data from Hipparcos and Gaia and fits astrometric solutions to those data. It computes likelihoods and parameter errors in line with the catalog and can reproduce five, seven, and nine (or higher) parameter fits to their astrometry.

[ascl:2109.016] SkyPy: Simulating the astrophysical sky

SkyPy simulates the astrophysical sky. It provides functions that sample realizations of sources and their associated properties from probability distributions. Simulation pipelines are constructed from these models, while task scheduling and data dependencies are handled internally. The package's modular design, containing a library of physical and empirical models across a range of observables and a command line script to run end-to-end simulations, allows users to interface with external software.

[ascl:2109.015] unpopular: Using CPM detrending to obtain TESS light curves

unpopular is an implementation of the Causal Pixel Model (CPM) de-trending method to obtain TESS Full-Frame Image (FFI) light curves. The code, written in Python, models the systematics in the light curves of individual pixels as a linear combination of light curves from many other distant pixels and removes shared flux variations. unpopular is able to preserve sector-length astrophysical signals, allowing for the extraction of multi-sector light curves from the FFI data.

[ascl:2109.014] HSS: The Hough Stream Spotter

The Hough Stream Spotter (HSS) is a stream finding code which transforms individual positions of stars to search for linear structure in discrete data sets. The code requires only the two-dimensional plane of galactic longitude and latitude as input.

[ascl:2109.013] WimPyDD: WIMP direct–detection rates predictor

WimPyDD calculates accurate predictions for the expected rates in WIMP direct–detection experiments within the framework of Galilean–invariant non–relativistic effective theory. The object–oriented customizable Python code handles different scenarios including inelastic scattering, WIMP of arbitrary spin, and a generic velocity distribution of WIMP in the Galactic halo.

[ascl:2109.012] STAR-MELT: STellar AccrRetion Mapping with Emission Line Tomography

STAR-MELT extracts and identifies emission lines from FITS files by matching to a compiled reference database of lines. Line profiles are fitted and quantified, allowing for calculations of physical properties across each individual observation. Temporal variations in lines can readily be displayed and quantified. STAR-MELT is also useful for different applications of spectral analysis where emission line identification is required. Standard data formats for spectra are automatically compatible, with user-defined custom formats also available. Any reference database (atomic or molecular) can also be used for line identification.

[ascl:2109.011] Rubble: Simulating dust size distributions in protoplanetary disks

Rubble implicitly models the local evolution of dust distributions in size, mass, and surface density by solving the Smoluchowski equation (also known as the coagulation-fragmentation equation) under given disk conditions. The Python package's robustness has been validated by a suite of numerical benchmarks against known analytical and empirical results. Rubble can model prescribed physical processes such as bouncing, modulated mass transfer, regulated dust loss/supply, probabilistic collisional outcomes based on velocity distributions, and more. The package also includes a toolkit for analyzing and visualizing results produced by Rubble.

[ascl:2109.010] Frankenstein: Flux reconstructor

Frankenstein (frank) fits the 1D radial brightness profile of an interferometric source given a set of visibilities. It uses a Gaussian process that performs the fit in <1 minute for a typical protoplanetary disc continuum dataset. Frankenstein can perform a fit in 2 ways, by running the code directly from the terminal or using the code as a Python module.

[ascl:2109.009] pyFFTW: Python wrapper around FFTW

pyFFTW is a pythonic wrapper around FFTW (ascl:1201.015), the speedy FFT library. Both the complex DFT and the real DFT are supported, as well as on arbitrary axes of arbitrary shaped and strided arrays, which makes it almost feature equivalent to standard and real FFT functions of numpy.fft. Additionally, it supports the clongdouble dtype, which numpy.fft does not, and operating FFTW in multithreaded mode.

[ascl:2109.008] pyia: Python package for working with Gaia data

pyia provides tools for working with Gaia data. It accesses Gaia data columns as Quantity objects, i.e., with units (e.g., data.parallax will have units ‘milliarcsecond’)
, constructs covariance matrices for Gaia data, and generates random samples from the Gaia error distribution per source. pyia can also create SkyCoord objects from Gaia data and execute simple (small) remote queries via the Gaia science archive and automatically fetch the results.

[ascl:2109.007] SkyCalc_ipy: SkyCalc wrapper for interactive Python

SkyCalc-iPy (SkyCalc for interactive Python) accesses atmospheric emission and transmission data generated by ESO’s SkyCalc tool interactively with Python. This package is based on the command line tool by ESO for accessing spectra on the ESO SkyCalc server.

[ascl:2109.006] eMCP: e-MERLIN CASA pipeline

The e-MERLIN CASA Pipeline calibrates and processes data from the e-MERLIN radio interferometer. It works on top of CASA (ascl:1107.013) and can convert, concatenate, prepare, flag and calibrate raw to produce advanced calibrated products for both continuum and spectral line data. The main outputs of the data are calibration tables, calibrated data, assessment plots, preliminary images of target and calibrator sources and a summary weblog. The pipeline provides an easy, ready-to-use toolkit that delivers calibrated data in a consistent, clear, and repeatable way. A parameters file is used to control the pipeline execution, so optimization of the algorithms is straightforward and reproducible. Good quality images are usually obtained with minimum human intervention.

[ascl:2109.005] SoFiA 2: An automated, parallel HI source finding pipeline

SoFiA 2 is a fully automated spectral-line source finding pipeline originally intended for the detection of galaxies in large HI data cubes. It is a reimplementation of parts of the original SoFiA pipeline (ascl:1412.001) in the C programming language and uses OpenMP for multithreading, making it substantially faster and more memory-efficient than its predecessor. At its core, SoFiA 2 uses the Smooth + Clip algorithm for source finding which operates by spatially and spectrally smoothing the data on multiple scales and applying a user-defined flux threshold relative to the noise level in each iteration. A wide range of useful preconditioning and post-processing filters is available, including noise normalization, flagging of artifacts and reliability filtering. In addition to global data products and source catalogs in different formats, SoFiA 2 can also generate cutout images and spectra for each individual detection.

[ascl:2109.004] DviSukta: Spherically Averaged Bispectrum calculator

DviSukta calculates the Spherically Averaged Bispectrum (SABS). The code is based on an optimized direct estimation method, is written in C, and is parallelized. DviSukta starts by reading the real space gridded data and performing a 3D Fourier transform of it. Alternatively, it starts by reading the data already in Fourier space. The grid spacing, number of k1 bins, number of n bins, and number of cos(theta) bins need to be specified in the input file.

[ascl:2109.003] VOLKS2: VLBI Observation for transient Localization Keen Searcher

The VOLK2 (VLBI Observation for transient Localization Keen Searcher) pipeline conducts single pulse searches and localization in regular VLBI observations as well as single pulse detections from known sources in dedicated observations. In VOLKS2, the search and localization are two independent steps. The search step takes the idea of geodetic VLBI post processing, which fully utilizes the cross spectrum fringe phase information to maximize the signal power. Compared with auto spectrum based method, it is able to extract single pulses from highly RFI contaminated data. The localization uses the geodetic VLBI solving methods, which derives the single pulse location by solving a set of linear equations given the relation between the residual delay and the offset to a priori position.

[ascl:2109.002] alpconv: Calculating alp-photon conversion

alpconv calculates the alp-photon conversion by calculating the degree of irregularity of the spectrum, in contract to some other methods that fit the source's spectrum with both null and ALP models and then compare the goodness of fit between the two.

[ascl:2109.001] gammaALPs: Conversion probability between photons and axions/axionlike particles

gammaALPs calculates the conversion probability between photons and axions/axion-like particles in various astrophysical magnetic fields. Though focused on environments relevant to mixing between gamma rays and ALPs, this suite, written in Python, can also be used for broader applications. The code also implements various models of astrophysical magnetic fields, which can be useful for applications beyond ALP searches.

[submitted] Pyckles

A super lightweight interface in Python to load spectra from the Pickles 1998 (stellar) and Brown 2014 (galactic) spectral catalogues

[submitted] AnisoCADO

A python package created around Eric Gendron’s code for analytically (and quickly) generating field-varying SCAO PSFs for the ELT.

[submitted] ScopeSim Instrument Reference Database

A reference database for astronomical instrument and telescope characteristics for all types of visual and infrared systems. Instrument packages are used in conjunction with the ScopeSim instrument data simulator.

[submitted] ScopeSim Templates

Templates and helper functions for creating on-sky Source description objects for the ScopeSim instrument data simulation engine.

[submitted] ScopeSim

An attempt at creating a common pythonic framework for visual and infrared telescope instrument data simulators.

[ascl:2108.025] SORA: Stellar Occultation Reduction Analysis

SORA optimally analyzes stellar occultation data. The library includes processes starting on the prediction of such events to the resulting size, shape and position of the Solar System object and can be used to build pipelines to analyze stellar occultation data. A stellar occultation is defined by the occulting body (Body), the occulted star (Star), and the time of the occultation. On the other hand, each observational station (Observer) will be associated with their light curve (LightCurve). SORA has tasks that allow the user to determine the immersion and emersion times and project them to the tangent sky plane, using the information within the Observer, Body and Star Objects. That projection will lead to chords that will be used to obtain the object’s apparent size, shape and position at the moment of the occultation. Automatic processes optimize the reduction of typical events. However, users have full control over the parameters and methods and can make changes in every step of the process.

[ascl:2108.024] iminuit: Jupyter-friendly Python interface for C++ MINUIT2

iminuit is a Jupyter-friendly Python interface for the Minuit2 C++ library maintained by CERN's ROOT team. It can be used as a general robust function minimization method, but is most commonly used for likelihood fits of models to data, and to get model parameter error estimates from likelihood profile analysis.

[ascl:2108.023] CMC-COSMIC: Cluster Monte Carlo code

CMC-COSMIC models dense star clusters using Hénon's method using orbit-averaging collisional stellar dynamics. It includes all the relevant physics for modeling dense spherical star clusters, such as strong dynamical encounters, single and binary stellar evolution, central massive black holes, three-body binary formation, and relativistic dynamics, among others. CMC is parallelized using the Message Passing Interface (MPI), and is pinned to the COSMIC (ascl:2108.022) package for binary population synthesis, which itself was originally based on the version of BSE (ascl:1303.014). COSMIC is currently a submodule within CMC, ensuring that any cluster simulations or binary populations are integrated with the same physics.

[ascl:2108.022] COSMIC: Compact Object Synthesis and Monte Carlo Investigation Code

COSMIC (Compact Object Synthesis and Monte Carlo Investigation Code) generates synthetic populations with an adaptive size based on how the shape of binary parameter distributions change as the number of simulated binaries increases. It implements stellar evolution using SSE (ascl:1303.015) and binary interactions using BSE (ascl:1303.014). COSMIC can also be used to simulate a single binary at a time, a list of multiple binaries, a grid of binaries, or a fixed population size as well as restart binaries at a mid point in their evolution. The code is included in CMC-COSMIC (ascl:2108.023).

[ascl:2108.021] ExoPlaSim: Exoplanet climate simulator

ExoPlaSim extends the PlaSim (ascl:2107.019) 3D general climate model to terrestrial exoplanets. It includes the PlaSim general circulation model and modifications that allow this code to run tidally-locked planets, planets with substantially different surface pressures than Earth, planets orbiting stars with different effective temperatures, super-Earths, and more. ExoPlaSim includes the ability to compute carbon-silicate weathering, dynamic orography through the glacier module (though only accumulation and ablation/evaporation/melting are included; glacial flow and spreading are not), and storm climatology.

[ascl:2108.020] DBSP_DRP: DBSP Data Reduction Pipeline

DBSP_DRP reduces data from the Palomar spectrograph DBSP. Built on top of PypeIt (ascl:1911.004), it automates the reduction, fluxing, telluric correction, and combining of the red and blue sides of one night's data. The pipeline also provides several GUIs for easier control of the reduction, with one for selecting which data to reduce, and verifying the correctness of FITS headers in an editable table. Another GUI manually places traces for a sort of manually "forced" spectroscopy with the -m option, and after manually placing traces, manually selects sky regions and tweaks the FWHM of the manual traces.

[ascl:2108.019] PIPS: Period detection and Identification Pipeline Suite

PIPS analyzes the lightcurves of astronomical objects whose brightness changes periodically. Originally developed to determine the periods of RR Lyrae variable stars, the code offers many features designed for variable star analysis and can obtain period values for almost any type of lightcurve with both speed and accuracy. PIPS determines periods through several different methods, analyzes the morphology of lightcurves via Fourier analysis, estimates the statistical significance of the detected signal, and determines stellar properties based on pre-existing stellar models.

[ascl:2108.018] Cosmic-CoNN: Cosmic ray detection toolkit

Cosmic-CoNN detects cosmic rays (CR) in CCD-captured astronomical images. It offers a PyTorch deep-learning framework to train generic, robust CR detection models for ground- and space-based imaging data as well as spectroscopic observations. Cosmic-CoNN also includes a suite of tools, including console commands, a web app, and Python APIs, to make deep-learning models easily accessible.

[ascl:2108.017] AutoProf: Automatic Isophotal solutions for galaxy images

AutoProf performs basic and advanced non-parametric galaxy image analysis. The pipeline's design allows for fast startup and easy implementation; the package offers a suite of robust default and optional tools for surface brightness profile extractions and related methods. AUTOPROF is highly extensible and can be adapted for a variety of applications, providing flexibility for exploring new ideas and supporting advanced users.

[ascl:2108.016] Chemulator: Thermochemical emulator for hydrodynamical modeling

The neural network-based emulator Chemulator advances the gas temperature and chemical abundances of a single position in an astrophysical gas. It is accurate on a single timestep and stable over many iterations with decreased accuracy, though performs less well at low visual extinctions. The code is useful for applications such as large scale ISM modeling; by retraining the emulator for a given parameter space, Chemulator could also perform more specialized applications such as planetary atmosphere modeling.

[ascl:2108.015] ELISa: Eclipsing binaries Learning Interactive System

ELISa models light curves of close eclipsing binaries. It models surfaces of detached, semi-detached, and over-contact binaries, generates light curves, and generates stellar spots with given longitude, latitude, radius, and temperature. It can also fit radial velocity curves and light curves via the implementation of the non-linear least squares method and also via Markov Chain Monte Carlo method.

[ascl:2108.014] StelNet: Stellar mass and age predictor

StelNet predicts mass and age from absolute luminosity and effective temperature for stars with close to solar metallicity. It uses a Deep Neural Network trained on stellar evolutionary tracks. The underlying model makes no assumption on the evolutionary stage and includes the pre-main sequence phase. A mix of models are trained and bootstrapped to quantify the uncertainty of the model, and data is through all trained models to provide a predictive distribution from which an expectation value and uncertainty level can be estimated.

[ascl:2108.013] AMOEBA: Automated Gaussian decomposition

AMOEBA (Automated Molecular Excitation Bayesian line-fitting Algorithm) employs a Bayesian approach to Gaussian decomposition, resulting in an objective and statistically robust identification of individual clouds along the line-of-sight. It uses the Python implementation of Goodman & Weare's Affine Invariant Markov chain Monte Carlo (MCMC) Ensemble sampler emcee (ascl:1303.002) to sample the posterior probability distribution and numerically evaluate the integrals required to compute the Bayes Factor. Amoeba takes as input a set of OH optical depth spectra and a set of expected brightness temperature spectra that are obtained by measuring the brightness temperature towards the bright background continuum source (the "on-source" observations), and in a pattern surrounding the continuum source (the "off-source" observations). Amoeba can also take as input a set of OH optical depth spectra only, and also allows input of an arbitrary number of spectra to be fit simultaneously.

[ascl:2108.012] NRDD_constraints: Dark Matter interaction with the Standard Model exclusion plot calculator

The NRDD_constraints tool provides simple interpolating functions written in Python that return the most constraining limit on the dark matter-nucleon scattering cross section for a list of non-relativistic effective operators. The package contains four files: the main code, NRDD_constraints.py; a simple driver, NRDD_constraints-example.py; and two data files, NRDD_data1.npy and NRDD_data2.npy

[ascl:2108.011] Spectra-Without-Windows: Window-free analysis of the BOSS DR12 power spectrum and bispectrum

Spectra-Without-Windows (formerly called BOSS-Without-Windows) analyzes Baryon Oscillation Spectroscopic Survey (BOSS) DR12 data using quadratic and cubic estimators. It contains analysis codes to estimate unwindowed power spectra and unwindowed bispectra. It also supplies the raw power and bispectrum spectrum measurements of BOSS and 999 Patchy simulations, and contains a utility function to generate the background number density, n(r) from the survey mask and n(z) distribution. This code has been replaced by the newer and more powerful 3D polyspectrum code PolyBin3D (ascl:2404.006).

[ascl:2108.010] FIREFLY: Chi-squared minimization full spectral fitting code

FIREFLY (Fitting IteRativEly For Likelihood analYsis) derives stellar population properties of stellar systems, whether observed galaxy or star cluster spectra or model spectra from simulations. The code fits combinations of single-burst stellar population models to spectroscopic data following an iterative best-fitting process controlled by the Bayesian Information Criterion without applying priors. Solutions within a statistical cut are retained with their weight, which is arbitrary. No additive or multiplicative polynomia are used to adjust the spectral shape and no regularization is imposed. This fitting freedom allows mapping of the effect of intrinsic spectral energy distribution (SED) degeneracies, such as age, metallicity, dust reddening on stellar population properties, and quantifying the effect of varying input model components on such properties.

[ascl:2108.009] caesar-rest: Web service for the caesar source extractor

caesar-rest is a REST-ful web service for astronomical source extraction and classification with the caesar source extractor [ascl:1807.015]. The software is developed in python and consists of containerized microservices, deployable on standalone servers or on a distributed cloud infrastructure. The core component is the REST web application, based on the Flask framework and providing APIs for managing the input data (e.g. data upload/download/removal) and source finding jobs (e.g. submit, get status, get outputs) with different job management systems (Kubernetes, Slurm, Celery). Additional services (AAI, user DB, log storage, job monitor, accounting) enable the user authentication, the storage and retrieval of user data and job information, the monitoring of submitted jobs, and the aggregation of service logs and user data/job stats.

[ascl:2108.008] CatBoost: High performance gradient boosting on decision trees library

CatBoost is a machine learning method based on gradient boosting over decision trees and can be used for ranking, classification, regression and other machine learning tasks for Python, R, Java, C++. It supports both numerical and categorical features and computation on CPU and GPU, and is fast and scalable. Visualization tools are also included in CatBoost.

[ascl:2108.007] catwoman: Transit modeling Python package for asymmetric light curves

catwoman models asymmetric transit lightcurves. Written in Python, it calculates light curves for any radially symmetric stellar limb darkening law, and where planets are modeled as two semi-circles of different radii. Catwoman is built on the batman library (ascl:1510.002) and uses its integration algorithm.

[ascl:2108.006] viper: Velocity and IP EstimatoR

viper (Velocity and IP EstimatoR) measures differential radial velocities from stellar spectra taken through iodine or other gas cells. It convolves the product of a stellar template and a gas cell spectrum with an instrumental profile. Via least square fitting, it optimizes the parameters of the instrumental profile, the wavelength solution, flux normalization, and the stellar Doppler shift. viper offers various functions to describe the instrumental profile such as Gaussian, super-Gaussian, skewed Gaussian or mixtures of Gaussians. The code is developed for echelle spectra; it can handle data from CES, CRIRES+, KECK, OES, TCES, and UVES, and additional instruments can easily be added. A graphical interface facilitates the work with numerous flexible options.

[ascl:2108.005] millennium-tap-query: Python tool to query the Millennium Simulation UWS/TAP client

millennium-tap-query is a simple wrapper for the Python package requests to deal with connections to the Millennium TAP Web Client. With this tool you can perform basic or advanced queries to the Millennium Simulation database and download the data products. millennium-tap-query is similar to the TAP query tool in the German Astrophysical Virtual Observatory (GAVO) VOtables package.

[ascl:2108.004] WaldoInSky: Anomaly detection algorithms for time-domain astronomy

WaldoInSky finds anomalous astronomical light curves and their analogs. The package contains four methods: an adaptation of the Unsupervised Random Forest for anomaly detection in light curves that operates on the light curve points and their power spectra; two manifold-learning methods (the t-SNE and UMAP) that operate on the DMDT maps (image representations of the light curves), and that can be used to find analog light curves in the low-dimensional representation; and an Isolation Forest method for evaluating approaches of light curve pre-processing, before they are passed to the anomaly detectors. WaldoInSky also contain code for random sparsification of light curves.

[ascl:2108.003] MAPS: Multi-frequency Angular Power Spectrum estimator

MAPS (Multi-frequency Angular Power Spectrum) extracts two-point statistical information from Epoch of Reionization (EoR) signals observed in three dimensions, with two directions on the sky and the wavelength (or frequency) constituting the third dimension. Rather than assume that the signal has the same statistical properties in all three directions, as the spherically averaged power spectrum (SAPS) does, MAPS does not make these assumptions, making it more natural for radio interferometric observations than SAPS.

[ascl:2108.002] AUM: A Unified Modeling scheme for galaxy abundance, galaxy clustering and galaxy-galaxy lensing

AUM predicts galaxy abundances, their clustering, and the galaxy-galaxy lensing signal, given the halo occupation distribution of galaxies and the underlying cosmological model. In combination with the measurements of the clustering, abundance, and lensing of galaxies, these routines can be used to perform cosmological parameter inference.

[submitted] spectrogrism

This module implements an ad-hoc grism-based spectrograph optical model. It provides a flexible chromatic mapping between the input focal plane and the output detector plane, based on an effective simplified ray-tracing model of the key optical elements defining the spectrograph (collimator, prism, grating, camera), described by a restricted number of physically-motivated distortion parameters.

[ascl:2108.001] HRK: HII Region Kinematics

Generate simulated radio recombination line observations of HII regions with various internal kinematic structure. Fit single Gaussians to each pixel of the simulated observations and generate images of the fitted Gaussian center and full-width half-maximum (FWHM) linewidth.

[submitted] MALU IFS visualisation tool

MALU visualizes integral field spectroscopy (IFS) data such as CALIFA, MANGA, SAMI or MUSE data producing fully interactive plots. The tool is not specific to any instrument. It is available in Python and no installation is required.

[ascl:2107.030] HERMES: High-Energy Radiative MESsengers

The HERMES (High-Energy Radiative MESsengers) computational framework for line of sight integration creates sky maps in the HEALPix-compatibile format of various galactic radiative processes, including Faraday rotation, synchrotron and free-free radio emission, gamma-ray emission from pion-decay, bremsstrahlung and inverse-Compton. The code is written in C++ and provides numerous integrators, including dispersion measure, rotation measure, and Gamma-ray emissions from Dark Matter annihilation, among others.

[ascl:2107.029] PHL: Persistent_Homology_LSS

Persistent_Homology_LSS analyzes halo catalogs using persistent homology to constrain cosmological parameters. It implements persistent homology on a point cloud composed of halos positions in a cubic box from N-body simulations of the universe at large scales. The output of the code are persistence diagrams and images that are used to constrain cosmological parameters from the halo catalog.

[ascl:2107.028] TRINITY: Dark matter halos, galaxies and supermassive black holes empirical model

TRINITY statistically connects dark matter halos, galaxies and supermassive black holes (SMBHs) from z=0-10. Constrained by multiple galaxy (0 < z < 10) and SMBH datasets (0 < z < 6.5), the empirical model finds the posterior probability distributions of the halo-galaxy-SMBH connection and SMBH properties, all of which are allowed to evolve with redshift. TRINITY can predict many observational data, such as galaxy stellar mass functions and quasar luminosity functions, and underlying galaxy and SMBH properties, including SMBH Eddington average Eddington ratios. These predictions are made by different code files. There are basically two types of prediction codes: the first type generates observable data given input redshift or redshift invertals; the second type generates galaxy or SMBH properties as a function of host halo mass and redshift.

[ascl:2107.027] KeplerPORTS: Kepler Planet Occurrence Rate Tools

KeplerPORTS calculates the detection efficiency of the DR25 Kepler Pipeline. It uses a detection contour model to quantify the recoverability of transiting planet signals due to the Kepler pipeline, and accurately portrays the ability of the Kepler pipeline to generate a Threshold Crossing Event (TCE) for a given hypothetical planet.

[ascl:2107.026] K2mosaic: Mosaic Kepler pixel data

K2mosaic stitches the postage stamp-sized pixel masks obtained by NASA's Kepler and K2 missions together into CCD-sized mosaics and movies. The command-line tool's principal use is to take a set of Target Pixel Files (TPF) and turn them into more traditional FITS image files -- one per CCD channel and per cadence. K2mosaic can also be used to create animations from these mosaics. The mosaics produced by K2mosaic also makes the analysis of certain Kepler/K2 targets, such as clusters and asteroids, easier. Moreover such mosaics are useful to reveal the context of single-star observations, e.g., they enable users to check for the presence of instrumental noise or nearby bright objects.

[ascl:2107.025] MCPM: Modified CPM method

MCPM extracts K2 photometry in dense stellar regions; the code is a modification and extension of the K2-CPM package (ascl:2107.024), which was developed for less-crowded fields. MCPM uses the pixel response function together with accurate astrometric grids, combining signals from a few pixels, and simultaneously fits for an astrophysical model to produce extracted more precise K2 photometry.

[ascl:2107.024] K2-CPM: Causal Pixel Model for K2 data

K2-CPM captures variability while preserving transit signals in Kepler data. Working at the pixel level, the model captures very fine-grained information about the variation of the spacecraft. The CPM models the systematic effects in the time series of a pixel using the pixels of many other stars and the assumption that any shared signal in these causally disconnected light curves is caused by instrumental effects. The target star's future and past are used and the data points are separated into training and test sets to ensure that information about any transit is perfectly isolated from the model. The method has four tuning parameters, the number of predictor stars or pixels, the autoregressive window size, and two L2-regularization amplitudes for model components, and consistently produces low-noise light curves.

[ascl:2107.023] cosmic_variance: Cosmic variance calculator

cosmic_variance calculates the cosmic variance during the Epoch of Reionization (EoR) for the UV Luminosity Function (UV LF), Stellar Mass Function (SMF), and Halo Mass Function (HMF). The three functions in the package provide the output as the cosmic variance expressed in percentage. The code is written in Python, and simple examples that show how to use the functions are provided.

[ascl:2107.022] Kd-match: Correspondences of objects between two catalogs through pattern matching

Kd-match matches stellar catalogs for which the transformation between the coordinate systems of the two catalogs is unknown and might include shearing. The code uses the ratio of sides as the invariant under a coordinate transformation and searches for several triangles with similar transformations by building quadrilaterals from sets of four objects in each catalog and calculating the ratio of areas of the triangles that comprise the quadrilaterals. The k-d tree accelerates this quadrilateral search dramatically and is significantly faster than the customary direct search over triangles.

[ascl:2107.021] RePrimAnd: Recovery of Primitives And EOS framework

The RePrimAnd library supports numerical simulations of general relativistic magnetohydrodynamics. It provides methods for recovering primitive variables such as pressure and velocity from the variables evolved in quasi-conservative formulations. Further, it provides a general framework for handling matter equations of state (EOS). Python bindings are automatically built together with the library, provided a Python3 installation containing the pybind11 package is detected. RePrimAnd also provides an (experimental) thorn that builds the library within an Einstein Toolkit (ascl:1102.014) environment using the ExternalLibraries mechanism.

[ascl:2107.020] Chem-I-Calc: Chemical Information Calculator

Chem-I-Calc evaluates the chemical information content of resolved star spectroscopy. It takes advantage of the Fisher information matrix and the Cramér-Rao inequality to quickly calculate the Cramér-Rao lower bounds (CRLBs), which give the best theoretically achievable precision from a set of observations.

[ascl:2107.019] PlaSim: Planet Simulator

PlaSim is a climate model of intermediate complexity for Earth, Mars and other planets. It is written for a university environment, to be used to train the next GCM (general circulation model) developers, to support scientists in understanding climate processes, and to do fundamental research. In addition to an atmospheric GCM of medium complexity, PlaSim includes other compartments of the climate system such as, for example, an ocean with sea ice and a land surface with a biosphere. These other compartments are reduced to linear systems. In other words, PlaSim consists of a GCM with a linear ocean/sea-ice module formulated in terms of a mixed layer energy balance. The soil/biosphere module is introduced analoguously. Thus, working with PlaSim is like testing the performance of an atmospheric or oceanic GCM interacting with various linear processes, which parameterize the variability of the subsystems in terms of their energy (and mass) balances.

[ascl:2107.018] ART: A Reconstruction Tool

ART reconstructs log-probability distributions using Gaussian processes. It requires an existing MCMC chain or similar set of samples from a probability distribution, including the log-probabilities. Gaussian process regression is used for interpolating the log-probability for the rescontruction, allowing for easy resampling, importance sampling, marginalization, testing different samplers, investigating chain convergence, and other operations.

[ascl:2107.017] PyCactus: Post-processing tools for Cactus computational toolkit simulation data

PyCactus contains tools for postprocessing data from numerical simulations performed with the Einstein Toolkit, based on the Cactus computational toolkit. The main package is PostCactus, which provides a high-level Python interface to the various data formats in a simulation folder. Further, the package SimRep allows the automatic creation of html reports for a simulation, and the SimVideo package allows the creation of movies visualizing simulation data.

[ascl:2107.016] shear-stacking: Stacked shear profiles and tests based upon them

shear-stacking calculates stacked shear profiles and tests based upon them, e.g. consistency for different slices of lensed background galaxies. The basic concept is that the lensing signal in terms of surface mass density (instead of shear) should be entirely determined by the properties of the lens sample and have no dependence on source galaxy properties.

[ascl:2107.015] shapelens: Astronomical image analysis and shape estimation framework

The shapelens C++ library provides ways to load galaxies and star images from FITS files and catalogs and to analyze their morphology. The main purpose of this library is to make several weak-lensing shape estimators publicly available. All of them are based on the moments of the brightness distribution. The estimators include DEIMOS, for analytic deconvolution in moment space, DEIMOSElliptical, a practical implemention of DEIMOS with an automatically matched elliptical weight function, DEIMOSCircular, which is identical to DEIMOSElliptical but with a circular weight function, and others.

[ascl:2107.014] Skylens++: Simulation package for optical astronomical observations

Skylens++ implements a Layer-based raytracing framework particularly well-suited for realistic simulations of weak and strong gravitational lensing. Source galaxies can be drawn from analytic models or deep space-based imaging. Lens planes can be populated with arbitrary deflectors, typically either from N-body simulations or analytic lens models. Both sources and lenses can be placed at freely configurable positions into the light cone, in effect allowing for multiple source and lens planes.

[ascl:2107.013] GUBAS: General Use Binary Asteroid Simulator

GUBAS (General Use Binary Asteroid Simulator) predicts binary asteroid system behaviors by implementing the Hou 2016 realization of the full two-body problem (F2BP). The F2BP models binary asteroid systems as two arbitrary mass distributions whose mass elements interact gravitationally and result in both gravity forces and torques. To account for these mass distributions and model the mutual gravity of the F2BP, GUBAS computes the inertia integrals of each body up to a user defined expansion order. This approach provides a recursive expression of the mutual gravity potential and represents a significant decrease in the computational burden of the F2BP when compared to other methods of representing the mutual potential.

[ascl:2107.012] PyROA: Modeling quasar light curves

PyROA models quasar light curves where the variability is described using a running optimal average (ROA), and parameters are sampled using Markov Chain Monte Carlo (MCMC) techniques using emcee (ascl:1303.002). Using a Bayesian approach, priors can be used on the sampled parameters. Currently it has three main uses: 1.) Determining the time delay between lightcurves at different wavelengths; 2.) Intercalibrating light curves from multiple telescopes, merging them into a single lightcurve; and 3.) Determining the time delay between images of lensed quasars, where the microlensing effects are also modeled. PyROA also includes a noise model, where there is a parameter for each light curve that adds extra variance to the flux measurments, to account for underestimated errors; this can be turned off if required. Example jupyter notebooks that demonstrate each of the three main uses of the code are provided.

[ascl:2107.011] AlignBandColors: Inter-color-band image alignment tool

AlignBandColors (ABC) aligns inter-color-band astronomical images to a 100th of a pixel accuracy using surrounding stars as guiding points. It has currently been tested with Sloan Digital Sky Survey (SDSS) Data Release 12 images, but is designed to be survey-independent. The code is part of the SpArcFiRe (ascl:2107.010) method.

[ascl:2107.010] SpArcFiRe: SPiral ARC FInder and REporter

SpArcFiRe takes as input an image of a galaxy in FITS, JPG, or PNG format, identifies spiral arms, and extracts structural information about the spiral arms. Pixels in each arm segment are listed, enabling image analysis on each segment. The automated method also performs a least-squares fit of a logarithmic spiral arc to the pixels in that segment, giving per-arc parameters, such as the pitch angle, arm segment length, and location, and outputs images showing the steps SpArcFire took to detect arm segments.

[ascl:2107.009] Balrog: Astronomical image simulation

The Balrog package of Python simulation code is for use with real astronomical imaging data. Objects are simulated into a survey's images and measurement software is run over the simulated objects' images. Balrog allows the user to derive the mapping between what is actually measured and the input truth. The package uses GalSim (ascl:1402.009) for all object simulations; source extraction and measurement is performed by SExtractor (ascl:1010.064). Balrog facilitates the ease of running these codes en masse over many images, automating useful GalSim and SExtractor functionality, as well as filling in many bookkeeping steps along the way.

[ascl:2107.008] nimbus: A Bayesian inference framework to constrain kilonova models

nimbus is a hierarchical Bayesian framework to infer the intrinsic luminosity parameters of kilonovae (KNe) associated with gravitational-wave (GW) events, based purely on non-detections. This framework makes use of GW 3-D distance information and electromagnetic upper limits from a given survey for multiple events, and self-consistently accounts for finite sky-coverage and probability of astrophysical origin.

[ascl:2107.007] Skymapper: Mapping astronomical survey data on the sky

Skymapper maps astronomical survey data from the celestial sphere onto 2D using a collection of matplotlib instructions. It facilitates interactive work as well as the creation of publication-quality plots with a python-based workflow many astronomers are accustomed to. The primary motivation is a truthful representation of samples and fields from the curved sky in planar figures, which becomes relevant when sizable portions of the sky are observed.

[ascl:2107.006] snmachine: Photometric supernova classification

snmachine reads in photometric supernova light curves, extracts useful features from them, and subsequently performs supervised machine learning to classify supernovae based on their light curves. This python library is also flexible enough to easily extend to general transient classification.

[ascl:2107.005] ReionYuga: Epoch of Reionization neutral Hydrogen field generator

The C code ReionYuga generates the Epoch of Reionization (EoR) neutral Hydrogen (HI) field (successively the redshifted 21-cm signal) within a cosmological simulation box using semi-numerical techniques. The code is based on excursion set formalism and uses a three parameter model. It is designed to work with PMN-body (ascl:2107.003) and FoF-Halo-finder (ascl:2107.004).

[ascl:2107.004] FoF-Halo-finder: Halo location and size

FoF-Halo-finder identifies the location and size of collapsed objects (halos) within a cosmological simulation box. These halos are the host for the luminous objects in the Universe. Written in C, it is based on the friends-of-friends (FoF) algorithm, and is designed to work with PMN-body (ascl:2107.003).

[ascl:2107.003] PMN-body: Particle Mesh N-body code

PMN-body computes the non-linear evolution of the cosmological matter density contrast. It is based on the Particle Mesh (PM) technique. Written in C, the code is parallelized for shared-memory machines using Open Multi-Processing (OpenMP).

[ascl:2107.002] ROA: Running Optimal Average

ROA (Running Optimal Average) describes time series data. This model uses a Gaussian window function that moves through the data giving stronger weights to points close to the center of the Gaussian. Therefore, the width of the window function, delta, controls the flexibility of the model, with a small delta providing a very flexible model. The function also calculates the effective number of parameters, as a very flexible model will correspond to large number of parameters while a rigid model (low delta) has a low effective number of parameters. Knowing the effective number of parameters can be used to optimize the window width, e.g., using the Bayesian information criterion (BIC). An error envelope, which expands appropriately where there are gaps in the data, is also calculated for the model.

Would you like to view a random code?