ASCL.net

Astrophysics Source Code Library

Making codes discoverable since 1999

Browsing Codes

Order
Title Date
 
Mode
Abstract Compact
Per Page
50100250All
[ascl:2103.029] SparseBLS: Box-Fitting Least Squares implementation for sparse data

SparseBLS uses the Box-fitting Least Squares (BLS) algorithm to detect transiting exoplanets in photometric data. SparseBLS does not bin data into phase bins and does not use a phase grid. Because its detection efficiency does not depend on the transit phase, it is significantly faster than BLS for sparse data and is well-suited for large photometric surveys producing unevenly-sampled sparse light curves, such as Gaia.

[ascl:2103.028] Astro-Fix: Correcting astronomical bad pixels in Python

astrofix is an astronomical image correction algorithm based on Gaussian Process Regression. It trains itself to apply the optimal interpolation kernel for each image, performing multiple times better than median replacement and interpolation with a fixed kernel.

[ascl:2103.027] GalLenspy: Reconstruction of mass profile in disc-like galaxies from the gravitational lensing effect

Gallenspy uses the gravitational lensing effect (GLE) to reconstruct mass profiles in disc-like galaxies. The algorithm inverts the lens equation for gravitational potentials with spherical symmetry, in addition to the estimation in the position of the source, given the positions of the images produced by the lens. Gallenspy also computes critical and caustic curves and the Einstein ring.

[ascl:2103.026] PyPion: Post-processing code for PION simulation data

PyPion reads in Silo (ascl:2103.025) data files from PION (ascl:2103.024) simulations and plots the data. This library works for 1D, 2D, and 3D data files and for any amount of nested-grid levels. The scripts contained in PyPion save the options entered into the command line when the python script is run, open the silo file and save all of the important header variables, open the directory in the silo (or vtk, or fits) file and save the requested variable data (eg. density, temp, etc.), and set up the plotting function and the figure.

[ascl:2103.025] Silo: Saving scientific data to binary disk files

Silo reads and writes a wide variety of scientific data to binary disk files. The files Silo produces and the data within them can be easily shared and exchanged between wholly independently developed applications running on disparate computing platforms. Consequently, Silo facilitates the development of general purpose tools for processing scientific data. One of the more popular tools that process Silo data files is the VisIt visualization tool (ascl:1103.007).

Silo supports gridless (point) meshes, structured meshes, unstructured-zoo and unstructured-arbitrary-polyhedral meshes, block structured AMR meshes, constructive solid geometry (CSG) meshes, piecewise-constant (e.g., zone-centered) and piecewise-linear (e.g. node-centered) variables defined on the node, edge, face or volume elements of meshes as well as the decomposition of meshes into arbitrary subset hierarchies including materials and mixing materials. In addition, Silo supports a wide variety of other useful objects to address various scientific computing application needs. Although the Silo library is a serial library, it has features that enable it to be applied quite effectively and scalable in parallel.

[ascl:2103.024] PION: Computational fluid-dynamics package for astrophysics

PION (PhotoIonization of Nebulae) is a grid-based fluid dynamics code for hydrodynamics and magnetohydrodynamics, including a ray-tracing module for calculating the attenuation of radiation from point sources of ionizing photons. It also has a module for coupling fluid dynamics and the radiation field to microphysical processes such as heating/cooling and ionization/recombination. PION models the evolution of HII regions, photoionized bubbles that form around hot stars, and has been extended to include stellar wind sources so that both wind bubbles and photoionized bubbles can be simulated at the same time. It is versatile enough to be extended to other applications.

[ascl:2103.023] DRAKE: Relic density in concrete models prediction

DRAKE (Dark matter Relic Abundance beyond Kinetic Equilibrium) predicts the dark matter relic abundance in situations where the standard assumption of kinetic equilibrium during the freeze-out process may not be satisfied. The code comes with a set of three dedicated Boltzmann equation solvers that implement, respectively, the traditionally adopted equation for the dark matter number density, fluid-like equations that couple the evolution of number density and velocity dispersion, and a full numerical evolution of the phase-space distribution.

[ascl:2103.022] nestle: Nested sampling algorithms for evaluating Bayesian evidence

nestle is a pure Python implementation of nested sampling algorithms for evaluating Bayesian evidence. Nested sampling integrates posterior probability in order to compare models in Bayesian statistics. It is similar to Markov Chain Monte Carlo (MCMC) in that it generates samples that can be used to estimate the posterior probability distribution. Unlike MCMC, the nature of the sampling also allows one to calculate the integral of the distribution. It is also a pretty good method for robustly finding global maxima.

[ascl:2103.021] Carsus: Atomic database for astronomy

Carsus manages atomic datasets. It requires Chianti (ascl:9911.004), and can read data from a variety of sources and output them to file formats readable by radiative transfer codes such as TARDIS (ascl:1402.018).

[ascl:2103.020] ARTIS: 3D Monte Carlo radiative transfer code for supernovae

ARTIS is a 3D radiative transfer code for Type Ia supernovae using the Monte Carlo method with indivisible energy packets. It incorporates polarization and virtual packets and non-LTE physics appropriate for the nebular phase of Type Ia supernovae.

[ascl:2103.019] SUPERNU: Radiative transfer code for explosive outflows using Monte Carlo methods

SuperNu simulates time-dependent radiation transport in local thermodynamic equilibrium with matter. It applies the methods of Implicit Monte Carlo (IMC) and Discrete Diffusion Monte Carlo (DDMC) for static or homologously expanding spatial grids. The radiation field affects material temperature but does not affect the motion of the fluid. SuperNu may be applied to simulate radiation transport for supernovae with ejecta velocities that are not affected by radiation momentum. The physical opacity calculation includes elements from Hydrogen up to Cobalt. SuperNu is motivated by the ongoing research into the effect of variation in the structure of progenitor star explosions on observables: the brightness and shape of light curves and the temporal evolution of the spectra. Consequently, the code may be used to post-process data from hydrodynamic simulations. SuperNu does not include any capabilities or methods that allow for non-trivial hydrodynamics.

[ascl:2103.018] GalacticDNSMass: Bayesian inference determination of mass distribution of Galactic double neutron stars

GalacticDNSMass performs Bayesian inference on Galactic double neutron stars (DNS) to investigate their mass distribution. Each DNS is comprised of two neutron stars (NS), a recycled NS and a non-recycled (slow) NS. It compares two hypotheses: A - recycled NS and non-recycled NS follow an identical mass distribution, and B - they are drawn from two distinct populations. Within each hypothesis it also explore three possible functional models: Gaussian, two-Gaussian (mixture model), and uniform mass distributions.

[ascl:2103.017] CRIME: Cosmological Realizations for Intensity Mapping Experiments

CRIME (Cosmological Realizations for Intensity Mapping Experiments) generates mock realizations of intensity mapping observations of the neutral hydrogen distribution. It contains three separate tools, GetHI, ForGet, and JoinT. GetHI generates realizations of the temperature fluctuations due to the 21cm emission of neutral hydrogen. Optionally it can also generate a realization of the point-source continuum emission (for a given population) by sampling the same density distribution, though using this feature greatly affects performance. ForGet generates realizations of the different galactic and extra-galactic foregrounds relevant for intensity mapping experiments using some external datasets (e.g. the Haslam 408 MHz map) stored in the "data"folder. JoinT is provided for convenience; it joins the temperature maps generated by GetHI and ForGet and includes several instrument-dependent effects (in an overly simplistic way).

[ascl:2103.016] RAiSERed: Analytic AGN model based code for radio-frequency redshifts

The RAiSERed (Radio AGN in Semi-analytic Environments: Redshifts) code implements the RAiSE analytic model for Fanaroff-Riley type II sources, using a Bayesian prior for their host cosmological environments, to measure the redshift of active galactic nuclei lobes based on radio-frequency observations. The Python code provides a class for the user to store measured attributes for each radio source, and to which model derived redshift probability density functions are returned. Systematic uncertainties in the analytic model can be calibrated by specifying a subset of radio sources with spectroscopic redshifts. Functions are additionally provided to plot the redshift probability density functions and assess the success of the model calibration.

[ascl:2103.015] LPF: Real-time detection of transient sources in radio data streams

LPF (Live Pulse Finder) provides real-time automated analysis of the radio image data stream at multiple frequencies. The fully automated GPU-based machine-learning backed pipeline performs source detection, association, flux measurement and physical parameter inference. At the end of the pipeline, an alert of a significant detection of a transient event can be sent out and the data saved for further investigation.

[ascl:2103.014] QuickCBC: Rapid and reliable inference for binary mergers

QuickCBC is a robust end-to-end low-latency Bayesian parameter estimation algorithm for binary mergers. It reads in calibrated strain data, performs robust on-source spectral estimation, executes a rapid search for compact binary coalescence (CBC) signals, uses wavelet de-noising to subtract any glitches from the search residuals, produces low-latency sky maps and initial parameter estimates, followed by full Bayesian parameter estimation.

[ascl:2103.013] schNell: Fast calculation of N_ell for GW anisotropies

schNell computes basic map-level noise properties for generic networks of gravitational wave interferometers, primarily the noise power spectrum "N_ell", but this lightweight python module that can also be used for, for example, antenna patterns, overlap functions, and inverse variance maps, among other tasks. The code has three main classes; detectors contain information about each individual detector of the network, such as their positions, noise properties, and orientation. NoiseCorrelations describes the noise-level correlation between pairs of detectors, and the MapCalculators class combines a list of Detectors into a network (potentially together with a NoiseCorrelation object) and computes the corresponding map-level noise properties arising from their correlations.

[ascl:2103.012] AstroNet-Triage: Neural network for TESS light curve triage

AstroNet-Triage contains TensorFlow models and data processing code for identifying exoplanets in astrophysical light curves; this is the triage version of two TESS neural networks. For the vetting version, see AstroNet-Vetting (ascl:2103.011). The TensorFlow code downloads and pre-processes TESS data, builds different types of neural network classification models, trains and evaluates new models, and generates new predictions using a trained model. Utilities that operate on light curves are provided; these reading TESS data from .h5 files, and perform phase folding, splitting, binning, and other tasks. C++ implementations of some light curve utilities are also included.

[ascl:2103.011] AstroNet-Vetting: Neural network for TESS light curve vetting

AstroNet-Vetting identifies exoplanets in astrophysical light curves. This is the vetting version of two TESS neural networks; for the triage version, see AstroNet-Triage (ascl:2103.012). The package contains TensorFlow code that downloads and pre-processes TESS data, builds different types of neural network classification models, trains and evaluates a new model, and uses a trained model to generate new predictions. It includes utilities for operating on light curves, such as for reading TESS data from .h5 files, phase folding, splitting, and binning. In addition, C++ implementations of light curve utilities are also provided.

[ascl:2103.010] TransitFit: Exoplanet transit fitting package for multi-telescope datasets

TransitFit fits exoplanetary transit light-curves for transmission spectroscopy studies. The code uses nested sampling for efficient and robust multi-epoch, multi-wavelength fitting of transit data obtained from one or more telescopes. TransitFit allows per-telescope detrending to be performed simultaneously with parameter fitting, including the use of user-supplied detrending alogorithms. Host limb darkening can be fitted either independently ("uncoupled") for each filter or combined ("coupled") using prior conditioning from the PHOENIX stellar atmosphere models. For this, TransitFit uses the Limb Darkening Toolkit (ascl:1510.003) together with filter profiles, including user-supplied filter profiles.

[ascl:2103.009] DarkEmulator: Cosmological emulation code for halo clustering statistics

The cosmology code DarkEmulator calculates summary statistics of large scale structure constructed as a part of Dark Quest Project. The “dark_emulator” python package enables fast and accurate computations of halo clustering quantities. The code supports the halo mass function, halo-matter cross-correlation, and halo auto-correlation as a function of halo masses, redshift, separations and cosmological models.

[submitted] MRS: The MOS Reduction Software

The MRS (The MOS Reduction Software) suite reduces the spectra taken with the multi-object spectrograph spectra used as the focal plane instrument of RTT150 telescope in the TÜBİTAK National Observatory.

[submitted] ObsPlanner

Simple program for planning and managing astronomical observations as observational diary or logs.

[ascl:2103.008] Pyedra: Python implementation for asteroid phase curve fitting

Pyedra performs asteroid phase curve fitting. From a simple table containing the asteroid MPC number, phase angle and reduced magnitude, Pyedra estimates the parameters of the phase function using the least squares method. The user can choose from three different models for the phase curve fit: H-G model, H-G1-G2 model and the Shevchenko model. The output in all cases is a table containing the adjusted parameters and their corresponding errors. This package allows carrying out phase function analysis for a few asteroids as well as to process large volumes of data such as those released by current large surveys.

[ascl:2103.007] TFF: Template Fourier Fitting

TFF derives the Fourier decomposition of period-folded RR Lyrae light curves with gaps. The method can be used for the same purpose on any other types of variables, assuming that the the template database is changed to the proper type of variables.

[ascl:2103.006] ggm: Gaussian gradient magnitude filtering of astronomical images

Ggm contains useful utilities for Gaussian gradient filtering of astronomical FITS images. It applies the Gaussian gradient magnitude filter to an input fits image, using a particular scale, sigma, in pixels. ggm cosmetically hides point sources in fits images by filling point sources with random values from the surrounding pixel region. It also provides an interactive tool to combine FITS images filtered on different scales.

[ascl:2103.005] satcand: Orbital stability and tidal migration constraints for KOI exomoon candidates

satcand applies theoretical constraints of orbital stability and tidal migration to KOI exomoon candidates. The package can evaluate the tidal migration within a Sun-Earth-Moon system, plot angular velocity over time, and calculate the migration time scale (T1) and the total migration time scale, among other things. In addition to the theoretical constraints, observational constraints can be applied.

[submitted] Deep Embedded Clustering for Open Cluster Characterization with Gaia DR2 Data

Characterize and understandOpen Clusters(OCs) allow us to understand better properties and mechanisms about the Universe such as stellar formation and the regions where these events occur. They also provide information about stellar processes and the evolution of the galactic disk.

In this paper, we present a novel method to characterize OCs. Our method employs a model built on Artificial Neural Networks(ANNs). More specifically, we adapted a state of the art model, the Deep Embedded Clustering(DEC) model for our purpose. The developed method aims to improve classical state of the arts techniques. We improved not only in terms of computational efficiency (with lower computational requirements), but inusability (reducing the number of hyperparameters to get a good characterization of the analyzed clusters). For our experiments, we used the Gaia DR2 database as the data source, and compared our model with the clustering technique K-Means. Our method achieves good results, becoming even better (in some of the cases) than current techniques.

[ascl:2103.004] redshifts: Spectroscopic redshifts search tool

redshifts collects all unique spectroscopic redshifts from online databases such as VizieR and NED. It can perform a flexible search within a radius of a given set of (RA, DEC) coordinates and uses column names and descriptions (including UCD keywords) to identify columns containing spectroscopic redshifts or velocities. It weeds out photometric redshifts and duplicates and returns a unique list of best spectroscopic redshift measurements. redshifts can be used standalone from the terminal, and can take a number of optional command line arguments, or from Python.

[ascl:2103.003] spalipy: Detection-based astronomical image registration

spalipy performs detection-based astronomical image registration in Python. A source image is transformed to the pixel-coordinate system of a template image using their respective detections as tie-points by finding matching quads of detections. spalipy also includes an optional additional warping of the initial affine transformation via splines to achieve accurate registration in the case of non-homogeneous coordinate transforms. This is particularly useful in the case of optically distorted or wide field-of-view images.

[ascl:2103.002] hfs_fit: Atomic emission spectral line hyperfine structure fitting

hfs_fit performs parameter optimization in the analysis of emission line hyperfine structure (HFS). The code uses a simulated annealing algorithm to optimize the magnetic dipole interaction constants, electric quadrupole interaction constants, Voigt profile widths and the center of gravity wavenumber for a given emission line profile. The fit can be changed visually with sliders for parameters, which is useful when HFS constants are unknown.

[ascl:2103.001] 21cmDeepLearning: Matter density map extractor

21cmDeepLearning extracts the underlying matter density map from a 21 cm intensity field by making use of a convolutional neural network (CNN) with the U-Net architecture; the software is implemented in Pytorch. The astrophysical parameters of the simulations can be predicted with a secondary CNN. The simulations of matter density and 21 cm maps are performed with the code 21cmFAST (ascl:1102.023).

[submitted] synchrofit: Python-based synchrotron spectral fitting

The synchrofit (synchrotron fitter) package implements a reduced dimensionality parameterisation of standard synchrotron spectrum models, and provides fitting routines applicable for active galactic nuclei and supernova remnants. The Python code includes the Jaffe-Parola model (JP), Kardashev-Pacholczyk model (KP), and continuous injection models (CI/KGJP) for both constant or Maxwell-Boltzmann magnetic field distributions. An adaptive maximum likelihood algorithm is invoked to fit these models to multi-frequency radio observations; the adaptive mesh is customisable for either optimal precision or computational efficiency. Functions are additionally provided to plot the fitted spectral model with its confidence interval, and to derive the spectral age of the synchrotron emitting particles.

[submitted] U.S. Naval Observatory Ephemerides of the Largest Asteroids (USNO/AE98)

USNO/AE98 contains ephemerides for fifteen of the largest asteroids that The Astronomical Almanac has used since its 2000 edition. These ephemerides are based on the Jet Propulsion Laboratory (JPL) planetary ephemeris DE405 and, thus, aligned to the International Celestial Reference System (ICRS). The data cover the period from 1799 November 16 (JD 2378450.5) through 2100 February 1 (JD 2488100.5). The internal uncertainty in the mean longitude at epoch, 1997 December 18, ranges from 0.05 arcseconds for 7 Iris through 0.22 arcseconds for 65 Cybele, and the uncertainty in the mean motion varies from 0.02 arcseconds per century for 4 Vesta to 0.14 arcseconds per century for 511 Davida.

The Astronomical Almanac has published ephemerides for 1 Ceres, 2 Pallas, 3 Juno, and 4 Vesta since its 1953 edition. Historically, these four asteroids have been observed more than any of the others. Ceres, Pallas, and Vesta deserve such attention because as they are the three most massive asteroids, the source of significant perturbations of the planets, the largest in linear size, and among the brightest main belt asteroids. Studying asteroids may provide clues to the origin and primordial composition of the solar system, data for modeling the chaotic dynamics of small solar system bodies, and assessments of potential collisions. Therefore, USNO/AE98 includes more than the traditional four asteroids.

The following criteria were used to select main belt asteroids for USNO/AE98:

Diameter greater than 300 km, presumably among the most massive asteroids
Excellent observing history and discovered before 1850
Largest in their taxonomic class
The massive asteroids included may be studied for their perturbing effects on the planets while those with detailed observing histories may be used to evaluate the accuracy limits of asteroid ephemerides. The fifteen asteroids that met at least one of these criteria are

1 Ceres (new mass determination)
2 Pallas (new mass determination)
3 Juno
4 Vesta (new mass determination)
6 Hebe
7 Iris
8 Flora
9 Metis
10 Hygiea
15 Eunomia
16 Psyche
52 Europa
65 Cybele
511 Davida
704 Interamnia
The refereed paper by Hilton (1999, Astron. J. 117, 1077) describes the USNO/AE98 asteroid ephemerides in detail. The associated USNO/AA Tech Note 1998-12 includes residual plots for all fifteen asteroids and a comparison between these ephemerides and those used in The Astronomical Almanac through 1999.

Software to compact, read, and interpolate the USNO/AE98 asteroid ephemerides is also available. It is written in C and designed to work with the C edition of the Naval Observatory Vector Astrometry Software (NOVAS). The programs could be used with tabular ephemerides of other asteroids as well. The associated README file provides the details of this system.

[submitted] FLARE: Synthetic Fast Radio Burst catalog generator

FLARE, a parallel code written in Python, generates 100,000 Fast Radio Bursts (FRB) using the Monte Carlo method. The FRB population is diverse and includes sporadic FRBs, repeaters, and periodic repeaters. However, less than 200 FRBs have been detected to date, which makes understanding the FRB population difficult. To tackle this problem, FLARE uses a Monte Carlo method to generate 100,000 realistic FRBs, which can be analyzed later on for further research. It has the capability to simulate FRB distances (based on the observed FRB distance range), energies (based on the "flaring magnetar model" of FRBs), fluences, multi-wavelength counterparts (based on x-ray to radio fluence ratio of FRB 200428), and other properties. It analyzes the resulting synthetic FRB catalog and displays the distribution of their properties. It is fast (as a result of parallel code) and requires minimal human interaction. FLARE is, therefore, able to give a broad picture of the FRB population.

[ascl:2102.030] GLEAM: Galaxy Line Emission and Absorption Modeling

GLEAM (Galaxy Line Emission and Absorption Modeling) fits Gaussian models to emission and absorption lines in large samples of 1D galaxy spectra. The code is tailored to work well without much human interaction on optical and infrared spectra in a wide range of instrument setups and signal-to-noise regimes. gleam will create a fits table with Gaussian line measurements, including central wavelength, width, height and amplitude, as well as estimates for the continuum under the line and the line flux, luminosity, equivalent width and velocity width. gleam will also, optionally, make plots of the spectrum with fitted lines overlaid.

[ascl:2102.029] BALRoGO: Bayesian Astrometric Likelihood Recovery of Galactic Objects

BALRoGO (Bayesian Astrometric Likelihood Recovery of Galactic Objects) handles data from the Gaia space mission. It extracts galactic objects such as globular clusters and dwarf galaxies from data contaminated by interlopers using a combination of Bayesian and non-Bayesian approaches. It fits proper motion space, surface density, and the object center. It also provides confidence regions for the color-magnitude diagram and parallaxes.

[ascl:2102.028] PyAutoFit: Classy probabilistic programming

PyAutoFit supports advanced statistical methods such as massively parallel non-linear search grid-searches, chaining together model-fits and sensitivity mapping. It is a Python-based probabilistic programming language which composes and fits models using a range of Bayesian inference libraries, such as emcee (ascl:1303.002) and dynesty (ascl:1809.013). It performs model composition and customization, outputting results, model-specific visualization and posterior analysis. Built for big-data analysis, results are output as a database which can be loaded after model-fitting is complete.

[ascl:2102.027] PyFstat: Continuous gravitational-wave data analysis

PyFstat performs F-statistic-based continuous gravitational wave (CW) searches and other CW data analysis tasks. It is built on top of the LALSuite library (ascl:2012.021), making that library's functionality more accessible through a Python interface; it also provides MCMC-based followup of promising candidates from wide-parameter-space searches.

[ascl:2102.026] extinction: Dust extinction laws

extinction is an implementation of fast interstellar dust extinction laws in Python. It contains Cython-optimized implementations of empirical dust extinction laws found in the literature. Flux values can be reddened or dereddened using included functions, and all extinction laws accept a unit keyword to change the interpretation of the wavelength array from Angstroms to inverse microns. Part of this code originated in the specutils package (ascl:1902.012).

[ascl:2102.025] binaryoffset: Detecting and correcting the binary offset effect in CCDs

binaryoffset identifies the binary offset effect in images from any detector. The easiest input to work with is a dark or bias image that is spatially flat. The code can also be run on images that are not spatially flat, assuming that there is some model of the signal on the CCD that can be used to produce a residual image.

[ascl:2102.024] Piff: PSFs In the Full FOV

Piff models the point-spread function (PSF) across multiple detectors in the full field of view (FOV). Models can be built in chip coordinates or in sky coordinates if needed to account for the effects of astrometric distortion. The software can fit in either real or Fourier space, and can identify and excise outlier stars that are poor exemplars of the PSF according to some metric.

[ascl:2102.023] Multi_CLASS: Cross-tracer angular power spectra of number counts using CLASS

Multi_CLASS modifies the Boltzmann code CLASS (ascl:1106.020) to compute of the cross-tracer angular power spectra of the number count fluctuations for two different tracers of the underlying dark matter density field. In other words, it generalizes the standard nCl output option of CLASS to the case of two different tracers, for example, two different galaxy populations with their own redshift distribution, and galaxy and magnification bias parameters, among others.

Multi_CLASS also includes an implementation of the effect of primordial non-Gaussianities of the local type, parametrized by the parameter f_NL (following the large-scale structure convention), on the effective bias of the tracers. There is also the possibility of having a tilted non-Gaussian correction, parametrized by n_NG, with a pivot scale determined by k_pivot_NG. The package includes galaxy redshift distributions for forthcoming galaxy surveys, with the ease of choosing between them (or an input file) from the parameters input file (e.g., multi_explanatory.ini). In addition, Multi_CLASS includes the possibility of using resolved gravitational wave events as a tracer.

[ascl:2102.022] RASSINE: Normalizing 1D stellar spectra

RASSINE normalizes merged 1D spectra using the concept of convex hulls. The code uses six parameters that can be fine-tuned, and provides an interactive interface, including graphical feedback, for easily choosing the parameters. RASSINE can also provide a first guess for the parameters that are derived directly from the merged 1D spectrum based on previously performed calibrations.

[ascl:2102.021] lensingGW: Lensing of gravitational waves

lensingGW simulates lensed gravitational waves in ground-based interferometers from arbitrary compact binaries and lens models. Its algorithm resolves strongly lensed images and microimages simultaneously, such as the images resulting from hundreds of microlenses embedded in galaxies and galaxy clusters. It is based on Lenstronomy (ascl:1804.012),

[ascl:2102.020] MOSAIC: Multipole operator generator for Fast Multipole Method operators

MOSAIC (Multipole Operators in Symbols, Automatically Improved and Condensed) automatically produces, verifies, and optimizes computer code for Fast Multipole Method (FMM) operators. It is based on a symbolic algebra library, and can produce code for any expansion order and be extended to use any basis or kernel function. The code applies algebraic modifications to reduce the number of floating-point operations and can symbolically verify correctness.

[ascl:2102.019] HUAYNO: Hierarchically split-Up AstrophYsical N-body sOlver N-body code

HUAYNO implements integrators derived from second order Hamiltonian splitting for N-body dynamics. This integration scheme conserves energy and momentum with little or no systematic drift. The code uses an explicit but approximate formula for the time symmetrization that is compatible with the use of individual time steps, making an iterative scheme unnecessary. HUAYNO is available as part of the AMUSE package (ascl:1107.007).

[ascl:2102.018] DaMaSCUS-SUN: Dark Matter Simulation Code for Underground Scatterings - Sun Edition

DaMaSCUS-SUN is a Monte Carlo tool simulating the process of solar reflection of dark matter (DM) particles. It provides precise estimates of the DM particle flux reflected by the Sun and passing through a direct detection experiment on Earth. One application is to compute exclusion limits for low DM masses based on nuclear and electron recoil experiments.

[ascl:2102.017] mirkwood: SED modeling using machine learning

mirkwood uses supervised machine learning to model non-linearly mapping galaxy fluxes to their properties. Multiple models are stacked to mitigate poor performance by any individual model in a given region of the parameter space. The code accounts for uncertainties arising both from intrinsic noise in observations and from finite training data and incorrect modeling assumptions, and provides highly accurate physical properties from observations of galaxies as compared to traditional SED fitting.

[ascl:2102.016] OPUS: Interoperable access to analysis and simulation codes

OPUS (Observatoire de Paris UWS System) provides interoperable access to analysis and simulation codes on local machines or work clusters. This job control system was developed using the micro-framework bottle.py, and executes jobs asynchronously to better manage jobs with a long execution duration. The software follows the proposed IVOA Provenance Data Model to capture and expose the provenance information of jobs and results.

Would you like to view a random code?