ASCL.net

Astrophysics Source Code Library

Making codes discoverable since 1999

Searching for codes credited to 'Bryan, Greg'

Tip! Refine or expand your search. Authors are sometimes listed as 'Smith, J. K.' instead of 'Smith, John' so it is useful to search for last names only. Note this is currently a simple phrase search.

[ascl:1010.072] Enzo: AMR Cosmology Application

Enzo is an adaptive mesh refinement (AMR), grid-based hybrid code (hydro + N-Body) which is designed to do simulations of cosmological structure formation. It uses the algorithms of Berger & Collela to improve spatial and temporal resolution in regions of large gradients, such as gravitationally collapsing objects. The Enzo simulation software is incredibly flexible, and can be used to simulate a wide range of cosmological situations with the available physics packages.

Enzo has been parallelized using the MPI message-passing library and can run on any shared or distributed memory parallel supercomputer or PC cluster. Simulations using as many as 1024 processors have been successfully carried out on the San Diego Supercomputing Center's Blue Horizon, an IBM SP.

[ascl:1612.020] Grackle: Chemistry and radiative cooling library for astrophysical simulations

The chemistry and radiative cooling library Grackle provides options for primordial chemistry and cooling, photo-heating and photo-ionization from UV backgrounds, and support for user-provided arrays of volumetric and specific heating rates for astrophysical simulations and models. The library provides functions to update chemistry species; solve radiative cooling and update internal energy; and calculate cooling time, temperature, pressure, and ratio of specific heats (gamma), and has interfaces for C, C++, Fortran, and Python codes.

[ascl:2403.011] LtU-ILI: Robust machine learning in astro

LtU-ILI (Learning the Universe Implicit Likelihood Inference) performs machine learning parameter inference. Given labeled training data or a stochastic simulator, the LtU-ILI piepline automatically trains state-of-the-art neural networks to learn the data-parameter relationship and produces robust, well-calibrated posterior inference. The package comes with a wide range of customizable complexity, including posterior-, likelihood-, and ratio-estimation methods for ILI, including sequential learning analogs, and various neural density estimators, including mixture density networks, conditional normalizing flows, and ResNet-like ratio classifiers. It offers fully-customizable, exotic embedding networks, including CNNs and Graph Neural Networks, and a unified interface for multiple ILI backends such as sbi, pydelfi, and lampe. LtU-ILI also handles multiple marginal and multivariate posterior coverage metrics, and offers Jupyter and command-line interfaces and a parallelizable configuration framework for efficient hyperparameter tuning and production runs.