Skip to content

Commit

Permalink
Merge pull request #81 from aimalz/issue/80/descope
Browse files Browse the repository at this point in the history
Issue/80/descope
  • Loading branch information
aimalz committed Jul 22, 2020
2 parents 2940d3a + 0702d4e commit e69960c
Show file tree
Hide file tree
Showing 53 changed files with 706 additions and 578 deletions.
16 changes: 8 additions & 8 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,28 +4,28 @@ python: 2.7
install:
- sudo apt-get update
# Install the dependencies and the package:
- pip install -r requirements.txt
- python setup.py install
# - pip install -r requirements.txt
# - python setup.py install
# Now get set up to run jupyter notebooks:
- sudo apt-get install texlive-latex-recommended
- sudo apt-get install texlive-latex-extra
- sudo apt-get install texlive-fonts-recommended
- sudo apt-get install texlive-publishers
- sudo apt-get install chktex
- sudo apt-get install dvipng
- pip install --upgrade jupyter
- pip install nbconvert
# - pip install --upgrade jupyter
# - pip install nbconvert
# Finally get set up to build the docs
- pip install sphinx
- pip install sphinx_rtd_theme
# - pip install sphinx
# - pip install sphinx_rtd_theme

script:
# Run the unit tests:
# - nosetests
# Run the demo notebook:
- jupyter nbconvert --ExecutePreprocessor.kernel_name=python --ExecutePreprocessor.timeout=600 --to notebook --execute docs/notebooks/demo2.ipynb
# - jupyter nbconvert --ExecutePreprocessor.kernel_name=python --ExecutePreprocessor.timeout=600 --to notebook --execute docs/notebooks/demo2.ipynb
# Build the docs, the same way readthedocs does it:
- cd docs ; sphinx-build -b html . _build/html ; cd -
# - cd docs ; sphinx-build -b html . _build/html ; cd -
# Compile the paper
- cd research/paper; make

Expand Down
22 changes: 16 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,25 +1,35 @@
# chippr

Cosmological Hierarchical Inference with Probabilistic Photometric Redshifts.
Cosmological Hierarchical Inference with Probabilistic Photometric Redshifts

## Motivation

This repository is the home of `chippr`, a Python package for estimating quantities of cosmological interest from surveys of photometric redshift posterior probability distributions. It is a refactoring of my previous [work](https://github.com/aimalz/prob-z) on probabilistic photometric redshifts.
This repository is the home of `chippr`, a Python package for estimating quantities of cosmological interest from surveys of photometric redshift posterior probability distributions.
It is a refactoring of previous [work](https://github.com/aimalz/prob-z) on using probabilistic photometric redshifts to infer the redshift distribution.

## Examples

You can browse the demo notebook here in this repo:
You can browse the demo notebook here:

* [Basic Demo for Python 2.7](http://htmlpreview.github.io/?https://github.com/aimalz/chippr/blob/master/docs/notebooks/demo2.html)

## Documentation

Documentation can be found on [ReadTheDocs](http://chippr.readthedocs.io/en/master/). The draft of the paper documenting the details of the method can be found [here](https://github.com/aimalz/chippr/blob/master/research/paper/draft.pdf).
Documentation can be found on [ReadTheDocs](http://chippr.readthedocs.io/en/master/).
The draft of the paper documenting the details of the method can be found [here](https://github.com/aimalz/chippr/blob/master/research/paper/draft.pdf).

## Disclaimer

As can be seen from the git history and Python version, this code is stale and should be understood to be a prototype, originally scoped out for applicability to SDSS DR10-era data of low dimensionality.
As a disclaimer, it will need a major upgrade for flexibility and computational scaling before it can run on data sets like those of modern and future galaxy surveys.

## People

* [Alex Malz](https://github.com/aimalz/qp/issues/new?body=@aimalz) (NYU)
* [Alex I. Malz](https://github.com/aimalz) (German Centre for Cosmological Lensing)

## License, Contributing etc

The code in this repo is available for re-use under the MIT license, which means that you can do whatever you like with it, just don't blame me. If you end up using any of the code or ideas you find here in your academic research, please cite me as `Malz et al, in preparation\footnote{\texttt{https://github.com/aimalz/chippr}}`. If you are interested in this project, please do drop me a line via the hyperlinked contact name above, or by [writing me an issue](https://github.com/aimalz/chippr/issues/new). To get started contributing to the `chippr` project, just fork the repo - pull requests are always welcome!
The code in this repo is available for re-use under the MIT license, which means that you can do whatever you like with it, just don't blame me.
If you end up using any of the code or ideas you find here in your academic research, please cite me as `Malz et al, in preparation\footnote{\texttt{https://github.com/aimalz/chippr}}`.
If you are interested in this project, please do drop me a line via the hyperlinked contact name above, or by [writing me an issue](https://github.com/aimalz/chippr/issues/new).
To get started contributing to the `chippr` project, just fork the repo -- pull requests are always welcome!
1 change: 1 addition & 0 deletions chippr/catalog.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,7 @@ def __init__(self, params={}, vb=True, loc='.', prepend=''):
if vb:
print self.params

np.random.seed(d.seed)
self.cat = {}

self.dir = loc
Expand Down
8 changes: 4 additions & 4 deletions chippr/catalog_plots.py
Original file line number Diff line number Diff line change
Expand Up @@ -139,10 +139,10 @@ def plot_mega_scatter(zs, pfs, z_grid, grid_ends, truth=None, plot_loc='', prepe
limval = (max(grid_ends) - min(grid_ends)) / (len(grid_ends) - 1.)
scatplot.set_xlim([min(grid_ends)-limval, max(grid_ends)+limval])
scatplot.set_ylim([min(grid_ends)-limval, max(grid_ends)+limval])
scatplot.set_xticks(np.linspace(min(grid_ends), np.ceil(max(grid_ends)), 5))
scatplot.set_yticks(np.linspace(min(grid_ends), np.ceil(max(grid_ends)), 5))
scatplot.set_xlabel(r'$z_{spec}$')
scatplot.set_ylabel(r'$z_{phot}$')
scatplot.set_xticks(np.linspace(np.floor(min(grid_ends)), np.ceil(max(grid_ends)), 5))
scatplot.set_yticks(np.linspace(np.floor(min(grid_ends)), np.ceil(max(grid_ends)), 5))
scatplot.set_xlabel(r'$z_{true}$')
scatplot.set_ylabel(r'$z_{est}$')
scatplot.text(0.25, 3., r'mock $p(z\mid \mathrm{``data"})$', rotation=0, size=20)

# scatplot.set_aspect(1.)
Expand Down
11 changes: 8 additions & 3 deletions chippr/log_z_dens_plots.py
Original file line number Diff line number Diff line change
Expand Up @@ -317,7 +317,7 @@ def plot_estimators(info, plot_dir, log=True, prepend='', metrics=True, mini=Tru
f = plt.figure(figsize=(7.5, 7.5))
sps_log = f.add_subplot(1, 1, 1)
sps_log.set_xlabel(r'$z$')
sps_log.set_xticks(np.linspace(min(info['bin_ends']), np.ceil(max(info['bin_ends'])), 5))
sps_log.set_xticks(np.linspace(np.floor(min(info['bin_ends'])), np.ceil(max(info['bin_ends'])), 5))
# mini_sps.ticklabel_format(style='sci',axis='y')

# tru, =
Expand All @@ -344,7 +344,8 @@ def plot_estimators(info, plot_dir, log=True, prepend='', metrics=True, mini=Tru
sps_log.set_ylim(-4., 1.)
sps_log.set_ylabel(r'$\ln[n(z)]$')
else:
sps_log.set_ylim(0., 3.)
sps_log.set_ylim(0., 4.)
sps_log.set_yticks([0,1,2,3,4])
sps_log.set_ylabel(r'$n(z)$')
sps_log.set_xlim(info['bin_ends'][0], info['bin_ends'][-1])

Expand Down Expand Up @@ -501,7 +502,11 @@ def plot_estimators(info, plot_dir, log=True, prepend='', metrics=True, mini=Tru

# sps_log.legend(handles=color_plots[:-1], fontsize='x-small', loc='lower center', frameon=False)
sps_log.legend(fontsize='large', loc='upper right', frameon=False)
sps_log.text(0.25, -3.75, r'inferred $n(z)$', rotation=0, size=20)
if log:
sps_log.text(0.25, -3.75, r'inferred $n(z)$', rotation=0, size=20)
else:
sps_log.text(2., 0.75, r'inferred $n(z)$', rotation=0, size=20)

f.subplots_adjust(hspace=0, wspace=0)
f.savefig(os.path.join(plot_dir, prepend+'estimators.png'), bbox_inches='tight', pad_inches = 0, dpi=d.dpi)
print(info['stats'])
Expand Down
4 changes: 2 additions & 2 deletions research/paper/cleaner.sh
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#infile='thesis.tex'
#outfile='draft.tex'

sed -e "s/{\\textbackslash}'\\\\{i\\\\}/\\'{i}/g" thesis.bib | grep -E -v '^\W*(doi|url|urldate|abstract|file|keywords|annote|note)' > thesis.bib.tmp
mv thesis.bib.tmp thesis.bib
sed -e "s/{\\textbackslash}'\\\\{i\\\\}/\\'{i}/g" draft.bib | grep -E -v '^\W*(doi|url|urldate|abstract|file|keywords|annote|note)' > draft.bib.tmp
mv draft.bib.tmp draft.bib

#grep -E -v '^(%|[[:blank:]]*%|\\COMMENT|\includegraphics*)' "$infile" | fold -w80 -s > "$outfile"
Loading

0 comments on commit e69960c

Please sign in to comment.