Submitting author: @ruthangus (Ruth Angus)
Repository: https://github.com/RuthAngus/stardate
Version: v0.1.0
Editor: @arfon
Reviewer: @mattpitkin, @nespinoza
Archive: 10.5281/zenodo.3444905
Status badge code:
HTML: <a href="http://joss.theoj.org/papers/ee2bbcd6b8fd88492d60f2fe77f4fcdd"><img src="http://joss.theoj.org/papers/ee2bbcd6b8fd88492d60f2fe77f4fcdd/status.svg"></a>
Markdown: [](http://joss.theoj.org/papers/ee2bbcd6b8fd88492d60f2fe77f4fcdd)
Reviewers and authors:
Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)
@mattpitkin & @nespinoza, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:
The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @arfon know.
โจ Please try and complete your review in the next two weeks โจ
paper.md
file include a list of authors with their affiliations?paper.md
file include a list of authors with their affiliations?Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @mattpitkin, it looks like you're currently assigned as the reviewer for this paper :tada:.
:star: Important :star:
If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews ๐ฟ
To fix this do the following two things:
For a list of things I can do to help you, just type:
@whedon commands
Attempting PDF compilation. Reticulating splines etc...
@mattpitkin, @nespinoza - please carry out your review in this issue by updating the checklist above and giving feedback in this issue. The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html
Any questions/concerns please let me know.
Hi @RuthAngus, one of the things we're asked to check is:
Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support
Would you be able to add a few sentences to the README file and main documentation page telling people how they should contribute to the project? Just something along the lines of saying that if they want to contribute they can submit issues or pull requests from the github repo.
Absolutely! I have added a sentence to the bottom of the README.
@RuthAngus thanks for adding this. I noticed that in the raw version of the README.rst file there's some info on installing and an example of running the code, but these are commented out. Even with the link to the documentation that exists I think it would still be useful to have that info visible. Could you uncomment them?
@mattpitkin good suggestion -- I have now done so!
@RuthAngus I'm working through the very nice example here https://stardate.readthedocs.io/en/latest/tutorials/Tutorial.html#. When I get to the code block after the sentence "If you know the extinction you can pass that to the Star object too:"
it throws an error because Av
and Av_err
are not defined. I'm assuming this Av
is the extinction value (and it's associated error) that previous sentence mentions, but this should be made explicit in the sentence, or the values should be removed from the example, so that it will run without an error if people have been following step-by-step.
@RuthAngus Could you add emcee
to the dependencies listed in https://stardate.readthedocs.io/en/latest/user/install.html#dependencies?
@RuthAngus In the paper it's probably worth mentioning that the fitting uses the MCMC software emcee
to perform the fitting (with a reference) and therefore returns samples from the parameters' posterior probability distributions.
Related to this, is there any way to set what priors you can use for any of the fitted parameters?
@RuthAngus are there any plans to add a test suite and continuous integration to the package? I don't think this is a requirement for software that is published in JOSS (@arfon?), but it would be nice and might be useful for future development?
@RuthAngus @arfon Once the above questions have been answered I'm happy to sign this off as reviewed.
@RuthAngus are there any plans to add a test suite and continuous integration to the package? I don't think this is a requirement for software that is published in JOSS (@arfon?), but it would be nice and might be useful for future development?
The JOSS guidelines on testing require at _least_ the following:
Documented manual steps that can be followed to objectively check the expected functionality of the software (e.g. a sample input file to assert behaviour)
We definitely prefer automated tests (ideally hooked up to Travis/Circle CI etc too) but don't require them. It is required though that there are documented procedures that the reviewer can follow to _objectively_ check that the software is doing what is expected.
@mattpitkin - does that help?
@arfon there is a very nice example and tutorial of using the code, but I think from what you say this may not quite be enough. As the code is doing stochastic sampling, the example given won't necessarily produce identical results each time.
@RuthAngus here's what I'd suggest by way of a test, which may be useful. First, make the star
class, or its run_mcmc()
method accept a random state variable of the form required by emcee. With this you'll be able to seed the sampler to reproduce exactly the same results each time. Then run the code for a particular star that you've analysed before, using a random state initialised with a particular seed number, and probably starting the sampler very close the already known maximum posterior values, generating a few hundred samples. Save these samples and add then into the repo in a test
directory. Also, add the script that you've just used to produce the samples to the test
directory, but change it so that the samples get output to a different file. Also, have the script read in the previously generated samples file and have check that the new samples it will produce are identical to the already existing samples (you could compare each sample, or just compare statistics like the mean, median and credible intervals to make sure they are the same). Does that sound doable?
Hi @mattpitkin, thanks for these comments!
I have now fixed the Av bug in the tutorial and added emcee to the list of dependencies. Thanks for running through the tutorial! There is not currently any support for manually adjusting the priors, but that's a feature I'd love to add in future. Would you mind if I tabled this for the next version? The priors that I'm using are described in the appendix of the submitted AAS paper that accompanies this software. I have also added a mention and reference to emcee in the JOSS paper, thanks for suggesting that!
I do currently have some tests in the /stardate/tests directory but I don't provide instructions on how to run these. Should I add some instructions to the docs on how to run the tests with pytest or something? The test in integration_tests.py does something similar to what you suggest. It does use a fixed random number seed, but I had previously hard-coded this into the function that runs emcee, which doesn't seem like the most sensible choice :-). I have now added an argument to set the seed in the fit() function. This test doesn't save samples like you suggest, which is a great idea, but that would fail each time I changed something in the likelihood function or prior, so I'd need to regenerate the sample file each time I did that. Do you think this is worthwhile doing regardless? I also agree that adding continuous integration would be really nice. I've never set this up before and I'm not that familiar with how, e.g. Travis works, but would it be a problem that the big file of MIST model isochrones needs to be downloaded in order to run the tests?
@whedon generate pdf
Attempting PDF compilation. Reticulating splines etc...
@RuthAngus here are some replies inline.
I have now fixed the Av bug in the tutorial and added emcee to the list of dependencies.
Great, thanks. I think emcee is still missing in the installation requirements described in the README file, but otherwise its looking good.
I do currently have some tests in the /stardate/tests directory but I don't provide instructions on how to run these. Should I add some instructions to the docs on how to run the tests with pytest or something? The test in integration_tests.py does something similar to what you suggest.
Excellent. I'd missed that this test was there, but it looks like the sort of thing that is required. Adding instructions on how to run it with pytest would be very useful.
It does use a fixed random number seed, but I had previously hard-coded this into the function that runs emcee, which doesn't seem like the most sensible choice :-). I have now added an argument to set the seed in the fit() function.
Great :smiley:
This test doesn't save samples like you suggest, which is a great idea, but that would fail each time I changed something in the likelihood function or prior, so I'd need to regenerate the sample file each time I did that.
My suggestion of having a saved set of samples in the test suite and doing a direct comparison between those and a new set of samples may be overkill. Having a broader consistency test like you do in your current integration_tests.py
should be enough (and is enough for this review I think). However, in general it would be good to have some fixed exact comparison - as you say, you may change the likelihood/prior in the future, but when doing this it would be best to fully maintain backwards compatibility (unless you find a bug!), and a fixed test for a specific set of data and a specific likelihood function and prior, would help ensure that this backwards compatibility is maintained. I don't think this is a requirement for this review though.
I also agree that adding continuous integration would be really nice. I've never set this up before and I'm not that familiar with how, e.g. Travis works, but would it be a problem that the big file of MIST model isochrones needs to be downloaded in order to run the tests?
It's relatively easy to set up Travis CI integration - an example of a .travis.yml
file that I've used in one of my packages for this is here. Once you have that in your repo you just have to register the project on the travis website. It think is should be able to handle having to download the MIST model isochrones file each time - often it has to download a large Docker image file, or install some large package, each time anyway (in fact you could create your own Docker image for it to use that already includes the file...)
@mattpitkin I've added emcee to the README installation guide now, thanks for spotting that. I have also added some basic test instructions to the docs. Does this look alright to you? Thanks for providing an example of a .travis.yml! I have CI working now and I'm really pleased about this! Let me know if you have any other suggestions for improvements.
@RuthAngus Great, that all looks good to me. I've ticked of all my check boxes and am happy to sign-off the review.
:wave: @nespinoza - how are you getting on with your review?
Hi @RuthAngus,
Thanks first of all for all the work you've already done with @mattpitkin's part of the review --- I've been keeping an eye on his comments in order to not bring up the same ideas/comments again in my review.
There's a couple of things that I think should be stated in the stardate
documentation with which I actually had trouble with; this is what prevents me right now from ticking the Installation instructions part, but I think they are fairly easy to address:
Because isochrones
only works now on python3
, stardate
also only works on python3
. As a hard-core python2.7
user (and yes, I know --- we have to evolve! On it), I think it is important that you state at the beginning of the docs that stardate
will only work with python3
because of this reason.
On a similar note, it appears that when you do a pip install emcee
, in several systems I have tried this the default is to install emcee 2.2.1, which _does not include the emcee.backend
module_ (which is only introduced in emcee version 3), which is needed for stardate
to work (line 176 in stardate.py
). I think this really should be a shout-out to @dfm, but in the meantime perhaps it should be stated that stardate
also only works on emcee
versions 3 and up in the docs as well.
Other than the above, I believe this is a fantastic package, kudos to @RuthAngus et al. for it --- looking forward to using it in my own research!
Hi @nespinoza, thanks for these suggestions -- these are both things I hadn't thought of! I have now added a python3 warning to the readme and docs and I have made the emcee version requirement explicit in the installation instructions. Hopefully that should work now!
Fantastic @RuthAngus, thank you for including that. I have signed the Installation Instructions part then and thus, I'm happy to recommend this por publication in JOSS!
Thanks so much @nespinoza and @mattpitkin for helping me improve the code! @arfon, is there anything else I need to do now?
@whedon check references
Attempting to check references...
```Reference check summary:
OK DOIs
MISSING DOIs
INVALID DOIs
@whedon generate pdf
Attempting PDF compilation. Reticulating splines etc...
Thanks so much @nespinoza and @mattpitkin for helping me improve the code! @arfon, is there anything else I need to do now?
At this point I think we need to wait for your AAS paper to be accepted so we can publish them at the same time. Do you know the status of that? Once that paper is accepted too then we can make a final update to the JOSS paper to make sure that your Angus et al (submitted)
can be updated to cite the accepted article (including DOI etc).
/ cc @crawfordsm
@arfon okay, sounds good! I'm currently waiting to receive a first-round referee report for the AAS paper. I submitted it less than two weeks ago, so it could still be quite a while before getting an acceptance for that one. I'm in no rush though and happy to wait :-)
/ cc @crawfordsm
The correspond AAS DOI is 10.3847/1538-3881/ab3c53
Thanks @crawfordsm. @RuthAngus - can you merge this PR which adds the ApJ link to the paper? https://github.com/RuthAngus/stardate/pull/8.
Also, I forgot to check, is it ApJ you are publishing the accompanying paper in? If not, just modify the name of the journal on this line.
Hi @arfon and @crawfordsm, it's an AJ paper, so I added the DOI and journal lines to the paper file without merging the PR, hope that's ok!
@whedon generate pdf
Attempting PDF compilation. Reticulating splines etc...
@RuthAngus - do you think you could update the reference to 'Angus et al (submitted)' to point to the AJ paper (https://doi.org/10.3847/1538-3881/ab3c53
)?
After that, could you make a new release of this software that includes the changes that have resulted from this review. Then, please make an archive of the software in Zenodo/figshare/other service and update this thread with the DOI of the archive? For the Zenodo/figshare archive, please make sure that:
I can then move forward with accepting the submission.
:+1: Marked @arfon as OOO from Friday, September 13th 2019 to Monday, September 16th 2019. :calendar:
@whedon generate pdf
Attempting PDF compilation. Reticulating splines etc...
Hi @arfon, I have updated the reference in the paper and created a new release! The Zenodo DOI is: 10.5281/zenodo.3444905
@whedon set 10.5281/zenodo.3444905 as archive
OK. 10.5281/zenodo.3444905 is the archive.
@whedon accept
Attempting dry run of processing paper acceptance...
Check final proof :point_right: https://github.com/openjournals/joss-papers/pull/964
If the paper PDF and Crossref deposit XML look good in https://github.com/openjournals/joss-papers/pull/964, then you can now move forward with accepting the submission by compiling again with the flag deposit=true
e.g.
@whedon accept deposit=true
@whedon accept deposit=true
Doing it live! Attempting automated processing of paper acceptance...
๐ฆ๐ฆ๐ฆ ๐ Tweet for this paper ๐ ๐ฆ๐ฆ๐ฆ
๐จ๐จ๐จ THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! ๐จ๐จ๐จ
Here's what you must now do:
Party like you just published a paper! ๐๐๐ฆ๐๐ป๐ค
Any issues? notify your editorial technical team...
Woohoo! Thank you!
@mattpitkin, @nespinoza - many thanks for your reviews here โจ
@RuthAngus - your paper is now accepted into JOSS :zap::rocket::boom:
:tada::tada::tada: Congratulations on your paper acceptance! :tada::tada::tada:
If you would like to include a link to your paper from your README use the following code snippets:
Markdown:
[](https://doi.org/10.21105/joss.01469)
HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.01469">
<img src="https://joss.theoj.org/papers/10.21105/joss.01469/status.svg" alt="DOI badge" >
</a>
reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.01469/status.svg
:target: https://doi.org/10.21105/joss.01469
This is how it will look in your documentation:
We need your help!
Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following: