Submitting author: @sambit-giri (Sambit Kumar Giri)
Repository: https://github.com/sambit-giri/tools21cm
Version: v2.1
Editor: @pibion
Reviewers: @nicholebarry, @aureliocarnero, @ziotom78
Archive: 10.5281/zenodo.3973542
:warning: JOSS reduced service mode :warning:
Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.
Status badge code:
HTML: <a href="https://joss.theoj.org/papers/77f0e49b819b80fb05ee90fb706ec037"><img src="https://joss.theoj.org/papers/77f0e49b819b80fb05ee90fb706ec037/status.svg"></a>
Markdown: [](https://joss.theoj.org/papers/77f0e49b819b80fb05ee90fb706ec037)
Reviewers and authors:
Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)
@nicholebarry & @aureliocarnero & @ziotom78 , please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:
The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @pibion know.
โจ Please try and complete your review in the next six weeks โจ
Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @nicholebarry, @aureliocarnero it looks like you're currently assigned to review this paper :tada:.
:warning: JOSS reduced service mode :warning:
Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.
:star: Important :star:
If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews ๐ฟ
To fix this do the following two things:
For a list of things I can do to help you, just type:
@whedon commands
For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:
@whedon generate pdf
Reference check summary:
OK DOIs
- 10.1016/j.physrep.2006.08.002 is OK
- 10.1088/0034-4885/75/8/086901 is OK
- 10.1111/j.1365-2966.2012.21293.x is OK
- 10.1093/mnras/stx2539 is OK
- 10.1093/mnras/stt1341 is OK
- 10.1093/mnras/stv2679 is OK
- 10.1093/mnras/stx649 is OK
- 10.1093/mnras/stz1220 is OK
- 10.1093/mnras/stz2224 is OK
- 10.1088/1475-7516/2019/02/058 is OK
- 10.1093/mnras/sty1786 is OK
- 10.1093/mnras/stv571 is OK
- 10.1007/s10686-013-9334-5 is OK
- 10.1051/0004-6361/201220873 is OK
- 10.1109/JPROC.2009.2017564 is OK
MISSING DOIs
- None
INVALID DOIs
- None
๐๐ผ @sambit-giri @nicholebarry @aureliocarnero this is the review thread for the paper. All of our communications will happen here from now on.
Both reviewers have checklists at the top of this thread with the JOSS requirements. As you go over the submission, please check any items that you feel have been satisfied. There are also links to the JOSS reviewer guidelines.
The JOSS review is different from most other journals. Our goal is to work with the authors to help them meet our criteria instead of merely passing judgment on the submission. As such, the reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention openjournals/joss-reviews#2363
so that a link is created to this thread (and I can keep an eye on what is happening). Please also feel free to comment and ask questions on this thread. In my experience, it is better to post comments/questions/suggestions as you come across them instead of waiting until you've reviewed the entire package.
We are currently operating in reduced service mode and aim for reviews to be completed within about 6 weeks. Please let me know if any of you require some more time. We can also use Whedon (our bot) to set automatic reminders if you know you'll be away for a known period of time.
Please feel free to ping me (@pibion) if you have any questions/concerns.
@whedon add @ziotom78 as reviewer
OK, @ziotom78 is now a reviewer
@aureliocarnero apologies, I needed to edit the checklist to add an additional reviewer and managed to delete the checks you've put so far.
Could you take a look and re-check?
I won't need to edit this issue again so they will stay this time!
@pibion sure no problem
@aureliocarnero thank you, and apologies again!
Dear @sambit-giri, while reviewing this paper, one thing we have to check is if the author list seems correct. From the people that has contributed to the git code and the authors in the paper, seems reasonable, but then, when you go to:
https://tools21cm.readthedocs.io/authors.html you can see Hannes Jensen as an author as well. Should not be an author as well in the paper? Or is it maybe not an author anymore? Nonetheless, if he/she had contribution in the past, you should consider putting the name as author in the manuscript.
Cheers
Dear @aureliocarnero,
Hannes Jensen had a package named c2raytools (https://github.com/hjens/c2raytools). It was not maintained after he left academia. We modified that package and merged it into tools21cm. We had contacted Hannes to be a co-author. But he does not have a proper academic affiliation/address to be on the paper. We were not sure how JOSS can handle this. If this is a problem, Hannes also is fine being acknowledged in the paper. What do you suggest?
@pibion Can you add the second author (https://github.com/garrelt) to this review process?
@sambit-giri I believe @garrelt can join this review process by subscribing to this thread.
@sambit-giri I'll look into this and get back to you shortly!
Hannes Jensen had a package named c2raytools (https://github.com/hjens/c2raytools). It was not maintained after he left academia. We modified that package and merged it into tools21cm. We had contacted Hannes to be a co-author. But he does not have a proper academic affiliation/address to be on the paper. We were not sure how JOSS can handle this. If this is a problem, Hannes also is fine being acknowledged in the paper. What do you suggest?
@sambit-giri, @aureliocarnero on the authorship issue: JOSS does need an affiliation for every author, but this is only required to make the latex template happy. So for someone without an academic affiliation, they can put whatever seems most appropriate. For example,
authors:
- name: Jane Doe
affiliation: 1
affiliations:
- name: None
index: 1
Would show "None" as the affiliation.
The author can put whatever seems most appropriate here, though. In addition to the "none" option, another suggestion is"Self."
Dear @sambit-giri please see message above. I would recommend to include Hannes as author to the paper if he agrees.
I have some comments on the references in the paper draft.
Since you reference several papers of yourself from 2019, could you please put 2019a, 2019b... etcerera? The same as you do with your 2018 papers.
Also, there is one reference: Ross et al., 2017, that at some point you reference as: Ross, Dixon, Iliev,
& Mellema, 2017.
Also, I'm not sure when you should use et al, or the list of names as a function of the numbers of authors. I think is 3, if there are only 3 authors, you can place the three names, but if they are more than 3, then you should put et al. Homogenize it through the text, please. @pibion , could you please confirm this issue? When we should use et al, and when the full list of authors?
@sambit-giri also, one of the checks I need to do related to the paper is:
State of the field: Do the authors describe how this software compares to other commonly-used packages?
You actually say nothing about the state of the field, are there any other software that compares? In these terms, I think is important you name in the draft what you commented above: "We modified c2raytools (https://github.com/hjens/c2raytools) package and merged it into tools21cm. ", referencing the code appropietly.
Cheers
Aurelio
@aureliocarnero I'm taking the day off today but I'll get back to you on the reference style Monday!
@sambit-giri, does the following provide sufficient information for you to add your additional collaborator as an author?
@sambit-giri, @aureliocarnero on the authorship issue: JOSS does need an affiliation for every author, but this is only required to make the latex template happy. So for someone without an academic affiliation, they can put whatever seems most appropriate. For example,
authors: - name: Jane Doe affiliation: 1 affiliations: - name: None index: 1
Would show "None" as the affiliation.
The author can put whatever seems most appropriate here, though. In addition to the "none" option, another suggestion is"Self."
@nicholebarry, @ziotom78 this is a reminder that we're asking our reviewers to aim to complete their checklist in about six weeks. If you have an idea of what your time frame is for getting started, you can let me know and I can set a reminder with whedon
.
@whedon generate pdf
PDF failed to compile for issue #2363 with the following error:
/app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon/author.rb:72:in block in build_affiliation_string': Problem with affiliations for Hannes Jensen, perhaps the affiliations index need quoting? (RuntimeError)
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon/author.rb:71:in
each'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon/author.rb:71:in build_affiliation_string'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon/author.rb:17:in
initialize'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon.rb:201:in new'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon.rb:201:in
block in parse_authors'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon.rb:198:in each'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon.rb:198:in
parse_authors'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon.rb:91:in initialize'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon/processor.rb:36:in
new'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/lib/whedon/processor.rb:36:in set_paper'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/bin/whedon:55:in
prepare'
from /app/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/command.rb:27:in run'
from /app/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/invocation.rb:126:in
invoke_command'
from /app/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor.rb:387:in dispatch'
from /app/vendor/bundle/ruby/2.4.0/gems/thor-0.20.3/lib/thor/base.rb:466:in
start'
from /app/vendor/bundle/ruby/2.4.0/bundler/gems/whedon-446a0298a33b/bin/whedon:119:in <top (required)>'
from /app/vendor/bundle/ruby/2.4.0/bin/whedon:23:in
load'
from /app/vendor/bundle/ruby/2.4.0/bin/whedon:23:in `
@whedon generate pdf
Dear @sambit-giri , Im starting to test the code. One important aspect is that you need to provide automated tests with your code, but I cannot find those or how to run. According to the checklist, I need to verify that: "Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?"
I cannot verify this at the moment
Dear @sambit-giri, I have a suggestion about the installation instructions. Going to the github page here: https://github.com/sambit-giri/tools21cm
In installation, I think you should name first that the user should clone the repository, I know this is obvious, but it is helpful. Therefore, I suggest that you divide the installation options in 2: first, by cloning the repositoy and then running python setup install, or directly with pip. They are 2 separate ways of doing it and it is not very clear.
@sambit-giri another question about installation. It seems like it is installed properly with the instructions you give, but I have a question. When I install it with python setup install, it appears a warning message as follows:
UserWarning: Unknown distribution option: 'install_requires'.
Im using python3, do you know if this can cause a problem to other users? Please verify the meaning of this warning.
Installing it with pip works perfectly, though
Dear @aureliocarnero
Although individual researchers have undoubtedly developed their own tools to perform some of the calculations
included in tools21cm, we are not aware of any open source package similar to tools21cm. We did search both
with google and on github, but did not find anything.
c2raytools was a set of python routines developed by Hannes Jensen and me to analyse reionization
simulations performed with the C2-Ray radiative transfer code. This set of routines is no longer
maintained and never had a user base beyond our nearest collaborators. It formed the basis
of the tools21cm package, which is however more general, hence the name change.
I hope this answers your query.
Best wishes,
@garrelt
@sambit-giri also, one of the checks I need to do related to the paper is:
State of the field: Do the authors describe how this software compares to other commonly-used packages?You actually say nothing about the state of the field, are there any other software that compares? In these terms, I think is important you name in the draft what you commented above: "We modified c2raytools (https://github.com/hjens/c2raytools) package and merged it into tools21cm. ", referencing the code appropietly.
Cheers
Aurelio
Dear @garrelt thank you for the clarification. No need to put anything in the text, then. Cheers
Aurelio
Dear @aureliocarnero
In order to test the code, some simulation output is required. We have now provided some data at https://ttt.astro.su.se/~gmell/244Mpc/TestData_tools21cm/
The link contains data along with a README explaining it.
You can test the code on this data by running the tutorial given at https://tools21cm.readthedocs.io/examples/tutorials.html
Dear @sambit-giri , Im starting to test the code. One important aspect is that you need to provide automated tests with your code, but I cannot find those or how to run. According to the checklist, I need to verify that: "Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?"
I cannot verify this at the moment
@aureliocarnero We have updated the README mentioning about cloning the repository.
Dear @sambit-giri, I have a suggestion about the installation instructions. Going to the github page here: https://github.com/sambit-giri/tools21cm
In installation, I think you should name first that the user should clone the repository, I know this is obvious, but it is helpful. Therefore, I suggest that you divide the installation options in 2: first, by cloning the repositoy and then running python setup install, or directly with pip. They are 2 separate ways of doing it and it is not very clear.
@sambit-giri,
The instrument papers for the MWA are Tingay et al. 2013 and Wayth et al. 2018 (Phase I and Phase II respectively). I suggest you cite those instead of Lonsdale et al. 2009.
@whedon generate pdf
@whedon generate pdf
@sambit-giri,
The instrument papers for the MWA are Tingay et al. 2013 and Wayth et al. 2018 (Phase I and Phase II respectively). I suggest you cite those instead of Lonsdale et al. 2009.
@nicholebarry We have updated it.
Dear @sambit-giri , I cannot find any reference in the github (or in the documentation) to:
Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support
So far I cannot evaluate this. I recommend you put this guidelines in the github repo README and possibly in the readthedocs.
To give you an example, you can see this other software I reviewed in the past, the section "Contributing":
https://github.com/noraeisner/LATTE#contributing
Dear @sambit-giri , I cannot find any reference in the github (or in the documentation) to:
Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support
So far I cannot evaluate this. I recommend you put this guidelines in the github repo README and possibly in the readthedocs.
To give you an example, you can see this other software I reviewed in the past, the section "Contributing":
https://github.com/noraeisner/LATTE#contributing
@aureliocarnero Thank you for the recommendation.
We have added a brief text about "contributing" in README.md and added the directed users to the detailed description in the documentation: https://tools21cm.readthedocs.io/contributing.html
@nicholebarry, @ziotom78 this is a reminder that we're asking our reviewers to aim to complete their checklist in about six weeks. If you have an idea of what your time frame is for getting started, you can let me know and I can set a reminder with
whedon
.
Hi @pibion, thanks for the message! Sorry for the slow reply, I was away in vacation and just came back. I already started the review and expect to finish it in a few days. It's on the top of my to-do list, so I do not need a reminder. Thank you!
In order to test the code, some simulation output is required. We have now provided some data at https://ttt.astro.su.se/~gmell/244Mpc/TestData_tools21cm/
The link contains data along with a README explaining it.
Hi @sambit-giri , I have downloaded the data and checked that the plots produced by the tutorial match mines. They effectively look the same, but I interpret JOSS' requirement for "automated tests" in a stricter sense: it's the computer that should take care of checking the consistency of the results with their expected outcome. This ensures that patches from potential contributors do not introduce systematic effects or bugs in your codebase.
I realize that it's not feasible to include all the test data in the GitHub repository, as they are really too big; for me your solution of providing them in a separate URL is perfect. Perhaps you could produce smaller data files by sampling a tinier space? It's not required that the results of a tests are cosmologically interesting, they just need to check that modifications to the code do not introduce systematic effects. You could keep the tutorial as it is, relying on the bigger data files (the tutorial must be cosmologically interesting, after all!), and include the smaller files in the GitHub repository only for the automated tests.
Also, you could add unit tests to several functions, like t2c.statistics.skewness
, without the need to load data from files. Example:
# Put this in file test/test_statistics.py
from tools21cm.statistics import skewness
import numpy as np
# Test the functionality of "skewness"
assert np.allclose([
skewness([1.0, 2.0, 0.0]),
skewness([1.0, 2.0, 4.0],
)], [
0.0,
0.20782656212951636,
])
This kind of tests is very useful for documentation as well.
Dear @sambit-giri I had an unexpected error running the tutorials in https://tools21cm.readthedocs.io/examples/tutorials.html#Bubble-size-distribution
When I execute cell [22]: r_spa, dn_spa = t2c.spa(xHII, boxsize=boxsize, nscales=20)
It tells me the following error:
Traceback (most recent call last):
File "
File "/home/acarnero/codes/anaconda3/lib/python3.7/site-packages/tools21cm/bubble_stats.py", line 181, in spa
rs, ni = spa_np.spa_np(data, xth=xth, binning=binning, nscales=nscales)
File "/home/acarnero/codes/anaconda3/lib/python3.7/site-packages/tools21cm/spa_np.py", line 15, in spa_np
for i in xrange(nscales):
NameError: name 'xrange' is not defined
Basically, xrange is not defined...
Im running in python 3.
Cheers
Aurelio
Dear @sambit-giri I had an unexpected error running the tutorials in https://tools21cm.readthedocs.io/examples/tutorials.html#Bubble-size-distribution
When I execute cell [22]: r_spa, dn_spa = t2c.spa(xHII, boxsize=boxsize, nscales=20)
It tells me the following error:Traceback (most recent call last):
File "", line 1, in
File "/home/acarnero/codes/anaconda3/lib/python3.7/site-packages/tools21cm/bubble_stats.py", line 181, in spa
rs, ni = spa_np.spa_np(data, xth=xth, binning=binning, nscales=nscales)
File "/home/acarnero/codes/anaconda3/lib/python3.7/site-packages/tools21cm/spa_np.py", line 15, in spa_np
for i in xrange(nscales):
NameError: name 'xrange' is not definedBasically, xrange is not defined...
Im running in python 3.
Cheers
Aurelio
Dear @aureliocarnero
I had fixed this issue last week. You might have installed the package before this. Please try to re-install it and tell me if it doesn't work still.
Best,
Sambit
In order to test the code, some simulation output is required. We have now provided some data at https://ttt.astro.su.se/~gmell/244Mpc/TestData_tools21cm/
The link contains data along with a README explaining it.Hi @sambit-giri , I have downloaded the data and checked that the plots produced by the tutorial match mines. They effectively look the same, but I interpret JOSS' requirement for "automated tests" in a stricter sense: it's the computer that should take care of checking the consistency of the results with their expected outcome. This ensures that patches from potential contributors do not introduce systematic effects or bugs in your codebase.
I realize that it's not feasible to include all the test data in the GitHub repository, as they are really too big; for me your solution of providing them in a separate URL is perfect. Perhaps you could produce smaller data files by sampling a tinier space? It's not required that the results of a tests are cosmologically interesting, they just need to check that modifications to the code do not introduce systematic effects. You could keep the tutorial as it is, relying on the bigger data files (the tutorial must be cosmologically interesting, after all!), and include the smaller files in the GitHub repository only for the automated tests.
Also, you could add unit tests to several functions, like
t2c.statistics.skewness
, without the need to load data from files. Example:# Put this in file test/test_statistics.py from tools21cm.statistics import skewness import numpy as np # Test the functionality of "skewness" assert np.allclose([ skewness([1.0, 2.0, 0.0]), skewness([1.0, 2.0, 4.0], )], [ 0.0, 0.20782656212951636, ])
This kind of tests is very useful for documentation as well.
Thanks @ziotom78
I will get back to you on this soon.
Dear @sambit-giri you are right, now it worked perfectly. So, for me, it is only missing the unit test issue commented above. Once you have those available, please let me know, so I can test those and then, validate all the points from the review. Cheers
Aurelio
@sambit-giri I agree with @aureliocarnero, providing a separate URL for a data download is an excellent solution for larger data sets.
I would recommend putting this data on a service like Zenodo or Figshare rather than hosting it from your own website.
Dear @ziotom78 and @aureliocarnero
I have put scripts for unit testing at https://github.com/sambit-giri/tools21cm/tree/master/tests
I have not included the functions that need data such as the read/write functions and redshift space distortion implementation.
Please let me know if anything else is needed.
Best,
Sambit
Dear @sambit-giri , this is great, but how can I execute the unity tests? You should explain this in the documentation... Cheers
Dear @sambit-giri , this is great, but how can I execute the unity tests? You should explain this in the documentation... Cheers
@aureliocarnero I have put a README inside the same folder.
@sambit-giri I agree with @aureliocarnero, providing a separate URL for a data download is an excellent solution for larger data sets.
I would recommend putting this data on a service like Zenodo or Figshare rather than hosting it from your own website.
Dear @pibion
We have put the data up on Zenodo (https://doi.org/10.5281/zenodo.3953639).
The link to the data has also been updated in the tutorial (https://tools21cm.readthedocs.io/examples/tutorials.html).
Best,
Sambit
@sambit-giri
Here are some purely atheistic suggestions as I go through the tutorial.
@sambit-giri
Here are some more detailed comments/questions as I go through the tutorial.
t2c.power_spectrum.dimensionless_ps(dT_subtracted, kbins=15, box_dims=box_dims)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/nabarry/opt/anaconda3/lib/python3.7/site-packages/tools21cm/power_spectrum.py", line 464, in dimensionless_ps
ks_new = np.array([ks[factor*(i+0.5)] for i in range(kbins)])
File "/Users/nabarry/opt/anaconda3/lib/python3.7/site-packages/tools21cm/power_spectrum.py", line 464, in <listcomp>
ks_new = np.array([ks[factor*(i+0.5)] for i in range(kbins)])
IndexError: only integers, slices (
:), ellipsis (
...), numpy.newaxis (
None`) and integer or boolean arrays are valid indicesDear @nicholebarry
- subtract_mean_channelwiseย is inย t2c.read_files. Is there a more logic place for this function?
I discovered that subtract_mean_signal and subtract_mean_channelwise are doing the same thing. Therefore I have removed subtract_mean_channelwise. I think subtract_mean_signal is at the place.
- There's a couple of places where the auto-generated descriptions on the readthedocs are hard to parse because there should be a newline and/or period. Some examples are inย Other Modules under "Some useful attributes" and on the Main modules page under t2c.power_spectrum.power_spectrum_mu.
Thanks for pointing it out. I have fixed them.
- Inย Temperature, please add units
Done.
- Inย Tutorials under 21cm Brightness Temperature, please add units to the tutorial plot
Done.
- Inย Tutorials, please set x and y axis labels for all plots where applicable
Done.
@nicholebarry
@sambit-giri
Here are some more detailed comments/questions as I go through the tutorial.
- Inย Tutorialsย under 21cm Brightness Temperature, it seems as though the sample data set already has its per-freq mean subtracted, and thus theย subtract_mean_channelwise has very little effect. This may be confusing to some.
I don't think so. The tutorial now prints the mean of the first channel before and after applying subtract_mean_signal.
- In Tutorials under 21cm power spectrum, should the box_dims be the C2Ray box size (244) or the array size (250)? Looking at the code, it seems as though it should be 244, but this is confusing from a new-user tutorial standpoint. Add a comment?
I have commented on this in the tutorial.
- I tried to expand upon the tutorial a little bit in 21cm power spectrum, and I got the following error.
t2c.power_spectrum.dimensionless_ps(dT_subtracted, kbins=15, box_dims=box_dims) Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/Users/nabarry/opt/anaconda3/lib/python3.7/site-packages/tools21cm/power_spectrum.py", line 464, in dimensionless_ps ks_new = np.array([ks[factor*(i+0.5)] for i in range(kbins)]) File "/Users/nabarry/opt/anaconda3/lib/python3.7/site-packages/tools21cm/power_spectrum.py", line 464, in <listcomp> ks_new = np.array([ks[factor*(i+0.5)] for i in range(kbins)]) IndexError: only integers, slices (`:`), ellipsis (`...`), numpy.newaxis (`None`) and integer or boolean arrays are valid indices
Thanks! I have fixed this error.
Dear @sambit-giri could you please put in the documentation explicitly something about running the Automated tests? I know there is a redmine in the directory, but how a user can know that directory exist? Cheers
Aurelio
@sambit-giri
Thanks for implementing those comments.
Final comments on the tutorial:
Misc. comments from browsing the modules:
I look forward to using tools21cm in conjunction with 21cmfast in the future!
@sambit-giri
Thanks for implementing those comments.
Final comments on the tutorial:
- Units for R and V in plots under bubble size distributions. Also titles to distinguish the plots using the various methods.
- I don't want to delay the publication, so this is more of a suggestion for the future. There's a lot of fantastic modules for observational aspects (foregrounds, noise, beam kernels, etc) that you've implemented. I'd love to see a tutorial to highlight this functionality. I also think this would advertise to a wider potential user-base.
Misc. comments from browsing the modules:
- t2c.telescope_functions.kelvin_2_jansky says it return muJy, but I think it may return Jy. Is this correct?
I look forward to using tools21cm in conjunction with 21cmfast in the future!
Thanks @nicholebarry for the recommendations. The current plan is to publish the base version of tools21cm. Few modules for observational aspects, like the foreground models and the beam kernels, are still under development. Therefore we have not mentioned about it in the paper. As the thermal noise goes well under the hood of observational aspects, we decided to release them together in the next release version.
@nicholebarry
@sambit-giri
Thanks for implementing those comments.
Final comments on the tutorial:
- Units for R and V in plots under bubble size distributions. Also titles to distinguish the plots using the various methods.
Done.
Misc. comments from browsing the modules:
- t2c.telescope_functions.kelvin_2_jansky says it return muJy, but I think it may return Jy. Is this correct?
It is Jy. I have fixed it in the docstring.
@sambit-giri
Thanks @nicholebarry for the recommendations. The current plan is to publish the base version of tools21cm. Few modules for observational aspects, like the foreground models and the beam kernels, are still under development. Therefore we have not mentioned about it in the paper. As the thermal noise goes well under the hood of observational aspects, we decided to release them together in the next release version.
Makes total sense.
I would suggest some more description/motivation on the front page of readthedocs (and subsequently the Github main readme) that would help a first-time user know whether or not this package satisfies their needs. Feel free to use these suggestions (or better ones you come up with). Just a few sentences for each would do. For reference, I'm using pyuvdata as a guide.
I also ran into a few failures with unit testing (which look to be related to the dimensionless_ps bug seen above)
```
============================================================ FAILURES =============================================================
____________________________________________________ test_bubbles_from_kmeans _____________________________________________________
def test_bubbles_from_kmeans():
out = t2c.bubbles_from_kmeans(data_ball, n_clusters=2, upper_lim=False)
test_IdentifyRegions.py:19:
../../../opt/anaconda3/lib/python3.7/site-packages/tools21cm/identify_regions.py:63: in bubbles_from_kmeans
if n_clusters==2: array, t_th = threshold_kmeans(data, upper_lim=upper_lim, n_jobs=n_jobs)
../../../opt/anaconda3/lib/python3.7/site-packages/tools21cm/identify_regions.py:104: in threshold_kmeans
else: t_th = X[y==1].max()
a = array([], shape=(0, 1), dtype=uint8), axis = None, out = None, keepdims = False, initial =
def _amax(a, axis=None, out=None, keepdims=False,
initial=_NoValue, where=True):
return umr_maximum(a, axis, None, out, keepdims, initial, where)
E ValueError: zero-size array to reduction operation maximum which has no identity
../../../opt/anaconda3/lib/python3.7/site-packages/numpy/core/_methods.py:30: ValueError
______________________________________________________ test_dimensionless_ps ______________________________________________________
def test_dimensionless_ps():
'''
With this test, power_spectrum_nd and radial_average are also test.
'''
dd, kk = t2c.dimensionless_ps(gauss, kbins=kbins, box_dims=box_dims)
test_PowerSpectrum.py:38:
../../../opt/anaconda3/lib/python3.7/site-packages/tools21cm/power_spectrum.py:464: in dimensionless_ps
ks_new = np.array([ks[factor*(i+0.5)] for i in range(kbins)])
.0 =
ks_new = np.array([ks[factor*(i+0.5)] for i in range(kbins)])
E IndexError: only integers, slices (:
), ellipsis (...
), numpy.newaxis (None
) and integer or boolean arrays are valid indices
../../../opt/anaconda3/lib/python3.7/site-packages/tools21cm/power_spectrum.py:464: IndexError
@nicholebarry
I also ran into a few failures with unit testing (which look to be related to the dimensionless_ps bug seen above)
============================================================ FAILURES ============================================================= ____________________________________________________ test_bubbles_from_kmeans _____________________________________________________ def test_bubbles_from_kmeans(): > out = t2c.bubbles_from_kmeans(data_ball, n_clusters=2, upper_lim=False) test_IdentifyRegions.py:19: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../opt/anaconda3/lib/python3.7/site-packages/tools21cm/identify_regions.py:63: in bubbles_from_kmeans if n_clusters==2: array, t_th = threshold_kmeans(data, upper_lim=upper_lim, n_jobs=n_jobs) ../../../opt/anaconda3/lib/python3.7/site-packages/tools21cm/identify_regions.py:104: in threshold_kmeans else: t_th = X[y==1].max() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ a = array([], shape=(0, 1), dtype=uint8), axis = None, out = None, keepdims = False, initial = <no value>, where = True def _amax(a, axis=None, out=None, keepdims=False, initial=_NoValue, where=True): > return umr_maximum(a, axis, None, out, keepdims, initial, where) E ValueError: zero-size array to reduction operation maximum which has no identity ../../../opt/anaconda3/lib/python3.7/site-packages/numpy/core/_methods.py:30: ValueError ______________________________________________________ test_dimensionless_ps ______________________________________________________ def test_dimensionless_ps(): ''' With this test, power_spectrum_nd and radial_average are also test. ''' > dd, kk = t2c.dimensionless_ps(gauss, kbins=kbins, box_dims=box_dims) test_PowerSpectrum.py:38: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../../opt/anaconda3/lib/python3.7/site-packages/tools21cm/power_spectrum.py:464: in dimensionless_ps ks_new = np.array([ks[factor*(i+0.5)] for i in range(kbins)]) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ .0 = <range_iterator object at 0x11bc5dba0> > ks_new = np.array([ks[factor*(i+0.5)] for i in range(kbins)]) E IndexError: only integers, slices (`:`), ellipsis (`...`), numpy.newaxis (`None`) and integer or boolean arrays are valid indices ../../../opt/anaconda3/lib/python3.7/site-packages/tools21cm/power_spectrum.py:464: IndexError
Did you git pull and install the package again?
I had completely rewritten the dimensionless_ps function after your previous comment about it. It doesn't have the above lines anymore (https://tools21cm.readthedocs.io/_modules/t2c/power_spectrum.html#dimensionless_ps).
I have fixed the other error you got with your unit test.
Dear @nicholebarry
@sambit-giri
Thanks @nicholebarry for the recommendations. The current plan is to publish the base version of tools21cm. Few modules for observational aspects, like the foreground models and the beam kernels, are still under development. Therefore we have not mentioned about it in the paper. As the thermal noise goes well under the hood of observational aspects, we decided to release them together in the next release version.
Makes total sense.
I would suggest some more description/motivation on the front page of readthedocs (and subsequently the Github main readme) that would help a first-time user know whether or not this package satisfies their needs. Feel free to use these suggestions (or better ones you come up with). Just a few sentences for each would do. For reference, I'm using pyuvdata as a guide.
- Inputs: the simulation codes that you can currently input into tools21cm
- Outputs: bubble sizes, power spectrum, redshift distortions, etc
- Under-development: instrumental effects, noise, other specific simulation code inputs
- Citation: point to JOSS paper when published and a snippet about how people should cite the code.
- Tests: description on how to run the unit tests (also great for people to know if they've installed correctly).
We have updated the documentation as per your recommendations. We will put the "Citation" after a DOI is produced.
Best,
Sambit
@sambit-giri,
Yes, sorry, you're right. I didn't reinstall. Everything passed now. And I think that's a great level of detail with the documentation.
I'll officially accept. My review is done (@pibion need anything else from me?). Best of luck in the future!
@nicholebarry
Thanks!
Dear @sambit-giri I saw you put the info about the automated test.
Im also done with with review.
Congratulations!
@pibion
@nicholebarry @aureliocarnero thanks so much for your great work on this! There's nothing else you need to do for the publication process.
@ziotom78, thanks for getting started with your review! Do you expect to be able to work on this in the next few weeks? If not, I can set a reminder with whedon to ping you.
Dear @sambit-giri I saw you put the info about the automated test.
Im also done with with review.
Congratulations!
@pibion
Thanks @aureliocarnero!
Hi @sambit-giri , I have almost finished my review. Everything looks ok, apart from one issue with the automatic tests. I have opened an issue (see above), once this is fixed my review is complete.
Ok, issue #7 is closed, my review is complete. Thank you all!
Ok, issue #7 is closed, my review is complete. Thank you all!
Thanks @ziotom78 !
Dear @pibion
Is there anything else to be done? What is the next step in the process?
Best,
Sambit
@sambit-giri we have just a few more steps - I need to do a read-through of the final pdf and then you'll need to archive your software on Zenodo or Figshare. I'll be able to review the pdf tomorrow and will ping you on the action you need to take tomorrow.
@whedon generate pdf
@whedon check references
Reference check summary:
OK DOIs
- 10.1016/j.physrep.2006.08.002 is OK
- 10.1088/0034-4885/75/8/086901 is OK
- 10.1111/j.1365-2966.2012.21293.x is OK
- 10.1093/mnras/stx2539 is OK
- 10.1093/mnras/stt1341 is OK
- 10.1093/mnras/stv2679 is OK
- 10.1093/mnras/stx649 is OK
- 10.1093/mnras/stz1220 is OK
- 10.1093/mnras/stz2224 is OK
- 10.1088/1475-7516/2019/02/058 is OK
- 10.1093/mnras/sty1786 is OK
- 10.1093/mnras/stv571 is OK
- 10.1007/s10686-013-9334-5 is OK
- 10.1051/0004-6361/201220873 is OK
- 10.1017/pasa.2018.37 is OK
- 10.1017/pasa.2012.007 is OK
- 10.1109/JPROC.2009.2017564 is OK
MISSING DOIs
- None
INVALID DOIs
- None
@sambit-giri the pdf looks great. I have one requested change: the last sentence in the second paragraph isn't immediately clear.
I would recommend splitting this sentence into two to make it clearer that Tools21cm takes the simulation outputs as its inputs. Maybe something like, " Much of the functionality of Tools21cm is focused
on the construction and analysis of mock 21 cm observations in the context of current and
upcoming radio telescopes, such as the Low Frequency Array (LOFAR; Haarlem et al., 2013),
the Murchison Widefield Array (MWA; Tingay et al., 2013; Wayth et al., 2018), Hydrogen
Epoch of Reionization Array (HERA; DeBoer et al., 2017) and the Square Kilometre Array
(SKA; Mellema et al., 2013). Tools21cm takes the simulation data of these telescopes as its inputs."
@whedon generate pdf
@pibion Thanks for the recommendation. I have changed the sentence to the following:
"Much of the functionality of Tools21cm is focused on the construction and analysis of mock 21 cm observations in the context of current and upcoming radio telescopes, such as the Low Frequency Array (LOFAR; Haarlem et al., 2013), the Murchison Widefield Array (MWA; Tingay et al., 2013; Wayth et al., 2018), Hydrogen Epoch of Reionization Array (HERA; DeBoer et al., 2017) and the Square Kilometre Array (SKA; Mellema et al., 2013). Tools21cm post-processes cosmological simulation data to create mock 21 cm observations."
@whedon check references
Reference check summary:
OK DOIs
- 10.1016/j.physrep.2006.08.002 is OK
- 10.1088/0034-4885/75/8/086901 is OK
- 10.1111/j.1365-2966.2012.21293.x is OK
- 10.1093/mnras/stx2539 is OK
- 10.1093/mnras/stt1341 is OK
- 10.1093/mnras/stv2679 is OK
- 10.1093/mnras/stx649 is OK
- 10.1093/mnras/stz1220 is OK
- 10.1093/mnras/stz2224 is OK
- 10.1088/1475-7516/2019/02/058 is OK
- 10.1093/mnras/sty1786 is OK
- 10.1093/mnras/stv571 is OK
- 10.1007/s10686-013-9334-5 is OK
- 10.1051/0004-6361/201220873 is OK
- 10.1017/pasa.2018.37 is OK
- 10.1017/pasa.2012.007 is OK
- 10.1109/JPROC.2009.2017564 is OK
MISSING DOIs
- None
INVALID DOIs
- None
@sambit-giri this looks excellent, thank you!
I'll need you to make a tagged release and archive, and report the version number and archive DOI in this review thread.
Zenodo, Figshare, and OSF are good options. I like to use Zenodo because you can set it up to automatically create a new DOI every time you add a new tag to your repository.
@pibion I have put it up on Zenodo. The link is https://zenodo.org/badge/latestdoi/88035801
@sambit-giri could you add Hannes Jensen to the author list on your Zenodo archive?
@sambit-giri could you add Hannes Jensen to the author list on your Zenodo archive?
@pibion Done!
@sambit-giri thanks! I'll start the final part of the process now!
@whedon set 10.5281/zenodo.3973542 as archive
OK. 10.5281/zenodo.3973542 is the archive.
@whedon set v2.1 as version
OK. v2.1 is the version.
@whedon accept
Attempting dry run of processing paper acceptance...
Reference check summary:
OK DOIs
- 10.1016/j.physrep.2006.08.002 is OK
- 10.1088/0034-4885/75/8/086901 is OK
- 10.1111/j.1365-2966.2012.21293.x is OK
- 10.1093/mnras/stx2539 is OK
- 10.1093/mnras/stt1341 is OK
- 10.1093/mnras/stv2679 is OK
- 10.1093/mnras/stx649 is OK
- 10.1093/mnras/stz1220 is OK
- 10.1093/mnras/stz2224 is OK
- 10.1088/1475-7516/2019/02/058 is OK
- 10.1093/mnras/sty1786 is OK
- 10.1093/mnras/stv571 is OK
- 10.1007/s10686-013-9334-5 is OK
- 10.1051/0004-6361/201220873 is OK
- 10.1017/pasa.2018.37 is OK
- 10.1017/pasa.2012.007 is OK
- 10.1109/JPROC.2009.2017564 is OK
MISSING DOIs
- None
INVALID DOIs
- None
:wave: @openjournals/joss-eics, this paper is ready to be accepted and published.
Check final proof :point_right: https://github.com/openjournals/joss-papers/pull/1632
If the paper PDF and Crossref deposit XML look good in https://github.com/openjournals/joss-papers/pull/1632, then you can now move forward with accepting the submission by compiling again with the flag deposit=true
e.g.
@whedon accept deposit=true
๐ @aureliocarnero - I notice you did not check off the performance box as part of your review. Is there a problem with this item?
@sambit-giri - when "21 cm" is used as an adjective, as in "21 cm signal", it should be hyphenated to "21-cm signal". Can you change this in your paper and in the title? You might also want to change the title of the zenodo archive to match. Actually, I now see that you need to change the title of the Zenodo archive in any event, as it's the default title from the repo.
@whedon generate pdf
Dear @danielskatz
I have edited "21 cm" in the paper wherever needed. I have also changed the title of the zenodo archive.
Best,
Sambit
Now we just need @aureliocarnero to check off the performance box as part of his review.
@danielskatz done!
@whedon accept
Attempting dry run of processing paper acceptance...
Reference check summary:
OK DOIs
- 10.1016/j.physrep.2006.08.002 is OK
- 10.1088/0034-4885/75/8/086901 is OK
- 10.1111/j.1365-2966.2012.21293.x is OK
- 10.1093/mnras/stx2539 is OK
- 10.1093/mnras/stt1341 is OK
- 10.1093/mnras/stv2679 is OK
- 10.1093/mnras/stx649 is OK
- 10.1093/mnras/stz1220 is OK
- 10.1093/mnras/stz2224 is OK
- 10.1088/1475-7516/2019/02/058 is OK
- 10.1093/mnras/sty1786 is OK
- 10.1093/mnras/stv571 is OK
- 10.1007/s10686-013-9334-5 is OK
- 10.1051/0004-6361/201220873 is OK
- 10.1017/pasa.2018.37 is OK
- 10.1017/pasa.2012.007 is OK
- 10.1109/JPROC.2009.2017564 is OK
MISSING DOIs
- None
INVALID DOIs
- None
:wave: @openjournals/joss-eics, this paper is ready to be accepted and published.
Check final proof :point_right: https://github.com/openjournals/joss-papers/pull/1633
If the paper PDF and Crossref deposit XML look good in https://github.com/openjournals/joss-papers/pull/1633, then you can now move forward with accepting the submission by compiling again with the flag deposit=true
e.g.
@whedon accept deposit=true
@whedon accept deposit=true
Doing it live! Attempting automated processing of paper acceptance...
๐ฆ๐ฆ๐ฆ ๐ Tweet for this paper ๐ ๐ฆ๐ฆ๐ฆ
๐จ๐จ๐จ THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! ๐จ๐จ๐จ
Here's what you must now do:
Party like you just published a paper! ๐๐๐ฆ๐๐ป๐ค
Any issues? Notify your editorial technical team...
Thanks to @nicholebarry & @aureliocarnero & @ziotom78 for reviewing, and @pibion for editing!
Congratulations to @sambit-giri (Sambit Kumar Giri) and co-authors!!
:tada::tada::tada: Congratulations on your paper acceptance! :tada::tada::tada:
If you would like to include a link to your paper from your README use the following code snippets:
Markdown:
[](https://doi.org/10.21105/joss.02363)
HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.02363">
<img src="https://joss.theoj.org/papers/10.21105/joss.02363/status.svg" alt="DOI badge" >
</a>
reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.02363/status.svg
:target: https://doi.org/10.21105/joss.02363
This is how it will look in your documentation:
We need your help!
Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following: