Joss-reviews: [REVIEW]: PySwarms: a research toolkit for Particle Swarm Optimization in Python

Created on 18 Oct 2017  ·  37Comments  ·  Source: openjournals/joss-reviews

Submitting author: @ljvmiranda921 (Lester James Miranda)
Repository: https://github.com/ljvmiranda921/pyswarms
Version: v0.1.7
Editor: @kyleniemeyer
Reviewer: @stsievert
Archive: 10.5281/zenodo.1145432

Status

status

Status badge code:

HTML: <a href="http://joss.theoj.org/papers/235299884212b9223bce909631e3938b"><img src="http://joss.theoj.org/papers/235299884212b9223bce909631e3938b/status.svg"></a>
Markdown: [![status](http://joss.theoj.org/papers/235299884212b9223bce909631e3938b/status.svg)](http://joss.theoj.org/papers/235299884212b9223bce909631e3938b)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer questions

@stsievert, please carry out your review in this issue by updating the checklist below (please make sure you're logged in to GitHub). The reviewer guidelines are available here: http://joss.theoj.org/about#reviewer_guidelines. Any questions/concerns please let @kyleniemeyer know.

Conflict of interest

Code of Conduct

General checks

  • [x] Repository: Is the source code for this software available at the repository url?
  • [x] License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license? [Yes, MIT]
  • [x] Version: Does the release version given match the GitHub release (v0.1.7)? [Yes, a prerelease]
  • [x] Authorship: Has the submitting author (@ljvmiranda921) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • [x] Installation: Does installation proceed as outlined in the documentation? [Yes, worked for me]
  • [x] Functionality: Have the functional claims of the software been confirmed? [Yes, docs make it clear that classes have parameters]
  • [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.) [I see no performance claims]

Documentation

  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution. [setup.py, requirements_dev.txt]
  • [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems). [Yes, in README and examples/]
  • [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • [x] Automated tests: Are there automated tests or manual steps described so that the function of the software can be verified? [Yes, TravisCI]
  • [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support [Yes, and linked on docs]

Software paper

  • [x] Authors: Does the paper.md file include a list of authors with their affiliations?
  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?
accepted published recommend-accept review

Most helpful comment

@kyleniemeyer my review is still in progress. I still need to download and verify the claims in the documentation. Plus I'd like @oesteban's comment too.

All 37 comments

Hello human, I'm @whedon. I'm here to help you with some common editorial tasks for JOSS. @stsievert it looks like you're currently assigned as the reviewer for this paper :tada:.

:star: Important :star:

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As as reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all JOSS reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

👋 @ljvmiranda921 @stsievert @oesteban

I've looked this over briefly today, and here's some comments:

  • I didn't see a statement of need that clearly addressed the audience. Is the audience swarm intelligence researchers? Other uses of neural nets that would some other optimization algorithm?
  • Particle Swarm Optimization is mentioned many times, but I've never heard of it. Is be used in place of sklearn's GridSearch (i.e., is it useful in hyper parameter tuning for machine learning)? I'd like to see a brief description of what (if any?) guarantees PSOs have.
  • What's the difference in the packages for global min and local min? If minimizing a convex function these two packages will produce the same result. Are you most concerned about non-convex functions?

Hi @stsievert ,

Thank you so much for your comments.

An aside, PSO is a class of search algorithms that can be used to optimize functions even if they're differentiable or not. This library provides implementations of these algorithms that should solve any optimization problem. Of course, the performance will depend on the PSO parameters (you can control if the particles are independent from one another, follows the herd, etc.), and to the type of PSO used in a certain problem.

As for your comments:

  • The package is intended for swarm int. researchers & students who would like a high-level interface of implementing PSO in their problems. It doesn't have to be a neural nets problem, in this use-case, it demonstrates how the parameters of a neural-net can be optimized through this package. It also doesn't need to be a ML problem, as long as an objective function is defined, then it can be used as input in pyswarms. This can range from performing inverse-kinematics in robotics, or job-shop scheduling (It just so happened that most of my use-cases are related to ML).
  • Although you can use swarm intelligence algorithms for finding the hyperparameters in a machine learning classifier [1] [2], similar to GridSearch/RandomSearch in sklearn, the GridSearch and RandomSearch in this package is only for finding the "hyperparameters" for the PSO algorithm (a.k.a. the parameters that controls its behaviour). In a vanilla PSO, you have parameters that controls how much the particles are dependent from one another, or how much of their neighbors they're going to follow.
  • I believe you're referring to GlobalBestPSO and LocalBestPSO? This doesn't necessarily refer to global minima or local minima. These two define the "topology" of the swarm. In GlobalBestPSO, all particles follow a leader, who has the best cost in the current iteration [3]. In LocalBestPSO, the particles are divided into neighbors/clusters, and only looks at the sub-leader in their respective neighborhoods [4]. I'm using the words GlobalBest and LocalBest because it's the one used in literature [3] [4].

Thank you so much, I hope it clears things up! 😄


[1] Dario Floreano, Peter Durr, and Claudio Mattiussi. “Neuroevolution: from architectures to
learning”. In: Evolutionary Intelligence (2008).
[2] Kenneth Stanley, David B. D’Ambrosio, and Jason Gauci. “A hypercube-based encoding
for evolving large-scale neural networks”. In: Artificial Life (2009).
[3] J. Kennedy and R.C. Eberhart, “Particle Swarm Optimization,” Proceedings of the IEEE International Joint Conference on Neural Networks, 1995, pp. 1942-1948.
[4] Y. Shi and R.C. Eberhart, “A modified particle swarm optimizer,” Proceedings of the IEEE International Conference on Evolutionary Computation, 1998, pp. 69-73.

The package is intended for swarm int. researchers & students who would like a high-level interface of implementing PSO in their problems.

Thanks – that clears this up. I don't think the above is expressed clearly in your docs. I would like to see the same sentence, maybe with "high-level interface" expanded.

Thank you so much, I hope it clears things up!

It does! Thank you.

Thank you @stsievert . Unfortunately, I'm on a trip so I might be able to handle this on Thursday, (UTC+09 :00).

Unfortunately, I'm on a trip so I might be able to handle this on Thursday

No worries – take your time!

Hi @stsievert , I've made the changes you've requested both in Paper.md and the README.rst. Hope it's much better now. Thanks a lot!

Thanks – this looks better! I've commented in https://github.com/ljvmiranda921/pyswarms/issues/56 for some more small edits.

@stsievert thanks for your comments so far! I noticed that a few of the checkboxes at the top are still empty, is your review still in progress?

(Also, please do check off that first one about the conflict of interest)

Update: The latest commit resolves ljvmiranda921/pyswarms#56

Hi @stsievert , the latest commit applies the changes needed. 👍

@kyleniemeyer my review is still in progress. I still need to download and verify the claims in the documentation. Plus I'd like @oesteban's comment too.

Update: The latest commit resolves ljvmiranda921/pyswarms#61

I've updated the review, and all boxes are checked.

Hi @oesteban, have you had a chance to look at the software package and paper, if you are still able to?

Hi @kyleniemeyer , sorry I wasn't able to check this recently. Thank you for the review @stsievert !

Hi @kyleniemeyer ! :) I am just wondering what the status of this would be. Thanks a lot!

Hi @ljvmiranda921, sorry for the delay on this.

It looks like your software is good to go, but I do have some feedback on the article before accepting.

Right now, the article is a bit short; we don't want a full-length paper, but per the author guidelines we do expect between 250-1000 words. Perhaps you could describe the implementation a bit more, or better yet explain some example use cases. Examples of the software being used in research (whether published already, or in progress) are also helpful. In addition, it may be helpful to explain PSO in a sentence or two, with an appropriate reference, at the beginning.

Hi @kyleniemeyer , thanks a lot! I have updated the paper and the changes can be seen in ljvmiranda921/pyswarms#71 . If everything looks good, I can already merge the branch to master. :smile:

@ljvmiranda921 looks good! Please merge that, and I'll try generating the article PDF.

@kyleniemeyer , merged! :+1:

@whedon generate pdf

Attempting PDF compilation. Reticulating splines etc...
https://github.com/openjournals/joss-papers/blob/joss.00433/joss.00433/10.21105.joss.00433.pdf

@ljvmiranda921 alright, looks good. Can you now archive the entire software repository (e.g., using Zenodo) and report the DOI back here? That'll be the last thing needed.

Thanks! Hmmm, I currently have a Zenodo DOI of the latest version v.0.1.7 (link, DOI: 10.5281/zenodo.996029). Do I still need to make a new one?

It should reflect the latest version including the changes you made to address the reviewer comments here—if that version was archived in September, then I don't think it would have the newer changes you made.

Got it, will archive via Zenodo

Hi @kyleniemeyer , sorry for my confusion and I just want to be careful: if I am to archive via Zenodo, then that means I will create a new release, thus bumping the version number of my submission (just following the instructions from this link). This will then be v0.1.8 and is now "different" to v0.1.7, is it okay to proceed? Thanks a lot! :+1:

@ljvmiranda921 yeah that's no problem, we typically expect the version number to change after the code goes through review.

Here is the DOI: 10.5281/zenodo.1145432 (link)

@whedon set 10.5281/zenodo.1145432 as archive

OK. 10.5281/zenodo.1145432 is the archive.

@arfon this is now accepted and ready to publish

Thank you @kyleniemeyer for helping me improve my paper, and to @stsievert for reviewing my submission! :smiley:

@stsievert - many thanks for your review and to @kyleniemeyer for editing this submission ✨. @ljvmiranda921 - your submission is now accepted into JOSS and your DOI is https://doi.org/10.21105/joss.00433 ⚡️ 🚀 💥

:tada::tada::tada: Congratulations on your paper acceptance! :tada::tada::tada:

If you would like to include a link to your paper from your README use the following code snippet:

[![DOI](http://joss.theoj.org/papers/10.21105/joss.00433/status.svg)](https://doi.org/10.21105/joss.00433)

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider volunteering to review for us sometime in the future. You can add your name to the reviewer list here: http://joss.theoj.org/reviewer-signup.html

Was this page helpful?
0 / 5 - 0 ratings