I was investigating today what it would take to make MathematicalProgram usable from Python/Matlab. So far, I haven't been able to wrap decision_variable.h because of its use of alias-using directives (like template <int rows>
using VectorDecisionVariable = MatrixDecisionVariable<rows, 1>;).
Mainline SWIG added support for these directives in 3.0.11, which came out in December 2016, but the SWIG + matlab fork we're using doesn't have those changes.
It might be possible to pull in the changes from mainline SWIG and get this working.
Of course, this is all moot if the SWIG code is going to be thrown away soon. What's the status of that?
Does swig work with typedef instead of using? I can change it to
template<int rows>
typedef MatrixDecisionVariable<rows, 1> VectorDecisionVariable;
How did swig works with eigen_autodiff_types.h? It uses the same using directives.
Making that change produces 35,841 lines of error messages, the first of which is:
/usr/local/bin/c++ -DEIGEN_MPL2_ONLY -DHAVE_SPDLOG -DdrakeOptimization_EXPORTS -I/home/rdeits/locomotion/drake-distro/drake/.. -I/home/rdeits/locomotion/drake-distro/build/install/lib/cmake/nlopt/../../../include/nlopt -isystem /home/rdeits/locomotion/drake-distro/build/install/include/eigen3 -isystem /home/rdeits/locomotion/drake-distro/build/install/include -isystem /home/rdeits/locomotion/drake-distro/build/install/include/ipopt/coin -isystem /home/rdeits/locomotion/drake-distro/build/install/include/ipopt/coin/ThirdParty -Werror=all -Werror=ignored-qualifiers -Werror=overloaded-virtual -Werror=extra -Werror=return-local-addr -Wno-unused-parameter -Wno-missing-field-initializers -O3 -DNDEBUG -fPIC -std=gnu++14 -MMD -MT solvers/CMakeFiles/drakeOptimization.dir/decision_variable.cc.o -MF solvers/CMakeFiles/drakeOptimization.dir/decision_variable.cc.o.d -o solvers/CMakeFiles/drakeOptimization.dir/decision_variable.cc.o -c /home/rdeits/locomotion/drake-distro/drake/solvers/decision_variable.cc
In file included from /home/rdeits/locomotion/drake-distro/drake/solvers/decision_variable.cc:1:0:
/home/rdeits/locomotion/drake-distro/drake/../drake/solvers/decision_variable.h:18:1: error: template declaration of ‘typedef’
typedef Eigen::Matrix<drake::symbolic::Variable, rows, cols> MatrixDecisionVariable;
^
/home/rdeits/locomotion/drake-distro/drake/../drake/solvers/decision_variable.h:20:1: error: template declaration of ‘typedef’
typedef MatrixDecisionVariable<rows, 1> VectorDecisionVariable;
^
/home/rdeits/locomotion/drake-distro/drake/../drake/solvers/decision_variable.h:20:9: error: ‘MatrixDecisionVariable’ does not name a type
typedef MatrixDecisionVariable<rows, 1> VectorDecisionVariable;
^
I think you can't typedef inside a teplate, at least not in that way :wink:
We haven't seen this with eigen_autodiff_types.h only because we've never swig-wrapped that file.
I've pulled in the 3.0.11 changes from upstream here: https://github.com/rdeits/swig/tree/rd-merge-upstream and nothing exploded. That might actually be a good solution, and doesn't require any changes to your code.
The more fundamental issue remains, however: should I even bother? This will be a substantial effort, which might just get tossed out when we move away from SWIG.
@mwoehlke-kitware should start working on the Boost::Python port next week.
I guess I won't bother, then.
Have been thinking a bit more, and it's becoming clear to me just how enabling it would be if we could get python bindings working for MathematicalProgram. In addition to @rdeits , I could see @patmarion and others using these for working with point cloud perception right in director. MathematicalProgram is at a point where having more (expert) users could make it get a lot better fast. @hongkai-dai and @soonho-tri are adding features fast, but they've got a long list!
@jwnimmer-tri , @david-german-tri , @jamiesnape , et al -- what's the best way to make this happen?
FWIW - there would be a ton of value for getting the bindings to matlab, too. @mposa and others might start using it. That might be a strong argument for swig.
Maybe the Boost::Python spike should be redirected to creating new python bindings for MathematicalProgram, instead of backend-swapping the current pydrake APIs? We could choose to wrap only the APIs that we feel are more stabilized, for now.
Summary of conversation elsewhere:
The swig MATLAB bindings have no users and @RussTedrake has decided we don't need to continue support for them. @rdeits will spike-test Python bindings for MathematicalProgram using SWIG 3.0.11, and will also nuke the MATLAB swig outputs. If the spike test works well, we have the option to replace swig-matlab with modern upstream SWIG (most likely as a binary dependency), or to use Boost::Python.
@mwoehlke-kitware: As a user of MathematicalProgram, @rdeits knows exactly the API he wants to expose, and plans to have the SWIG bindings done ASAP. Once they're available, we should focus Boost::Python effort on replicating them. In the meantime, the existing bindings also matter, and will doubtless teach us something, so you can proceed with them as planned.
Sounds good? Consistent with everyone's understanding?
we have the option to replace swig-matlab with modern upstream SWIG (most likely as a binary dependency)
SWIG 3.0.2 is in trusty-backports, SWIG 3.0.8 is in xenial, which makes life easier.
Annoyingly, only SWIG 3.0.11 supports the using alias syntax, which is used in MathematicalProgram.
That is very unfortunate. I retract my "makes life easier" comment.
Brief update on this:
Basically, I'm stuck on how to handle the Eigen Vectors of decision variables (which, ironically, I was a strong proponent of in the first place). Nothing I've tried has gotten SWIG to even try to generate proper bindings for VectorXDecisionVariable. The best I can get is a bare pointer that just leaks memory.
I decided to also give pybind11 a try, just to see how it was. Handling Eigen matrices with numeric scalar types is amazingly easy in pybind11 (as in, took basically no effort instead of the days that were required to get SWIG working). Unfortunately, their support for Eigen types with non-numeric arguments seems to be weaker. So far, I haven't gotten it working.
I'm going to keep playing with both approaches.
For reference, the particular API I'm trying to wrap is basically what's used in solvers/test/mixed_integer_optimization_test.cc. That is, I'd like to be able do do something like:
import numpy as np
import pydrake.mathematicalprogram as mp
prog = mp.MathematicalProgram()
x = prog.NewBinaryVariables(3, "x")
c = np.array([-1.0, -1.0, -2.0])
prog.AddLinearCost(c)
a1 = np.array([[1.0, 2.0, 3.0]])
prog.AddLinearConstraint(a1, -np.inf, 4)
etc.
Anyway, not giving up quite yet, but wrapping Eigen with user-defined scalar types has turned out to be difficult.
Follow-up: from discussion with one of the pybind11 devs, automagic wrapping of something like Eigen::Matrix<symbolic::Variable, -1, -1> is unlikely to work on its own. We can still return and use instances of that type in python, but they'll act as opaque objects rather than magically transforming into numpy arrays. Fortunately, we can implement our own methods for that opaque type and essentially re-create the subset of the matrix API that we need (realistically, not much more than basic math and indexing).
The conclusion is probably similar in SWIG: magic wrapping of Matrixswigmake autodiff helpers already do for Eigen autodiff types: https://github.com/rdeits/swigmake/blob/master/swig/common/autodiff.i
Does anyone have input/suggestions on wrapping Eigen types like this? Going down the route that autodiff.i uses is possible, but is a pain to maintain and adds a lot of indirection.
Unfortunately, their support for Eigen types with non-numeric arguments seems to be weaker.
That is unfortunate, we are using other types as the scalar in Eigen::Matrix, such as Eigen::Matrix<symbolic::Expression, 2, 3>.
Yeah, and I still think that's the right thing to do in C++. It just makes this step more challenging 😉
Just to be clear, in a perfect world I would be able to take a C++ method that returns a Matrix<Variable...> and call it from python to get, ideally, a Numpy array. And then I would be able to multiply that Numpy array by a constant to get a Numpy array of Expressions, which I could pass to a C++ function expecting Matrix<Expression>. I'm not sure how close to that we can realistically get, but it's nice to have dreams.
@patmarion had some useful advice. I had been trying to get the conversion from Matrix
Eigen::Matrix<Variable> typesEigen::Matrix<Variable> an opaque type in the wrapper layer (that is, don't automatically try to convert it to a python type)Eigen::Matrix<Variable> and use its length and indexing methods to populate a numpy array with the same underlying data, then return that arrayThat leaves the option open for a future clever person to come along and figure out all the magic type conversion under the hood without changing the API.
I'm going to try to make a quick prototype of this idea in SWIG and pybind, outside of Drake.
Prototype is here: https://github.com/rdeits/mathprog-wrapper-exploration
The summary is that implementing @patmarion's suggestion was really easy in pybind11. A surprising number of things just work. For example, if I overload + for Variables in C++, then I can do (in python):
x = Variable("x")
y = Variable("y")
xy = numpy.array([x, y])
return np.sum(xy)
and get an Expression out, because numpy will internally call the appropriate c++ addition function.
Integrating it with cmake was also amazingly easy: https://github.com/rdeits/mathprog-wrapper-exploration/blob/master/pybind/CMakeLists.txt#L24
Implementing it with SWIG was...well...I haven't actually gotten it to work yet. The problem is that in order to generate a wrapper for Matrix<Variable> in SWIG, I have to convince SWIG that Eigen::Matrix is really a thing. But SWIG can't actually parse the Eigen headers. So there's probably some manual type conversion that I'll have to write. Suffice it to say, this is much harder. The cmake configuration is also much more complicated: https://github.com/rdeits/mathprog-wrapper-exploration/blob/master/swig/CMakeLists.txt#L25
In SWIG's defense, the parts of the API that it could wrap, it did automatically, while in pybind11 I had to declare each method that I wanted to be wrapped.
By the way, I haven't been trying Boost::Python simply because I have no experience with it at all. I'm sure it's also a good option, but right now I'm more interested in the general comparison of SWIG vs not-SWIG, with pybind11 being my not-SWIG of choice.
hoorah!
This is super exciting, kudos @rdeits! @mwoehlke-kitware, you should read recent developments on this thread if you haven't already.
I decided to also give pybind11 a try, just to see how it was.
Are you planning to continue exploring the SWIG avenue, or is not-SWIG the approach of choice at this point?
Integrating it with cmake was also amazingly easy: https://github.com/rdeits/mathprog-wrapper-exploration/blob/master/pybind/CMakeLists.txt#L24
It looks like others have already worked out how to do it in Bazel as well: https://github.com/bazelbuild/bazel/issues/1475
By the way, I haven't been trying Boost::Python simply because I have no experience with it at all. I'm sure it's also a good option, but right now I'm more interested in the general comparison of SWIG vs not-SWIG, with pybind11 being my not-SWIG of choice.
@mwoehlke-kitware From my notes, you thought pybind11 might have some advantages over Boost::Python. It certainly looks like a clean and simple external, and it avoids even thinking about the Boost battling-versions issues. On the other hand, TRI has internal commitments to Boost::Python, so if we're going to use something else in Drake, it's important to figure out how to make it interoperate.
_EDIT: Thinking about it for a moment, though, why would interop be hard? There are Python functions, produced by whatever underlying wrapper technology, and Python code calls them. Right?_
Personally, I like pybind11 best. I would surprised if there were issues interoperating.
By the way, I haven't been trying Boost::Python simply because I have no experience with it at all. I'm sure it's also a good option, but right now I'm more interested in the general comparison of SWIG vs not-SWIG, with pybind11 being my not-SWIG of choice.
From what I can tell, pybind11 is a fork of Boost::Python that has been extracted from the rest of Boost and is updated to use modern C++... with one important difference: it has some Eigen wrapping bits built in. We'd apparently need an extra external to get that with Boost::Python.
@mwoehlke-kitware From my notes, you thought pybind11 might have some advantages over Boost::Python.
Due to requiring a C++11 compiler (and C++14 awareness), it does some things much more "cleanly" than Boost::Python (less use of macros, particularly). I'm not sure if Boost::Python supports exporting lambdas as Python functions; pybind11 definitely does. CMake integration with pybind11 is really easy.
As discussed in today's meeting, I would guess that Boost::Python code can be converted fairly easily to pybind11. The API's are very similar, so it "ought" to be a matter of switching includes, and changing the namespace prefix on Boost::Python calls. (Possibly even just changing a namespace alias.)
FTR all of the above sounds great with me. (I had raised the concern with David about making sure we inter-operate with Boost::Python; I think the above plan is solid and I'm not worried anymore.)
I am currently rewriting the IRIS python bindings in pybind11, both in order to fix the build issues like #4914 and to make sure I'm happy with the tool. So far, it's great.
https://github.com/rdeits/drake/tree/pybind-rbtree now has all python tests passing using pybind11 bindings instead of SWIG. There's some more work to be done to expose some IK constraints that aren't being tested but that Director probably needs, and to clean up the build, but overall this is really nice.
https://github.com/rdeits/drake/tree/pybind-mathprog can now run a very basic mathematicalprogram example from python: https://github.com/rdeits/drake/blob/7bffb37628e62e59ea4181963e70a6bd2bdc5a2b/drake/bindings/python/pydrake/test/testMathematicalProgram.py#L15
prog = mathematicalprogram.MathematicalProgram()
x = prog.NewBinaryVariables(3, "x")
c = np.array([-1.0, -1.0, -2.0])
prog.AddLinearCost(c, x)
a = np.array([[1.0, 2.0, 3.0]])
prog.AddLinearConstraint(a, -np.inf, 4, x)
a2 = np.array([1.0, 1.0])
prog.AddLinearConstraint(a2, 1, np.inf, x[:2])
result = prog.Solve()
self.assertEqual(result, mathematicalprogram.SolutionResult.kSolutionFound)
# Test that we got the right solution for all x
x_expected = np.array([1.0, 0.0, 1.0])
self.assertTrue(np.all(np.isclose(prog.GetSolution(x), x_expected)))
hoorah!
Most helpful comment
Summary of conversation elsewhere:
The swig MATLAB bindings have no users and @RussTedrake has decided we don't need to continue support for them. @rdeits will spike-test Python bindings for MathematicalProgram using SWIG 3.0.11, and will also nuke the MATLAB swig outputs. If the spike test works well, we have the option to replace swig-matlab with modern upstream SWIG (most likely as a binary dependency), or to use
Boost::Python.@mwoehlke-kitware: As a user of MathematicalProgram, @rdeits knows exactly the API he wants to expose, and plans to have the SWIG bindings done ASAP. Once they're available, we should focus
Boost::Pythoneffort on replicating them. In the meantime, the existing bindings also matter, and will doubtless teach us something, so you can proceed with them as planned.Sounds good? Consistent with everyone's understanding?