Hi Guys I'm trying to run the Parikh example, and when running the test using
py.test keras_parikh_entailment/keras_decomposable_attention.py
I get an error
ValueError: You calledset_weights(weights)on layer "embed" with a weight list of length 1, but the layer was expecting 0 weights. Provided weights: [array([[ 0.00000000e+00, 0.00000000e+00, 1.4...
I've tried a couple of different environments (python 2.7 and 3.5, CPU & GPU).
I updated to the most recent versions of Keras and Spacy. Tried on both Mac and Ubuntu.
Thanks
Hm, looking at the code here, I made some edits that should have been pushed to a branch. I'm not sure the test is still correct.
Thanks for the report — looking into this. In the meantime, if you look through the history of the file, you should be able to find a working version.
Edit:
$ py.test examples/keras_parikh_entailment/keras_decomposable_attention.py
examples/keras_parikh_entailment/keras_decomposable_attention.py ..
====================================
2 passed in 349.79 seconds
Can't get over how long Theano can take to compile RNNs...But, it does work for me?
HMM, I tried all the versions. Also checked out that my spacy install was working correctly (ran the tests). Thinking maybe a Keras version mismatch. Do you know what version works for you?
Thanks
Strange. One error I get sometimes is that py.test can be referencing the wrong spaCy version, e.g. if it's not installed in my virtualenv.
Definitely double checked that. This seems to be more on the Keras side. Looks like the embedding layer is expecting a numpy array in a shape, with zero length (not sure about that one), but its getting a length of 1. I've had some issues with Keras in the past having to reshape the numpy array for my weights, but that was before they had frozen the APIs. Haven't seen this, hence the shout out..
pip list is showing Keras (1.2.0), if that helps.
Try fiddling with the embedding layer, e.g. disabling the tuning part. I'm playing a little trick where I represent the learned vectors as the sum of the originals and the fine-tuning. The advantage is that this lets us tune only a fixed number of vectors, e.g. the 1000 most common. But it means we have more complexity here --- the Lambda layer to sum, for instance. I'm never confident I'm writing my Keras code correctly when I have sequence inputs.
I'll dig in and see what I find. My keras is less than optimal, but should be able to figure it out. I'll leave this open for now, so I can post back with anything I find.
Just wanted to add that I've run into the same issues as @pchowdhry with this example.
Could you try this commit: 738f38e8d?
Tried that commit, same issue.
I have the same problem as well.
This "The regularizers property of layers/models is deprecated. Regularization losses are now managed via the losses layer/model property." leads me to believe that Keras may be out of sync with TF.
str(weights)[:50] + '...')
E ValueError: You called set_weights(weights) on layer "embed" with a weight list of length 1, but the layer was expecting 0 weights. Provided weights: [array([[ -1.06741039e+11, 4.55912455e-41, -1.0...
/usr/local/lib/python2.7/dist-packages/keras/engine/topology.py:966: ValueError
----------------------------------------- Captured stderr call -----------------------------------------
/usr/local/lib/python2.7/dist-packages/keras/engine/topology.py:368: UserWarning: The regularizers property of layers/models is deprecated. Regularization losses are now managed via the losses layer/model property.
warnings.warn('The regularizers property of '
mmacy@DuhBuntu:~/devel/spaCy/examples$ pytest keras_parikh_entailment/keras_decomposable_attention.py
========================================= test session starts ==========================================
platform linux2 -- Python 2.7.12, pytest-3.0.5, py-1.4.32, pluggy-0.4.0
rootdir: /home/mmacy/devel/spaCy, inifile:
collected 2 items
keras_parikh_entailment/keras_decomposable_attention.py ..
====================================== 2 passed in 21.40 seconds =======================================
@pchowdhry @jfoster17 try
% pip uninstall Keras
% git clone https://github.com/fchollet/keras.git; cd keras; python setup.py build; python setup.py install
Although the code in git is still called 1.2.0, fchollet is getting ready to cut a 2.0 and has, by all appearances, fixed this in his master branch.
Hope that helps.
Thanks Matt, this worked for me. Appreciate the help.
-pankaj
I found a number of issues in the Spacy example keras_parikh_entailment.
Are we 100% sure that example runs, regardless of Keras?
Have you upgraded to the version in git as dictated by the documentation?
I always install Keras via git. Despite that, in spacy_hook.py
you define
class KerasSimilarityShim(object):
...
and then
def create_similarity_pipeline(nlp):
return [SimilarityModel.load(
nlp.path / 'similarity',
nlp,
feature_extracter=get_features)]
Is this correct?
Francesco Gadaleta, PhD
http://www.worldofpiggy.com
http://www.worldofpiggy.com
Machine Learning and Artificial Intelligence Expert
Mobile: +32 (0)478998936
Personal website: www.worldofpiggy.com
http://www.worldofpiggy.com https://www.vyte.in/frag
http://www.worldofpiggy.com
On 15 January 2017 at 23:18, Matthew Macy notifications@github.com wrote:
Have you upgraded to the version in git as dictated by the documentation?
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
https://github.com/explosion/spaCy/issues/727#issuecomment-272734424,
or mute the thread
https://github.com/notifications/unsubscribe-auth/ALE475wWeQifevaVksI5ElEOlVs3l8Mtks5rSpsngaJpZM4LeWzY
.
The code definitely runs for me. However, I usually use the Theano backend, and I'm finding now that it doesn't work on Tensorflow. I'm not sure what's wrong.
@pchowdhry , @mattmacy , have you got this example working? I'm trying to use it, but even a simple evaluate or demo fails for me.
Running into some issues when trying to run both tests,
TypeError: call() got an unexpected keyword argument 'mask'
../../../anaconda/lib/python3.6/site-packages/Keras-2.0.2-py3.6.egg/keras/engine/topology.py:554: TypeError
The TypeError is coming from these two lines (181 & 182) of code in keras_decomposable_attention.py:
avged = GlobalAveragePooling1D()(result, mask=self.words)
maxed = GlobalMaxPooling1D()(result, mask=self.words)
changing it to:
avged = GlobalAveragePooling1D()(result) #mask=self.words
maxed = GlobalMaxPooling1D()(result) #mask=self.words
fixes the issue and gets the tests to pass:

Hi, one way to fix the issue is to revert to this commit:
https://github.com/RaffEdwardBAH/keras/commit/5e17b41938c6f80911d7cacbec6923e7111c5567
That's a branch by someone who tried to make a pull request fixing the mask keyword issue, but for some reason it never happened. Clone that repo, checkout that commit, pip install ., change the backend from the default Tensorflow to Theano and the error should be gone. There are still some issues that I've had with the source code, so I'll be making a pull request addressing them soon. Hope that's helpful :)
To build on @mattmacy answer:
You should checkout a right Keras commit, I used 1.2.2. (use a virtual env beforehand if necessary)
Make the following modification in keras_decomposable_attention.py / _Comparison / __call__:
(not necessary for the py.test but necessary for the training)
avged = GlobalAveragePooling1D()(result) # mask=self.words
maxed = GlobalMaxPooling1D()(result) # mask=self.words
pip uninstall Keras
# move to the right directory for you
git clone https://github.com/fchollet/keras.git
cd keras
git checkout 1.2.2
python setup.py build
python setup.py install
# move to the right directory for you
py.test keras_parikh_entailment/keras_decomposable_attention.py
py.test keras_parikh_entailment/keras_decomposable_attention.py
========================================================================================================= test session starts =========================================================================================================
platform darwin -- Python 2.7.10, pytest-3.2.2, py-1.4.34, pluggy-0.4.0
rootdir: /.../spaCy, inifile:
collected 2 items
keras_parikh_entailment/keras_decomposable_attention.py ..
========================================================================================================== warnings summary ===========================================================================================================
examples/keras_parikh_entailment/keras_decomposable_attention.py::test_fit_model
/.../lib/python2.7/site-packages/tensorflow/python/ops/gradients_impl.py:95: UserWarning: Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory.
"Converting sparse IndexedSlices to a dense Tensor of unknown shape. "
-- Docs: http://doc.pytest.org/en/latest/warnings.html
================================================================================================ 2 passed, 1 warnings in 25.32 seconds ================================================================================================
Worked for me.
Configuration:
pip list
DEPRECATION: The default format will switch to columns in the future. You can use --format=(legacy|columns) (or define a format=(legacy|columns) in your pip.conf under the [list] section) to disable this warning.
backports.weakref (1.0.post1)
bleach (1.5.0)
certifi (2017.7.27.1)
chardet (3.0.4)
cymem (1.31.2)
Cython (0.23.5)
cytoolz (0.8.2)
dill (0.2.7.1)
en-core-web-sm (1.2.0)
ftfy (4.4.3)
funcsigs (1.0.2)
html5lib (1.0b8)
idna (2.6)
Keras (1.2.2)
Markdown (2.6.9)
mock (2.0.0)
murmurhash (0.26.4)
numpy (1.13.2)
pathlib (1.0.1)
pbr (3.1.1)
pip (9.0.1)
plac (0.9.6)
preshed (1.0.0)
protobuf (3.4.0)
py (1.4.34)
pytest (3.2.2)
PyYAML (3.12)
regex (2017.9.23)
requests (2.18.4)
scipy (0.19.1)
setuptools (36.5.0)
six (1.11.0)
spacy (1.9.0)
tensorflow (1.3.0)
tensorflow-tensorboard (0.1.7)
termcolor (1.1.0)
Theano (0.10.0b3)
thinc (6.5.2)
toolz (0.8.2)
tqdm (4.17.1)
ujson (1.35)
urllib3 (1.22)
wcwidth (0.1.7)
webencodings (0.5.1)
Werkzeug (0.12.2)
wheel (0.30.0)
wrapt (1.10.11)
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs.
Most helpful comment
Running into some issues when trying to run both tests,