Rasa: Unable to detect entities from exported data of api.ai

Created on 17 Dec 2016  路  11Comments  路  Source: RasaHQ/rasa

I imported one agent's data from api.ai to rasa and faced below mentioned issues in response -
1) Not able to detect entities.
2) It was working fine for exact statements ,but for even slightly different it was detecting random intents.
3) Even for gibberish data it was detecting one or the other intent. Keeping into account that we can't have confidence of detection , its impossible to know whether we have detected correct intent or not.

To further debug the issue if you want to have exported data let me know I will mail you.

type

All 11 comments

Hi @jayeshathila

I will look into the issue today; in the meantime do you mind sending me some data by email at ........?
Thank you for your help

@plauto Tried mailing data at .... but delivery is failing. Can you please check if emailId given is active ?

@jayeshathila sorry, my bad. I was texting over my phone. Try again at .......
EDIT: Just received. Thank you :)

Ok, I found out the problem. Basically we should generate some example from the placeholders that you have in your entities...I am writing some code that generates those samples, so that the training will actually happen :)

@jayeshathila I am working on a fix. It's taking a bit more than expected since we need to implement something like this:

https://api.ai/blog/2014/12/10/System_Entities/

https://docs.api.ai/docs/concept-entities#section-system-entities

so far I have some code that expands your entities for the title and vendor and generates the corresponding training examples; while this works great, I also need to implement something extra for reproducing the system entities!

@plauto I'm having similar issues with intent and entity classification using the training data given in the rasa NLU tutorial(restaurant_search). Though this only occurs when i'm using Spacy as the backend. For Mitie it's fine.

Will the fix solve this too? Much appreciate your help.

@yangsterr could you please elaborate on your problem? e.g. give the input, expected output, and actual output. That would help us understand if this is the same thing.

@amn41 I'm trying to replicate the results in the restaurant search bot tutorial. Let's focus on the entity issue.

What i've done... I changed the backend value in config.py file to spacy_sklearn, use demo-rasa.json to train the model(this is surprisingly fast within 30 seconds), added "server_model_dir" : "./model_YYYYMMDD-HHMMSS" into the config.json file, set server up, ran the curl command to parse the sentence...

curl "I am looking for Chinese food" returns me with no entity.

curl "I want to eat Indian food" returns me with "I" as cuisine entity and "Indian" as a location entity.

curl "Show me Indian restaurant" returns me the right entity which is cuisine - Indian.

curl labelled training example such as "central indian restaurant" but entity cuisine - Indian was not detected.

I'm wondering how to fix this inconsistency seen in Spacy and not in Mitie.

I'm seeing exactly the same result as @yangsterr reported with Spacy.
Indeed works perfectly as in tutorial with MITIE.

I also confirm a similar result as @naoko and @yangsterr - I see the surprisingly quick initial training with spacy_sklearn and although I didn't think it odd until I'd seen the training go through with MITIE, there was no output in the terminal (I'm guessing maybe there should be something akin to what MITIE shows).

However it did create the model folder, so it was only when I couldn't get it to show recognition of any entities that I realised something was up...

The spacy_sklearn issue shouldn't alarm you too much. It's my fault for not documenting correctly. Basically the tutorial example has far too little training data to really work, and it's kind of good luck that MITIE can manage it. I've added a comment to that effect in the documentation.

Regarding the idea of supporting something like api.ai's shared entities, that's something for the roadmap, and contributions are very welcome :)

Closing this for now

Was this page helpful?
0 / 5 - 0 ratings