Robomongo: Import large json not working - no error

Created on 31 Mar 2014  路  3Comments  路  Source: Studio3T/robomongo

I am attempting to import around 1350 documents at once with a very large JSON file. Each document has about 103 attributes. Needless to say, the JSON dump is very large. When I execute the insert with the json pasted, it chugs for about 10 seconds, then says it was successful, however, no documents are actually inserted. If I break down the large JSON paste into smaller chunks (inserting maybe 100-250 at a time), then it works fine. It seems like the number of documents/size of the text is limiting what is happening. I am receiving no error messages. Is there a hard limit?

wontfix

Most helpful comment

if you provide a snigle document, the above command is right for sure, otherwise for multiple this command is works fine :
mongoimport --jsonArray --db Research --collection fields --file jsonSample.json --jsonArray

All 3 comments

@kphamilton If you need to import JSON in bulk you would be much better off using the mongoimport command line tool that ships with MongoDB. This should give display errors for documents or JSON that cannot be imported.

Sample usage:

 mongoimport --db mydb --collection contacts --file contacts.json

Importing multiple documents via Robomongo is currently a quick aid for inserting a few documents rather than a full bulk import tool ;-)

if you provide a snigle document, the above command is right for sure, otherwise for multiple this command is works fine :
mongoimport --jsonArray --db Research --collection fields --file jsonSample.json --jsonArray

I also had to resort to using mongoimport. My data consisted of a JSON array, and my problem was a missing --jsonArray command line argument.

Thanks @MohammadHeydari!

Was this page helpful?
0 / 5 - 0 ratings