Related to #5974
The goal is to create a UI that makes it easy to upload CSV files to Elasticsearch and start playing with them in Kibana without having to use any external tools like Logstash.
\ will require slightly different code, and I'm not sure what other schemes are common in CSVs. Perhaps there's a pre-existing library I can use that will cover all the common cases.The CSV wizard will appear as an additional option on the Add Data landing page.

The first screen will allow the user to select a CSV file for upload and show them a preview in tabular format. The preview will initially show a limited number of rows from the file, potentially with the option to page through. The user will have an option to pick a delimiter and quote/escape character. We might try to auto detect these.

This step allows the user to build an ingest pipeline to manipulate their data before it gets indexed in Elasticsearch. Works just like the pipeline step in the tail a file wizard #5974.

Create a Kibana Index pattern based on the sample output from the ingest pipeline. Allows the user to specify an index pattern name, a timestamp field, and the mapping type of all their fields. Works just like the matching step in the tail a file wizard #5974.

Once the user clicks save in the previous step, Kibana will do a number of things in the background:
This page will probably need some sort of progress indicator in case it's working with a large file. Once all the operations are complete we'll enable a button that takes the user to the discover page for their new index pattern.

My 2c
This might be a good dataset to play with for the advance use case... multiple files per logical index, date/time stamps, geo coordinates:
http://www1.ncdc.noaa.gov/pub/data/swdi/stormevents/csvfiles/




@tbragin how crucial do you think the customizable quote character is? The library I'm using right now only supports double quotes, and RFC 4180 specifies double quotes should be used, so I'm wondering if it's worth waiting to see if there's any demand for customization before adding it.
@Bargs I don't think it's necessary for phase 1 - I'd be fine taking the wait-and-see approach on that one.
There is a demand to upload large csv files into kibana without using logstash.
Most helpful comment
There is a demand to upload large csv files into kibana without using logstash.