I'm using a python script to COPY/INSERT data from google storage to a table in big query:
uri = 'gs:// .... .csv.gz'
load_job = client.load_table_from_uri(
uri,
dataset_ref.table('testtable')) # API request
I'm getting:
google.api_core.exceptions.BadRequest: 400 Error while reading data, error message: CSV table encountered too many errors, giving up. Rows: 1; errors: 1. Please look into the error stream for more details.
But it doesn't say where is the error stream? I checked Job History in the UI and it does not list there.
How can I get access to this error stream?
When I'm doing auto detect from the UI for the file everything works and the data is loaded correctly.
The load_job instance has an errors property, which will return a list of the error mappings returned by the API call. See also troubleshooting errors.
Most helpful comment
The
load_jobinstance has anerrorsproperty, which will return a list of the error mappings returned by the API call. See also troubleshooting errors.