Skip to content

Issue while uploading data to an existing bigquery table #2591

@VikramTiwari

Description

@VikramTiwari
  1. OS type and version : macOS, v10.12
  2. Python version and virtual environment information python --version : Python 2.7.12
  3. google-cloud-python version pip show google-cloud, pip show google-<service> or pip freeze
gax-google-logging-v2==0.8.3
gax-google-pubsub-v1==0.8.3
gcloud==0.18.3
google-api-python-client==1.5.4
google-apitools==0.5.4
google-cloud-dataflow==0.4.1
google-gax==0.12.5
googleapis-common-protos==1.3.5
grpc-google-logging-v2==0.8.1
grpc-google-pubsub-v1==0.8.1
grpcio==1.0.1rc1
  1. Stacktrace if available
Traceback (most recent call last):



  File "readAndUploadFacebookReport.py", line 135, in <module>

    main()

  File "readAndUploadFacebookReport.py", line 128, in main

    load_transformed_data_to_bq('fbschema.json', transform_filename)

  File "readAndUploadFacebookReport.py", line 83, in load_transformed_data_to_bq

    job = table.upload_from_file(readable, source_format='text/csv', skip_leading_rows=1)

  File "/usr/local/lib/python2.7/site-packages/gcloud/bigquery/table.py", line 913, in upload_from_file

    return client.job_from_resource(json.loads(response_content))

  File "/usr/local/lib/python2.7/site-packages/gcloud/bigquery/client.py", line 119, in job_from_resource

    config = resource['configuration']

KeyError: 'configuration'
  1. Steps to reproduce
  2. Tried pushing data into an existing table on BQ using table.upload_from_file on a csv file with proper schema and other configurations. Code breaks on that line as well.
  3. Code example
with open(filepath, 'rb') as readable:
        job = table.upload_from_file(readable, source_format='text/csv', skip_leading_rows=1, write_disposition="WRITE_APPEND")

Metadata

Metadata

Assignees

Labels

api: bigqueryIssues related to the BigQuery API.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions