| bq_test_project {bigrquery} | R Documentation |
You'll need to set up the BIGQUERY_TEST_PROJECT (name of a project) and
BIGQUERY_TEST_BUCKET (name of bucket) env vars in order to run bigrquery
tests locally. I recommend creating a new project because the tests involve
both reading and writing in BigQuery and CloudStorage.
You will also need to have billing billing enabled for the project, and to
run bq_test_init() once.
bq_test_project() bq_test_init(name = "basedata") bq_test_dataset(name = random_name(), location = "US") bq_testable() bq_authable() gs_test_bucket() gs_test_object(name = random_name())
name |
Dataset name - used only for testing. |
bq_test_project() returns the name of a project suitable for
use in testing. bq_test_dataset() creates a temporary dataset
who's lifetime is tied to the lifetime of the object that it returns.
In tests, bq_test_project() (and hence bq_test_dataset()) will
automatically skip if auth and a test project are not available.
if (bq_testable()) {
ds <- bq_test_dataset()
bq_mtcars <- bq_table_upload(bq_table(ds, "mtcars"), mtcars)
# dataset and table will be automatically deleted when ds is GC'd
}