您好,我尝试使用 java client library 从 BigQuery 中删除表在Dataproc中,启动spark-shell,如下所示:
spark-shell --packages com.google.cloud:google-cloud-bigquery:1.59.0
并导入以下依赖项
import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryOptions;
import com.google.cloud.bigquery.FieldValueList;
import com.google.cloud.bigquery.Job;
import com.google.cloud.bigquery.JobId;
import com.google.cloud.bigquery.JobInfo;
import com.google.cloud.bigquery.QueryJobConfiguration;
import com.google.cloud.bigquery.QueryResponse;
import com.google.cloud.bigquery.TableResult;
import java.util.UUID;
val bigquery = BigQueryOptions.getDefaultInstance().getService()
bigquery.delete("test","temp")
这里 test 和 temp 分别是我的数据集和表名称,但运行上述语句后,它显示以下错误:
java.lang.NoSuchMethodError: com.google.common.util.concurrent.MoreExecutors.directExecutor()Ljava/util/concurrent/Executor;
at com.google.api.gax.retrying.BasicRetryingFuture.<init>(BasicRetryingFuture.java:82)
at com.google.api.gax.retrying.DirectRetryingExecutor.createFuture(DirectRetryingExecutor.java:88)
at com.google.api.gax.retrying.DirectRetryingExecutor.createFuture(DirectRetryingExecutor.java:74)
at com.google.cloud.RetryHelper.run(RetryHelper.java:75)
at com.google.cloud.RetryHelper.runWithRetries(RetryHelper.java:50)
at com.google.cloud.bigquery.BigQueryImpl.delete(BigQueryImpl.java:386)
at com.google.cloud.bigquery.BigQueryImpl.delete(BigQueryImpl.java:375)
... 48 elided
最佳答案
这是因为类路径上有一个旧的 Guava library没有 MoreExecutors.directExecutor
方法的版本(作为 Hadoop/Spark 依赖项提供)。
要解决此问题,您需要包含和 shade/relocate (以避免与类路径上的其他库发生冲突)google-cloud-bigquery
库及其依赖项(包括 Guava)添加到应用程序的 UberJar 中.
Here is如何使用 Maven Shade 执行此操作的示例插件。
关于apache-spark - Google Cloud Dataproc 删除 BigQuery 表不起作用,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/54286624/