Request insufficient authentication scopes when running Spark-Job on dataproc

apache-spark google-cloud-platform google-cloud-dataproc

3277 просмотра

1 ответ

I am trying to run the spark job on the google dataproc cluster as

 gcloud dataproc jobs submit hadoop --cluster <cluster-name> \
--jar file:///usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar \
--class org.apache.hadoop.examples.WordCount \
--arg1 \
--arg2 \

But the Job throws error

 (gcloud.dataproc.jobs.submit.spark) PERMISSION_DENIED: Request had insufficient authentication scopes.

How do I add the auth scopes to run the JOB?

Автор: Freeman Источник Размещён: 28.08.2019 06:02

Ответы (1)


9 плюса

Решение

Usually if you're running into this error it's because of running gcloud from inside a GCE VM that's using VM-metadata controlled scopes, since otherwise gcloud installed on a local machine will typically already be using broad scopes to include all GCP operations.

For Dataproc access, when creating the VM from which you're running gcloud, you need to specify --scopes cloud-platform from the CLI, or if creating the VM from the Cloud Console UI, you should select "Allow full access to all Cloud APIs":

Cloud Console Create VM UI - Identity and API access

As another commenter mentioned above, nowadays you can also update scopes on existing GCE instances to add the CLOUD_PLATFORM scope.

Автор: Dennis Huo Размещён: 05.09.2017 07:01
Вопросы из категории :
32x32