Each analytics team in your organization is running BigQuery jobs in their own projects. You want to enable
each team to monitor slot usage within their projects. What should you do?
A data pipeline uses Cloud Pub/Sub for ingesting data. The data is stored in topics and a Dataflow workflow reads from a subscription to that topic, processes the data, and writes output to BigQuery. What is the recommended way to authenticate when reading data from Cloud Pub/Sub?
Messages are unexpectedly accumulating in service using Cloud Pub/Sub. A developer unfamiliar with Cloud Pub/Sub has asked for our help in diagnosing the problem. What would you point out with respect to how messages are removed from Cloud Pub/Sub topics?
A manufacturer of delivery drones is implementing a new data analysis pipeline to detect part failures before they occur. The drones have multiple sensors that send performance and environment data to an analytics pipeline. Currently, data is sent to a REST API endpoint. The REST API endpoint that receives data cannot always keep up with the pace data is arriving. When that happens, data is lost. Machine learning engineers have asked you to change the ingestion process to reduce this data loss. What would you do?
As your organization expands its usage of GCP, many teams have started to create their own projects. Projects
are further multiplied to accommodate different stages of deployments and target audiences. Each project
requires unique access control configurations. The central IT team needs to have access to all projects.
Furthermore, data from Cloud Storage buckets and BigQuery datasets must be shared for use in other projects
in an ad hoc way. You want to simplify access control management by minimizing the number of policies.
Which two steps should you take? Choose 2 answers.