Download Associate Cloud Engineer.Associate-Cloud-Engineer.VCEplus.2024-12-04.162q.vcex

Vendor: Google
Exam Code: Associate-Cloud-Engineer
Exam Name: Associate Cloud Engineer
Date: Dec 04, 2024
File Size: 314 KB

How to open VCEX files?

Files with VCEX extension can be opened by ProfExam Simulator.

Purchase
Coupon: EXAM_HUB

Discount: 20%

Demo Questions

Question 1
Your management has asked an external auditor to review all the resources in a specific project. The security team has enabled the Organization Policy called Domain Restricted Sharing on the organization node by specifying only your Cloud Identity domain. You want the auditor to only be able to view, but not modify, the resources in that project. What should you do?
  1. Ask the auditor for their Google account, and give them the Viewer role on the project.
  2. Ask the auditor for their Google account, and give them the Security Reviewer role on the project.
  3. Create a temporary account for the auditor in Cloud Identity, and give that account the Viewer role on the project.
  4. Create a temporary account for the auditor in Cloud Identity, and give that account the Security Reviewer role on the project.
Correct answer: C
Explanation:
Using primitive roles The following table lists the primitive roles that you can grant to access a project, the description of what the role does, and the permissions bundled within that role. Avoid using primitive roles except when absolutely necessary. These roles are very powerful, and include a large number of permissions across all Google Cloud services. For more details on when you should use primitive roles, see the Identity and Access Management FAQ. IAM predefined roles are much more granular, and allow you to carefully manage the set of permissions that your users have access to. See Understanding Roles for a list of roles that can be granted at the project level. Creating custom roles can further increase the control you have over user permissions.  https://cloud.google.com/resource-manager/docs/access-control-proj#using_primitive_roles https://cloud.google.com/iam/docs/understanding-custom-roles 
Using primitive roles The following table lists the primitive roles that you can grant to access a project, the description of what the role does, and the permissions bundled within that role. Avoid using primitive roles except when absolutely necessary. These roles are very powerful, and include a large number of permissions across all Google Cloud services. For more details on when you should use primitive roles, see the Identity and Access Management FAQ. IAM predefined roles are much more granular, and allow you to carefully manage the set of permissions that your users have access to. See Understanding Roles for a list of roles that can be granted at the project level. Creating custom roles can further increase the control you have over user permissions. 
https://cloud.google.com/resource-manager/docs/access-control-proj#using_primitive_roles 
https://cloud.google.com/iam/docs/understanding-custom-roles 
Question 2
You have a workload running on Compute Engine that is critical to your business. You want to ensure that the data on the boot disk of this workload is backed up regularly. You need to be able to restore a backup as quickly as possible in case of disaster. You also want older backups to be cleaned automatically to save on cost. You want to follow Google-recommended practices. What should you do?
  1. Create a Cloud Function to create an instance template.
  2. Create a snapshot schedule for the disk using the desired interval.
  3. Create a cron job to create a new disk from the disk using gcloud.
  4. Create a Cloud Task to create an image and export it to Cloud Storage.
Correct answer: B
Explanation:
Best practices for persistent disk snapshotsYou can create persistent disk snapshots at any time, but you can create snapshots more quickly and with greater reliability if you use the following best practices.Creating frequent snapshots efficientlyUse snapshots to manage your data efficiently.Create a snapshot of your data on a regular schedule to minimize data loss due to unexpected failure.Improve performance by eliminating excessive snapshot downloads and by creating an image and reusing it.Set your snapshot schedule to off-peak hours to reduce snapshot time.Snapshot frequency limitsCreating snapshots from persistent disksYou can snapshot your disks at most once every 10 minutes. If you want to issue a burst of requests to snapshot your disks, you can issue at most 6 requests in 60 minutes.If the limit is exceeded, the operation fails and returns the following error:https://cloud.google.com/compute/docs/disks/snapshot-best-practices
Best practices for persistent disk snapshots
You can create persistent disk snapshots at any time, but you can create snapshots more quickly and with greater reliability if you use the following best practices.
Creating frequent snapshots efficiently
Use snapshots to manage your data efficiently.
Create a snapshot of your data on a regular schedule to minimize data loss due to unexpected failure.
Improve performance by eliminating excessive snapshot downloads and by creating an image and reusing it.
Set your snapshot schedule to off-peak hours to reduce snapshot time.
Snapshot frequency limits
Creating snapshots from persistent disks
You can snapshot your disks at most once every 10 minutes. If you want to issue a burst of requests to snapshot your disks, you can issue at most 6 requests in 60 minutes.
If the limit is exceeded, the operation fails and returns the following error:
https://cloud.google.com/compute/docs/disks/snapshot-best-practices
Question 3
You need to assign a Cloud Identity and Access Management (Cloud IAM) role to an external auditor. The auditor needs to have permissions to review your Google Cloud Platform (GCP) Audit Logs and also to review your Data Access logs. What should you do?
  1. Assign the auditor the IAM role roles/logging.privateLogViewer. Perform the export of logs to Cloud Storage.
  2. Assign the auditor the IAM role roles/logging.privateLogViewer. Direct the auditor to also review the logs for changes to Cloud IAM policy.
  3. Assign the auditor's IAM user to a custom role that has logging.privateLogEntries.list permission. Perform the export of logs to Cloud Storage.
  4. Assign the auditor's IAM user to a custom role that has logging.privateLogEntries.list permission. Direct the auditor to also review the logs for changes to Cloud IAM policy.
Correct answer: B
Explanation:
Google Cloud provides Cloud Audit Logs, which is an integral part of Cloud Logging. It consists of two log streams for each project: Admin Activity and Data Access, which are generated by Google Cloud services to help youanswer the question of who did what, where, and when? within your Google Cloud projects.Ref:https://cloud.google.com/iam/docs/job-functions/auditing#scenario_external_auditors
Google Cloud provides Cloud Audit Logs, which is an integral part of Cloud Logging. It consists of two log streams for each project: Admin Activity and Data Access, which are generated by Google Cloud services to help you
answer the question of who did what, where, and when? within your Google Cloud projects.
Ref:https://cloud.google.com/iam/docs/job-functions/auditing#scenario_external_auditors
Question 4
You are managing several Google Cloud Platform (GCP) projects and need access to all logs for the past 60 days. You want to be able to explore and quickly analyze the log contents. You want to follow Google- recommended practices to obtain the combined logs for all projects. What should you do?
  1. Navigate to Stackdriver Logging and select resource.labels.project_id='*'
  2. Create a Stackdriver Logging Export with a Sink destination to a BigQuery dataset. Configure the table expiration to 60 days.
  3. Create a Stackdriver Logging Export with a Sink destination to Cloud Storage. Create a lifecycle rule to delete objects after 60 days. 
  4. Configure a Cloud Scheduler job to read from Stackdriver and store the logs in BigQuery. Configure the table expiration to 60 days.
Correct answer: B
Explanation:
Ref:https://cloud.google.com/logging/docs/export/aggregated_sinksEither way, we now have the data in a BigQuery Dataset. Querying information from a Big Query dataset is easier and quicker than analyzing contents in Cloud Storage bucket. As our requirement is to Quickly analyze the log contents, we should prefer Big Query over Cloud Storage.Also, You can control storage costs and optimize storage usage by setting the default table expiration for newly created tables in a dataset. If you set the property when the dataset is created, any table created in the dataset is deleted after the expiration period. If you set the property after the dataset is created, only new tables are deleted after the expiration period. For example, if you set the default table expiration to 7 days, older data is automatically deleted after 1 week. Ref:https://cloud.google.com/bigquery/docs/best-practices-storage
Ref:https://cloud.google.com/logging/docs/export/aggregated_sinks
Either way, we now have the data in a BigQuery Dataset. Querying information from a Big Query dataset is easier and quicker than analyzing contents in Cloud Storage bucket. As our requirement is to Quickly analyze the log contents, we should prefer Big Query over Cloud Storage.
Also, You can control storage costs and optimize storage usage by setting the default table expiration for newly created tables in a dataset. If you set the property when the dataset is created, any table created in the dataset is deleted after the expiration period. If you set the property after the dataset is created, only new tables are deleted after the expiration period. For example, if you set the default table expiration to 7 days, older data is automatically deleted after 1 week. Ref:https://cloud.google.com/bigquery/docs/best-practices-storage
Question 5
You need to reduce GCP service costs for a division of your company using the fewest possible steps. You need to turn off all configured services in an existing GCP project. What should you do?
  1. 1. Verify that you are assigned the Project Owners IAM role for this project. 2. Locate the project in the GCP console, click Shut down and then enter the project ID.
  2. 1. Verify that you are assigned the Project Owners IAM role for this project. 2. Switch to the project in the GCP console, locate the resources and delete them.
  3. 1. Verify that you are assigned the Organizational Administrator IAM role for this project. 2. Locate the project in the GCP console, enter the project ID and then click Shut down.
  4. 1. Verify that you are assigned the Organizational Administrators IAM role for this project. 2. Switch to the project in the GCP console, locate the resources and delete them.
Correct answer: A
Explanation:
https://cloud.google.com/run/docs/tutorials/gcloudhttps://cloud.google.com/resource-manager/docs/creating-managing-projectshttps://cloud.google.com/iam/docs/understanding-roles#primitive_rolesYou can shut down projects using the Cloud Console. When you shut down a project, this immediately happens: All billing and traffic serving stops, You lose access to the project, The owners of the project will be notified and can stop the deletion within 30 days, The project will be scheduled to be deleted after 30 days. However, some resources may be deleted much earlier.
https://cloud.google.com/run/docs/tutorials/gcloud
https://cloud.google.com/resource-manager/docs/creating-managing-projects
https://cloud.google.com/iam/docs/understanding-roles#primitive_roles
You can shut down projects using the Cloud Console. When you shut down a project, this immediately happens: All billing and traffic serving stops, You lose access to the project, The owners of the project will be notified and can stop the deletion within 30 days, The project will be scheduled to be deleted after 30 days. However, some resources may be deleted much earlier.
Question 6
You are configuring service accounts for an application that spans multiple projects. Virtual machines (VMs) running in the web-applications project need access to BigQuery datasets in crm-databases-proj. You want to follow Google-recommended practices to give access to the service account in the web-applications project. What should you do?
  1. Give ''project owner'' for web-applications appropriate roles to crm-databases- proj
  2. Give ''project owner'' role to crm-databases-proj and the web-applications project.
  3. Give ''project owner'' role to crm-databases-proj and bigquery.dataViewer role to web-applications.
  4. Give bigquery.dataViewer role to crm-databases-proj and appropriate roles to web-applications.
Correct answer: C
Explanation:
bigquery.dataViewer role provides permissions to read the datasets metadata and list tables in the dataset as well as Read data and metadata from the datasets tables. This is exactly what we need to fulfil this requirement and follows the least privilege principle.Ref:https://cloud.google.com/iam/docs/understanding-roles#bigquery-roles
bigquery.dataViewer role provides permissions to read the datasets metadata and list tables in the dataset as well as Read data and metadata from the datasets tables. This is exactly what we need to fulfil this requirement and follows the least privilege principle.
Ref:https://cloud.google.com/iam/docs/understanding-roles#bigquery-roles
Question 7
An employee was terminated, but their access to Google Cloud Platform (GCP) was not removed until 2 weeks later. You need to find out this employee accessed any sensitive customer information after their termination.
What should you do?
 
  1. View System Event Logs in Stackdriver. Search for the user's email as the principal.
  2. View System Event Logs in Stackdriver. Search for the service account associated with the user.
  3. View Data Access audit logs in Stackdriver. Search for the user's email as the principal.
  4. View the Admin Activity log in Stackdriver. Search for the service account associated with the user.
Correct answer: C
Explanation:
https://cloud.google.com/logging/docs/auditData Access audit logs Data Access audit logs contain API calls that read the configuration or metadata of resources, as well as user-driven API calls that create, modify, or read user-provided resource data. https://cloud.google.com/logging/docs/audit#data-access
https://cloud.google.com/logging/docs/audit
Data Access audit logs Data Access audit logs contain API calls that read the configuration or metadata of resources, as well as user-driven API calls that create, modify, or read user-provided resource data. 
https://cloud.google.com/logging/docs/audit#data-access
Question 8
You need to create a custom IAM role for use with a GCP service. All permissions in the role must be suitable for production use. You also want to clearly share with your organization the status of the custom role. This will be the first version of the custom role. What should you do?
  1. Use permissions in your role that use the 'supported' support level for role permissions. Set the role stage to ALPHA while testing the role permissions.
  2. Use permissions in your role that use the 'supported' support level for role permissions. Set the role stage to BETA while testing the role permissions.
  3. Use permissions in your role that use the 'testing' support level for role permissions. Set the role stage to ALPHA while testing the role permissions.
  4. Use permissions in your role that use the 'testing' support level for role permissions. Set the role stage to BETA while testing the role permissions.
Correct answer: A
Explanation:
When setting support levels for permissions in custom roles, you can set to one of SUPPORTED, TESTING or NOT_SUPPORTED.Ref:https://cloud.google.com/iam/docs/custom-roles-permissions-support
When setting support levels for permissions in custom roles, you can set to one of SUPPORTED, TESTING or NOT_SUPPORTED.
Ref:https://cloud.google.com/iam/docs/custom-roles-permissions-support
Question 9
Your existing application running in Google Kubernetes Engine (GKE) consists of multiple pods running on four GKE n1--standard--2 nodes. You need to deploy additional pods requiring n2--highmem--16 nodes without any downtime. What should you do?
  1. Use gcloud container clusters upgrade. Deploy the new services.
  2. Create a new Node Pool and specify machine type n2--highmem--16. Deploy the new pods.
  3. Create a new cluster with n2--highmem--16 nodes. Redeploy the pods and delete the old cluster.
  4. Create a new cluster with both n1--standard--2 and n2--highmem--16 nodes. Redeploy the pods and delete the old cluster.
Correct answer: B
Explanation:
https://cloud.google.com/kubernetes-engine/docs/concepts/deployment
https://cloud.google.com/kubernetes-engine/docs/concepts/deployment
Question 10
You have an application that uses Cloud Spanner as a database backend to keep current state information about users. Cloud Bigtable logs all events triggered by users. You export Cloud Spanner data to Cloud Storage during daily backups. One of your analysts asks you to join data from Cloud Spanner and Cloud Bigtable for specific users. You want to complete this ad hoc request as efficiently as possible. What should you do?
  1. Create a dataflow job that copies data from Cloud Bigtable and Cloud Storage for specific users.
  2. Create a dataflow job that copies data from Cloud Bigtable and Cloud Spanner for specific users.
  3. Create a Cloud Dataproc cluster that runs a Spark job to extract data from Cloud Bigtable and Cloud Storage for specific users. 
  4. Create two separate BigQuery external tables on Cloud Storage and Cloud Bigtable. Use the BigQuery console to join these tables through user fields, and apply appropriate filters.
Correct answer: D
Explanation:
'The Cloud Spanner to Cloud Storage Text template is a batch pipeline that reads in data from a Cloud Spanner table, optionally transforms the data via a JavaScript User Defined Function (UDF) that you provide, and writes it to Cloud Storage as CSV text files.'https://cloud.google.com/dataflow/docs/guides/templates/provided-batch#cloudspannertogcstext'The Dataflow connector for Cloud Spanner lets you read data from and write data to Cloud Spanner in a Dataflow pipeline'https://cloud.google.com/spanner/docs/dataflow-connectorhttps://cloud.google.com/bigquery/external-data-sources
'The Cloud Spanner to Cloud Storage Text template is a batch pipeline that reads in data from a Cloud Spanner table, optionally transforms the data via a JavaScript User Defined Function (UDF) that you provide, and writes it to Cloud Storage as CSV text files.'
https://cloud.google.com/dataflow/docs/guides/templates/provided-batch#cloudspannertogcstext
'The Dataflow connector for Cloud Spanner lets you read data from and write data to Cloud Spanner in a Dataflow pipeline'
https://cloud.google.com/spanner/docs/dataflow-connector
https://cloud.google.com/bigquery/external-data-sources
HOW TO OPEN VCE FILES

Use VCE Exam Simulator to open VCE files
Avanaset

HOW TO OPEN VCEX AND EXAM FILES

Use ProfExam Simulator to open VCEX and EXAM files
ProfExam Screen

ProfExam
ProfExam at a 20% markdown

You have the opportunity to purchase ProfExam at a 20% reduced price

Get Now!