Download Associate Cloud Engineer.Associate-Cloud-Engineer.PracticeTest.2025-04-13.141q.vcex

Vendor: Google
Exam Code: Associate-Cloud-Engineer
Exam Name: Associate Cloud Engineer
Date: Apr 13, 2025
File Size: 710 KB

How to open VCEX files?

Files with VCEX extension can be opened by ProfExam Simulator.

Demo Questions

Question 1
A colleague handed over a Google Cloud Platform project for you to maintain. As part of a security checkup, you want to review who has been granted the Project Owner role. What should you do?
  1. In the console, validate which SSH keys have been stored as project-wide keys.
  2. Navigate to Identity-Aware Proxy and check the permissions for these resources.
  3. Enable Audit Logs on the IAM & admin page for all resources, and validate the results.
  4. Use the command gcloud projects get-iam-policy to view the current role assignments.
Correct answer: D
Explanation:
A simple approach would be to use the command flags available when listing all the IAM policy for a given project. For instance, the following command: `gcloud projects get-iam-policy $PROJECT_ID --flatten=" bindings[].members" --format="table(bindings.members)" --filter="bindings.role:roles/owner"` outputs all the users and service accounts associated with the role 'roles/owner' in the project in question. https://groups.google.com/g/google-cloud-dev/c/Z6sZs7TvygQ?pli=1
A simple approach would be to use the command flags available when listing all the IAM policy for a given project. For instance, the following command: `gcloud projects get-iam-policy $PROJECT_ID --flatten=" bindings[].members" --format="table(bindings.members)" --filter="bindings.role:roles/owner"` outputs all the users and service accounts associated with the role 'roles/owner' in the project in question. https://groups.
google.com/g/google-cloud-dev/c/Z6sZs7TvygQ?pli=1
Question 2
A company wants to build an application that stores images in a Cloud Storage bucket and wants to generate thumbnails as well as resize the images. They want to use a google managed service that can scale up and scale down to zero automatically with minimal effort. You have been asked to recommend a service. Which GCP service would you suggest?
  1. Google Compute Engine
  2. Google App Engine
  3. Cloud Functions
  4. Google Kubernetes Engine
Correct answer: C
Explanation:
Text Description automatically generated with low confidence Cloud Functions is Google Cloud's event-driven serverless compute platform. It automatically scales based on the load and requires no additional configuration. You pay only for the resources used.Ref: https://cloud.google.com/functionsWhile all other options i.e. Google Compute Engine, Google Kubernetes Engine, Google App Engine support autoscaling, it needs to be configured explicitly based on the load and is not as trivial as the scale up or scale down offered by Google's cloud functions.
Text Description automatically generated with low confidence
Cloud Functions is Google Cloud's event-driven serverless compute platform. It automatically scales based on the load and requires no additional configuration. You pay only for the resources used.
Ref: https://cloud.google.com/functions
While all other options i.e. Google Compute Engine, Google Kubernetes Engine, Google App Engine support autoscaling, it needs to be configured explicitly based on the load and is not as trivial as the scale up or scale down offered by Google's cloud functions.
Question 3
After a recent security incident, your startup company wants better insight into what is happening in the Google Cloud environment. You need to monitor unexpected firewall changes and instance creation. Your company prefers simple solutions. What should you do?
  1. Install Kibana on a compute Instance. Create a log sink to forward Cloud Audit Logs filtered for firewalls and compute instances to Pub/Sub. Target the Pub/Sub topic to push messages to the Kibana instance.Analyze the logs on Kibana in real time.
  2. Create a log sink to forward Cloud Audit Logs filtered for firewalls and compute instances to Cloud Storage.Use BigQuery to periodically analyze log events in the storage bucket.
  3. Turn on Google Cloud firewall rules logging, and set up alerts for any insert, update, or delete events.
  4. Use Cloud Logging filters to create log-based metrics for firewall and instance actions. Monitor the changes and set up reasonable alerts.
Correct answer: D
Question 4
All development (dev) teams in your organization are located in the United States. Each dev team has its own Google Cloud project. You want to restrict access so that each dev team can only create cloud resources in the United States (US). What should you do?
  1. Create an organization to contain all the dev projects. Create an Identity and Access Management (IAM) policy to limit the resources in US regions.
  2. Create an Identity and Access Management <IAM) policy to restrict the resources locations in the US.Apply the policy to all dev projects.
  3. Create a folder to contain all the dev projects Create an organization policy to limit resources in US locations.
  4. Create an Identity and Access Management (IAM)policy to restrict the resources locations in all dev projects. Apply the policy to all dev roles.
Correct answer: C
Question 5
An application generates daily reports in a Compute Engine virtual machine (VM). The VM is in the project corp-iot-insights. Your team operates only in the project corp-aggregate-reports and needs a copy of the daily exports in the bucket corp-aggregate-reports-storage. You want to configure access so that the daily reports from the VM are available in the bucket corp-aggregate-reports-storage and use as few steps as possible while following Google-recommended practices. What should you do?
  1. Move both projects under the same folder.
  2. Grant the VM Service Account the role Storage Object Creator on corp-aggregate-reports-storage.
  3. Create a Shared VPC network between both projects. Grant the VM Service Account the role Storage Object Creator on corp-iot-insights.
  4. Make corp-aggregate-reports-storage public and create a folder with a pseudo-randomized suffix name.Share the folder with the IoT team.
Correct answer: B
Explanation:
Predefined rolesThe following table describes Identity and Access Management (IAM) roles that are associated with Cloud Storage and lists the permissions that are contained in each role. Unless otherwise noted, these roles can be applied either to entire projects or specific buckets.Storage Object Creator (roles/storage.objectCreator) Allows users to create objects. Does not give permission to view, delete, or overwrite objects.https://cloud.google.com/storage/docs/access-control/iam-roles#standard-roles
Predefined roles
The following table describes Identity and Access Management (IAM) roles that are associated with Cloud Storage and lists the permissions that are contained in each role. Unless otherwise noted, these roles can be applied either to entire projects or specific buckets.
Storage Object Creator (roles/storage.objectCreator) Allows users to create objects. Does not give permission to view, delete, or overwrite objects.
https://cloud.google.com/storage/docs/access-control/iam-roles#standard-roles
Question 6
An external member of your team needs list access to compute images and disks in one of your projects. You want to follow Google-recommended practices when you grant the required permissions to this user. What should you do?
  1. Create a custom role based on the Compute Image User role Add the compute.disks, list to the includedPermissions field Grant the custom role to the user at the project level
  2. Create a custom role based on the Compute Storage Admin role. Exclude unnecessary permissions from the custom role. Grant the custom role to the user at the project level.
  3. Create a custom role, and add all the required compute.disks.list and compute, images.list permissions as includedPermissions. Grant the custom role to the user at the project level.
  4. Grant the Compute Storage Admin role at the project level.
Correct answer: A
Question 7
During a recent audit of your existing Google Cloud resources, you discovered several users with email addresses outside of your Google Workspace domain.
You want to ensure that your resources are only shared with users whose email addresses match your domain.
You need to remove any mismatched users, and you want to avoid having to audit your resources to identify mismatched users. What should you do?
  1. Create a Cloud Scheduler task to regularly scan your projects and delete mismatched users.
  2. Create a Cloud Scheduler task to regularly scan your resources and delete mismatched users.
  3. Set an organizational policy constraint to limit identities by domain to automatically remove mismatched users.
  4. Set an organizational policy constraint to limit identities by domain, and then retroactively remove the existing mismatched users.
Correct answer: D
Explanation:
https://cloud.google.com/resource-manager/docs/organization-policy/org-policy-constraints This list constraint defines the set of domains that email addresses added to Essential Contacts can have. By default, email addresses with any domain can be added to Essential Contacts. The allowed/denied list must specify one or more domains of the form @example.com. If this constraint is active and configured with allowed values, only email addresses with a suffix matching one of the entries from the list of allowed domains can be added in Essential Contacts. This constraint has no effect on updating or removing existing contacts.constraints/essentialcontacts.allowedContactDomains
https://cloud.google.com/resource-manager/docs/organization-policy/org-policy-constraints This list constraint defines the set of domains that email addresses added to Essential Contacts can have. By default, email addresses with any domain can be added to Essential Contacts. The allowed/denied list must specify one or more domains of the form @example.com. If this constraint is active and configured with allowed values, only email addresses with a suffix matching one of the entries from the list of allowed domains can be added in Essential Contacts. This constraint has no effect on updating or removing existing contacts.
constraints/essentialcontacts.allowedContactDomains
Question 8
Every employee of your company has a Google account. Your operational team needs to manage a large number of instances on Compute Engine. Each member of this team needs only administrative access to the servers. Your security team wants to ensure that the deployment of credentials is operationally efficient and must be able to determine who accessed a given instance. What should you do?
  1. Generate a new SSH key pair. Give the private key to each member of your team. Configure the public key in the metadata of each instance.
  2. Ask each member of the team to generate a new SSH key pair and to send you their public key. Use a configuration management tool to deploy those keys on each instance.
  3. Ask each member of the team to generate a new SSH key pair and to add the public key to their Google account. Grant the "compute.osAdminLogin" role to the Google group corresponding to this team.
  4. Generate a new SSH key pair. Give the private key to each member of your team. Configure the public key as a project-wide public SSH key in your Cloud Platform project and allow project-wide public SSH keys on each instance.
Correct answer: C
Explanation:
https://cloud.google.com/compute/docs/instances/managing-instance-access
https://cloud.google.com/compute/docs/instances/managing-instance-access
Question 9
The core business of your company is to rent out construction equipment at a large scale. All the equipment that is being rented out has been equipped with multiple sensors that send event information every few seconds. These signals can vary from engine status, distance traveled, fuel level, and more. Customers are billed based on the consumption monitored by these sensors. You expect high throughput - up to thousands of events per hour per device - and need to retrieve consistent data based on the time of the event. Storing and retrieving individual signals should be atomic. What should you do?
  1. Create a file in Cloud Storage per device and append new data to that file.
  2. Create a file in Cloud Filestore per device and append new data to that file.
  3. Ingest the data into Datastore. Store data in an entity group based on the device.
  4. Ingest the data into Cloud Bigtable. Create a row key based on the event timestamp.
Correct answer: D
Explanation:
Keyword need to look for- "High Throughput",- "Consistent",- "Property based data insert/fetch like ngine status, distance traveled, fuel level, and more." which can be designed in column,- "Large Scale Customer Base + Each Customer has multiple sensor which send event in seconds" This will go for pera bytes situation,- Export data based on the time of the event.- Atomico BigTable will fit all requirement.o DataStore is not fully Atomico CloudStorage is not a option where we can export data based on time of event. We need another solution to do that o FireStore can be used with MobileSDK.
Keyword need to look for
- "High Throughput",
- "Consistent",
- "Property based data insert/fetch like ngine status, distance traveled, fuel level, and more." which can be designed in column,
- "Large Scale Customer Base + Each Customer has multiple sensor which send event in seconds" This will go for pera bytes situation,
- Export data based on the time of the event.
- Atomic
o BigTable will fit all requirement.
o DataStore is not fully Atomic
o CloudStorage is not a option where we can export data based on time of event. We need another solution to do that o FireStore can be used with MobileSDK.
Question 10
The DevOps group in your organization needs full control of Compute Engine resources in your development project. However, they should not have permission to create or update any other resources in the project. You want to follow Google's recommendations for setting permissions for the DevOps group. What should you do?
  1. Create a custom role at the folder level and grant all compute. instanceAdmln. * permissions to the role Grant the custom role to the DevOps group.
  2. Grant the basic role roles/editor to the DevOps group.
  3. Grant the basic role roles/viewer and the predefined role roles/compute.admin to the DevOps group.
  4. Create an IAM policy and grant all compute. instanceAdmln." permissions to the policy Attach the policy to the DevOps group.
Correct answer: B
Question 11
The sales team has a project named Sales Data Digest that has the ID acme-data-digest You need to set up similar Google Cloud resources for the marketing team but their resources must be organized independently of the sales team. What should you do?
  1. Create a new protect named Meeting Data Digest and use the ID acme-data-digest Grant the Project Editor role to the Marketing team.
  2. Create another protect with the ID acme-marketing-data-digest for the Marketing team and deploy the resources there
  3. Grant the Project Editor role to the Marketing learn for acme data digest
  4. Create a Project Lien on acme-data digest and then grant the Project Editor role to the Marketing team
Correct answer: B
HOW TO OPEN VCE FILES

Use VCE Exam Simulator to open VCE files
Avanaset

HOW TO OPEN VCEX FILES

Use ProfExam Simulator to open VCEX files
ProfExam Screen

ProfExam
ProfExam at a 20% markdown

You have the opportunity to purchase ProfExam at a 20% reduced price

Get Now!