A) Stop the source instances before stopping their read replicas
B) Delete each read replica before stopping its corresponding source instance
C) Stop the read replicas before stopping their source instances
D) Use the AWS CLI to stop each read replica and source instance at the same
Correct Answer
verified
Multiple Choice
A) Ensure the DynamoDB table is configured to be always consistent.
B) Ensure the BatchGetltem operation is called with the ConsistentRead parameter set to false.
C) Enable a stream on the DynamoDB table and subscribe each device to the stream to ensure all devices receive up-to-date status information.
D) Ensure the BatchGetltem operation is called with the ConsistentRead parameter set to true.
Correct Answer
verified
Multiple Choice
A) Log in to the host and run the rm $PGDATA/pg_logs/* command
B) Modify the rds.log_retention_period parameter to 1440 and wait up to 24 hours for database logs to be deleted
C) Create a ticket with AWS Support to have the logs deleted
D) Run the SELECT rds_rotate_error_log() stored procedure to rotate the logs
Correct Answer
verified
Multiple Choice
A) Set the TCP keepalive parameters low
B) Call the AWS CLI failover-db-cluster command
C) Enable Enhanced Monitoring on the DB cluster
D) Start a database activity stream on the DB cluster
Correct Answer
verified
Multiple Choice
A) Re-create global secondary indexes in the new table
B) Define IAM policies for access to the new table
C) Define the TTL settings
D) Encrypt the table from the AWS Management Console or use the update-table command
E) Set the provisioned read and write capacity
Correct Answer
verified
Multiple Choice
A) Add a route to an internet gateway in the subnet's route table.
B) Add a route to a NAT gateway in the subnet's route table.
C) Assign a new security group to the EC2 instances with an outbound rule to ports 80 and 443.
D) Create a VPC endpoint for DynamoDB and add a route to the endpoint in the subnet's route table.
Correct Answer
verified
Multiple Choice
A) Amazon RDS for MySQL with multi-Region read replicas
B) Amazon Aurora global database
C) Amazon RDS for Oracle with GoldenGate
D) Amazon DynamoDB global tables
Correct Answer
verified
Multiple Choice
A) Add values for the rcuCount and wcuCount parameters to the Mappings section of the template. Configure DynamoDB to provision throughput capacity using the stack's mappings.
B) Add values for two Number parameters, rcuCount and wcuCount, to the template. Replace the hard-coded values with calls to the Ref intrinsic function, referencing the new parameters.
C) Add values for the rcuCount and wcuCount parameters as outputs of the template. Configure DynamoDB to provision throughput capacity using the stack outputs.
D) Add values for the rcuCount and wcuCount parameters to the Mappings section of the template. Replace the hard-coded values with calls to the Ref intrinsic function, referencing the new parameters.
Correct Answer
verified
Multiple Choice
A) Create an Aurora Replica with encryption enabled using AWS Key Management Service (AWS KMS) . Then promote the replica to master.
B) Use SSL/TLS to secure the in-transit connection between the financial application and the Aurora DB cluster.
C) Modify the existing Aurora DB cluster and enable encryption using an AWS Key Management Service (AWS KMS) encryption key. Apply the changes immediately.
D) Take a snapshot of the Aurora DB cluster and encrypt the snapshot using an AWS Key Management Service (AWS KMS) encryption key. Restore the snapshot to a new DB cluster and update the financial application database endpoints.
E) Use AWS Key Management Service (AWS KMS) to secure the in-transit connection between the financial application and the Aurora DB cluster.
Correct Answer
verified
Multiple Choice
A) Use an Amazon RDS DB instance. Shut down the instance once the data has been read.
B) Use Amazon Aurora Serverless. Allow the service to spin resources up and down, as needed.
C) Use Amazon DynamoDB in on-demand capacity mode.
D) Use Amazon S3 and load the data from flat files.
Correct Answer
verified
Multiple Choice
A) Create an Amazon CloudWatch dashboard to show the number of connections, CPU usage, and disk space consumption. Watch these dashboards during the next slow period.
B) Launch an Amazon EC2 instance, and install and configure an open-source PostgreSQL monitoring tool that will run reports based on the output error logs.
C) Modify the logging database parameter to log all the queries related to locking in the database and then check the logs after the next slow period for this information.
D) Enable Amazon RDS Performance Insights on the PostgreSQL database. Use the metrics to identify any queries that are related to spikes in the graph during the next slow period.
Correct Answer
verified
Multiple Choice
A) The restored DB instance does not have Enhanced Monitoring enabled
B) The production DB instance is using a custom parameter group
C) The restored DB instance is using the default security group
D) The production DB instance is using a custom option group
Correct Answer
verified
Multiple Choice
A) Dump all the tables from the Oracle database into an Amazon S3 bucket using datapump (expdp) . Run data transformations in AWS Glue. Load the data from the S3 bucket to the Aurora DB cluster.
B) Order an AWS Snowball appliance and copy the Oracle backup to the Snowball appliance. Once the Snowball data is delivered to Amazon S3, create a new Aurora DB cluster. Enable the S3 integration to migrate the data directly from Amazon S3 to Amazon RDS.
C) Use the AWS Schema Conversion Tool (AWS SCT) to help rewrite database objects to MySQL during the schema migration. Use AWS DMS to perform the full load and change data capture (CDC) tasks.
D) Use AWS Server Migration Service (AWS SMS) to import the Oracle virtual machine image as an Amazon EC2 instance. Use the Oracle Logical Dump utility to migrate the Oracle data from Amazon EC2 to an Aurora DB cluster.
Correct Answer
verified
Multiple Choice
A) Use AWS IAM database authentication and restrict access to the tables using an IAM policy.
B) Configure the rules in a NACL to restrict outbound traffic from the Aurora DB cluster.
C) Execute GRANT and REVOKE commands that restrict access to the tables containing sensitive data.
D) Define access privileges to the tables containing sensitive data in the pg_hba.conf file.
Correct Answer
verified
Multiple Choice
A) Set up a VPN tunnel for encrypting data over the network from the data center to AWS. Leverage AWS SCT and apply the converted schema to Amazon Redshift. Once complete, start an AWS DMS task to move the data from the source to Amazon S3. Use AWS Glue to load the data from Amazon S3 to Amazon Redshift.
B) Leverage AWS SCT and apply the converted schema to Amazon Redshift. Start an AWS DMS task with two AWS Snowball Edge devices to copy data from on-premises to Amazon S3 with AWS KMS encryption. Use AWS DMS to finish copying data to Amazon Redshift.
C) Leverage AWS SCT and apply the converted schema to Amazon Redshift. Once complete, use a fleet of 10 TB dedicated encrypted drives using the AWS Import/Export feature to copy data from on-premises to Amazon S3 with AWS KMS encryption. Use AWS Glue to load the data to Amazon redshift.
D) Set up a VPN tunnel for encrypting data over the network from the data center to AWS. Leverage a native database export feature to export the data and compress the files. Use the aws S3 cp multi-port upload command to upload these files to Amazon S3 with AWS KMS encryption. Once complete, load the data to Amazon Redshift using AWS Glue.
Correct Answer
verified
Multiple Choice
A) Create a snapshot of the old databases and restore the snapshot with the required storage
B) Create a new RDS DB instance with the required storage and move the databases from the old instances to the new instance using AWS DMS
C) Create a new database using native backup and restore
D) Create a new read replica and make it the primary by terminating the existing primary
Correct Answer
verified
Multiple Choice
A) DynamoDB Streams
B) DynamoDB with DynamoDB Accelerator
C) DynamoDB with on-demand capacity mode
D) DynamoDB with provisioned capacity mode with Auto Scaling
Correct Answer
verified
Multiple Choice
A) Create a new IAM role with the same user name as the Amazon Redshift developer user ID. Provide the IAM role with read-only access to Amazon S3 with the assume role action.
B) Create a new IAM role with read-only access to the Amazon S3 bucket and include the assume role action. Modify the Amazon Redshift cluster to add the IAM role.
C) Create a new IAM role with read-only access to the Amazon S3 bucket with the assume role action. Add this role to the developer IAM user ID used for the copy job that ended with an error message.
D) Create a new IAM user with access keys and a new role with read-only access to the Amazon S3 bucket. Add this role to the Amazon Redshift cluster. Change the copy job to use the access keys created.
Correct Answer
verified
Multiple Choice
A) Use the plant identifier as the partition key and the measurement time as the sort key. Create a global secondary index (GSI) with the plant identifier as the partition key and the fault attribute as the sort key.
B) Create a composite of the plant identifier and sensor identifier as the partition key. Use the measurement time as the sort key. Create a local secondary index (LSI) on the fault attribute.
C) Create a composite of the plant identifier and sensor identifier as the partition key. Use the measurement time as the sort key. Create a global secondary index (GSI) with the plant identifier as the partition key and the fault attribute as the sort key.
D) Use the plant identifier as the partition key and the sensor identifier as the sort key. Create a local secondary index (LSI) on the fault attribute.
Correct Answer
verified
Multiple Choice
A) Use reserved capacity. Set it to the capacity levels required for peak daytime throughput.
B) Use provisioned capacity. Set it to the capacity levels required for peak daytime throughput.
C) Use provisioned capacity. Create an AWS Application Auto Scaling policy to update capacity based on consumption.
D) Use on-demand capacity.
Correct Answer
verified
Showing 21 - 40 of 156
Related Exams