$0.00
Amazon DBS-C01 Dumps

Amazon DBS-C01 Exam Dumps

AWS Certified Database - Specialty

Total Questions : 270
Update Date : October 10, 2024
PDF + Test Engine
$65 $95
Test Engine
$55 $85
PDF Only
$45 $75



Last Week DBS-C01 Exam Results

225

Customers Passed Amazon DBS-C01 Exam

93%

Average Score In Real DBS-C01 Exam

95%

Questions came from our DBS-C01 dumps.



Choosing the Right Path for Your DBS-C01 Exam Preparation

Welcome to PassExamHub's comprehensive study guide for the AWS Certified Database - Specialty exam. Our DBS-C01 dumps is designed to equip you with the knowledge and resources you need to confidently prepare for and succeed in the DBS-C01 certification exam.

What Our Amazon DBS-C01 Study Material Offers

PassExamHub's DBS-C01 dumps PDF is carefully crafted to provide you with a comprehensive and effective learning experience. Our study material includes:

In-depth Content: Our study guide covers all the key concepts, topics, and skills you need to master for the DBS-C01 exam. Each topic is explained in a clear and concise manner, making it easy to understand even the most complex concepts.
Online Test Engine: Test your knowledge and build your confidence with a wide range of practice questions that simulate the actual exam format. Our test engine cover every exam objective and provide detailed explanations for both correct and incorrect answers.
Exam Strategies: Get valuable insights into exam-taking strategies, time management, and how to approach different types of questions.
Real-world Scenarios: Gain practical insights into applying your knowledge in real-world scenarios, ensuring you're well-prepared to tackle challenges in your professional career.

Why Choose PassExamHub?

Expertise: Our DBS-C01 exam questions answers are developed by experienced Amazon certified professionals who have a deep understanding of the exam objectives and industry best practices.
Comprehensive Coverage: We leave no stone unturned in covering every topic and skill that could appear on the DBS-C01 exam, ensuring you're fully prepared.
Engaging Learning: Our content is presented in a user-friendly and engaging format, making your study sessions enjoyable and effective.
Proven Success: Countless students have used our study materials to achieve their DBS-C01 certifications and advance their careers.
Start Your Journey Today!

Embark on your journey to AWS Certified Database - Specialty success with PassExamHub. Our study material is your trusted companion in preparing for the DBS-C01 exam and unlocking exciting career opportunities.


Related Exams


Amazon DBS-C01 Sample Question Answers

Question # 1

A company recently migrated its line-of-business (LOB) application to AWS. The application uses an Amazon RDS for SQL Server DB instance as its database engine. The company must set up cross-Region disaster recovery for the application. The company needs a solution with the lowest possible RPO and RTO. Which solution will meet these requirements?

A. Create a cross-Region read replica of the DB instance. Promote the read replica at the time of failover. 
B. Set up SQL replication from the DB instance to an Amazon EC2 instance in the disaster recovery Region. Promote the EC2 instance as the primary server. 
C. Use AWS Database Migration Service (AWS KMS) for ongoing replication of the DB instance in the disaster recovery Region. 
D. Take manual snapshots of the DB instance in the primary Region. Copy the snapshots to the disaster recovery Region. 



Question # 2

A financial services company runs an on-premises MySQL database for a critical application. The company is dissatisfied with its current database disaster recovery (DR) solution. The application experiences a significant amount of downtime whenever the database fails over to its DR facility. The application also experiences slower response times when reports are processed on the same database. To minimize the downtime in DR situations, the company has decided to migrate the database to AWS. The company requires a solution that is highly available and the most cost-effective. Which solution meets these requirements?

A. Create an Amazon RDS for MySQL Multi-AZ DB instance and configure a read replica in a different Availability Zone. Configure the application to reference the replica instance endpoint and report queries to reference the primary DB instance endpoint. 
B. Create an Amazon RDS for MySQL Multi-AZ DB instance and configure a read replica in a different Availability Zone. Configure the application to reference the primary DB instance endpoint and report queries to reference the replica instance endpoint. 
C. Create an Amazon Aurora DB cluster and configure an Aurora Replica in a different Availability Zone. Configure the application to reference the cluster endpoint and report queries to reference the reader endpoint. 
D. Create an Amazon Aurora DB cluster and configure an Aurora Replica in a different Availability Zone. Configure the application to reference the primary DB instance endpoint and report queries to reference the replica instance endpoint. 



Question # 3

A company has branch offices in the United States and Singapore. The company has a three-tier web application that uses a shared database. The database runs on an Amazon RDS for MySQL DB instance that is hosted in the us-west-2 Region. The application has a distributed front end that is deployed in us-west-2 and in the ap-southeast-1 Region. The company uses this front end as a dashboard that provides statistics to sales managers in each branch office. The dashboard loads more slowly in the Singapore branch office than in the United States branch office. The company needs a solution so that the dashboard loads consistently for users in each location. Which solution will meet these requirements in the MOST operationally efficient way?

A. Take a snapshot of the DB instance in us-west-2. Create a new DB instance in apsoutheast-2 from the snapshot. Reconfigure the ap-southeast-1 front-end dashboard to access the new DB instance. 
B. Create an RDS read replica in ap-southeast-1 from the primary DB instance in us-west2. Reconfigure the ap-southeast-1 front-end dashboard to access the read replica. 
C. Create a new DB instance in ap-southeast-1. Use AWS Database Migration Service (AWS DMS) and change data capture (CDC) to update the new DB instance in apsoutheast-1. Reconfigure the ap-southeast-1 front-end dashboard to access the new DB instance. 
D. Create an RDS read replica in us-west-2, where the primary DB instance resides. Create a read replica in ap-southeast-1 from the read replica in us-west-2. Reconfigure the ap-southeast-1 front-end dashboard to access the read replica in ap-southeast-1. 



Question # 4

A software-as-a-service (SaaS) company is using an Amazon Aurora Serverless DB cluster for its production MySQL database. The DB cluster has general logs and slow query logs enabled. A database engineer must use the most operationally efficient solution with minimal resource utilization to retain the logs and facilitate interactive search and analysis. Which solution meets these requirements?

A. Use an AWS Lambda function to ship database logs to an Amazon S3 bucket. Use Amazon Athena and Amazon QuickSight to search and analyze the logs. 
B. Download the logs from the DB cluster and store them in Amazon S3 by using manual scripts. Use Amazon Athena and Amazon QuickSight to search and analyze the logs. 
C. Use an AWS Lambda function to ship database logs to an Amazon S3 bucket. Use Amazon Elasticsearch Service (Amazon ES) and Kibana to search and analyze the logs. 
D. Use Amazon CloudWatch Logs Insights to search and analyze the logs when the logs are automatically uploaded by the DB cluster. 



Question # 5

A gaming company uses Amazon Aurora Serverless for one of its internal applications. The company's developers use Amazon RDS Data API to work with the Aurora Serverless DB cluster. After a recent security review, the company is mandating security enhancements. A database specialist must ensure that access to RDS Data API is private and never passes through the public internet. What should the database specialist do to meet this requirement?

A. Modify the Aurora Serverless cluster by selecting a VPC with private subnets. 
B. Modify the Aurora Serverless cluster by unchecking the publicly accessible option. 
C. Create an interface VPC endpoint that uses AWS PrivateLink for RDS Data API. 
D. Create a gateway VPC endpoint for RDS Data API. 



Question # 6

A company runs a customer relationship management (CRM) system that is hosted onpremises with a MySQL database as the backend. A custom stored procedure is used to send email notifications to another system when data is inserted into a table. The company has noticed that the performance of the CRM system has decreased due to database reporting applications used by various teams. The company requires an AWS solution that would reduce maintenance, improve performance, and accommodate the email notification feature. Which AWS solution meets these requirements?

A. Use MySQL running on an Amazon EC2 instance with Auto Scaling to accommodate the reporting applications. Configure a stored procedure and an AWS Lambda function that uses Amazon SES to send email notifications to the other system. 
B. Use Amazon Aurora MySQL in a multi-master cluster to accommodate the reporting applications. Configure Amazon RDS event subscriptions to publish a message to an Amazon SNS topic and subscribe the other system's email address to the topic. 
C. Use MySQL running on an Amazon EC2 instance with a read replica to accommodate the reporting applications. Configure Amazon SES integration to send email notifications to the other system. 
D. Use Amazon Aurora MySQL with a read replica for the reporting applications. Configure a stored procedure and an AWS Lambda function to publish a message to an Amazon SNS topic. Subscribe the other system's email address to the topic. 



Question # 7

A security team is conducting an audit for a financial company. The security team discovers that the database credentials of an Amazon RDS for MySQL DB instance are hardcoded in the source code. The source code is stored in a shared location for automatic deployment and is exposed to all users who can access the location. A database specialist must use encryption to ensure that the credentials are not visible in the source code. Which solution will meet these requirements?

A. Use an AWS Key Management Service (AWS KMS) key to encrypt the most recent database backup. Restore the backup as a new database to activate encryption. 
B. Store the source code to access the credentials in an AWS Systems Manager Parameter Store secure string parameter that is encrypted by AWS Key Management Service (AWS KMS). Access the code with calls to Systems Manager. 
C. Store the credentials in an AWS Systems Manager Parameter Store secure string parameter that is encrypted by AWS Key Management Service (AWS KMS). Access the credentials with calls to Systems Manager. 
D. Use an AWS Key Management Service (AWS KMS) key to encrypt the DB instance at rest. Activate RDS encryption in transit by using SSL certificates. 



Question # 8

Developers have requested a new Amazon Redshift cluster so they can load new thirdparty marketing data. The new cluster is ready and the user credentials are given to the developers. The developers indicate that their copy jobs fail with the following error message: “Amazon Invalid operation: S3ServiceException:Access Denied,Status 403,Error AccessDenied.” The developers need to load this data soon, so a database specialist must act quickly to solve this issue. What is the MOST secure solution?

A. Create a new IAM role with the same user name as the Amazon Redshift developer user ID. Provide the IAM role with read-only access to Amazon S3 with the assume role action. 
B. Create a new IAM role with read-only access to the Amazon S3 bucket and include the assume role action. Modify the Amazon Redshift cluster to add the IAM role. 
C. Create a new IAM role with read-only access to the Amazon S3 bucket with the assume role action. Add this role to the developer IAM user ID used for the copy job that ended with an error message. 
D. Create a new IAM user with access keys and a new role with read-only access to the Amazon S3 bucket. Add this role to the Amazon Redshift cluster. Change the copy job to use the access keys created. 



Question # 9

A company's database specialist implements an AWS Database Migration Service (AWS DMS) task for change data capture (CDC) to replicate data from an on- premises Oracle database to Amazon S3. When usage of the company's application increases, the database specialist notices multiple hours of latency with the CDC. Which solutions will reduce this latency? (Choose two.)

A. Configure the DMS task to run in full large binary object (LOB) mode. 
B. Configure the DMS task to run in limited large binary object (LOB) mode. 
C. Create a Multi-AZ replication instance. 
D. Load tables in parallel by creating multiple replication instances for sets of tables that participate in common transactions. 
E. Replicate tables in parallel by creating multiple DMS tasks for sets of tables that do not participate in common transactions. 



Question # 10

A company plans to migrate a MySQL-based application from an on-premises environment to AWS. The application performs database joins across several tables and uses indexes for faster query response times. The company needs the database to be highly available with automatic failover. Which solution on AWS will meet these requirements with the LEAST operational overhead?

A. Deploy an Amazon RDS DB instance with a read replica. 
B. Deploy an Amazon RDS Multi-AZ DB instance. 
C. Deploy Amazon DynamoDB global tables. 
D. Deploy multiple Amazon RDS DB instances. Use Amazon Route 53 DNS with failover health checks configured. 



Question # 11

A company's development team needs to have production data restored in a staging AWS account. The production database is running on an Amazon RDS for PostgreSQL Multi-AZ DB instance, which has AWS KMS encryption enabled using the default KMS key. A database specialist planned to share the most recent automated snapshot with the staging account, but discovered that the option to share snapshots is disabled in the AWS Management Console. What should the database specialist do to resolve this? 

A. Disable automated backups in the DB instance. Share both the automated snapshot and the default KMS key with the staging account. Restore the snapshot in the staging account and enable automated backups. 
B. Copy the automated snapshot specifying a custom KMS encryption key. Share both the copied snapshot and the custom KMS encryption key with the staging account. Restore the snapshot to the staging account within the same Region. 
C. Modify the DB instance to use a custom KMS encryption key. Share both the automated snapshot and the custom KMS encryption key with the staging account. Restore the snapshot in the staging account. 
D. Copy the automated snapshot while keeping the default KMS key. Share both the snapshot and the default KMS key with the staging account. Restore the snapshot in the staging account. 



Question # 12

An online retail company is planning a multi-day flash sale that must support processing of up to 5,000 orders per second. The number of orders and exact schedule for the sale will vary each day. During the sale, approximately 10,000 concurrent users will look at the deals before buying items. Outside of the sale, the traffic volume is very low. The acceptable performance for read/write queries should be under 25 ms. Order items are about 2 KB in size and have a unique identifier. The company requires the most costeffective solution that will automatically scale and is highly available. Which solution meets these requirements?

A. Amazon DynamoDB with on-demand capacity mode 
B. Amazon Aurora with one writer node and an Aurora Replica with the parallel query feature enabled 
C. Amazon DynamoDB with provisioned capacity mode with 5,000 write capacity units (WCUs) and 10,000 read capacity units (RCUs) 
D. Amazon Aurora with one writer node and two cross-Region Aurora Replicas 



Question # 13

A company wants to build a new invoicing service for its cloud-native application on AWS. The company has a small development team and wants to focus on service feature development and minimize operations and maintenance as much as possible. The company expects the service to handle billions of requests and millions of new records every day. The service feature requirements, including data access patterns are welldefined. The service has an availability target of 99.99% with a milliseconds latency requirement. The database for the service will be the system of record for invoicing data. Which database solution meets these requirements at the LOWEST cost?

A. Amazon Neptune 
B. Amazon Aurora PostgreSQL Serverless 
C. Amazon RDS for PostgreSQL 
D. Amazon DynamoDB 



Question # 14

Recently, a gaming firm purchased a popular iOS game that is especially popular during the Christmas season. The business has opted to include a leaderboard into the game, which will be powered by Amazon DynamoDB. The application's load is likely to increase significantly throughout the Christmas season. Which solution satisfies these criteria at the lowest possible cost?

A. DynamoDB Streams 
B. DynamoDB with DynamoDB Accelerator 
C. DynamoDB with on-demand capacity mode 
D. DynamoDB with provisioned capacity mode with Auto Scaling 



Question # 15

An ecommerce company uses a backend application that stores data in an Amazon DynamoDB table. The backend application runs in a private subnet in a VPC and must connect to this table. The company must minimize any network latency that results from network connectivity issues, even during periods of heavy application usage. A database administrator also needs the ability to use a private connection to connect to the DynamoDB table from the application. Which solution will meet these requirements? 

A. Use network ACLs to ensure that any outgoing or incoming connections to any port except DynamoDB are deactivated. Encrypt API calls by using TLS.
 B. Create a VPC endpoint for DynamoDB in the application's VPC. Use the VPC endpoint to access the table. 
C. Create an AWS Lambda function that has access to DynamoDB. Restrict outgoing access only to this Lambda function from the application. 
D. Use a VPN to route all communication to DynamoDB through the company's own corporate network infrastructure. 



Question # 16

A finance company migrated its 3 ¢’ on-premises PostgreSQL database to an Amazon Aurora PostgreSQL DB cluster. During a review after the migration, a database specialist discovers that the database is not encrypted at rest. The database must be encrypted at rest as soon as possible to meet security requirements. The database specialist must enable encryption for the DB cluster with minimal downtime. Which solution will meet these requirements? 

A. Modify the unencrypted DB cluster using the AWS Management Console. Enable encryption and choose to apply the change immediately. 
B. Take a snapshot of the unencrypted DB cluster and restore it to a new DB cluster with encryption enabled. Update any database connection strings to reference the new DB cluster endpoint, and then delete the unencrypted DB cluster. 
C. Create an encrypted Aurora Replica of the unencrypted DB cluster. Promote the Aurora Replica as the new master. 
D. Create a new DB cluster with encryption enabled and use the pg_dump and pg_restore utilities to load data to the new DB cluster. Update any database connection strings to reference the new DB cluster endpoint, and then delete the unencrypted DB cluster. 



Question # 17

An internet advertising firm stores its data in an Amazon DynamoDb table. Amazon DynamoDB Streams are enabled on the table, and one of the keys has a global secondary index. The table is encrypted using a customer-managed AWS Key Management Service (AWS KMS) key. The firm has chosen to grow worldwide and want to duplicate the database using DynamoDB global tables in a new AWS Region. An administrator observes the following upon review: No role with the dynamodb: CreateGlobalTable permission exists in the account. An empty table with the same name exists in the new Region where replication is desired. A global secondary index with the same partition key but a different sort key exists in the new Region where replication is desired. Which settings will prevent you from creating a global table or replica in the new Region? (Select two.)

A. A global secondary index with the same partition key but a different sort key exists in the new Region where replication is desired. 
B. An empty table with the same name exists in the Region where replication is desired. 
C. No role with the dynamodb:CreateGlobalTable permission exists in the account. 
D. DynamoDB Streams is enabled for the table. 
E. The table is encrypted using a KMS customer managed key. 



Question # 18

A company is planning to use Amazon RDS for SQL Server for one of its critical applications. The company's security team requires that the users of the RDS for SQL Server DB instance are authenticated with on-premises Microsoft Active Directory credentials. Which combination of steps should a database specialist take to meet this requirement? (Choose three.) 

A. Extend the on-premises Active Directory to AWS by using AD Connector. 
B. Create an IAM user that uses the AmazonRDSDirectoryServiceAccess managed IAM policy. 
C. Create a directory by using AWS Directory Service for Microsoft Active Directory. 
D. Create an Active Directory domain controller on Amazon EC2. 
E. Create an IAM role that uses the AmazonRDSDirectoryServiceAccess managed IAM policy. 
F. Create a one-way forest trust from the AWS Directory Service for Microsoft Active Directory directory to the on-premises Active Directory. 



Question # 19

A company hosts a 2 TB Oracle database in its on-premises data center. A database specialist is migrating the database from on premises to an Amazon Aurora PostgreSQL database on AWS. The database specialist identifies a problem that relates to compatibility Oracle stores metadata in its data dictionary in uppercase, but PostgreSQL stores the metadata in lowercase. The database specialist must resolve this problem to complete the migration. What is the MOST operationally efficient solution that meets these requirements? 

A. Override the default uppercase format of Oracle schema by encasing object names in quotation marks during creation. 
B. Use AWS Database Migration Service (AWS DMS) mapping rules with rule-action as convert-lowercase. 
C. Use the AWS Schema Conversion Tool conversion agent to convert the metadata from uppercase to lowercase. 
D. Use an AWS Glue job that is attached to an AWS Database Migration Service (AWS DMS) replication task to convert the metadata from uppercase to lowercase. 



Question # 20

A company is developing a multi-tier web application hosted on AWS using Amazon Aurora as the database. The application needs to be deployed to production and other nonproduction environments. A Database Specialist needs to specify different MasterUsername and MasterUserPassword properties in the AWS CloudFormation templates used for automated deployment. The CloudFormation templates are version controlled in the company’s code repository. The company also needs to meet compliance requirement by routinely rotating its database master password for production. What is most secure solution to store the master password?

A. Store the master password in a parameter file in each environment. Reference the environment-specific parameter file in the CloudFormation template. 
B. Encrypt the master password using an AWS KMS key. Store the encrypted master password in the CloudFormation template. 
C. Use the secretsmanager dynamic reference to retrieve the master password stored in AWS Secrets Manager and enable automatic rotation. 
D. Use the ssm dynamic reference to retrieve the master password stored in the AWS Systems Manager Parameter Store and enable automatic rotation. 



Question # 21

A Database Specialist is creating Amazon DynamoDB tables, Amazon CloudWatch alarms, and associated infrastructure for an Application team using a development AWS account. The team wants a deployment method that will standardize the core solution components while managing environmentspecific settings separately, and wants to minimize rework due to configuration errors. Which process should the Database Specialist recommend to meet these requirements?

A. Organize common and environmental-specific parameters hierarchically in the AWS Systems Manager Parameter Store, then reference the parameters dynamically from an AWS CloudFormation template. Deploy the CloudFormation stack using the environment name as a parameter. 
B. Create a parameterized AWS CloudFormation template that builds the required objects. Keep separate environment parameter files in separate Amazon S3 buckets. Provide an AWS CLI command that deploys the CloudFormation stack directly referencing the appropriate parameter bucket.
 C. Create a parameterized AWS CloudFormation template that builds the required objects. Import the template into the CloudFormation interface in the AWS Management Console. Make the required changes to the parameters and deploy the CloudFormation stack. 
D. Create an AWS Lambda function that builds the required objects using an AWS SDK. Set the required parameter values in a test event in the Lambda console for each environment that the Application team can modify, as needed. Deploy the infrastructure by triggering the test event in the console. 



Question # 22

A retail company uses Amazon Redshift Spectrum to run complex analytical queries on objects that are stored in an Amazon S3 bucket. The objects are joined with multiple dimension tables that are stored in an Amazon Redshift database. The company uses the database to create monthly and quarterly aggregated reports. Users who attempt to run queries are reporting the following error message: error: Spectrum Scan Error: Access throttled Which solution will resolve this error? 

A. Check file sizes of fact tables in Amazon S3, and look for large files. Break up large files into smaller files of equal size between 100 MB and 1 GB 
B. Reduce the number of queries that users can run in parallel. 
C. Check file sizes of fact tables in Amazon S3, and look for small files. Merge the small files into larger files of at least 64 MB in size. 
D. Review and optimize queries that submit a large aggregation step to Redshift Spectrum.



Question # 23

A manufacturing company has an. inventory system that stores information in an Amazon Aurora MySQL DB cluster. The database tables are partitioned. The database size has grown to 3 TB. Users run one-time queries by using a SQL client. Queries that use an equijoin to join large tables are taking a long time to run. Which action will improve query performance with the LEAST operational effort?

A. Migrate the database to a new Amazon Redshift data warehouse. 
B. Enable hash joins on the database by setting the variable optimizer_switch to hash_join=on. 
C. Take a snapshot of the DB cluster. Create a new DB instance by using the snapshot, and enable parallel query mode. 
D. Add an Aurora read replica. 



Question # 24

A business is launching a new Amazon RDS for SQL Server database instance. The organization wishes to allow auditing of the SQL Server database. Which measures should a database professional perform in combination to achieve this requirement? (Select two.) 

A. Create a service-linked role for Amazon RDS that grants permissions for Amazon RDS to store audit logs on Amazon S3. 
B. Set up a parameter group to configure an IAM role and an Amazon S3 bucket for audit log storage. Associate the parameter group with the DB instance.
 C. Disable Multi-AZ on the DB instance, and then enable auditing. Enable Multi-AZ after auditing is enabled. 
D. Disable automated backup on the DB instance, and then enable auditing. Enable automated backup after auditing is enabled. 
E. Set up an options group to configure an IAM role and an Amazon S3 bucket for audit log storage. Associate the options group with the DB instance. 



Question # 25

A company hosts an on-premises Microsoft SQL Server Enterprise edition database with Transparent Data Encryption (TDE) enabled. The database is 20 TB in size and includes sparse tables. The company needs to migrate the database to Amazon RDS for SQL Server during a maintenance window that is scheduled for an upcoming weekend. Data-atrest encryption must be enabled for the target DB instance. Which combination of steps should the company take to migrate the database to AWS in the MOST operationally efficient manner? (Choose two.)

A. Use AWS Database Migration Service (AWS DMS) to migrate from the on-premises source database to the RDS for SQL Server target database. 
B. Disable TDE. Create a database backup without encryption. Copy the backup to Amazon S3. 
C. Restore the backup to the RDS for SQL Server DB instance. Enable TDE for the RDS for SQL Server DB instance. 
D. Set up an AWS Snowball Edge device. Copy the database backup to the device. Send the device to AWS. Restore the database from Amazon S3. 
E. Encrypt the data with client-side encryption before transferring the data to Amazon RDS. 



Question # 26

A company uses an on-premises Microsoft SQL Server database to host relational and JSON data and to run daily ETL and advanced analytics. The company wants to migrate the database to the AWS Cloud. Database specialist must choose one or more AWS services to run the company's workloads. Which solution will meet these requirements in the MOST operationally efficient manner?

A. Use Amazon Redshift for relational data. Use Amazon DynamoDB for JSON data 
B. Use Amazon Redshift for relational data and JSON data. 
C. Use Amazon RDS for relational data. Use Amazon Neptune for JSON data 
D. Use Amazon Redshift for relational data. Use Amazon S3 for JSON data. 



Question # 27

A pharmaceutical company uses Amazon Quantum Ledger Database (Amazon QLDB) to store its clinical trial data records. The company has an application that runs as AWS Lambda functions. The application is hosted in the private subnet in a VPC. The application does not have internet access and needs to read some of the clinical data records. The company is concerned that traffic between the QLDB ledger and the VPC could leave the AWS network. The company needs to secure access to the QLDB ledger and allow the VPC traffic to have read-only access. Which security strategy should a database specialist implement to meet these requirements?

A. Move the QLDB ledger into a private database subnet inside the VPC. Run the Lambda functions inside the same VPC in an application private subnet. Ensure that the VPC route table allows read-only flow from the application subnet to the database subnet. 
B. Create an AWS PrivateLink VPC endpoint for the QLDB ledger. Attach a VPC policy to the VPC endpoint to allow read-only traffic for the Lambda functions that run inside the VPC.
 C. Add a security group to the QLDB ledger to allow access from the private subnets inside the VPC where the Lambda functions that access the QLDB ledger are running. 
D. Create a VPN connection to ensure pairing of the private subnet where the Lambda functions are running with the private subnet where the QLDB ledger is deployed.