Tuesday, 12 June 2018

AWS service limits asked in "AWS Certified Solutions Architect - Associate" and "AWS Certified Developer - Associate" certifications

Background

I just cleared my "AWS Certified Developer - Associate" certification exam yesterday with 90%. I have already cleared "AWS Certified Solutions Architect - Associate" exam 6 months back with 89%. You can see my badges below-
While preparing I realized that there are some questions based on service limits in AWS. These can be straightforward questions or they can be slightly twisted. Either case knowing service limits help out a lot. So I am going to summarize most of them which I feel important from certification perspective.




NOTE: AWS service limits can change anytime. So it is best to refer the FAQ sections of corresponding services to confirm. Following limits are as of June 2018.

AWS service limits & constraints

Following are AWS services and their corresponding limits. There would be more limits and constraints to each service. I am simply trying to summarise based on my exam preparation, test quizzes, and actual exam experience. Please let me know in comments if these limits are changed and I can update accordingly. Thanks.

Consolidated billing


AWS S3

  • By default, customers can provision up to 100 buckets per AWS account. However, you can increase your Amazon S3 bucket limit by visiting AWS Service Limits.
  • The bucket name can be between 3 and 63 characters long and can contain only lower-case characters, numbers, periods, and dashes.
  • Bucket names must not be formatted as an IP address (for example, 192.168.5.4).
  • For more details refer - https://docs.aws.amazon.com/AmazonS3/latest/dev/BucketRestrictions.html
  • AWS S3 offers unlimited storage
  • Each object on S3, however, can be 0 bytes to 5TB.
  • The largest object that can be uploaded in a single PUT is 5GB
  • For objects larger than 100 megabytes, customers should consider using the Multipart Upload capability.
  • For further details refer - https://aws.amazon.com/s3/faqs/

Glacier

  • There is no maximum limit to the total amount of data that can be stored in Amazon Glacier. 
  • Individual archives are limited to a maximum size of 40 terabytes.
  • For more details refer - https://aws.amazon.com/glacier/faqs/

Redshift


AWS EC2

VPC

Route 53



Cloud watch

Cloud formation

Lambda

Dynamo DB

  • There is an initial limit of 256 tables per region. You can raise a request to increase this limit.
  • You can define a maximum of 5 local secondary indexes and 5 global secondary indexes per table(hard limit) - total 10 secondary indexes
  • The maximum size of item collection is 10GB
  • The minimum amount of reserved capacity that can be bought - 100
  • The maximum item size in DynamoDB is 400 KB, which includes both attribute name binary length (UTF-8 length) and attribute value lengths (again binary length). The attribute name counts towards the size limit. No limit on the number of items.
  • For more details refer - https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Limits.html
  • A BatchGetItem single operation can retrieve up to 16 MB of data, which can contain as many as 100 items
  • For more details refer - https://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_BatchGetItem.html
  • A single Scan operation will read up to the maximum number of items set (if using the Limit parameter) or a maximum of 1 MB of data and then apply any filtering to the results using FilterExpression.
  • For more details refer - https://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_Scan.html

SQS

  • You can create any number of message queues.
  • Max configuration: 14 days retention and 12 hours visibility timeout
  • Default configuration: 4 days retention  and 30 seconds visibility timeout
  • A single request can have up to 1 to 10 messages up to a maximum payload of 256KB.
  • Each 64 kb chunk payload is billed as 1 request. So a single API call with 256kb payload will be billed as 4 requests.
  • To configure the maximum message size, use the console or the SetQueueAttributes method to set the MaximumMessageSize attribute. This attribute specifies the limit on bytes that an Amazon SQS message can contain. Set this limit to a value between 1,024 bytes (1 KB), and 262,144 bytes (256 KB).
  • For more details refer - https://aws.amazon.com/sqs/faqs/

SNS

  • By default, SNS offers 10 million subscriptions per topic and 100,000 topics per account.  To request a higher limit, please contact Support.
  • Topic names are limited to 256 characters.
  • SNS subscription confirmation time period is 3 days

SWF



Again as mentioned before this is obviously not an exhaustive list but merely a summary of what I thought could be best to revise before going to the associate exams. Let me know if you think something else needs to be added here for the benefit of everyone.


Since you have taken time to go through the limits here is a bonus question for you :)

Question: You receive a call from a potential client who explains that one of the many services they offer is a website running on a t2.micro EC2 instance where users can submit requests for customized e-cards to be sent to their friends and family. The e-card website administrator was on a cruise and was shocked when he returned to the office in mid-January to find hundreds of angry emails complaining that customers' loved ones had not received their Christmas cards. He also had several emails from CloudWatch alerting him that the SQS queue for the e-card application had grown to over 500 messages on December 25th. You investigate and find that the problem was caused by a crashed EC2 instance which serves as an application server. What do you advise your client to do first? Choose the correct answer from the options below

Options:
  1. Use an autoscaling group to create as many application servers as needed to access all of the Christmas card SQS messages.
  2. Reboot the application server immediately so that it begins processing the Christmas cards SQS messages.
  3. Redeploy the application server as larger instance type so that it processed the  Christmas cards SQS faster.
  4. Send an apology to the customer notifying them that their cards will not be delivered.

Answer:
4. Send an apology to the customer notifying them that their cards will not be delivered.

Explanation:
Since 500 message count was as of December 25th and e-card website administrator returned mid-Jan the difference is more than 14 days which is the maximum retention period for SQS messages.

To be honest I had select option 1 in my 1st attempt :)


Related Links



t> UA-39527780-1 back to top