Quiz12 Amazon Machine Learning and Sagemaker
Question 1:
What limit, if any, is there to the size of your training dataset in Amazon Machine Learning by default?
- 1TB
- 100GB
- 50GB
- No limit
By default, Amazon ML is limited to 100GB of training data. You can file a support ticket to get this increased, but Amazon ML cannot handle terabyte-scale data.
Question 2:
Is there a limit to the size of the dataset that you can use for training models with Amazon SageMaker? If so, what is the limit?
- 100GB
- No fixed limit
- 1TB
- 50GB
There are no fixed limits to the size of the dataset you can use for training models with Amazon SageMaker.
Question 3:
The audit team of an organization needs a history of Amazon SageMaker API calls made on their account for security analysis and operational troubleshooting purposes. Which of the following service helps in this regard?
- CloudTrail
- Cloud Watch
- CloudFormation
- SageMaker Logs
SageMaker outputs its results to both CloudTrail and Cloud Watch, but CloudTrail is specifically designed for auditing purposes.
Question 4:
Which of the following is a new Amazon SageMaker capability that enables machine learning models to train once and run anywhere in the cloud and at the edge?
- SageMaker Neo
- SageMaker Search
- Batch Transform
- Jupyter Notebooks
Question 5:
A Python developer is planning to develop a machine learning model to predict real estate prices using a Jupyter notebook and train and deploy this model in a high available and scalable manner. The developer wishes to avoid worrying about provisioning sufficient capacity for this model. Which of the following services is best suited for this?
- Apache Spark
- Amazon Machine Learning
- Amazon EMR
- Amazon SageMaker
SageMaker is the only scalable solution that is both fully managed and uses Jupyter notebooks.