The correct answer is B because Amazon SageMaker JumpStart provides pre-built solutions, including training workflows for popular open-source LLMs such as Falcon, LLaMA, and others. It allows practitioners to quickly launch fine-tuning jobs using predefined templates, minimizing operational setup and code complexity.
From AWS documentation:
"Amazon SageMaker JumpStart enables you to fine-tune and deploy foundation models with minimal setup. It provides easy-to-use interfaces and pre-built configurations for training, which significantly reduces the operational overhead required to train models."
Explanation of other options:
A. PartyRock is designed for prototyping generative AI apps but does not support model training or fine-tuning.
C. Writing a custom script for SageMaker training is flexible but involves more operational effort, including handling infrastructure configuration.
D. Training on EC2 via a Jupyter notebook is fully manual and operationally intensive, including dependency setup, data handling, and resource scaling.
Referenced AWS AI/ML Documents and Study Guides:
Amazon SageMaker JumpStart Developer Guide – Fine-tuning Foundation Models
AWS Certified Machine Learning Specialty Guide – Model Customization and JumpStart