From 014562a92dc207f9b2bcd47b46ecb6759ab8e4fe Mon Sep 17 00:00:00 2001 From: tianwei Date: Tue, 5 Dec 2023 19:29:26 +0800 Subject: [PATCH] update starwhale intro and quickstart docs (#51) --- docs/getting-started/index.md | 18 +++-- docs/getting-started/server.md | 8 ++- docs/getting-started/standalone.md | 8 +++ docs/swcli/installation.md | 6 ++ docs/what-is-starwhale.md | 66 +++++++++++-------- .../current/getting-started/index.md | 16 +++-- .../current/getting-started/server.md | 8 ++- .../current/getting-started/standalone.md | 10 ++- .../current/swcli/installation.md | 6 ++ .../current/what-is-starwhale.md | 66 +++++++++++-------- 10 files changed, 140 insertions(+), 72 deletions(-) diff --git a/docs/getting-started/index.md b/docs/getting-started/index.md index a73287714..eb9699233 100644 --- a/docs/getting-started/index.md +++ b/docs/getting-started/index.md @@ -2,13 +2,21 @@ title: Getting started --- -First, you need to install the [Starwhale Client (swcli)](../swcli), which can be done by running the following command: +Each deployment of Starwhale is called an instance. All instances can be managed by the Starwhale Client (swcli). -```bash -python3 -m pip install starwhale -``` +You can start using Starwhale with one of the following instance types: -For more information, see the [swcli installation guide](../swcli/installation). +* **Starwhale Standalone** - Rather than a running service, Starwhale Standalone is actually a repository that resides in your local file system. It is created and managed by the Starwhale Client (swcli). You only need to install swcli to use it. Currently, each user on a single machine can have only ONE Starwhale Standalone instance. We recommend you use the Starwhale Standalone to build and test your datasets, runtime, and models before pushing them to Starwhale Server/Cloud instances. +* **Starwhale Server** - Starwhale Server is a service deployed on your local server. Besides text-only results from the Starwhale Client (swcli), Starwhale Server provides Web UI for you to manage your datasets and models, evaluate your models in your local Kubernetes cluster, and review the evaluation results. +* **Starwhale Cloud** - Starwhale Cloud is a managed service hosted on public clouds. By registering an account on , you are ready to use Starwhale without needing to install, operate, and maintain your own instances. Starwhale Cloud also provides public resources for you to download, like datasets, runtimes, and models. Check the "starwhale/public" project on Starwhale Cloud for more details. + +When choosing which instance type to use, consider the following: + +| Instance Type | Deployment location | Maintained by | User Interface | Scalability | +| ------------- | ------------- | ------------- | ------------- | ------------- | +| Starwhale Standalone | Your laptop or any server in your data center | Not required | Command line | Not scalable | +| Starwhale Server | Your data center | Yourself | Web UI and command line | Scalable, depends on your Kubernetes cluster | +| Starwhale Cloud | Public cloud, like AWS or Aliyun | the Starwhale Team |Web UI and command line | Scalable, but currently limited by the freely available resource on the cloud | Depending on your instance type, there are three getting-started guides available for you: diff --git a/docs/getting-started/server.md b/docs/getting-started/server.md index 389658279..f759daa6e 100644 --- a/docs/getting-started/server.md +++ b/docs/getting-started/server.md @@ -2,9 +2,13 @@ title: Getting started with Starwhale Server --- -## Install Starwhale Server +## Start Starwhale Server -To install Starwhale Server, see the [installation guide](../server/installation/index.md). +```bash +swcli server start +``` + +For detailed informatiuon, see the [installation guide](../server/installation/index.md). ## Create your first project diff --git a/docs/getting-started/standalone.md b/docs/getting-started/standalone.md index 5c788868f..b90d6898f 100644 --- a/docs/getting-started/standalone.md +++ b/docs/getting-started/standalone.md @@ -6,6 +6,14 @@ When the [Starwhale Client (swcli)](../swcli/) is installed, you are ready to us We also provide a Jupyter Notebook example, you can try it in [Google Colab](https://colab.research.google.com/github/star-whale/starwhale/blob/main/example/notebooks/quickstart-standalone.ipynb) or in your local [vscode/jupyterlab](https://github.com/star-whale/starwhale/blob/main/example/notebooks/quickstart-standalone.ipynb). +## Installing Starwhale Client + +```bash +python3 -m pip install starwhale +``` + +For detailed information, see [Starwhale Client Installation Guide](swcli/installation). + ## Downloading Examples Download Starwhale examples by cloning the Starwhale project via: diff --git a/docs/swcli/installation.md b/docs/swcli/installation.md index 4446530f8..9773dbcd2 100644 --- a/docs/swcli/installation.md +++ b/docs/swcli/installation.md @@ -8,6 +8,12 @@ We can use `swcli` to complete all tasks for Starwhale Instances. `swcli` is wri DO NOT install Starwhale in your system's global Python environment. It will cause a python dependency conflict problem. ::: +## Quick install + +```bash +python3 -m pip install starwhale +``` + ## Prerequisites * Python 3.7 ~ 3.11 diff --git a/docs/what-is-starwhale.md b/docs/what-is-starwhale.md index 0a0cf1b33..e96b69687 100644 --- a/docs/what-is-starwhale.md +++ b/docs/what-is-starwhale.md @@ -5,31 +5,41 @@ title: What is Starwhale ## Overview -Starwhale is an MLOps/LLMOps platform that make your model creation, evaluation and publication much easier. It aims to create a handy tool for data scientists and machine learning engineers. - -Starwhale helps you: - -* Keep track of your training/testing dataset history including data items and their labels, so that you can easily access them. -* Manage your model packages that you can share across your team. -* Run your models in different environments, either on a Nvidia GPU server or on an embedded device like Cherry Pi. -* Create a online service with interactive Web UI for your models. - -Starwhale is designed to be an open platform. You can create your own plugins to meet your requirements. - -## Deployment options - -Each deployment of Starwhale is called an instance. All instances can be managed by the Starwhale Client (swcli). - -You can start using Starwhale with one of the following instance types: - -* **Starwhale Standalone** - Rather than a running service, Starwhale Standalone is actually a repository that resides in your local file system. It is created and managed by the Starwhale Client (swcli). You only need to install swcli to use it. Currently, each user on a single machine can have only ONE Starwhale Standalone instance. We recommend you use the Starwhale Standalone to build and test your datasets, runtime, and models before pushing them to Starwhale Server/Cloud instances. -* **Starwhale Server** - Starwhale Server is a service deployed on your local server. Besides text-only results from the Starwhale Client (swcli), Starwhale Server provides Web UI for you to manage your datasets and models, evaluate your models in your local Kubernetes cluster, and review the evaluation results. -* **Starwhale Cloud** - Starwhale Cloud is a managed service hosted on public clouds. By registering an account on , you are ready to use Starwhale without needing to install, operate, and maintain your own instances. Starwhale Cloud also provides public resources for you to download, like datasets, runtimes, and models. Check the "starwhale/public" project on Starwhale Cloud for more details. - -When choosing which instance type to use, consider the following: - -| Instance Type | Deployment location | Maintained by | User Interface | Scalability | -| ------------- | ------------- | ------------- | ------------- | ------------- | -| Starwhale Standalone | Your laptop or any server in your data center | Not required | Command line | Not scalable | -| Starwhale Server | Your data center | Yourself | Web UI and command line | Scalable, depends on your Kubernetes cluster | -| Starwhale Cloud | Public cloud, like AWS or Aliyun | the Starwhale Team |Web UI and command line | Scalable, but currently limited by the freely available resource on the cloud | +Starwhale is an MLOps/LLMOps platform that provides R&D operation management capabilities for machine learning projects, establishing standardized model development, testing, deployment and operation processes, connecting business teams, AI teams and operation teams. It solves problems such as long model iteration cycles, team collaboration, and waste of human resources in the machine learning process. Starwhale provides Standalone, Server and Cloud in three instance ways to meet the development needs of a single machine environment, private cluster deployment and multi-cloud services hosted by the Starwhale team. + +Starwhale is also an [open source platform](https://github.com/star-whale/starwhale), using the [Apache-2.0 license](https://github.com/star-whale/starwhale/blob/main/LICENSE). + +![products](https://starwhale-examples.oss-cn-beijing.aliyuncs.com/docs/products.png) + +* Fundamentals: + * [Starwhale Model](model/index): Starwhale Model is a standard package format for machine learning models, which can be used for various purposes, such as model fine-tuning, model evaluation, and online services. Starwhale Model includes model files, inference code, configuration files, etc. + * [Starwhale Dataset](dataset/index): Starwhale Dataset enables efficient data storage, data loading, and data visualization, making it a data management tool for the ML/DL field. + * [Starwhale Runtime](runtime/index): Starwhale Runtime provides a reproducible and shareable runtime environment for running Python programs. With Starwhale Runtime, you can easily share with others and use it on Starwhale Server and Starwhale Cloud instances. +* Model Evaluation: + * [Model Evaluation](evaluation/index): Starwhale Model Evaluation allows users to implement complex, production-level, distributed model evaluation tasks with minimal Python code using the SDK. + * Live Demo: Evaluate models online through a Web UI. + * Reports: Create shareable, automatically integrated evaluation reports. + * Tables: Provide multi-dimensional model evaluation result comparisons and displays, with support for multimedia data such as images, audio, and video. The tables can present all the data and artifacts recorded during the evaluation process using the Starwhale Python SDK. +* LLM Fine-tuning: Provide a full toolchain for LLM fine-tuning, including model fine-tuning, batch evaluation comparison, online evaluation comparison, and model publishing. +* Deployment Instances: + * Starwhale Standalone: Deployed in a local development environment, managed by the `swcli` command-line tool, meeting development and debugging needs. + * Starwhale Server: Deployed in a private data center, relying on a Kubernetes cluster, providing centralized, web-based, and secure services. + * Starwhale Cloud: Hosted on a public cloud, with the access address . The Starwhale team is responsible for maintenance, and no installation is required. You can start using it after registering an account. + +## Typical Use Cases + +* **Dataset Management**: With the Starwhale Dataset Python SDK, you can easily import, create, distribute, and load datasets while achieving fine-grained version control and visualization. +* **Model Management**: By using a simple packaging mechanism, you can generate Starwhale Model packages that include models, configuration files, and code, providing efficient distribution, version management, Model Registry, and visualization, making the daily management of model packages more straightforward. +* **Machine Learning Runtime Sharing**: By exporting the development environment or writing a simple YAML, you can reproduce the environment in other instances, achieving a stable and consistent runtime. Starwhale Runtime abstracts and shields some underlying dependencies, so users don't need to master Dockerfile writing or CUDA installation, making it easy to define an environment that meets the requirements of machine learning programs. +* **Model Evaluation**: With the Starwhale Evaluation Python SDK, you can implement efficient, large-scale, multi-dataset, and multi-stage model evaluations in a distributed cluster environment with minimal code, record data and artifacts generated during the evaluation process in Starwhale Tables, and provide various visualization methods. +* **Online Evaluation**: Quickly create interactive Web UI online services for Starwhale models to perform rapid testing. +* **Model Fine-tuning**: Provide a complete toolchain for fine-tuning large language models (LLMs), making the model fine-tuning process faster and more quantifiable. + +Starwhale is an open platform that can be used for individual functions or combined for use, with the core goal of providing a convenient tool for data scientists and machine learning engineers to improve work efficiency. + +## Start Your Starwhale Journey + +* Complete the [installation of Starwhale Client](swcli/installation) and [launch of Starwhale Server](server/installation/server-start) within 5-10 minutes. +* Follow the [Starwhale Standalone Getting Started Guide](server/installation/server-start) to build, evaluate, and visualize the helloworld example's model, dataset, and runtime in your local environment. +* Refer to the [Starwhale Server Getting Started Guide](getting-started/server) to run the helloworld example's model evaluation in Starwhale Server. +* Read the [User Guide](swcli) and [Reference Guide](reference/swcli), and refer to the [examples](https://github.com/starwhale-ai/starwhale/tree/main/examples) to explore more features and functionalities. diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/getting-started/index.md b/i18n/zh/docusaurus-plugin-content-docs/current/getting-started/index.md index 5b5bdaddf..97d4db90d 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/getting-started/index.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/getting-started/index.md @@ -2,13 +2,19 @@ title: 入门指南 --- -首先,您需要安装[Starwhale Client(swcli)](../swcli),可以运行如下命令: +Starwhale的每个部署称为一个实例。所有实例都可以通过Starwhale Client(swcli)进行管理。您可以任选以下实例类型之一开始使用: -```bash -python3 -m pip install starwhale -``` +* **Starwhale Standalone** - Starwhale Standalone 本质上是一套存储在本地文件系统中的数据库。它由 Starwhale Client(swcli)创建和管理。您只需安装 `swcli` 即可使用。目前,一台机器上的每个用户只能拥有一个Starwhale Standalone 实例。我们建议您使用 Starwhale Standalone 来构建和测试您的数据集和模型,然后再将它们推送到 Starwhale Server/Cloud 实例。 +* **Starwhale Server** - Starwhale Server 是部署在您本地服务器上的服务。除了 Starwhale Client(swcli)的文本交互界面,Starwhale Server还提供 Web UI供您管理数据集和模型,以及在Kubernetes集群中运行模型并查看运行结果。 +* **Starwhale Cloud** - Starwhale Cloud 是托管在公共云上的服务。 通过在注册一个账号,您就可以使用Starwhale,而无需安装、运行和维护您自己的实例。 Starwhale Cloud 还提供公共资源供您下载,例如一些流行的开源集数据集、模型和运行时。查看 Starwhale Cloud 实例上的 “starwhale/public”项目以获取更多详细信息。 -更多详细信息请参阅[swcli安装指南](../swcli/installation)。 +在您决定要使用的实例类型时,请考虑以下因素: + +| 实例类型 | 部署位置 | 维护者 | 用户界面 | 可扩展性 | +| -------------- | -------------- | -------------- | -------------- | -------------- | +| Starwhale Standalone | 您的笔记本电脑或本地服务器 | 不需要 | 命令行 | 不可扩展 | +| Starwhale Server | 您的数据中心 | 您自己 | Web UI和命令行 | 可扩展,取决于您的 Kubernetes 集群 | +| Starwhale Cloud | 公共云,如AWS或阿里云 | Starwhale团队 | Web UI和命令行 | 可扩展,但目前受到云上免费可用资源的限制 | 根据您使用的实例类型,您可以参考以下三个入门指南: diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/getting-started/server.md b/i18n/zh/docusaurus-plugin-content-docs/current/getting-started/server.md index 223ca5c94..7fbaafec9 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/getting-started/server.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/getting-started/server.md @@ -2,9 +2,13 @@ title: Starwhale Server入门指南 --- -## 安装Starwhale Server +## 启动 Starwhale Server -安装 Starwhale Server,参见[安装指南](../server/installation/index.md)。 +```bash +swcli server start +``` + +更多的Starwhale Server 安装和启动信息,请参见[安装指南](../server/installation/index.md)。 ## 创建您的第一个项目 diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/getting-started/standalone.md b/i18n/zh/docusaurus-plugin-content-docs/current/getting-started/standalone.md index 5d184e05b..d92bf79d5 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/getting-started/standalone.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/getting-started/standalone.md @@ -2,10 +2,16 @@ title: Starwhale Standalone入门指南 --- -当[Starwhale Client(swcli)](../swcli/)安装完成后,您就可以使用Starwhale Standalone。 - 我们也提供对应的Jupyter Notebook例子,可以在 [Google Colab](https://colab.research.google.com/github/star-whale/starwhale/blob/main/example/notebooks/quickstart-standalone.ipynb) 或本地的 [vscode/jupyterlab](https://github.com/star-whale/starwhale/blob/main/example/notebooks/quickstart-standalone.ipynb) 中试用。 +## 安装 Starwhale Client + +```bash +python3 -m pip install starwhale +``` + +详细文档参见 [Starwhale Client 安装指南](swcli/installation)。 + ## 下载例子 通过以下方式克隆Starwhale项目来下载Starwhale示例: diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/swcli/installation.md b/i18n/zh/docusaurus-plugin-content-docs/current/swcli/installation.md index 880b88860..6d0c95c24 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/swcli/installation.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/swcli/installation.md @@ -8,6 +8,12 @@ title: 安装指南 非常不建议将 Starwhale 安装在系统的全局 Python 环境中,可能会导致 Python 的依赖冲突问题。使用 venv 或 conda 创建一个隔离的 Python 环境,并在其中安装 Starwhale,是 Python 推荐的做法。 ::: +## 快速安装 + +```bash +python3 -m pip install starwhale +``` + ## 先决条件 * Python3.7 ~ 3.11 diff --git a/i18n/zh/docusaurus-plugin-content-docs/current/what-is-starwhale.md b/i18n/zh/docusaurus-plugin-content-docs/current/what-is-starwhale.md index 011f37b9c..49c41df85 100644 --- a/i18n/zh/docusaurus-plugin-content-docs/current/what-is-starwhale.md +++ b/i18n/zh/docusaurus-plugin-content-docs/current/what-is-starwhale.md @@ -5,31 +5,41 @@ title: 什么是Starwhale ## 概述 -Starwhale是一个 MLOps/LLMOps平台,能够让您的模型创建、评估和发布流程变得更加轻松。它旨在为数据科学家和机器学习工程师创建一个方便的工具。 - -Starwhale能够帮助您: - -* 跟踪您的训练/测试数据集历史记录,包括所有数据项及其相关标签,以便您轻松访问它们。 -* 管理您可以在团队中共享的模型包。 -* 在不同的环境中运行您的模型,无论是在 Nvidia GPU服务器上还是在嵌入式设备(如 Cherry Pi)上。 -* 为您的模型快速创建配备交互式 Web UI的在线服务。 - -同时,Starwhale 是一个开放的平台,您可以创建插件来满足自己的需求。 - -## 部署选项 - -Starwhale的每个部署称为一个实例。所有实例都可以通过Starwhale Client(swcli)进行管理。 - -您可以任选以下实例类型之一开始使用: - -* **Starwhale Standalone** - Starwhale Standalone 本质上是一套存储在本地文件系统中的数据库。它由 Starwhale Client(swcli)创建和管理。您只需安装 `swcli` 即可使用。目前,一台机器上的每个用户只能拥有一个Starwhale Standalone 实例。我们建议您使用 Starwhale Standalone 来构建和测试您的数据集和模型,然后再将它们推送到 Starwhale Server/Cloud 实例。 -* **Starwhale Server** - Starwhale Server 是部署在您本地服务器上的服务。除了 Starwhale Client(swcli)的文本交互界面,Starwhale Server还提供 Web UI供您管理数据集和模型,以及在Kubernetes集群中运行模型并查看运行结果。 -* **Starwhale Cloud** - Starwhale Cloud 是托管在公共云上的服务。 通过在注册一个账号,您就可以使用Starwhale,而无需安装、运行和维护您自己的实例。 Starwhale Cloud 还提供公共资源供您下载,例如一些流行的开源集数据集、模型和运行时。查看 Starwhale Cloud 实例上的 “starwhale/public”项目以获取更多详细信息。 - -在您决定要使用的实例类型时,请考虑以下因素: - -| 实例类型 | 部署位置 | 维护者 | 用户界面 | 可扩展性 | -| -------------- | -------------- | -------------- | -------------- | -------------- | -| Starwhale Standalone | 您的笔记本电脑或本地服务器 | 不需要 | 命令行 | 不可扩展 | -| Starwhale Server | 您的数据中心 | 您自己 | Web UI和命令行 | 可扩展,取决于您的 Kubernetes 集群 | -| Starwhale Cloud | 公共云,如AWS或阿里云 | Starwhale团队 | Web UI和命令行 | 可扩展,但目前受到云上免费可用资源的限制 | +Starwhale是一个 MLOps/LLMOps平台,面向机器学习项目提供研发运营管理能力,建立标准化的模型开发、测试、部署和运营流程,连接业务团队、AI团队和运营团队。解决机器学习过程中模型迭代周期长、团队协作、人力资源浪费等问题。Starwhale提供Standalone, Server 和 Cloud 三种实例方式,满足单机环境开发,私有化集群部署和Starwhale团队托管的云服务多种部署场景。 + +Starwhale 同时也是一个[开源的平台](https://github.com/star-whale/starwhale),使用 [Apache-2.0 协议](https://github.com/star-whale/starwhale/blob/main/LICENSE)。 + +![products](https://starwhale-examples.oss-cn-beijing.aliyuncs.com/docs/products.png) + +* 平台基础: + * [Starwhale Model](model/index):Starwhale 模型是一种机器学习模型的标准包格式,可用于多种用途,例如模型微调、模型评估和在线服务。 Starwhale 模型包含模型文件、推理代码、配置文件等。 + * [Starwhale Dataset](dataset/index):Starwhale 数据集能够高效的数据存储、数据加载和数据可视化,是一款面向ML/DL领域的数据管理工具。 + * [Starwhale Runtime](runtime/index):Starwhale 运行时能够针对运行Python程序,提供一种可复现、可分享的运行环境。使用 Starwhale 运行时,可以非常容易的与他人分享,并且能在 Starwhale Server 和 Starwhale Cloud 实例上使用 Starwhale 运行时。 +* 模型评测: + * [Model Evaluation](evaluation/index):Starwhale 模型评测能让用户通过SDK写少量的Python 代码就能实现复杂的、生产级别的、分布式的模型评测任务。 + * Live Demo:能够通过Web UI方式对模型进行在线评测。 + * Reports:编写可分享,可自动集成评测数据的报告。 + * Tables:提供多维度的模型评测结果对比和展示,表格中支持包括图片、音频和视频等多媒体数据展示。能够将评测过程中通过Starwhale Python SDK 自由记录的数据都呈现出来。 +* LLM 微调:提供面向LLM的全流程模型微调工具链,包括模型微调,批量评测对比,在线评测对比和模型发布功能。 +* 部署实例: + * Starwhale Standalone:部署在本地开发环境中,通过 `swcli` 命令行工具进行管理,满足开发调试需求。 + * Starwhale Server:部署在私有数据中心里,依赖 Kubernetes 集群,提供集中化的、Web交互式的、安全的服务。 + * Starwhale Cloud:托管在公共云上的服务,访问地址为,由 Starwhale 团队负责运维,无需安装,注册账户后即可使用。 + +## 典型使用场景 + +* **数据集管理**:基于 Starwhale Dataset Python SDK 可以非常容易的导入、创建、分发和加载数据集,同时可以实现数据集细粒度的版本控制和可视化等功能。 +* **模型管理**:通过简单的打包机制,能将模型、配置文件和代码等生成 Starwhale 模型包,提供高效分发、版本管理、Model Registry和可视化等功能,让模型包的日常管理更简单。 +* **机器学习运行环境共享**: 通过导出开发环境或编写简单的 YAML 生成 Starwhale 运行时,可以在其他实例中重现该环境,获得稳定的、一致的运行时,实现一处定义,处处运行的目标。Starwhale 运行时抽象和屏蔽一些底层依赖,用户不需要掌握Dockerfile编写、CUDA安装等知识,非常简单的就能定义出满足机器学习程序运行的环境。 +* **模型评测**:借助 Starwhale Evaluation Python SDK,只需要编写少量代码,就能实现分布式集群环境下的高效的、大规模的、多数据集的、多阶段的模型评测,并能将评测过程中产生的数据、制品等记录到 Starwhale Tables中,并提供多种可视化方式展示。 +* **在线评测**:为Starwhale 模型快速创建交互式的Web UI在线服务,可以进行快速检测。 +* **模型微调**:针对大语言模型(LLM)的微调,提供一套完整的工具链,让模型微调过程变得快速且可量化。 + +Starwhale 是一个开放的平台,工作中可以只用某些功能,也可以组合使用,核心目标是为数据科学家和机器学习工程师提供一个方便的工具,提升工作效率。 + +## 开始 Starwhale 之旅 + +* 5-10分钟内完成 [Starwhale Client 的安装](swcli/installation)和 [Starwhale Server 的启动](server/installation/server-start)。 +* 参考 [Starwhale Standalone 入门指南](server/installation/server-start),在本地对 helloworld 例子的模型、数据集和运行时构建,完成对 MNIST 数据集的模型效果的评估。 +* 参考 [Starwhale Server 入门指南](getting-started/server),在 Starwhale Server 中运行 helloworld 例子的模型评测。 +* 阅读 [用户指南](swcli) 和 [参考指南](reference/swcli),并参照[例子](https://github.com/star-whale/starwhale/tree/main/example),制作自己的数据集、运行时和模型包,进行模型评测等任务。