diff --git a/cloud/PrivateLink-create-a-connection.md b/cloud/PrivateLink-create-a-connection.md new file mode 100644 index 000000000..9a68afbc6 --- /dev/null +++ b/cloud/PrivateLink-create-a-connection.md @@ -0,0 +1,68 @@ +--- +id: PrivateLink-create-a-connection +title: Create a PrivateLink connection +description: Create a PrivateLink connection. +slug: /create-a-connection +--- + +Follow the steps below to create a PrivateLink connection between RisingWave Cloud and your VPC. + +## Prerequisites + +- You need to create a cluster with the Pro plan or Enterprise plan in RisingWave Cloud: + + - See [Choose a cluster plan](/cluster-choose-a-cluster-plan.md) for more information. Please note that Developer clusters do not support PrivateLink connections. + + - The VPC you want to connect to and your cluster must be in the same region. If your preferred region is not available when creating a cluster, contact our [support team](mailto:cloud-support@risingwave-labs.com) or [sales team](mailto:sales@risingwave-labs.com). + +- You need to set up a PrivateLink service in your VPC and make sure it runs properly. The following links might be helpful: + + - For AWS, see [Share your services through AWS PrivateLink](https://docs.aws.amazon.com/vpc/latest/privatelink/privatelink-share-your-services.html). + - For GCP, see [GCP Published services](https://cloud.google.com/vpc/docs/about-vpc-hosted-services). + - For Azure, see [Azure Private Link services](https://learn.microsoft.com/en-us/azure/private-link/private-link-service-overview). + + :::note + Azure Private Link integration is currently in development and will be available soon. + ::: + +## Steps + +1. Go to the [**PrivateLink**](https://cloud.risingwave.com/connection/) page and click **Create PrivateLink**. + +2. For **Platform**, select your cloud service provider. Currently, RisingWave Cloud supports **AWS** PrivateLink and **GCP** Private Service Connect. + +3. For **Cluster**, select the cluster you want to connect the VPC to. Ensure that the VPC and the cluster are in the same region. + +4. For **Name name**, enter a descriptive name for the connection. + +5. For **Endpoint service name** or **Service attachment**: + +
If you choose AWS as the platform, enter the service name of the endpoint service. + + You can find it in the [Amazon VPC console](https://console.aws.amazon.com/vpc/) → **Endpoint services** → **Service name** section. + + AWS endpoint service name + +
+ +
If you choose GCP as the platform, enter the server target URL of the service attachment. + + You can find it in the [Google Cloud Console](https://console.cloud.google.com/) → **Network services** → **Private Service Connect**. + + GCP Service attachment + +
+ +6. Click **Confirm** to create the connection. + +## What's next + +Now, you can create a source or sink with the PrivateLink connection using SQL. + +For details on how to use the VPC endpoint to create a source with the PrivateLink connection, see [Create source with PrivateLink connection](/docs/current/ingest-from-kafka/#create-source-with-privatelink-connection); for creating a sink, see [Create sink with PrivateLink connection](/docs/current/create-sink-kafka/#create-sink-with-privatelink-connection). diff --git a/cloud/PrivateLink-drop-a-connection.md b/cloud/PrivateLink-drop-a-connection.md new file mode 100644 index 000000000..cdfac2da2 --- /dev/null +++ b/cloud/PrivateLink-drop-a-connection.md @@ -0,0 +1,12 @@ +--- +id: PrivateLink-drop-a-connection +title: Drop a PrivateLink connection +description: If you no longer need to connect to a PrivateLink, you can drop the connection. +slug: /drop-a-connection +--- + +Follow the steps below to drop a connection to your VPC when you no longer need it. + +1. Go to the [**Connection**](https://cloud.risingwave.com/connection/) page and click **Create PrivateLink**. + +2. Hover over the connection you want to drop and click the delete button, then confirm the deletion. diff --git a/cloud/PrivateLink-overview.md b/cloud/PrivateLink-overview.md new file mode 100644 index 000000000..8fdcff189 --- /dev/null +++ b/cloud/PrivateLink-overview.md @@ -0,0 +1,63 @@ +--- +id: PrivateLink-overview +title: PrivateLink connection +description: Manage PrivateLink connections. +slug: /PrivateLink-overview +--- + +## Concept + +In RisingWave Cloud, if you want to connect RisingWave clusters with your services inside your private Virtual Private Cloud (VPC) network, you can use the PrivateLink service to establish a private and secure connection between RisingWave Cloud and your private VPC in the same region. + +RisingWave Cloud utilizes the the private connection capability of the underlying Cloud vendors to establish the PrivateLink connection. In particular, the PrivateLink service is built on top of the following services: + +- [AWS PrivateLink](https://docs.aws.amazon.com/vpc/latest/privatelink/what-is-privatelink.html) +- [GCP Private Service Connect](https://cloud.google.com/vpc/docs/private-service-connect) +- [Azure Private Link](https://learn.microsoft.com/en-us/azure/private-link/) + +:::note +Azure Private Link integration is currently in development and will be available soon. +::: + +The diagram below depicts a high-level overview of how PrivateLink service works. Both all three platforms share the same pattern of network structure so that you can configure them in the same way automatically. + + + +On the **RisingWave Cloud** side, RisingWave Cloud will create an endpoint (specifically an AWS VPC endpoint, GCP Private Service Connect endpoint, or Azure private endpoint) and bind it with one running RisingWave cluster. + +On the **Customer** side, you need to set up a PrivateLink service (specifically an AWS endpoint service, GCP published service, or Azure Private Link service) in your VPC network first. + + + + + + + + + + + + + + + + diff --git a/cloud/images/PrivateLink-diagram.png b/cloud/images/PrivateLink-diagram.png new file mode 100644 index 000000000..b4f72bb01 Binary files /dev/null and b/cloud/images/PrivateLink-diagram.png differ diff --git a/cloud/images/vpc-diagram.png b/cloud/images/vpc-diagram.png deleted file mode 100644 index 972fbc634..000000000 Binary files a/cloud/images/vpc-diagram.png and /dev/null differ diff --git a/cloud/vpc-create-a-connection.md b/cloud/vpc-create-a-connection.md deleted file mode 100644 index 544298ebd..000000000 --- a/cloud/vpc-create-a-connection.md +++ /dev/null @@ -1,84 +0,0 @@ ---- -id: vpc-create-a-connection -title: Create a VPC connection -description: Create a VPC connection through PrivateLink or Private Service Connect. -slug: /create-a-connection ---- - -Follow the steps below to establish a secure connection with your VPC through AWS PrivateLink or GCP Private Service Connect. - -## Prerequisites - -- You have created a cluster in RisingWave Cloud and: - - - It is created with the Standard plan or Invited plan. Developer clusters do not support VPC connections. See [Choose a cluster plan](/cluster-choose-a-cluster-plan.md) for more information. - - - The VPC you want to connect to and your cluster must be in the same region. If your preferred region is not available when creating a cluster, contact our support team. - -- The VPC, source/sink service, and endpoint service or service attachment are set up and running properly. If you are setting up new services, the following links might be helpful: - - - For AWS, see [Share your services through AWS PrivateLink](https://docs.aws.amazon.com/vpc/latest/privatelink/privatelink-share-your-services.html). - - For GCP, see [Private Service Connect](https://cloud.google.com/vpc/docs/private-service-connect). - -## Steps - -1. Go to the [**Connection**](https://cloud.risingwave.com/connection/) page and click **Create connection**. - - Create connection page - -2. For **Connection type**, select your cloud service provider. - - Currently, RisingWave Cloud supports **AWS** PrivateLink and **GCP** Private Service Connect. - -3. For **Cluster**, select the cluster you want to connect the VPC to. - - Ensure that the VPC and the cluster are in the same region. - -4. For **Connection name**, enter a descriptive name for the connection. - -5. Enter the service identifier. - -
For AWS, enter the service name of the endpoint service. - - You can find it in the [Amazon VPC console](https://console.aws.amazon.com/vpc/) → **Endpoint services** → **Service name** section. - - AWS endpoint service name - -
- -
For GCP, enter the server target URL of the service attachment. - - You can find it in the [Google Cloud Console](https://console.cloud.google.com/) → **Network services** → **Private Service Connect**. - - GCP Service attachment - -
- -6. Click **Confirm** to create the connection. - -## What's next - -Now, you can create a source or sink with the VPC connection using SQL. - -:::note -Guided setup for creating a source or sink with a VPC connection is coming soon. -::: - -After you created the connection, a VPC connection endpoint is generated for your cluster. You can find it in [**Connection**](https://cloud.risingwave.com/connection/). - - - -For details on how to use the VPC endpoint to create a source with the VPC connection, see [Create source with VPC connection](/docs/current/ingest-from-kafka/#create-source-with-vpc-connection). For creating a sink, see [Create sink with VPC connection](/docs/current/create-sink-kafka/#create-sink-with-vpc-connection). - diff --git a/cloud/vpc-drop-a-connection.md b/cloud/vpc-drop-a-connection.md deleted file mode 100644 index 0b5b10a47..000000000 --- a/cloud/vpc-drop-a-connection.md +++ /dev/null @@ -1,18 +0,0 @@ ---- -id: vpc-drop-a-connection -title: Drop a VPC connection -description: If you no longer need to connect to a VPC, you can drop the connection. -slug: /drop-a-connection ---- - -Follow the steps below to drop a connection to your VPC when you no longer need it. - -1. Go to the [**Connection**](https://cloud.risingwave.com/connection/) page and click **Create connection**. - - Connection page - -2. Hover over the connection you want to drop and click the delete button, then confirm the deletion. - diff --git a/cloud/vpc-overview.md b/cloud/vpc-overview.md deleted file mode 100644 index 8b7fc836f..000000000 --- a/cloud/vpc-overview.md +++ /dev/null @@ -1,60 +0,0 @@ ---- -id: vpc-overview -title: VPC connection -description: Manage VPC connections. -slug: /vpc-overview ---- - -## Concept - -If you want to connect to a cloud-hosted source or sink, there might be connectivity issues when your service is located within a virtual private cloud (VPC) that is not publicly accessible. - -To establish a secure and direct connection between the VPC and your clusters in RisingWave Cloud and allow RisingWave to read consumer messages from the broker or send messages to the broker, you need to establish a VPC connection. - - - -## Connecting to VPCs - -RisingWave Cloud supports VPC connections through AWS PrivateLink and GCP Private Service Connect. - -You can go to the [**Connection**](https://cloud.risingwave.com/connection/) page to manage your VPC connections. - -Connection page - - - - - - - - - - - - - - - - diff --git a/docs/get-started.md b/docs/get-started.md index ebe6f7e38..6f620f0ba 100644 --- a/docs/get-started.md +++ b/docs/get-started.md @@ -38,7 +38,7 @@ risingwave Ensure [Docker Desktop](https://docs.docker.com/get-docker/) is installed and running in your environment. ```shell -docker run -it --pull=always -p 4566:4566 -p 5691:5691 risingwavelabs/risingwave:v1.7.0-standalone single_node +docker run -it --pull=always -p 4566:4566 -p 5691:5691 risingwavelabs/risingwave:latest single_node ``` ### Homebrew @@ -47,7 +47,7 @@ Ensure [Homebrew](https://brew.sh/) is installed, and run the following commands ```shell brew tap risingwavelabs/risingwave -brew install risingwave@1.7-standalone +brew install risingwave risingwave ``` @@ -151,4 +151,4 @@ Congratulations! You've successfully started RisingWave and conducted some initi * [Example A: Ingest data from Kafka](https://github.com/risingwavelabs/awesome-stream-processing/blob/main/00-get-started/01-ingest-kafka-data.md) * [Example B: Ingest data from Postgres CDC](https://github.com/risingwavelabs/awesome-stream-processing/blob/main/00-get-started/02-ingest-pg-cdc.md) - See [this GitHub directory](https://github.com/risingwavelabs/risingwave/tree/main/integration_tests) for ready-to-run demos and integration examples. -- Read our documentation to learn about how to ingest data from data streaming sources, transform data, and deliver data to downstream systems. \ No newline at end of file +- Read our documentation to learn about how to ingest data from data streaming sources, transform data, and deliver data to downstream systems. diff --git a/docs/guides/create-sink-kafka.md b/docs/guides/create-sink-kafka.md index 778640670..ca225a097 100644 --- a/docs/guides/create-sink-kafka.md +++ b/docs/guides/create-sink-kafka.md @@ -214,11 +214,11 @@ FORMAT PLAIN ENCODE JSON; ``` -## Create sink with VPC connection +## Create sink with PrivateLink connection If your Kafka sink service is located in a different VPC from RisingWave, use AWS PrivateLink or GCP Private Service Connect to establish a secure and direct connection. For details on how to set up an AWS PrivateLink connection, see [Create an AWS PrivateLink connection](/sql/commands/sql-create-connection.md#create-an-aws-privatelink-connection). -To create a Kafka sink with a VPC connection, in the WITH section of your `CREATE SINK` statement, specify the following parameters. +To create a Kafka sink with a PrivateLink connection, in the WITH section of your `CREATE SINK` statement, specify the following parameters. |Parameter| Notes| |---|---| diff --git a/docs/ingest/ingest-from-iceberg.md b/docs/ingest/ingest-from-iceberg.md index 60eda31c0..9c5559cc8 100644 --- a/docs/ingest/ingest-from-iceberg.md +++ b/docs/ingest/ingest-from-iceberg.md @@ -100,6 +100,45 @@ SELECT * FROM s FOR SYSTEM_TIME AS OF 4102444800; SELECT * FROM s FOR SYSTEM_VERSION AS OF 3023402865675048688; ``` +## System tables + +We currently support system tables [`rw_iceberg_files` and `rw_iceberg_snapshots`](/sql/system-catalogs/rw_catalog.md#available-risingwave-catalogs). `rw_iceberg_files` contains the current files of the Iceberg source or table. Here is a simple example: + +```sql title="Read Iceberg files" + SELECT * FROM rw_iceberg_files; +``` +``` + source_id | schema_name | source_name | content | file_path | file_format | record_count | file_size_in_bytes | equality_ids | sort_order_id +-----------+-------------+----------------------------+---------+----------------------------------------------------------------------------------------------------------------------+-------------+--------------+--------------------+--------------+--------------- + 1 | public | source_merge_on_read_table | 0 | s3a://hummock001/demo_db/merge_on_read_table/data/00000-1-dcbf00a2-8722-4cd8-9d77-15880e334488-00001.parquet | parquet | 1 | 951 | | 0 + 1 | public | source_merge_on_read_table | 0 | s3a://hummock001/demo_db/merge_on_read_table/data/00000-0-0d0f5a54-b4b1-431d-8574-e13fc410296f-00001.parquet | parquet | 1 | 907 | | 0 + 1 | public | source_merge_on_read_table | 0 | s3a://hummock001/demo_db/merge_on_read_table/data/00001-1-0d0f5a54-b4b1-431d-8574-e13fc410296f-00001.parquet | parquet | 1 | 935 | | 0 + 1 | public | source_merge_on_read_table | 0 | s3a://hummock001/demo_db/merge_on_read_table/data/00000-0-66d3b597-a132-4e04-a40a-02972edcff24-00001.parquet | parquet | 1 | 907 | | 0 + 1 | public | source_merge_on_read_table | 1 | s3a://hummock001/demo_db/merge_on_read_table/data/00000-1-dcbf00a2-8722-4cd8-9d77-15880e334488-00001-deletes.parquet | parquet | 1 | 1467 | | + 1 | public | source_merge_on_read_table | 0 | s3a://hummock001/demo_db/merge_on_read_table/data/00000-0-6acaf4d1-512c-42b7-9a76-e81fc8c8ca8f-00004.parquet | parquet | 1 | 1690 | | + 1 | public | source_merge_on_read_table | 2 | s3a://hummock001/demo_db/merge_on_read_table/data/00000-0-6e0c500e-6a8f-4252-863c-7c3d6ac20264-00048-eq-del.parquet | parquet | 1 | 721 | {1} | +(7 rows) +``` + +`rw_iceberg_snapshots` contains all Iceberg snapshots in RisingWave. Based on it, you can read a specific snapshot by a time travel query. For example, you can use the following query to read these snapshots: + +```sql title="Read all Iceberg snapshots" +SELECT * FROM rw_iceberg_snapshots; +``` +``` + source_id | schema_name |source_name| sequence_number | snapshot_id | timestamp_ms | manifest_list | summary +-----------+-------------+-----------------------------+-----------------+---------------------+-------------------------------+-----------------------------------------------------------------------------------------------------------------------------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ + 1 | public | t | 1 | 4476030648521181855 | 2024-04-03 08:54:22.488+00:00 | s3a://hummock001/demo_db/t/metadata/snap-4476030648521181855-1-f06c9623-1ae5-4646-b117-b765d4154221.avro | {"added-data-files": "2", "added-files-size": "1814", "added-records": "2", "changed-partition-count": "1", "operation": "append", "spark.app.id": "local-1712134455383", "total-data-files": "2", "total-delete-files": "0", "total-equality-deletes": "0", "total-files-size": "1814", "total-position-deletes": "0", "total-records": "2"} + 1 | public | t | 2 | 3368332126703695368 | 2024-04-03 08:54:36.795+00:00 | s3a://hummock001/demo_db/t/metadata/snap-3368332126703695368-1-3de95014-9e9e-4704-8ea3-44449525638a.avro | {"added-data-files": "2", "added-files-size": "1842", "added-records": "2", "changed-partition-count": "1", "operation": "append", "spark.app.id": "local-1712134465484", "total-data-files": "4", "total-delete-files": "0", "total-equality-deletes": "0", "total-files-size": "3656", "total-position-deletes": "0", "total-records": "4"} + 1 | public | t | 3 | 8015994742063949347 | 2024-04-07 02:46:38.913+00:00 | s3a://hummock001/demo_db/t/metadata/snap-8015994742063949347-1-f31760cb-5df9-452a-938e-01eec7105e94.avro | {"changed-partition-count": "1", "deleted-data-files": "1", "deleted-records": "1", "operation": "delete", "removed-files-size": "907", "spark.app.id": "local-1712457982197", "total-data-files": "3", "total-delete-files": "0", "total-equality-deletes": "0", "total-files-size": "2749", "total-position-deletes": "0", "total-records": "3"} + 1 | public | t | 4 | 4642583001850518920 | 2024-04-07 02:47:05.171+00:00 | s3a://hummock001/demo_db/t/metadata/snap-4642583001850518920-1-57d9b426-e584-4185-9b7e-eab890f20589.avro | {"added-data-files": "1", "added-delete-files": "1", "added-files-size": "2418", "added-position-delete-files": "1", "added-position-deletes": "1", "added-records": "1", "changed-partition-count": "1", "operation": "overwrite", "spark.app.id": "local-1712458005806", "total-data-files": "4", "total-delete-files": "1", "total-equality-deletes": "0", "total-files-size": "5167", "total-position-deletes": "1", "total-records": "4"} +``` + +```sql title="Read a specific snapshot" +SELECT * FROM t FOR SYSTEM_VERSION AS OF 4476030648521181855; +SELECT * FROM t FOR SYSTEM_TIME AS OF '2024-04-03 08:54:22.488+00:00'; +``` + ## Examples Firstly, create an append-only Iceberg table, see [Append-only sink from upsert source](/guides/sink-to-iceberg.md#append-only-sink-from-upsert-source) for details. diff --git a/docs/ingest/ingest-from-kafka.md b/docs/ingest/ingest-from-kafka.md index eb0d16573..9a9eaa898 100644 --- a/docs/ingest/ingest-from-kafka.md +++ b/docs/ingest/ingest-from-kafka.md @@ -328,16 +328,16 @@ Based on the compatibility type that is configured for the schema registry, some To learn about compatibility types for Schema Registry and the changes allowed, see [Compatibility Types](https://docs.confluent.io/platform/current/schema-registry/avro.html#compatibility-types). -## Create source with VPC connection +## Create source with PrivateLink connection If your Kafka source service is located in a different VPC from RisingWave, use AWS PrivateLink to establish a secure and direct connection. For details on how to set up an AWS PrivateLink connection, see [Create an AWS PrivateLink connection](/sql/commands/sql-create-connection.md#create-an-aws-privatelink-connection). -To create a Kafka source with a VPC connection, in the WITH section of your `CREATE SOURCE` or `CREATE TABLE` statement, specify the following parameters. +To create a Kafka source with a PrivateLink connection, in the WITH section of your `CREATE SOURCE` or `CREATE TABLE` statement, specify the following parameters. |Parameter| Notes| |---|---| |`privatelink.targets`| The PrivateLink targets that correspond to the Kafka brokers. The targets should be in JSON format. Note that each target listed corresponds to each broker specified in the `properties.bootstrap.server` field. If the order is incorrect, there will be connectivity issues. | -|`privatelink.endpoint`|The DNS name of the VPC endpoint.
If you're using RisingWave Cloud, you can find the auto-generated endpoint after you created a connection. See details in [Create a VPC connection](/cloud/create-a-connection#whats-next).| +|`privatelink.endpoint`|The DNS name of the VPC endpoint.
If you're using RisingWave Cloud, you can find the auto-generated endpoint after you created a connection. See details in [Create a PrivateLink connection](/cloud/create-a-connection#whats-next).| |`connection.name`| The name of the connection.
This parameter should only be included if you are using a connection created with the [`CREATE CONNECTION`](/sql/commands/sql-create-connection.md) statement. Omit this parameter if you have provisioned a VPC endpoint using `privatelink.endpoint` (recommended).| Here is an example of creating a Kafka source using a PrivateLink connection. Notice that `{"port": 9094}` corresponds to the broker `broker1-endpoint`, `{"port": 9095}` corresponds to the broker `broker2-endpoint`, and `{"port": 9096}` corresponds to the broker `broker3-endpoint`. diff --git a/docs/sql/system-catalogs/rw_catalog.md b/docs/sql/system-catalogs/rw_catalog.md index 3485d45d1..f8890075b 100644 --- a/docs/sql/system-catalogs/rw_catalog.md +++ b/docs/sql/system-catalogs/rw_catalog.md @@ -101,6 +101,8 @@ SELECT name, initialized_at, created_at FROM rw_sources; rw_hummock_pinned_versions | Contains information about the pinned versions in Hummock (the storage engine in RisingWave), including the worker node ID and the minimum pinned snapshot ID. | rw_hummock_sstables | Contains information about the SSTables (Sorted String Tables) used in Hummock (the storage engine in RisingWave). | rw_hummock_version_deltas | Contains information about version deltas in Hummock (the storage engine in RisingWave). A version delta represents the modifications or differences in data between consecutive epochs.| + |rw_iceberg_files| Contains the current files of the Iceberg source or table.| + |rw_iceberg_snapshots| Contains all Iceberg snapshots in RisingWave. Based on it, you can read a specific snapshot by a time travel query.| rw_indexes | Contains information about indexes in the database, including their IDs, names, schema identifiers, definitions, and more.| rw_internal_tables | Contains information about internal tables in the database. Internal tables are tables that store intermediate results (also known as internal states) of queries. Equivalent to the [`SHOW INTERNAL TABLES`](/sql/commands/sql-show-internal-tables.md) command.| rw_materialized_views | Contains information about materialized views in the database, including their names, definitions, schema and owner IDs, and access control lists. diff --git a/docusaurus.config.js b/docusaurus.config.js index 95c7925cb..a65ba38f2 100644 --- a/docusaurus.config.js +++ b/docusaurus.config.js @@ -224,7 +224,7 @@ const config = { async: true, }, { - src: "https://cdn.jsdelivr.net/npm/@runllm/search-widget@0.0.1-alpha51/dist/run-llm-search-widget.es.js", + src: "https://cdn.jsdelivr.net/npm/@runllm/search-widget@0.0.1-alpha61/dist/run-llm-search-widget.es.js", id: "runllm-widget-script", type: "module", "runllm-server-address": "https://api.runllm.com", @@ -237,7 +237,7 @@ const config = { async: true, }, ], - stylesheets: ["https://cdn.jsdelivr.net/npm/@runllm/search-widget@0.0.1-alpha51/dist/main.css"], + stylesheets: ["https://cdn.jsdelivr.net/npm/@runllm/search-widget@0.0.1-alpha61/dist/main.css"], webpack: { jsLoader: (isServer) => ({ loader: require.resolve("swc-loader"), diff --git a/sidebarCloud.js b/sidebarCloud.js index 213c4799b..c05b3e8cc 100644 --- a/sidebarCloud.js +++ b/sidebarCloud.js @@ -174,23 +174,23 @@ module.exports = { }, { type: "category", - label: "VPC connection", + label: "PrivateLink connection", collapsible: true, collapsed: true, - link: { type: "doc", id: "vpc-overview" }, + link: { type: "doc", id: "PrivateLink-overview" }, items: [ { type: "doc", - id: "vpc-overview", + id: "PrivateLink-overview", label: "Overview", }, { type: "doc", - id: "vpc-create-a-connection", + id: "PrivateLink-create-a-connection", }, { type: "doc", - id: "vpc-drop-a-connection", + id: "PrivateLink-drop-a-connection", }, ], }, diff --git a/versioned_docs/version-1.8/get-started.md b/versioned_docs/version-1.8/get-started.md index ebe6f7e38..6f620f0ba 100644 --- a/versioned_docs/version-1.8/get-started.md +++ b/versioned_docs/version-1.8/get-started.md @@ -38,7 +38,7 @@ risingwave Ensure [Docker Desktop](https://docs.docker.com/get-docker/) is installed and running in your environment. ```shell -docker run -it --pull=always -p 4566:4566 -p 5691:5691 risingwavelabs/risingwave:v1.7.0-standalone single_node +docker run -it --pull=always -p 4566:4566 -p 5691:5691 risingwavelabs/risingwave:latest single_node ``` ### Homebrew @@ -47,7 +47,7 @@ Ensure [Homebrew](https://brew.sh/) is installed, and run the following commands ```shell brew tap risingwavelabs/risingwave -brew install risingwave@1.7-standalone +brew install risingwave risingwave ``` @@ -151,4 +151,4 @@ Congratulations! You've successfully started RisingWave and conducted some initi * [Example A: Ingest data from Kafka](https://github.com/risingwavelabs/awesome-stream-processing/blob/main/00-get-started/01-ingest-kafka-data.md) * [Example B: Ingest data from Postgres CDC](https://github.com/risingwavelabs/awesome-stream-processing/blob/main/00-get-started/02-ingest-pg-cdc.md) - See [this GitHub directory](https://github.com/risingwavelabs/risingwave/tree/main/integration_tests) for ready-to-run demos and integration examples. -- Read our documentation to learn about how to ingest data from data streaming sources, transform data, and deliver data to downstream systems. \ No newline at end of file +- Read our documentation to learn about how to ingest data from data streaming sources, transform data, and deliver data to downstream systems.