Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Verify JVM configuration in Snowflake connector #24717

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

hashhar
Copy link
Member

@hashhar hashhar commented Jan 15, 2025

Description

Same as in BigQuery we need additional JVM args in Snowflake connector since it's using Arrow unconditionally.

Release notes

(x) This is not user-visible or is docs only, and no release notes are required.
( ) Release notes are required. Please propose a release note for me.
( ) Release notes are required, with the following suggested text:

## Section
* Fix some things. ({issue}`issuenumber`)

@hashhar hashhar requested a review from ebyhr January 15, 2025 17:59
@cla-bot cla-bot bot added the cla-signed label Jan 15, 2025
@github-actions github-actions bot added the bigquery BigQuery connector label Jan 15, 2025
@hashhar hashhar force-pushed the hashhar/snow-arrow-jvm-check branch from 8ef0ea9 to 3be054f Compare January 15, 2025 18:03
@github-actions github-actions bot added the docs label Jan 15, 2025
hashhar and others added 2 commits January 15, 2025 23:34
This will be used in Snowflake connector too in the next commit.

Co-authored-by: Mateusz "Serafin" Gajewski <[email protected]>
Snowflake connector uses Arrow which requires additional JVM arguments.
The docs were also incorrect, Arrow is always used unconditionally in
the Snowflake connector.
@hashhar hashhar force-pushed the hashhar/snow-arrow-jvm-check branch from 3be054f to 5dc3c32 Compare January 15, 2025 18:05
@hashhar hashhar requested a review from mosabua January 15, 2025 18:05
Copy link
Member

@mosabua mosabua left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The suggested doc changes are good enough. Ideally however we even create a Requirements section in the connector doc like we have in others.

- Using Apache Arrow serialization is disabled by default. In order to enable
it, add `--add-opens=java.base/java.nio=ALL-UNNAMED` to the Trino
{ref}`jvm-config`.
Snowflake connector uses Apache Arrow as the serialization format when
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Snowflake connector uses Apache Arrow as the serialization format when
The Snowflake connector uses Apache Arrow as the serialization format when

it, add `--add-opens=java.base/java.nio=ALL-UNNAMED` to the Trino
{ref}`jvm-config`.
Snowflake connector uses Apache Arrow as the serialization format when
reading from Snowflake which requires additional JVM arguments. Add
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
reading from Snowflake which requires additional JVM arguments. Add
reading from Snowflake. Add the the following required, additional JVM argument to the [](jvm-config):

And then have a codeblock with the line

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Development

Successfully merging this pull request may close these issues.

2 participants