diff --git a/.formatter.exs b/.formatter.exs new file mode 100644 index 0000000..d2cda26 --- /dev/null +++ b/.formatter.exs @@ -0,0 +1,4 @@ +# Used by "mix format" +[ + inputs: ["{mix,.formatter}.exs", "{config,lib,test}/**/*.{ex,exs}"] +] diff --git a/.gitignore b/.gitignore new file mode 100644 index 0000000..34a8914 --- /dev/null +++ b/.gitignore @@ -0,0 +1,26 @@ +# The directory Mix will write compiled artifacts to. +/_build/ + +# If you run "mix test --cover", coverage assets end up here. +/cover/ + +# The directory Mix downloads your dependencies sources to. +/deps/ + +# Where third-party dependencies like ExDoc output generated docs. +/doc/ + +# Ignore .fetch files in case you like to edit your project deps locally. +/.fetch + +# If the VM crashes, it generates a dump, let's ignore it too. +erl_crash.dump + +# Also ignore archive artifacts (built via "mix archive.build"). +*.ez + +# Ignore package tarball (built via "mix hex.build"). +req_snowflake-*.tar + +# Temporary files, for example, from tests. +/tmp/ diff --git a/README.md b/README.md index 69bd997..86d12b4 100644 --- a/README.md +++ b/README.md @@ -1 +1,117 @@ -Initial blank repo, see Pull Request :) +# ReqSnowflake +[Req](https://github.com/wojtekmach/req) plugin for [Snowflake](https://www.snowflake.com). + +**WIP, NOT PRODUCTION READY YET** + +NOTE: THIS DRIVER/CONNECTOR IS NOT OFFICIALLY AFFILIATED WITH SNOWFLAKE, NOR HAS OFFICIAL SUPPORT FROM THEM. + +A pure-elixir driver for [Snowflake](https://www.snowflake.com/), the cloud data platform. + +## Usage + +```elixir +Mix.install([ + {:req_snowflake, github: "joshuataylor/req_snowflake"} +]) + +# With plain string query +Req.new() +|> ReqSnowflake.attach( + username: "rosebud", + password: "hunter2", + account_name: "foobar", + region: "us-east-1", + warehouse: "compute_wh", # optional + role: "myrole", # optional + database: "mydb", # optional + schema: "myschema" # optional +) +|> Req.post!(snowflake_query: "select L_ORDERKEY, L_PARTKEY from snowflake_sample_data.tpch_sf1.lineitem limit 2").body +#=> +# %ReqSnowflake.Result{ +# columns: ["L_ORDERKEY", "L_PARTKEY"], +# num_rows: 2, +# rows: [[3_000_001, 14406], [3_000_002, 34422]], +# success: true +# } + +# With query parameters for inserting +Req.new() +|> ReqSnowflake.attach( + username: "rosebud", + password: "hunter2", + account_name: "foobar", + region: "us-east-1", + warehouse: "compute_wh", # optional + role: "myrole", # optional + database: "mydb", # optional + schema: "myschema" # optional +) +|> Req.post!( + snowflake_query: "INSERT INTO \"foo\".\"bar\".\"baz\" (\"hello\") VALUES (?)", + bindings: %{"1" => %{type: "TEXT", value: "xxx"}} +) +#=> +# %ReqSnowflake.Result { +# columns: ["number of rows inserted"], +# num_rows: 1, +# rows: [[1]], +# success: true +# } +``` + +# What this is + +It uses the Snowflake REST API to communicate with Snowflake, with an earlier version set for JSON. +There isn't an Elixir Arrow library (yet!), so it seems that setting an earlier Java version seems +to give us back JSON results. The REST API is used by the Python, Golang, NodeJS and other languages to +send requests to Snowflake, so it is stable and shouldn't just randomly break. I've been using `snowflake_elixir` +(predecessor to this package) in production for 18 months and the API hasn't changed once. + +This library does not use the [Snowflake SQL API](https://docs.snowflake.com/en/developer-guide/sql-api/index.html), which is +limited in its implementation and features. We might as well use the REST API. + +Right now the library doesn't support [MFA](https://docs.snowflake.com/en/user-guide/security-mfa.html), so you'll +need to either use [private key auth](https://docs.snowflake.com/en/user-guide/odbc-parameters.html#using-key-pair-authentication) or +connect using a username & password. A private key auth is highly recommended as you can rotate passwords easier. + +Once I have time I will write a library that will use Arrow responses, as [apparently it's faster](https://www.snowflake.com/blog/fetching-query-results-from-snowflake-just-got-a-lot-faster-with-apache-arrow/) + +One of the major notes when using Ecto is you will need to enable Snowflakes `QUOTED_IDENTIFIERS_IGNORE_CASE` setting, which you can +find here: https://docs.snowflake.com/en/sql-reference/identifiers-syntax.html#third-party-tools-and-case-insensitive-identifier-resolution + +Note that this can be done on an account or if needed on a session level which you can set below. + +## Features +* Queries + +## Short term Roadmap +- Add async queries back, and add option to poll for response +- Document the different insert types +- Add support for caching the token for the user/password/role, with default as `true` +- Pass the review gambit, incorporate feedback from the community :) +- Add this to `snowflake_elixir` as a generic Snowflake library +- Document the library thoroughly + +## Medium term roadmap +- Add support for MFA +- Add support for private key auth +- Add support for the way Snowflake does Arrow streaming, using Arrow Rust with Rustler support we can even provide precompiled binaries! (maybe as a different repository) + +## Thanks +I just want to thank the opensource community, especially dbconnection/ecto/ecto_sql/postgrex for being amazing, and +being able to copy most of the decoding code from that. I also want to thank the [@wojtekmach](https://github.com/wojtekmach) for the awesome work on req :). + +## License + +Copyright (C) 2022 Josh Taylor + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at [http://www.apache.org/licenses/LICENSE-2.0](http://www.apache.org/licenses/LICENSE-2.0) + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. \ No newline at end of file diff --git a/lib/req_snowflake/req_snowflake.ex b/lib/req_snowflake/req_snowflake.ex new file mode 100644 index 0000000..e0b4861 --- /dev/null +++ b/lib/req_snowflake/req_snowflake.ex @@ -0,0 +1,297 @@ +defmodule ReqSnowflake do + @moduledoc """ + `Req` plugin for [Snowflake](https://www.snowflake.com). + + It uses the Snowflake REST API to communicate with Snowflake, with an earlier version set for JSON. + There isn't an Elixir Arrow library (yet!), so it seems that setting an earlier Java version seems + to give us back JSON results. The REST API is used by the Python, Golang and other languages to + send requests to Snowflake, so it is stable and shouldn't just randomly break. + + It is not the [Snowflake SQL API](https://docs.snowflake.com/en/developer-guide/sql-api/index.html), which is + limited in its implementation. + + Right now the library doesn't support [MFA](https://docs.snowflake.com/en/user-guide/security-mfa.html), so you'll + need to either use [private key auth](https://docs.snowflake.com/en/user-guide/odbc-parameters.html#using-key-pair-authentication) or + connect using a username & password. A private key auth is highly recommended as you can rotate passwords easier. + + This module handles the connection to Snowflake. + + The module also supports async queries, which means that the query will be sent to Snowflake, then polled for updates. + This means that there won't (hopefully) be long connection times, so deploying should be easier as you won't run the risk + of killing running queries. + + Query results are decoded into the `ReqSnowflake.Result` struct. + + The struct implements the `Table.Reader` protocol and thus can be efficiently traversed by rows or columns. + """ + + alias Req.Request + alias ReqSnowflake.Result + alias ReqSnowflake.Snowflake + + @allowed_options ~w(snowflake_query username password account_name region warehouse role database schema query_tag rows_per_resultset statement_timeout_in_seconds application_name bindings)a + + @doc """ + Attaches to Req request, used for querying Snowflake. + + ## Request Options + * `:account_name` - Required. Account name. This is usually the name between the https:// and us-east-1 (or whatever region). + If unsure, run `select current_account();` in Snowflake. + * `:region` - Required. Your snowflake region, the region is found between the account name and ".snowflakecomputing.com" on the portal. + If unsure, run `select current_region();` in Snowflake to show it. Example is `us-east-1`. Region names and their + IDs can be [found here](https://docs.snowflake.com/en/user-guide/intro-regions.html) + """ + @spec attach(Request.t(), keyword()) :: Request.t() + def attach(%Request{} = request, options \\ []) do + request + |> Request.prepend_request_steps(snowflake_run: &run/1) + |> Request.register_options(@allowed_options) + |> Request.merge_options(options) + end + + # Bindings happen for inserts + defp run( + %Request{ + options: + %{ + account_name: account_name, + region: region, + snowflake_query: query, + bindings: bindings + } = options + } = request + ) do + token = ReqSnowflake.SnowflakeLogin.get_snowflake_login_token(options) + base_url = Snowflake.snowflake_url(account_name, region) + + %{request | url: URI.parse(snowflake_query_url(base_url))} + |> Request.merge_options(json: snowflake_insert_headers(query, bindings)) + |> Request.put_header("accept", "application/snowflake") + |> Request.put_header("Authorization", "Snowflake Token=\"#{token}\"") + |> Request.append_response_steps(snowflake_decode_response: &decode/1) + end + + defp run(%Request{options: %{snowflake_query: query}} = request) do + token = ReqSnowflake.SnowflakeLogin.get_snowflake_login_token(request.options) + base_url = Snowflake.snowflake_url(request.options[:account_name], request.options[:region]) + + %{request | url: URI.parse(snowflake_query_url(base_url))} + |> Request.merge_options(json: snowflake_query_body(query)) + |> Request.put_header("accept", "application/snowflake") + |> Request.put_header("Authorization", "Snowflake Token=\"#{token}\"") + |> Request.append_response_steps(snowflake_decode_response: &decode/1) + end + + defp run(%Request{} = request), do: request + + defp decode({request, %{status: 200} = response}) do + {request, update_in(response.body, &decode_body/1)} + end + + defp decode(any), do: any + + defp decode_body(%{"success" => true} = data) do + data + |> Map.get("data") + |> Map.get("queryResultFormat") + |> process_query_result_format(data["data"]) + end + + defp process_query_result_format( + "json", + %{ + "rowset" => [], + "rowtype" => row_type, + "total" => total, + "chunks" => chunks, + "chunkHeaders" => %{ + "x-amz-server-side-encryption-customer-key" => key, + "x-amz-server-side-encryption-customer-key-md5" => md5 + } + } + ) do + urls = Enum.map(chunks, fn %{"url" => url} -> url end) + + parsed = + Task.async_stream(urls, fn url -> s3_get_json(url, key, md5) end, max_concurrency: 5) + |> Enum.map(fn {:ok, result} -> result end) + |> Enum.join(", ") + + rows = Jason.decode!("[#{parsed}]") + + row_data = process_row_data(rows, row_type) + + columns = Enum.map(row_type, fn %{"name" => name} -> name end) + + %Result{ + success: true, + rows: row_data, + columns: columns, + num_rows: total + } + end + + defp process_query_result_format( + "json", + %{ + "rowset" => rowset, + "rowtype" => row_type, + "total" => total, + "chunks" => chunks, + "chunkHeaders" => %{ + "x-amz-server-side-encryption-customer-key" => key, + "x-amz-server-side-encryption-customer-key-md5" => md5 + } + } + ) do + parsed = + chunks + |> Enum.map(fn %{"url" => url} -> url end) + |> Task.async_stream(fn url -> s3_get_json(url, key, md5) end, max_concurrency: 5) + |> Enum.map(fn {:ok, result} -> result end) + |> Enum.join(", ") + + rows = Jason.decode!("[#{parsed}]") + + row_data = process_row_data(rowset, row_type) ++ process_row_data(rows, row_type) + + columns = Enum.map(row_type, fn %{"name" => name} -> name end) + + %Result{ + success: true, + rows: row_data, + columns: columns, + num_rows: total + } + end + + defp process_query_result_format( + "json", + %{"rowset" => rows, "rowtype" => row_type, "total" => total} + ) do + row_data = process_row_data(rows, row_type) + + columns = Enum.map(row_type, fn %{"name" => name} -> name end) + + %Result{ + success: true, + rows: row_data, + columns: columns, + num_rows: total + } + end + + defp snowflake_query_url(host) do + uuid = Application.get_env(:req_snowflake, :snowflake_uuid, UUID.uuid4()) + + "#{host}/queries/v1/query-request?requestId=#{uuid}" + end + + defp snowflake_query_body(query) when is_binary(query) do + %{ + sqlText: query, + sequenceId: 0, + bindings: nil, + bindStage: nil, + describeOnly: false, + parameters: %{ + CLIENT_RESULT_CHUNK_SIZE: 48 + }, + describedJobId: nil, + isInternal: false, + asyncExec: false + } + end + + defp snowflake_insert_headers(query, bindings) when is_binary(query) and is_map(bindings) do + %{ + sqlText: query, + sequenceId: 0, + bindings: bindings, + bindStage: nil, + describeOnly: false, + parameters: %{ + CLIENT_RESULT_CHUNK_SIZE: 48 + }, + describedJobId: nil, + isInternal: false, + asyncExec: false + } + end + + defp s3_get_json(url, encryption_key, encryption_key_md5) do + Req.new(url: url) + |> Request.put_header("accept", "application/snowflake") + |> Request.put_header("x-amz-server-side-encryption-customer-key", encryption_key) + |> Request.put_header("x-amz-server-side-encryption-customer-key-md5", encryption_key_md5) + |> Req.get!() + |> Map.get(:body) + end + + defp process_row_data(rows, row_type) do + rows + |> Stream.map(fn r -> + r + |> Stream.with_index() + |> Stream.map(fn {rr, column_no} -> + decode_column(Enum.at(row_type, column_no), rr) + end) + |> Enum.to_list() + end) + |> Enum.to_list() + end + + # Decodes a column type of null to nil + def decode_column(%{"scale" => 0, "type" => "fixed", "byteLength" => nil}, nil) do + nil + end + + def decode_column(_, nil), do: nil + + def decode_column(%{"type" => "date"}, value) do + unix_time = String.to_integer(value) * 86400 + + case DateTime.from_unix(unix_time) do + {:ok, time} -> DateTime.to_date(time) + _ -> {:error, value} + end + end + + def decode_column(%{"type" => "timestamp_ntz"}, value) do + String.replace(value, ".", "") + |> String.slice(0..-4) + |> String.to_integer() + |> DateTime.from_unix!(:microsecond) + end + + def decode_column(%{"type" => "timestamp_tz"}, value) do + value + |> String.split(" ") + |> hd + |> String.replace(".", "") + |> String.slice(0..-4) + |> String.to_integer() + |> DateTime.from_unix!(:microsecond) + end + + def decode_column(%{"type" => "timestamp_ltz"}, value) do + String.replace(value, ".", "") + |> String.slice(0..-4) + |> String.to_integer() + |> DateTime.from_unix!(:second) + end + + # Decodes an integer column type + def decode_column(%{"scale" => 0, "type" => "fixed", "byteLength" => nil}, value) do + case Integer.parse(value) do + {num, ""} -> + num + + _ -> + value + end + end + + # for everything else, just return the value + def decode_column(_, value), do: value +end diff --git a/lib/req_snowflake/req_snowflake_login.ex b/lib/req_snowflake/req_snowflake_login.ex new file mode 100644 index 0000000..aa43271 --- /dev/null +++ b/lib/req_snowflake/req_snowflake_login.ex @@ -0,0 +1,63 @@ +defmodule ReqSnowflakeLogin do + @moduledoc false + + alias Req.Request + alias ReqSnowflake.Snowflake + + @allowed_options ~w(username password account_name region warehouse role database schema query_tag rows_per_resultset statement_timeout_in_seconds application_name)a + + @doc """ + Attaches to Req request, used for logging into Snowflake. + + ## Request Options + * `:username` - Required. The username for your account. + * `:password` - Required. The password for your account. + * `:account_name` - Required. Account name. This is usually the name between the https:// and us-east-1 (or whatever region). + If unsure, run `select current_account();` in Snowflake. + * `:region` - Required. Your snowflake region, the region is found between the account name and ".snowflakecomputing.com" on the portal. + If unsure, run `select current_region();` in Snowflake to show it. Example is `us-east-1`. Region names and their + IDs can be [found here](https://docs.snowflake.com/en/user-guide/intro-regions.html) + * `:warehouse` - Optional. The warehouse to use on Snowflake. If none set, will use default for the account. + * `:role` - Optional. The role to use. If none set, will use default for the account. Default role is public otherwise. + * `:database` - Optional. If set the database to connect to by default. + * `:schema` - Optional. If set the schema to connect to by default. + * `:query_tag` - Optional. If set, a query tag is attached to all queries ran in this session. [See QUERY_TAG](https://docs.snowflake.com/en/sql-reference/parameters.html#query-tag) + * `:rows_per_resultset` - Optional. If set, how many rows to return when issuing a query for the session. Default is unlimited (0). [See ROWS_PER_RESULTSET](https://docs.snowflake.com/en/sql-reference/parameters.html#rows-per-resultset) + * `:statement_timeout_in_seconds` - Optional. If set, the amount of time, in seconds, after which a running SQL statement (query, DDL, DML, etc.) is canceled by the system. + [See STATEMENT_TIMEOUT_IN_SECONDS](https://docs.snowflake.com/en/sql-reference/parameters.html#rows-per-resultset) + * `:application_name` - Optional. If set, the application name will show when viewing logs in Snowflake. Defaults to `req_snowflake`. + * `:cache` - Optional. Defaults to true, if false will not cache the token using persistent storage. + """ + @spec attach(Request.t(), keyword()) :: Request.t() + def attach(%Request{} = request, options \\ []) do + request + |> Request.register_options(@allowed_options) + |> Request.merge_options(options) + |> Req.Request.prepend_request_steps(req_snowflake_login_parse_url: &auth_snowflake/1) + end + + defp auth_snowflake( + %{ + url: %URI{host: host}, + options: %{account_name: account_name, region: region} + } = request + ) do + # Since we can't get the host before now, we have to check here. + if Snowflake.snowflake_host(account_name, region) == host do + keyword_options = Keyword.new(request.options) + token = ReqSnowflake.SnowflakeLogin.get_snowflake_login_token(keyword_options) + + return_token(request, token) + else + request + end + end + + # If we the token exists, we add it to the header response. + defp return_token(request, token) when is_binary(token) do + update_in(request.headers, &[{"authorization", "Snowflake Token=\"#{token}\""} | &1]) + end + + # Otherwise just return the error back upstream + defp return_token(request, %RuntimeError{} = error), do: {request, error} +end diff --git a/lib/req_snowflake/result.ex b/lib/req_snowflake/result.ex new file mode 100644 index 0000000..6771aaf --- /dev/null +++ b/lib/req_snowflake/result.ex @@ -0,0 +1,29 @@ +defmodule ReqSnowflake.Result do + @type t :: %__MODULE__{ + columns: nil | [String.t()], + rows: [tuple], + num_rows: integer, + metadata: [map()], + messages: [map()], + success: boolean + } + + defstruct columns: nil, + rows: nil, + num_rows: 0, + metadata: [], + messages: [], + success: false +end + +if Code.ensure_loaded?(Table.Reader) do + defimpl Table.Reader, for: SnowflakeEx.Result do + def init(%{columns: columns}) when columns in [nil, []] do + {:rows, %{columns: []}, []} + end + + def init(result) do + {:rows, %{columns: result.columns}, result.rows} + end + end +end diff --git a/lib/req_snowflake/snowflake.ex b/lib/req_snowflake/snowflake.ex new file mode 100644 index 0000000..93a6819 --- /dev/null +++ b/lib/req_snowflake/snowflake.ex @@ -0,0 +1,20 @@ +defmodule ReqSnowflake.Snowflake do + @moduledoc false + + # This is set as the base URL is dynamic, so makes it easier for testing using bypass. + def snowflake_host(account_name, region) do + Application.get_env( + :req_snowflake, + :snowflake_hostname, + "#{account_name}.#{region}.snowflakecomputing.com" + ) + end + + def snowflake_url(account_name, region) do + Application.get_env( + :req_snowflake, + :snowflake_url, + "https://#{snowflake_host(account_name, region)}" + ) + end +end diff --git a/lib/req_snowflake/snowflake_login.ex b/lib/req_snowflake/snowflake_login.ex new file mode 100644 index 0000000..969ea54 --- /dev/null +++ b/lib/req_snowflake/snowflake_login.ex @@ -0,0 +1,99 @@ +defmodule ReqSnowflake.SnowflakeLogin do + alias ReqSnowflake.Snowflake + + def get_snowflake_login_token(options) when is_map(options) do + Keyword.new(options) + |> get_snowflake_login_token() + end + + def get_snowflake_login_token(options \\ []) when is_list(options) do + data = build_snowflake_login(options) + url = snowflake_url(options) + + Req.post!(url, json: data) + |> decode_response() + end + + defp snowflake_url(options) when is_list(options) do + account_name = options[:account_name] + region = options[:region] + + base_url = Snowflake.snowflake_url(account_name, region) + + query_params = + %{ + databaseName: Keyword.get(options, :database, nil), + schemaName: Keyword.get(options, :schema, nil), + roleName: Keyword.get(options, :role, nil), + warehouse: Keyword.get(options, :warehouse, nil) + } + |> Enum.filter(fn {_, v} -> v != nil end) + |> URI.encode_query() + + "#{base_url}/session/v1/login-request" + |> URI.parse() + |> Map.put(:query, query_params) + |> URI.to_string() + end + + defp decode_response(%Req.Response{status: 200, body: %{"data" => %{"token" => token}}}), + do: token + + defp decode_response(%Req.Response{body: %{"message" => message}}), + do: RuntimeError.exception(message) + + defp decode_response(%Req.Response{body: body}), do: RuntimeError.exception(body) + + # Something unexpected happened, such as the server being down, timeout, etc. + # Probably wise to later bubble this back up to the user? + defp decode_response(error), do: RuntimeError.exception(error) + + @spec build_snowflake_login( + database: String.t(), + account_name: String.t(), + username: String.t(), + password: String.t(), + region: String.t(), + application_name: String.t() + ) :: {:ok, %{token: String.t(), session_id: String.t()}} | {:error, String.t()} + # Builds the snowflake login map that needs to sent in JSON. + # This will soon need to support many options, so it's moved to its own function. + # Full list of parameters needed to be supported: https://docs.snowflake.com/en/sql-reference/parameters.html + defp build_snowflake_login(options) do + %{ + data: %{ + ACCOUNT_NAME: options[:account_name], + PASSWORD: options[:password], + # This way we get JSON results + CLIENT_APP_ID: "JavaScript", + # Version supporting JSON results + CLIENT_APP_VERSION: "1.5.3", + LOGIN_NAME: options[:username], + SESSION_PARAMETERS: %{ + VALIDATE_DEFAULT_PARAMETERS: true, + QUOTED_IDENTIFIERS_IGNORE_CASE: true + }, + CLIENT_ENVIRONMENT: %{ + tracing: "DEBUG", + OS: "Linux", + OCSP_MODE: "FAIL_OPEN", + APPLICATION: Keyword.get(options, :application_name, "req_snowflake"), + serverURL: + "https://#{options[:account_name]}.#{options[:region]}.snowflakecomputing.com", + role: options[:role], + user: options[:username], + account: options[:account_name] + } + } + } + |> add_client_environment(:schema, Keyword.get(options, :schema)) + |> add_client_environment(:warehouse, Keyword.get(options, :warehouse)) + |> add_client_environment(:database, Keyword.get(options, :database)) + end + + defp add_client_environment(data, _key, nil), do: data + + defp add_client_environment(data, key, value) do + put_in(data, [:data, :CLIENT_ENVIRONMENT, key], value) + end +end diff --git a/mix.exs b/mix.exs new file mode 100644 index 0000000..43a2155 --- /dev/null +++ b/mix.exs @@ -0,0 +1,55 @@ +defmodule ReqSnowflake.MixProject do + use Mix.Project + + @version "0.1.0-dev" + @source_url "https://github.com/joshuataylor/req_snowflake" + + def project do + [ + app: :req_snowflake, + version: @version, + elixir: "~> 1.12", + start_permanent: Mix.env() == :prod, + deps: deps(), + aliases: aliases(), + docs: docs(), + preferred_cli_env: [ + "test.all": :test, + docs: :docs, + "hex.publish": :docs + ] + ] + end + + def application do + [ + extra_applications: [:logger] + ] + end + + defp docs do + [ + main: "readme", + source_url: @source_url, + source_ref: "v#{@version}", + extras: ["README.md"] + ] + end + + defp deps do + [ + {:decimal, "~> 2.0"}, + {:req, github: "wojtekmach/req"}, + {:table, "~> 0.1.1", optional: true}, + {:jason, "~> 1.2"}, + {:ex_doc, ">= 0.0.0", only: :dev, runtime: false, optional: true}, + {:elixir_uuid, "~> 1.2"}, + {:dialyxir, "~> 1.0", only: [:dev], runtime: false, optional: true}, + {:bypass, "~> 2.1", only: :test} + ] + end + + def aliases do + ["test.all": ["test --include integration"]] + end +end diff --git a/mix.lock b/mix.lock new file mode 100644 index 0000000..dcd00a2 --- /dev/null +++ b/mix.lock @@ -0,0 +1,31 @@ +%{ + "bypass": {:hex, :bypass, "2.1.0", "909782781bf8e20ee86a9cabde36b259d44af8b9f38756173e8f5e2e1fabb9b1", [:mix], [{:plug, "~> 1.7", [hex: :plug, repo: "hexpm", optional: false]}, {:plug_cowboy, "~> 2.0", [hex: :plug_cowboy, repo: "hexpm", optional: false]}, {:ranch, "~> 1.3", [hex: :ranch, repo: "hexpm", optional: false]}], "hexpm", "d9b5df8fa5b7a6efa08384e9bbecfe4ce61c77d28a4282f79e02f1ef78d96b80"}, + "castore": {:hex, :castore, "0.1.17", "ba672681de4e51ed8ec1f74ed624d104c0db72742ea1a5e74edbc770c815182f", [:mix], [], "hexpm", "d9844227ed52d26e7519224525cb6868650c272d4a3d327ce3ca5570c12163f9"}, + "cowboy": {:hex, :cowboy, "2.9.0", "865dd8b6607e14cf03282e10e934023a1bd8be6f6bacf921a7e2a96d800cd452", [:make, :rebar3], [{:cowlib, "2.11.0", [hex: :cowlib, repo: "hexpm", optional: false]}, {:ranch, "1.8.0", [hex: :ranch, repo: "hexpm", optional: false]}], "hexpm", "2c729f934b4e1aa149aff882f57c6372c15399a20d54f65c8d67bef583021bde"}, + "cowboy_telemetry": {:hex, :cowboy_telemetry, "0.4.0", "f239f68b588efa7707abce16a84d0d2acf3a0f50571f8bb7f56a15865aae820c", [:rebar3], [{:cowboy, "~> 2.7", [hex: :cowboy, repo: "hexpm", optional: false]}, {:telemetry, "~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "7d98bac1ee4565d31b62d59f8823dfd8356a169e7fcbb83831b8a5397404c9de"}, + "cowlib": {:hex, :cowlib, "2.11.0", "0b9ff9c346629256c42ebe1eeb769a83c6cb771a6ee5960bd110ab0b9b872063", [:make, :rebar3], [], "hexpm", "2b3e9da0b21c4565751a6d4901c20d1b4cc25cbb7fd50d91d2ab6dd287bc86a9"}, + "decimal": {:hex, :decimal, "2.0.0", "a78296e617b0f5dd4c6caf57c714431347912ffb1d0842e998e9792b5642d697", [:mix], [], "hexpm", "34666e9c55dea81013e77d9d87370fe6cb6291d1ef32f46a1600230b1d44f577"}, + "dialyxir": {:hex, :dialyxir, "1.1.0", "c5aab0d6e71e5522e77beff7ba9e08f8e02bad90dfbeffae60eaf0cb47e29488", [:mix], [{:erlex, ">= 0.2.6", [hex: :erlex, repo: "hexpm", optional: false]}], "hexpm", "07ea8e49c45f15264ebe6d5b93799d4dd56a44036cf42d0ad9c960bc266c0b9a"}, + "earmark_parser": {:hex, :earmark_parser, "1.4.25", "2024618731c55ebfcc5439d756852ec4e85978a39d0d58593763924d9a15916f", [:mix], [], "hexpm", "56749c5e1c59447f7b7a23ddb235e4b3defe276afc220a6227237f3efe83f51e"}, + "elixir_uuid": {:hex, :elixir_uuid, "1.2.1", "dce506597acb7e6b0daeaff52ff6a9043f5919a4c3315abb4143f0b00378c097", [:mix], [], "hexpm", "f7eba2ea6c3555cea09706492716b0d87397b88946e6380898c2889d68585752"}, + "erlex": {:hex, :erlex, "0.2.6", "c7987d15e899c7a2f34f5420d2a2ea0d659682c06ac607572df55a43753aa12e", [:mix], [], "hexpm", "2ed2e25711feb44d52b17d2780eabf998452f6efda104877a3881c2f8c0c0c75"}, + "ex_doc": {:hex, :ex_doc, "0.28.4", "001a0ea6beac2f810f1abc3dbf4b123e9593eaa5f00dd13ded024eae7c523298", [:mix], [{:earmark_parser, "~> 1.4.19", [hex: :earmark_parser, repo: "hexpm", optional: false]}, {:makeup_elixir, "~> 0.14", [hex: :makeup_elixir, repo: "hexpm", optional: false]}, {:makeup_erlang, "~> 0.1", [hex: :makeup_erlang, repo: "hexpm", optional: false]}], "hexpm", "bf85d003dd34911d89c8ddb8bda1a958af3471a274a4c2150a9c01c78ac3f8ed"}, + "finch": {:hex, :finch, "0.12.0", "6bbb3e0bb62dd91cd1217d9682a30f5bfc9b0b74950bf10a0b4d4399c2076892", [:mix], [{:castore, "~> 0.1", [hex: :castore, repo: "hexpm", optional: false]}, {:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:mint, "~> 1.3", [hex: :mint, repo: "hexpm", optional: false]}, {:nimble_options, "~> 0.4.0", [hex: :nimble_options, repo: "hexpm", optional: false]}, {:nimble_pool, "~> 0.2.6", [hex: :nimble_pool, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "320da3f32459e7dcb77f4271b4f2445ba6c5d32cc3c7cca8e2cff599e24be5a6"}, + "hpax": {:hex, :hpax, "0.1.1", "2396c313683ada39e98c20a75a82911592b47e5c24391363343bde74f82396ca", [:mix], [], "hexpm", "0ae7d5a0b04a8a60caf7a39fcf3ec476f35cc2cc16c05abea730d3ce6ac6c826"}, + "jason": {:hex, :jason, "1.3.0", "fa6b82a934feb176263ad2df0dbd91bf633d4a46ebfdffea0c8ae82953714946", [:mix], [{:decimal, "~> 1.0 or ~> 2.0", [hex: :decimal, repo: "hexpm", optional: true]}], "hexpm", "53fc1f51255390e0ec7e50f9cb41e751c260d065dcba2bf0d08dc51a4002c2ac"}, + "makeup": {:hex, :makeup, "1.1.0", "6b67c8bc2882a6b6a445859952a602afc1a41c2e08379ca057c0f525366fc3ca", [:mix], [{:nimble_parsec, "~> 1.2.2 or ~> 1.3", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm", "0a45ed501f4a8897f580eabf99a2e5234ea3e75a4373c8a52824f6e873be57a6"}, + "makeup_elixir": {:hex, :makeup_elixir, "0.16.0", "f8c570a0d33f8039513fbccaf7108c5d750f47d8defd44088371191b76492b0b", [:mix], [{:makeup, "~> 1.0", [hex: :makeup, repo: "hexpm", optional: false]}, {:nimble_parsec, "~> 1.2.3", [hex: :nimble_parsec, repo: "hexpm", optional: false]}], "hexpm", "28b2cbdc13960a46ae9a8858c4bebdec3c9a6d7b4b9e7f4ed1502f8159f338e7"}, + "makeup_erlang": {:hex, :makeup_erlang, "0.1.1", "3fcb7f09eb9d98dc4d208f49cc955a34218fc41ff6b84df7c75b3e6e533cc65f", [:mix], [{:makeup, "~> 1.0", [hex: :makeup, repo: "hexpm", optional: false]}], "hexpm", "174d0809e98a4ef0b3309256cbf97101c6ec01c4ab0b23e926a9e17df2077cbb"}, + "mime": {:hex, :mime, "2.0.2", "0b9e1a4c840eafb68d820b0e2158ef5c49385d17fb36855ac6e7e087d4b1dcc5", [:mix], [], "hexpm", "e6a3f76b4c277739e36c2e21a2c640778ba4c3846189d5ab19f97f126df5f9b7"}, + "mint": {:hex, :mint, "1.4.1", "49b3b6ea35a9a38836d2ad745251b01ca9ec062f7cb66f546bf22e6699137126", [:mix], [{:castore, "~> 0.1.0", [hex: :castore, repo: "hexpm", optional: true]}, {:hpax, "~> 0.1.1", [hex: :hpax, repo: "hexpm", optional: false]}], "hexpm", "cd261766e61011a9079cccf8fa9d826e7a397c24fbedf0e11b49312bea629b58"}, + "nimble_options": {:hex, :nimble_options, "0.4.0", "c89babbab52221a24b8d1ff9e7d838be70f0d871be823165c94dd3418eea728f", [:mix], [], "hexpm", "e6701c1af326a11eea9634a3b1c62b475339ace9456c1a23ec3bc9a847bca02d"}, + "nimble_parsec": {:hex, :nimble_parsec, "1.2.3", "244836e6e3f1200c7f30cb56733fd808744eca61fd182f731eac4af635cc6d0b", [:mix], [], "hexpm", "c8d789e39b9131acf7b99291e93dae60ab48ef14a7ee9d58c6964f59efb570b0"}, + "nimble_pool": {:hex, :nimble_pool, "0.2.6", "91f2f4c357da4c4a0a548286c84a3a28004f68f05609b4534526871a22053cde", [:mix], [], "hexpm", "1c715055095d3f2705c4e236c18b618420a35490da94149ff8b580a2144f653f"}, + "plug": {:hex, :plug, "1.13.6", "187beb6b67c6cec50503e940f0434ea4692b19384d47e5fdfd701e93cadb4cc2", [:mix], [{:mime, "~> 1.0 or ~> 2.0", [hex: :mime, repo: "hexpm", optional: false]}, {:plug_crypto, "~> 1.1.1 or ~> 1.2", [hex: :plug_crypto, repo: "hexpm", optional: false]}, {:telemetry, "~> 0.4.3 or ~> 1.0", [hex: :telemetry, repo: "hexpm", optional: false]}], "hexpm", "02b9c6b9955bce92c829f31d6284bf53c591ca63c4fb9ff81dfd0418667a34ff"}, + "plug_cowboy": {:hex, :plug_cowboy, "2.5.2", "62894ccd601cf9597e2c23911ff12798a8a18d237e9739f58a6b04e4988899fe", [:mix], [{:cowboy, "~> 2.7", [hex: :cowboy, repo: "hexpm", optional: false]}, {:cowboy_telemetry, "~> 0.3", [hex: :cowboy_telemetry, repo: "hexpm", optional: false]}, {:plug, "~> 1.7", [hex: :plug, repo: "hexpm", optional: false]}], "hexpm", "ea6e87f774c8608d60c8d34022a7d073bd7680a0a013f049fc62bf35efea1044"}, + "plug_crypto": {:hex, :plug_crypto, "1.2.2", "05654514ac717ff3a1843204b424477d9e60c143406aa94daf2274fdd280794d", [:mix], [], "hexpm", "87631c7ad914a5a445f0a3809f99b079113ae4ed4b867348dd9eec288cecb6db"}, + "ranch": {:hex, :ranch, "1.8.0", "8c7a100a139fd57f17327b6413e4167ac559fbc04ca7448e9be9057311597a1d", [:make, :rebar3], [], "hexpm", "49fbcfd3682fab1f5d109351b61257676da1a2fdbe295904176d5e521a2ddfe5"}, + "req": {:git, "https://github.com/wojtekmach/req.git", "caeefaea471c8e864287b4cff4c259c80ae35e68", []}, + "table": {:hex, :table, "0.1.1", "e8d3c75bb14b8dac227e17d85a626e695d96c271569e94e4a2d15494a7cb0b49", [:mix], [], "hexpm", "6a73280b2f5ad70474594feeb8c034ad490b97dbd5adaeb678f71c250cbc928c"}, + "telemetry": {:hex, :telemetry, "1.1.0", "a589817034a27eab11144ad24d5c0f9fab1f58173274b1e9bae7074af9cbee51", [:rebar3], [], "hexpm", "b727b2a1f75614774cff2d7565b64d0dfa5bd52ba517f16543e6fc7efcc0df48"}, +} diff --git a/priv/testing/s3/response_1.json b/priv/testing/s3/response_1.json new file mode 100644 index 0000000..36d20f3 --- /dev/null +++ b/priv/testing/s3/response_1.json @@ -0,0 +1,3 @@ +["3003586","193197","755","1","24.00","30964.56","0.03","0.02","N","O","9630","9605","9651","COLLECT COD","RAIL","quests haggle furiously regular, "], +["3003586","54440","6946","2","49.00","68327.56","0.01","0.05","N","O","9658","9636","9673","DELIVER IN PERSON","FOB","heodolites are furio"], +["3003586","164196","6713","3","37.00","46627.03","0.02","0.02","N","O","9649","9663","9675","TAKE BACK RETURN","AIR","de of the unusual, regular "] \ No newline at end of file diff --git a/priv/testing/s3/response_2.json b/priv/testing/s3/response_2.json new file mode 100644 index 0000000..c0ad506 --- /dev/null +++ b/priv/testing/s3/response_2.json @@ -0,0 +1,3 @@ +["1203681","53253","769","4","47.00","56693.75","0.05","0.06","R","F","9141","9192","9155","DELIVER IN PERSON","AIR","e. furiously ironic requests wake careful"], +["1203681","14329","1833","5","4.00","4973.28","0.00","0.08","A","F","9155","9215","9170","NONE","AIR","s. slyly final"], +["1203682","11930","6933","1","13.00","23945.09","0.06","0.04","R","F","9235","9242","9259","DELIVER IN PERSON","TRUCK","ts use above the "] \ No newline at end of file diff --git a/priv/testing/snowflake_bootstrap_data_request_response.json b/priv/testing/snowflake_bootstrap_data_request_response.json new file mode 100644 index 0000000..7f01892 --- /dev/null +++ b/priv/testing/snowflake_bootstrap_data_request_response.json @@ -0,0 +1,20 @@ +{ + "code": null, + "data": { + "databases": [ + { + "createdOn": 1111111111111, + "id": 11111111, + "name": "SNOWFLAKE_SAMPLE_DATA", + "providerName": null + } + ], + "extraConsoleElements": [ + "FOOBAR" + ], + "newNotificationCount": 0, + "serverVersion": "6.17.0" + }, + "message": null, + "success": true +} \ No newline at end of file diff --git a/priv/testing/snowflake_inline_query_response.json b/priv/testing/snowflake_inline_query_response.json new file mode 100644 index 0000000..31129eb --- /dev/null +++ b/priv/testing/snowflake_inline_query_response.json @@ -0,0 +1,26 @@ +{ + "data" : { + "parameters" : [{"name":"TIMESTAMP_OUTPUT_FORMAT","value":"YYYY-MM-DD HH24:MI:SS.FF3 TZHTZM"},{"name":"CLIENT_PREFETCH_THREADS","value":4},{"name":"JS_TREAT_INTEGER_AS_BIGINT","value":false},{"name":"TIME_OUTPUT_FORMAT","value":"HH24:MI:SS"},{"name":"TIMESTAMP_TZ_OUTPUT_FORMAT","value":""},{"name":"CLIENT_RESULT_CHUNK_SIZE","value":48},{"name":"CLIENT_SESSION_KEEP_ALIVE","value":false},{"name":"CLIENT_OUT_OF_BAND_TELEMETRY_ENABLED","value":false},{"name":"CLIENT_METADATA_USE_SESSION_DATABASE","value":false},{"name":"ENABLE_STAGE_S3_PRIVATELINK_FOR_US_EAST_1","value":false},{"name":"TIMESTAMP_NTZ_OUTPUT_FORMAT","value":"YYYY-MM-DD HH24:MI:SS.FF3"},{"name":"CLIENT_RESULT_PREFETCH_THREADS","value":1},{"name":"CLIENT_METADATA_REQUEST_USE_CONNECTION_CTX","value":false},{"name":"CLIENT_HONOR_CLIENT_TZ_FOR_TIMESTAMP_NTZ","value":true},{"name":"CLIENT_MEMORY_LIMIT","value":1536},{"name":"CLIENT_TIMESTAMP_TYPE_MAPPING","value":"TIMESTAMP_LTZ"},{"name":"TIMEZONE","value":"America/Los_Angeles"},{"name":"CLIENT_RESULT_PREFETCH_SLOTS","value":2},{"name":"CLIENT_TELEMETRY_ENABLED","value":true},{"name":"CLIENT_USE_V1_QUERY_API","value":true},{"name":"CLIENT_DISABLE_INCIDENTS","value":true},{"name":"CLIENT_RESULT_COLUMN_CASE_INSENSITIVE","value":false},{"name":"BINARY_OUTPUT_FORMAT","value":"HEX"},{"name":"CLIENT_ENABLE_LOG_INFO_STATEMENT_PARAMETERS","value":false},{"name":"JS_DRIVER_DISABLE_OCSP_FOR_NON_SF_ENDPOINTS","value":false},{"name":"CLIENT_TELEMETRY_SESSIONLESS_ENABLED","value":true},{"name":"CLIENT_CONSENT_CACHE_ID_TOKEN","value":false},{"name":"CLIENT_FORCE_PROTECT_ID_TOKEN","value":true},{"name":"DATE_OUTPUT_FORMAT","value":"YYYY-MM-DD"},{"name":"CLIENT_STAGE_ARRAY_BINDING_THRESHOLD","value":65280},{"name":"CLIENT_SESSION_KEEP_ALIVE_HEARTBEAT_FREQUENCY","value":3600},{"name":"CLIENT_SESSION_CLONE","value":false},{"name":"AUTOCOMMIT","value":true},{"name":"TIMESTAMP_LTZ_OUTPUT_FORMAT","value":""} ], + "rowtype" : [{"name":"L_ORDERKEY","database":"SNOWFLAKE_SAMPLE_DATA","schema":"TPCH_SF1","table":"LINEITEM","byteLength":null,"scale":0,"precision":38,"type":"fixed","nullable":false,"collation":null,"length":null},{"name":"L_PARTKEY","database":"SNOWFLAKE_SAMPLE_DATA","schema":"TPCH_SF1","table":"LINEITEM","byteLength":null,"scale":0,"precision":38,"type":"fixed","nullable":false,"collation":null,"length":null},{"name":"L_SUPPKEY","database":"SNOWFLAKE_SAMPLE_DATA","schema":"TPCH_SF1","table":"LINEITEM","byteLength":null,"scale":0,"precision":38,"type":"fixed","nullable":false,"collation":null,"length":null},{"name":"L_LINENUMBER","database":"SNOWFLAKE_SAMPLE_DATA","schema":"TPCH_SF1","table":"LINEITEM","byteLength":null,"scale":0,"precision":38,"type":"fixed","nullable":false,"collation":null,"length":null},{"name":"L_QUANTITY","database":"SNOWFLAKE_SAMPLE_DATA","schema":"TPCH_SF1","table":"LINEITEM","byteLength":null,"scale":2,"precision":12,"type":"fixed","nullable":false,"collation":null,"length":null},{"name":"L_EXTENDEDPRICE","database":"SNOWFLAKE_SAMPLE_DATA","schema":"TPCH_SF1","table":"LINEITEM","byteLength":null,"scale":2,"precision":12,"type":"fixed","nullable":false,"collation":null,"length":null},{"name":"L_DISCOUNT","database":"SNOWFLAKE_SAMPLE_DATA","schema":"TPCH_SF1","table":"LINEITEM","byteLength":null,"scale":2,"precision":12,"type":"fixed","nullable":false,"collation":null,"length":null},{"name":"L_TAX","database":"SNOWFLAKE_SAMPLE_DATA","schema":"TPCH_SF1","table":"LINEITEM","byteLength":null,"scale":2,"precision":12,"type":"fixed","nullable":false,"collation":null,"length":null},{"name":"L_RETURNFLAG","database":"SNOWFLAKE_SAMPLE_DATA","schema":"TPCH_SF1","table":"LINEITEM","byteLength":4,"scale":null,"precision":null,"type":"text","nullable":false,"collation":null,"length":1},{"name":"L_LINESTATUS","database":"SNOWFLAKE_SAMPLE_DATA","schema":"TPCH_SF1","table":"LINEITEM","byteLength":4,"scale":null,"precision":null,"type":"text","nullable":false,"collation":null,"length":1},{"name":"L_SHIPDATE","database":"SNOWFLAKE_SAMPLE_DATA","schema":"TPCH_SF1","table":"LINEITEM","byteLength":null,"scale":null,"precision":null,"type":"date","nullable":false,"collation":null,"length":null},{"name":"L_COMMITDATE","database":"SNOWFLAKE_SAMPLE_DATA","schema":"TPCH_SF1","table":"LINEITEM","byteLength":null,"scale":null,"precision":null,"type":"date","nullable":false,"collation":null,"length":null},{"name":"L_RECEIPTDATE","database":"SNOWFLAKE_SAMPLE_DATA","schema":"TPCH_SF1","table":"LINEITEM","byteLength":null,"scale":null,"precision":null,"type":"date","nullable":false,"collation":null,"length":null},{"name":"L_SHIPINSTRUCT","database":"SNOWFLAKE_SAMPLE_DATA","schema":"TPCH_SF1","table":"LINEITEM","byteLength":100,"scale":null,"precision":null,"type":"text","nullable":false,"collation":null,"length":25},{"name":"L_SHIPMODE","database":"SNOWFLAKE_SAMPLE_DATA","schema":"TPCH_SF1","table":"LINEITEM","byteLength":40,"scale":null,"precision":null,"type":"text","nullable":false,"collation":null,"length":10},{"name":"L_COMMENT","database":"SNOWFLAKE_SAMPLE_DATA","schema":"TPCH_SF1","table":"LINEITEM","byteLength":176,"scale":null,"precision":null,"type":"text","nullable":false,"collation":null,"length":44} ], + "rowset" : [["3000001","14406","4407","1","22.00","29048.80","0.02","0.06","A","F","8431","8475","8459","DELIVER IN PERSON","AIR","uriously silent patterns across the f"], + ["3000002","34422","4423","1","45.00","61038.90","0.06","0.04","N","O","9401","9369","9418","NONE","AIR","al braids wake idly regular a"] + ], + "total" : 2, + "returned" : 2, + "queryId" : "0000000-0000-0000-0000-000000000000", + "databaseProvider" : null, + "finalDatabaseName" : "SNOWFLAKE_SAMPLE_DATA", + "finalSchemaName" : "TPCH_SF1", + "finalWarehouseName" : "COMPUTE_WH", + "finalRoleName" : "FOOBAR", + "numberOfBinds" : 0, + "arrayBindSupported" : false, + "statementTypeId" : 4096, + "version" : 1, + "sendResultTime" : 1111111111111, + "queryResultFormat" : "json" + }, + "code" : null, + "message" : null, + "success" : true +} \ No newline at end of file diff --git a/priv/testing/snowflake_insert_query_response.json b/priv/testing/snowflake_insert_query_response.json new file mode 100644 index 0000000..564ea09 --- /dev/null +++ b/priv/testing/snowflake_insert_query_response.json @@ -0,0 +1,30 @@ +{ + "data" : { + "parameters" : [{"name":"TIMESTAMP_OUTPUT_FORMAT","value":"YYYY-MM-DD HH24:MI:SS.FF3 TZHTZM"},{"name":"CLIENT_PREFETCH_THREADS","value":4},{"name":"JS_TREAT_INTEGER_AS_BIGINT","value":false},{"name":"TIME_OUTPUT_FORMAT","value":"HH24:MI:SS"},{"name":"TIMESTAMP_TZ_OUTPUT_FORMAT","value":""},{"name":"CLIENT_RESULT_CHUNK_SIZE","value":48},{"name":"CLIENT_SESSION_KEEP_ALIVE","value":false},{"name":"CLIENT_OUT_OF_BAND_TELEMETRY_ENABLED","value":false},{"name":"CLIENT_METADATA_USE_SESSION_DATABASE","value":false},{"name":"ENABLE_STAGE_S3_PRIVATELINK_FOR_US_EAST_1","value":false},{"name":"TIMESTAMP_NTZ_OUTPUT_FORMAT","value":"YYYY-MM-DD HH24:MI:SS.FF3"},{"name":"CLIENT_RESULT_PREFETCH_THREADS","value":1},{"name":"CLIENT_METADATA_REQUEST_USE_CONNECTION_CTX","value":false},{"name":"CLIENT_HONOR_CLIENT_TZ_FOR_TIMESTAMP_NTZ","value":true},{"name":"CLIENT_MEMORY_LIMIT","value":1536},{"name":"CLIENT_TIMESTAMP_TYPE_MAPPING","value":"TIMESTAMP_LTZ"},{"name":"TIMEZONE","value":"America/Los_Angeles"},{"name":"CLIENT_RESULT_PREFETCH_SLOTS","value":2},{"name":"CLIENT_TELEMETRY_ENABLED","value":true},{"name":"CLIENT_USE_V1_QUERY_API","value":true},{"name":"CLIENT_DISABLE_INCIDENTS","value":true},{"name":"CLIENT_RESULT_COLUMN_CASE_INSENSITIVE","value":false},{"name":"BINARY_OUTPUT_FORMAT","value":"HEX"},{"name":"CLIENT_ENABLE_LOG_INFO_STATEMENT_PARAMETERS","value":false},{"name":"JS_DRIVER_DISABLE_OCSP_FOR_NON_SF_ENDPOINTS","value":false},{"name":"CLIENT_TELEMETRY_SESSIONLESS_ENABLED","value":true},{"name":"CLIENT_CONSENT_CACHE_ID_TOKEN","value":false},{"name":"CLIENT_FORCE_PROTECT_ID_TOKEN","value":true},{"name":"DATE_OUTPUT_FORMAT","value":"YYYY-MM-DD"},{"name":"CLIENT_STAGE_ARRAY_BINDING_THRESHOLD","value":65280},{"name":"CLIENT_SESSION_KEEP_ALIVE_HEARTBEAT_FREQUENCY","value":3600},{"name":"CLIENT_SESSION_CLONE","value":false},{"name":"AUTOCOMMIT","value":true},{"name":"TIMESTAMP_LTZ_OUTPUT_FORMAT","value":""} ], + "rowtype" : [{"name":"number of rows inserted","database":"","schema":"","table":"","byteLength":null,"type":"fixed","scale":0,"precision":19,"nullable":false,"collation":null,"length":null} ], + "rowset" : [["1"] ], + "total" : 1, + "returned" : 1, + "queryId" : "11111111-1111-1111-0000-111111111111", + "databaseProvider" : null, + "finalDatabaseName" : "SNOWFLAKE_SAMPLE_DATA", + "finalSchemaName" : "TPCH_SF1", + "finalWarehouseName" : "COMPUTE_WH", + "finalRoleName" : "FOOBAR", + "numberOfBinds" : 0, + "stats" : { + "numRowsInserted" : 1, + "numRowsDeleted" : 0, + "numRowsUpdated" : 0, + "numDmlDuplicates" : 0 + }, + "arrayBindSupported" : true, + "statementTypeId" : 12544, + "version" : 1, + "sendResultTime" : 1111111111111, + "queryResultFormat" : "json" + }, + "code" : null, + "message" : null, + "success" : true +} \ No newline at end of file diff --git a/priv/testing/snowflake_s3_query_response.json b/priv/testing/snowflake_s3_query_response.json new file mode 100644 index 0000000..95760a3 --- /dev/null +++ b/priv/testing/snowflake_s3_query_response.json @@ -0,0 +1,377 @@ +{ + "data":{ + "parameters":[ + { + "name":"TIMESTAMP_OUTPUT_FORMAT", + "value":"YYYY-MM-DD HH24:MI:SS.FF3 TZHTZM" + }, + { + "name":"CLIENT_PREFETCH_THREADS", + "value":4 + }, + { + "name":"JS_TREAT_INTEGER_AS_BIGINT", + "value":false + }, + { + "name":"TIME_OUTPUT_FORMAT", + "value":"HH24:MI:SS" + }, + { + "name":"TIMESTAMP_TZ_OUTPUT_FORMAT", + "value":"" + }, + { + "name":"CLIENT_RESULT_CHUNK_SIZE", + "value":48 + }, + { + "name":"CLIENT_SESSION_KEEP_ALIVE", + "value":false + }, + { + "name":"CLIENT_OUT_OF_BAND_TELEMETRY_ENABLED", + "value":false + }, + { + "name":"CLIENT_METADATA_USE_SESSION_DATABASE", + "value":false + }, + { + "name":"ENABLE_STAGE_S3_PRIVATELINK_FOR_US_EAST_1", + "value":false + }, + { + "name":"TIMESTAMP_NTZ_OUTPUT_FORMAT", + "value":"YYYY-MM-DD HH24:MI:SS.FF3" + }, + { + "name":"CLIENT_RESULT_PREFETCH_THREADS", + "value":1 + }, + { + "name":"CLIENT_METADATA_REQUEST_USE_CONNECTION_CTX", + "value":false + }, + { + "name":"CLIENT_HONOR_CLIENT_TZ_FOR_TIMESTAMP_NTZ", + "value":true + }, + { + "name":"CLIENT_MEMORY_LIMIT", + "value":1536 + }, + { + "name":"CLIENT_TIMESTAMP_TYPE_MAPPING", + "value":"TIMESTAMP_LTZ" + }, + { + "name":"TIMEZONE", + "value":"America/Los_Angeles" + }, + { + "name":"CLIENT_RESULT_PREFETCH_SLOTS", + "value":2 + }, + { + "name":"CLIENT_TELEMETRY_ENABLED", + "value":true + }, + { + "name":"CLIENT_USE_V1_QUERY_API", + "value":true + }, + { + "name":"CLIENT_DISABLE_INCIDENTS", + "value":true + }, + { + "name":"CLIENT_RESULT_COLUMN_CASE_INSENSITIVE", + "value":false + }, + { + "name":"BINARY_OUTPUT_FORMAT", + "value":"HEX" + }, + { + "name":"CLIENT_ENABLE_LOG_INFO_STATEMENT_PARAMETERS", + "value":false + }, + { + "name":"JS_DRIVER_DISABLE_OCSP_FOR_NON_SF_ENDPOINTS", + "value":false + }, + { + "name":"CLIENT_TELEMETRY_SESSIONLESS_ENABLED", + "value":true + }, + { + "name":"CLIENT_CONSENT_CACHE_ID_TOKEN", + "value":false + }, + { + "name":"CLIENT_FORCE_PROTECT_ID_TOKEN", + "value":true + }, + { + "name":"DATE_OUTPUT_FORMAT", + "value":"YYYY-MM-DD" + }, + { + "name":"CLIENT_STAGE_ARRAY_BINDING_THRESHOLD", + "value":65280 + }, + { + "name":"CLIENT_SESSION_KEEP_ALIVE_HEARTBEAT_FREQUENCY", + "value":3600 + }, + { + "name":"CLIENT_SESSION_CLONE", + "value":false + }, + { + "name":"AUTOCOMMIT", + "value":true + }, + { + "name":"TIMESTAMP_LTZ_OUTPUT_FORMAT", + "value":"" + } + ], + "rowtype":[ + { + "name":"L_ORDERKEY", + "database":"SNOWFLAKE_SAMPLE_DATA", + "schema":"TPCH_SF1", + "table":"LINEITEM", + "byteLength":null, + "scale":0, + "precision":38, + "type":"fixed", + "nullable":false, + "collation":null, + "length":null + }, + { + "name":"L_PARTKEY", + "database":"SNOWFLAKE_SAMPLE_DATA", + "schema":"TPCH_SF1", + "table":"LINEITEM", + "byteLength":null, + "scale":0, + "precision":38, + "type":"fixed", + "nullable":false, + "collation":null, + "length":null + }, + { + "name":"L_SUPPKEY", + "database":"SNOWFLAKE_SAMPLE_DATA", + "schema":"TPCH_SF1", + "table":"LINEITEM", + "byteLength":null, + "scale":0, + "precision":38, + "type":"fixed", + "nullable":false, + "collation":null, + "length":null + }, + { + "name":"L_LINENUMBER", + "database":"SNOWFLAKE_SAMPLE_DATA", + "schema":"TPCH_SF1", + "table":"LINEITEM", + "byteLength":null, + "scale":0, + "precision":38, + "type":"fixed", + "nullable":false, + "collation":null, + "length":null + }, + { + "name":"L_QUANTITY", + "database":"SNOWFLAKE_SAMPLE_DATA", + "schema":"TPCH_SF1", + "table":"LINEITEM", + "byteLength":null, + "scale":2, + "precision":12, + "type":"fixed", + "nullable":false, + "collation":null, + "length":null + }, + { + "name":"L_EXTENDEDPRICE", + "database":"SNOWFLAKE_SAMPLE_DATA", + "schema":"TPCH_SF1", + "table":"LINEITEM", + "byteLength":null, + "scale":2, + "precision":12, + "type":"fixed", + "nullable":false, + "collation":null, + "length":null + }, + { + "name":"L_DISCOUNT", + "database":"SNOWFLAKE_SAMPLE_DATA", + "schema":"TPCH_SF1", + "table":"LINEITEM", + "byteLength":null, + "scale":2, + "precision":12, + "type":"fixed", + "nullable":false, + "collation":null, + "length":null + }, + { + "name":"L_TAX", + "database":"SNOWFLAKE_SAMPLE_DATA", + "schema":"TPCH_SF1", + "table":"LINEITEM", + "byteLength":null, + "scale":2, + "precision":12, + "type":"fixed", + "nullable":false, + "collation":null, + "length":null + }, + { + "name":"L_RETURNFLAG", + "database":"SNOWFLAKE_SAMPLE_DATA", + "schema":"TPCH_SF1", + "table":"LINEITEM", + "byteLength":4, + "scale":null, + "precision":null, + "type":"text", + "nullable":false, + "collation":null, + "length":1 + }, + { + "name":"L_LINESTATUS", + "database":"SNOWFLAKE_SAMPLE_DATA", + "schema":"TPCH_SF1", + "table":"LINEITEM", + "byteLength":4, + "scale":null, + "precision":null, + "type":"text", + "nullable":false, + "collation":null, + "length":1 + }, + { + "name":"L_SHIPDATE", + "database":"SNOWFLAKE_SAMPLE_DATA", + "schema":"TPCH_SF1", + "table":"LINEITEM", + "byteLength":null, + "scale":null, + "precision":null, + "type":"date", + "nullable":false, + "collation":null, + "length":null + }, + { + "name":"L_COMMITDATE", + "database":"SNOWFLAKE_SAMPLE_DATA", + "schema":"TPCH_SF1", + "table":"LINEITEM", + "byteLength":null, + "scale":null, + "precision":null, + "type":"date", + "nullable":false, + "collation":null, + "length":null + }, + { + "name":"L_RECEIPTDATE", + "database":"SNOWFLAKE_SAMPLE_DATA", + "schema":"TPCH_SF1", + "table":"LINEITEM", + "byteLength":null, + "scale":null, + "precision":null, + "type":"date", + "nullable":false, + "collation":null, + "length":null + }, + { + "name":"L_SHIPINSTRUCT", + "database":"SNOWFLAKE_SAMPLE_DATA", + "schema":"TPCH_SF1", + "table":"LINEITEM", + "byteLength":100, + "scale":null, + "precision":null, + "type":"text", + "nullable":false, + "collation":null, + "length":25 + }, + { + "name":"L_SHIPMODE", + "database":"SNOWFLAKE_SAMPLE_DATA", + "schema":"TPCH_SF1", + "table":"LINEITEM", + "byteLength":40, + "scale":null, + "precision":null, + "type":"text", + "nullable":false, + "collation":null, + "length":10 + }, + { + "name":"L_COMMENT", + "database":"SNOWFLAKE_SAMPLE_DATA", + "schema":"TPCH_SF1", + "table":"LINEITEM", + "byteLength":176, + "scale":null, + "precision":null, + "type":"text", + "nullable":false, + "collation":null, + "length":44 + } + ], + "rowset":[ + + ], + "qrmk":"grommit", + "chunkHeaders":{ + "x-amz-server-side-encryption-customer-key":"def", + "x-amz-server-side-encryption-customer-key-md5":"abc" + }, + "total":4, + "returned":4, + "queryId":"01a49cd1-3200-eecb-0000-0000c20d9019", + "databaseProvider":null, + "finalDatabaseName":"SNOWFLAKE_SAMPLE_DATA", + "finalSchemaName":"TPCH_SF1", + "finalWarehouseName":"COMPUTE_WH", + "finalRoleName":"ACCOUNTADMIN", + "numberOfBinds":0, + "arrayBindSupported":false, + "statementTypeId":4096, + "version":1, + "sendResultTime":1653915916053, + "queryResultFormat":"json" + }, + "code":null, + "message":null, + "success":true +} \ No newline at end of file diff --git a/priv/testing/snowflake_valid_login_response.json b/priv/testing/snowflake_valid_login_response.json new file mode 100644 index 0000000..2dac05c --- /dev/null +++ b/priv/testing/snowflake_valid_login_response.json @@ -0,0 +1,168 @@ +{ + "data":{ + "masterToken":"ver:1-hint:xx", + "token":"ver:1-hint:xx", + "validityInSeconds":3600, + "masterValidityInSeconds":14400, + "displayUserName":"elixir", + "serverVersion":"6.17.0", + "firstLogin":false, + "remMeToken":null, + "remMeValidityInSeconds":0, + "healthCheckInterval":45, + "newClientForUpgrade":null, + "sessionId":1, + "parameters":[ + { + "name":"TIMESTAMP_OUTPUT_FORMAT", + "value":"YYYY-MM-DD HH24:MI:SS.FF3 TZHTZM" + }, + { + "name":"CLIENT_PREFETCH_THREADS", + "value":4 + }, + { + "name":"JS_TREAT_INTEGER_AS_BIGINT", + "value":false + }, + { + "name":"TIME_OUTPUT_FORMAT", + "value":"HH24:MI:SS" + }, + { + "name":"TIMESTAMP_TZ_OUTPUT_FORMAT", + "value":"" + }, + { + "name":"CLIENT_RESULT_CHUNK_SIZE", + "value":160 + }, + { + "name":"CLIENT_SESSION_KEEP_ALIVE", + "value":false + }, + { + "name":"CLIENT_OUT_OF_BAND_TELEMETRY_ENABLED", + "value":false + }, + { + "name":"CLIENT_METADATA_USE_SESSION_DATABASE", + "value":false + }, + { + "name":"ENABLE_STAGE_S3_PRIVATELINK_FOR_US_EAST_1", + "value":false + }, + { + "name":"TIMESTAMP_NTZ_OUTPUT_FORMAT", + "value":"YYYY-MM-DD HH24:MI:SS.FF3" + }, + { + "name":"CLIENT_RESULT_PREFETCH_THREADS", + "value":1 + }, + { + "name":"CLIENT_METADATA_REQUEST_USE_CONNECTION_CTX", + "value":false + }, + { + "name":"CLIENT_HONOR_CLIENT_TZ_FOR_TIMESTAMP_NTZ", + "value":true + }, + { + "name":"CLIENT_MEMORY_LIMIT", + "value":1536 + }, + { + "name":"CLIENT_TIMESTAMP_TYPE_MAPPING", + "value":"TIMESTAMP_LTZ" + }, + { + "name":"TIMEZONE", + "value":"America/Los_Angeles" + }, + { + "name":"CLIENT_RESULT_PREFETCH_SLOTS", + "value":2 + }, + { + "name":"CLIENT_TELEMETRY_ENABLED", + "value":true + }, + { + "name":"CLIENT_USE_V1_QUERY_API", + "value":true + }, + { + "name":"CLIENT_DISABLE_INCIDENTS", + "value":true + }, + { + "name":"CLIENT_RESULT_COLUMN_CASE_INSENSITIVE", + "value":false + }, + { + "name":"BINARY_OUTPUT_FORMAT", + "value":"HEX" + }, + { + "name":"CLIENT_ENABLE_LOG_INFO_STATEMENT_PARAMETERS", + "value":false + }, + { + "name":"JS_DRIVER_DISABLE_OCSP_FOR_NON_SF_ENDPOINTS", + "value":false + }, + { + "name":"CLIENT_TELEMETRY_SESSIONLESS_ENABLED", + "value":true + }, + { + "name":"CLIENT_CONSENT_CACHE_ID_TOKEN", + "value":false + }, + { + "name":"CLIENT_FORCE_PROTECT_ID_TOKEN", + "value":true + }, + { + "name":"DATE_OUTPUT_FORMAT", + "value":"YYYY-MM-DD" + }, + { + "name":"CLIENT_STAGE_ARRAY_BINDING_THRESHOLD", + "value":65280 + }, + { + "name":"CLIENT_SESSION_KEEP_ALIVE_HEARTBEAT_FREQUENCY", + "value":3600 + }, + { + "name":"CLIENT_SESSION_CLONE", + "value":false + }, + { + "name":"AUTOCOMMIT", + "value":true + }, + { + "name":"TIMESTAMP_LTZ_OUTPUT_FORMAT", + "value":"" + } + ], + "sessionInfo":{ + "databaseName":"SNOWFLAKE_SAMPLE_DATA", + "schemaName":"TPCH_SF1", + "warehouseName":"COMPUTE_WH", + "roleName":"FOOBAR" + }, + "idToken":null, + "idTokenValidityInSeconds":0, + "responseData":null, + "mfaToken":null, + "mfaTokenValidityInSeconds":0 + }, + "code":null, + "message":null, + "success":true +} \ No newline at end of file diff --git a/test/snowflake_insert_test.exs b/test/snowflake_insert_test.exs new file mode 100644 index 0000000..faf12a5 --- /dev/null +++ b/test/snowflake_insert_test.exs @@ -0,0 +1,67 @@ +defmodule ReqSnowflake.InsertTest do + use ExUnit.Case, async: false + alias ReqSnowflake.Result + + setup do + bypass = Bypass.open() + Application.put_env(:req_snowflake, :snowflake_hostname, "127.0.0.1") + Application.put_env(:req_snowflake, :snowflake_url, "http://127.0.0.1:#{bypass.port}") + Application.put_env(:req_snowflake, :snowflake_uuid, "0000000-0000-0000-0000-000000000000") + + {:ok, %{bypass: bypass}} + end + + test "Can insert to Snowflake", %{bypass: bypass} do + Bypass.expect(bypass, "POST", "/session/v1/login-request", fn conn -> + File.read!( + Path.join([ + :code.priv_dir(:req_snowflake), + "testing/snowflake_valid_login_response.json" + ]) + ) + |> json(conn, 200) + end) + + Bypass.expect(bypass, "POST", "/queries/v1/query-request", fn conn -> + File.read!( + Path.join([ + :code.priv_dir(:req_snowflake), + "testing/snowflake_insert_query_response.json" + ]) + ) + |> json(conn, 200) + end) + + response = + Req.new() + |> ReqSnowflake.attach( + username: "myuser", + password: "hunter2", + account_name: "elixir", + region: "us-east-1", + warehouse: "compute_wh", + role: "somerole", + database: "snowflake_sample_data", + schema: "tpch_sf1" + ) + |> Req.post!( + snowflake_query: "INSERT INTO \"foo\".\"bar\".\"baz\" (\"hello\") VALUES (?)", + bindings: %{"1" => %{type: "TEXT", value: "xxx"}} + ) + + assert response.body == %Result{ + columns: ["number of rows inserted"], + messages: [], + metadata: [], + num_rows: 1, + rows: [[1]], + success: true + } + end + + defp json(data, conn, status) do + conn + |> Plug.Conn.put_resp_content_type("application/json") + |> Plug.Conn.send_resp(status, data) + end +end diff --git a/test/snowflake_login_integration_test.exs b/test/snowflake_login_integration_test.exs new file mode 100644 index 0000000..4313040 --- /dev/null +++ b/test/snowflake_login_integration_test.exs @@ -0,0 +1,108 @@ +defmodule ReqSnowflakeLogin.LoginIntegrationTest do + use ExUnit.Case, async: false + @moduletag :integration + + setup do + :application.unset_env(:req_snowflake, :snowflake_hostname) + :application.unset_env(:req_snowflake, :snowflake_url) + end + + test "Can login to Snowflake using valid credentials" do + username = System.get_env("SNOWFLAKE_USERNAME") + password = System.get_env("SNOWFLAKE_PASSWORD") + account_name = System.get_env("SNOWFLAKE_ACCOUNT_NAME") + region = System.get_env("SNOWFLAKE_REGION") + warehouse = System.get_env("SNOWFLAKE_WAREHOUSE") + role = System.get_env("SNOWFLAKE_ROLE") + database = System.get_env("SNOWFLAKE_DATABASE") + schema = System.get_env("SNOWFLAKE_SCHEMA") + + response = + Req.new(http_errors: :raise) + |> ReqSnowflakeLogin.attach( + username: username, + password: password, + account_name: account_name, + region: region, + warehouse: warehouse, + role: role, + database: database, + schema: schema + ) + |> Req.post!( + url: + "https://#{ReqSnowflake.Snowflake.snowflake_host(account_name, region)}/console/bootstrap-data-request", + json: %{"dataKinds" => ["DATABASES"]} + ) + + assert response.status == 200 + refute response.body["code"] + + assert length(response.body["data"]["databases"]) > 0 + assert hd(response.body["data"]["databases"])["name"] + assert hd(response.body["data"]["databases"])["id"] + end + + test "Attempting to login to Snowflake for a valid account returns incorrect username or password RuntimeError" do + account_name = System.get_env("SNOWFLAKE_ACCOUNT_NAME") + region = System.get_env("SNOWFLAKE_REGION") + + assert_raise RuntimeError, ~r/Incorrect username or password was specified/, fn -> + Req.new(http_errors: :raise) + |> ReqSnowflakeLogin.attach( + username: "invalid", + password: "invalid", + account_name: account_name, + region: region + ) + |> Req.post!( + url: + "https://#{ReqSnowflake.Snowflake.snowflake_host(account_name, region)}/console/bootstrap-data-request", + json: %{"dataKinds" => ["DATABASES"]} + ) + end + end + + test "Attempting to login to Snowflake for an invalid account name returns an error" do + assert_raise RuntimeError, ~r/403 Forbidden/, fn -> + Req.new(http_errors: :raise) + |> ReqSnowflakeLogin.attach( + username: "invalid", + password: "invalid", + account_name: "invalid", + region: "us-east-1" + ) + |> Req.post!( + url: "https://invalid.us-east-1.snowflakecomputing.com/console/bootstrap-data-request", + json: %{"dataKinds" => ["DATABASES"]} + ) + end + end + + test "Not going to Snowflake won't try and log you in" do + username = System.get_env("SNOWFLAKE_USERNAME") + password = System.get_env("SNOWFLAKE_PASSWORD") + account_name = System.get_env("SNOWFLAKE_ACCOUNT_NAME") + region = System.get_env("SNOWFLAKE_REGION") + warehouse = System.get_env("SNOWFLAKE_WAREHOUSE") + role = System.get_env("SNOWFLAKE_ROLE") + database = System.get_env("SNOWFLAKE_DATABASE") + schema = System.get_env("SNOWFLAKE_SCHEMA") + + response = + Req.new(http_errors: :raise) + |> ReqSnowflakeLogin.attach( + username: username, + password: password, + account_name: account_name, + region: region, + warehouse: warehouse, + role: role, + database: database, + schema: schema + ) + |> Req.get!(url: "https://elixir-lang.org/") + + assert response.status == 200 + end +end diff --git a/test/snowflake_login_test.exs b/test/snowflake_login_test.exs new file mode 100644 index 0000000..7303998 --- /dev/null +++ b/test/snowflake_login_test.exs @@ -0,0 +1,131 @@ +defmodule ReqSnowflakeLogin.LoginTest do + use ExUnit.Case, async: false + + setup do + bypass = Bypass.open() + Application.put_env(:req_snowflake, :snowflake_hostname, "127.0.0.1") + Application.put_env(:req_snowflake, :snowflake_url, "http://127.0.0.1:#{bypass.port}") + Application.put_env(:req_snowflake, :snowflake_uuid, "0000000-0000-0000-0000-000000000000") + + {:ok, %{bypass: bypass}} + end + + test "Can login to Snowflake using valid credentials", %{bypass: bypass} do + Bypass.expect(bypass, "POST", "/session/v1/login-request", fn conn -> + File.read!( + Path.join([ + :code.priv_dir(:req_snowflake), + "testing/snowflake_valid_login_response.json" + ]) + ) + |> json(conn, 200) + end) + + Bypass.expect(bypass, "POST", "/console/bootstrap-data-request", fn conn -> + File.read!( + Path.join([ + :code.priv_dir(:req_snowflake), + "testing/snowflake_bootstrap_data_request_response.json" + ]) + ) + |> json(conn, 200) + end) + + response = + Req.new(http_errors: :raise) + |> ReqSnowflakeLogin.attach( + username: "elixir", + password: "elixir123", + account_name: "elixir123", + region: "us-east-1", + warehouse: "COMPUTE_WH", + role: "FOOBAR", + database: "SNOWFLAKE_SAMPLE_DATA", + schema: "TPCH_SF1" + ) + |> Req.post!( + url: "http://127.0.0.1:#{bypass.port}/console/bootstrap-data-request", + json: %{"dataKinds" => ["DATABASES"]} + ) + + assert response.status == 200 + refute response.body["code"] + + assert length(response.body["data"]["databases"]) > 0 + assert hd(response.body["data"]["databases"])["name"] + assert hd(response.body["data"]["databases"])["id"] + end + + # test "Attempting to login to Snowflake for a valid account returns incorrect username or password RuntimeError" do + # assert_raise RuntimeError, ~r/Incorrect username or password was specified/, fn -> + # Req.new(http_errors: :raise) + # |> ReqSnowflakeLogin.attach( + # username: "invalid", + # password: "invalid", + # account_name: account_name, + # region: region + # ) + # |> Req.post!( + # url: + # "https://#{account_name}.#{region}.snowflakecomputing.com/console/bootstrap-data-request", + # json: %{"dataKinds" => ["DATABASES"]} + # ) + # end + # end + # + # test "Attempting to login to Snowflake for an invalid account name returns an error" do + # assert_raise RuntimeError, ~r/403 Forbidden/, fn -> + # Req.new(http_errors: :raise) + # |> ReqSnowflakeLogin.attach( + # username: "invalid", + # password: "invalid", + # account_name: "invalid", + # region: "us-east-1" + # ) + # |> Req.post!( + # url: "https://invalid.us-east-1.snowflakecomputing.com/console/bootstrap-data-request", + # json: %{"dataKinds" => ["DATABASES"]} + # ) + # end + # end + + # test "Not going to Snowflake won't try and log you in" do + # bypass = Bypass.open() + # Application.put_env(:req_snowflake, :snowflake_hostname, "127.0.0.1") + # Application.put_env(:req_snowflake, :snowflake_url, "http://127.0.0.1:#{bypass.port}") + # + # Bypass.expect(bypass, "POST", "/session/v1/login-request", fn conn -> + # json = + # File.read!( + # Path.join([ + # :code.priv_dir(:req_snowflake), + # "testing/snowflake_valid_login_response.json" + # ]) + # ) + # + # json(conn, 200, json) + # end) + # + # response = + # Req.new(http_errors: :raise) + # |> ReqSnowflakeLogin.attach( + # username: "x", + # password: "y", + # account_name: "z", + # region: "x", + # warehouse: "a", + # role: "x", + # database: "a", + # schema: "a" + # ) + # |> Req.get!(url: "https://elixir-lang.org/") + # + # assert response.status == 200 + # end + + defp json(data, conn, status) do + conn + |> Plug.Conn.put_resp_content_type("application/json") + |> Plug.Conn.send_resp(status, data) + end +end diff --git a/test/snowflake_query_test.exs b/test/snowflake_query_test.exs new file mode 100644 index 0000000..48a1c32 --- /dev/null +++ b/test/snowflake_query_test.exs @@ -0,0 +1,345 @@ +defmodule ReqSnowflake.QueryTest do + use ExUnit.Case, async: false + + setup do + bypass = Bypass.open() + Application.put_env(:req_snowflake, :snowflake_hostname, "127.0.0.1") + Application.put_env(:req_snowflake, :snowflake_url, "http://127.0.0.1:#{bypass.port}") + Application.put_env(:req_snowflake, :snowflake_uuid, "0000000-0000-0000-0000-000000000000") + + {:ok, %{bypass: bypass}} + end + + test "Can query Snowflake with a valid query", %{bypass: bypass} do + Bypass.expect(bypass, "POST", "/session/v1/login-request", fn conn -> + File.read!( + Path.join([ + :code.priv_dir(:req_snowflake), + "testing/snowflake_valid_login_response.json" + ]) + ) + |> json(conn, 200) + end) + + Bypass.expect(bypass, "POST", "/queries/v1/query-request", fn conn -> + File.read!( + Path.join([ + :code.priv_dir(:req_snowflake), + "testing/snowflake_inline_query_response.json" + ]) + ) + |> json(conn, 200) + end) + + response = + Req.new() + |> ReqSnowflake.attach( + username: "myuser", + password: "hunter2", + account_name: "elixir", + region: "us-east-1", + warehouse: "compute_wh", + role: "somerole", + database: "snowflake_sample_data", + schema: "tpch_sf1" + ) + |> Req.post!( + snowflake_query: "select * from snowflake_sample_data.tpch_sf1.lineitem limit 2" + ) + + assert response.body == %ReqSnowflake.Result{ + columns: [ + "L_ORDERKEY", + "L_PARTKEY", + "L_SUPPKEY", + "L_LINENUMBER", + "L_QUANTITY", + "L_EXTENDEDPRICE", + "L_DISCOUNT", + "L_TAX", + "L_RETURNFLAG", + "L_LINESTATUS", + "L_SHIPDATE", + "L_COMMITDATE", + "L_RECEIPTDATE", + "L_SHIPINSTRUCT", + "L_SHIPMODE", + "L_COMMENT" + ], + messages: [], + metadata: [], + num_rows: 2, + rows: [ + [ + 3_000_001, + 14406, + 4407, + 1, + "22.00", + "29048.80", + "0.02", + "0.06", + "A", + "F", + Date.from_iso8601!("1993-01-31"), + Date.from_iso8601!("1993-03-16"), + Date.from_iso8601!("1993-02-28"), + "DELIVER IN PERSON", + "AIR", + "uriously silent patterns across the f" + ], + [ + 3_000_002, + 34422, + 4423, + 1, + "45.00", + "61038.90", + "0.06", + "0.04", + "N", + "O", + Date.from_iso8601!("1995-09-28"), + Date.from_iso8601!("1995-08-27"), + Date.from_iso8601!("1995-10-15"), + "NONE", + "AIR", + "al braids wake idly regular a" + ] + ], + success: true + } + end + + test "Can query and get data from S3", %{bypass: bypass} do + Bypass.expect(bypass, "POST", "/session/v1/login-request", fn conn -> + File.read!( + Path.join([ + :code.priv_dir(:req_snowflake), + "testing/snowflake_valid_login_response.json" + ]) + ) + |> json(conn, 200) + end) + + # Add the chunks to be downloaded via Bypass + chunks = [ + %{ + "url" => "http://127.0.0.1:#{bypass.port}/s31", + "rowCount" => 2909, + "uncompressedSize" => 426_128, + "compressedSize" => 110_925 + }, + %{ + "url" => "http://127.0.0.1:#{bypass.port}/s32", + "rowCount" => 247, + "uncompressedSize" => 36148, + "compressedSize" => 10189 + } + ] + + Bypass.expect(bypass, "POST", "/queries/v1/query-request", fn conn -> + File.read!( + Path.join([ + :code.priv_dir(:req_snowflake), + "testing/snowflake_s3_query_response.json" + ]) + ) + |> Jason.decode!() + |> put_in(["data", "chunks"], chunks) + |> Jason.encode!() + |> json(conn, 200) + end) + + Bypass.expect(bypass, "GET", "/s31", fn conn -> + File.read!( + Path.join([ + :code.priv_dir(:req_snowflake), + "testing/s3/response_1.json" + ]) + ) + |> json_gzip(conn, 200) + end) + + Bypass.expect(bypass, "GET", "/s32", fn conn -> + File.read!( + Path.join([ + :code.priv_dir(:req_snowflake), + "testing/s3/response_2.json" + ]) + ) + |> json_gzip(conn, 200) + end) + + response = + Req.new() + |> ReqSnowflake.attach( + username: "myuser", + password: "hunter2", + account_name: "elixir", + region: "us-east-1", + warehouse: "compute_wh", + role: "somerole", + database: "snowflake_sample_data", + schema: "tpch_sf1" + ) + |> Req.post!( + snowflake_query: "select * from snowflake_sample_data.tpch_sf1.lineitem limit 2" + ) + + assert response.body == %ReqSnowflake.Result{ + columns: [ + "L_ORDERKEY", + "L_PARTKEY", + "L_SUPPKEY", + "L_LINENUMBER", + "L_QUANTITY", + "L_EXTENDEDPRICE", + "L_DISCOUNT", + "L_TAX", + "L_RETURNFLAG", + "L_LINESTATUS", + "L_SHIPDATE", + "L_COMMITDATE", + "L_RECEIPTDATE", + "L_SHIPINSTRUCT", + "L_SHIPMODE", + "L_COMMENT" + ], + messages: [], + metadata: [], + num_rows: 4, + rows: [ + [ + 3_003_586, + 193_197, + 755, + 1, + "24.00", + "30964.56", + "0.03", + "0.02", + "N", + "O", + Date.from_iso8601!("1996-05-14"), + Date.from_iso8601!("1996-04-19"), + Date.from_iso8601!("1996-06-04"), + "COLLECT COD", + "RAIL", + "quests haggle furiously regular, " + ], + [ + 3_003_586, + 54440, + 6946, + 2, + "49.00", + "68327.56", + "0.01", + "0.05", + "N", + "O", + Date.from_iso8601!("1996-06-11"), + Date.from_iso8601!("1996-05-20"), + Date.from_iso8601!("1996-06-26"), + "DELIVER IN PERSON", + "FOB", + "heodolites are furio" + ], + [ + 3_003_586, + 164_196, + 6713, + 3, + "37.00", + "46627.03", + "0.02", + "0.02", + "N", + "O", + Date.from_iso8601!("1996-06-02"), + Date.from_iso8601!("1996-06-16"), + Date.from_iso8601!("1996-06-28"), + "TAKE BACK RETURN", + "AIR", + "de of the unusual, regular " + ], + [ + 1_203_681, + 53253, + 769, + 4, + "47.00", + "56693.75", + "0.05", + "0.06", + "R", + "F", + Date.from_iso8601!("1995-01-11"), + Date.from_iso8601!("1995-03-03"), + Date.from_iso8601!("1995-01-25"), + "DELIVER IN PERSON", + "AIR", + "e. furiously ironic requests wake careful" + ], + [ + 1_203_681, + 14329, + 1833, + 5, + "4.00", + "4973.28", + "0.00", + "0.08", + "A", + "F", + Date.from_iso8601!("1995-01-25"), + Date.from_iso8601!("1995-03-26"), + Date.from_iso8601!("1995-02-09"), + "NONE", + "AIR", + "s. slyly final" + ], + [ + 1_203_682, + 11930, + 6933, + 1, + "13.00", + "23945.09", + "0.06", + "0.04", + "R", + "F", + Date.from_iso8601!("1995-04-15"), + Date.from_iso8601!("1995-04-22"), + Date.from_iso8601!("1995-05-09"), + "DELIVER IN PERSON", + "TRUCK", + "ts use above the " + ] + ], + success: true + } + end + + # Snowflake sends back gzipped encoding, so gzip the response here we send back with bypass. + # Also send back some other headers s3 sends for goodluck. + defp json_gzip(data, conn, status) do + conn + |> Plug.Conn.put_resp_content_type("binary/octet-stream") + |> Plug.Conn.put_resp_header("content-encoding", "gzip") + |> Plug.Conn.put_resp_header("x-amz-id-2", "aaa") + |> Plug.Conn.put_resp_header("x-amz-request-id", "xxx") + |> Plug.Conn.put_resp_header("x-amz-server-side-encryption-customer-algorithm", "AES256") + |> Plug.Conn.put_resp_header("x-amz-server-side-encryption-customer-key-md5", "abcd") + |> Plug.Conn.put_resp_header("accept-ranges", "bytes") + |> Plug.Conn.send_resp(status, :zlib.gzip(data)) + end + + defp json(data, conn, status) do + conn + |> Plug.Conn.put_resp_content_type("application/json") + |> Plug.Conn.send_resp(status, data) + end +end diff --git a/test/test_helper.exs b/test/test_helper.exs new file mode 100644 index 0000000..a5a8568 --- /dev/null +++ b/test/test_helper.exs @@ -0,0 +1 @@ +ExUnit.start(exclude: [:integration], async: false)