Skip to content

Commit

Permalink
Merge pull request #192 from hypercerts-org/chore/setup_instructions
Browse files Browse the repository at this point in the history
Update Readme
  • Loading branch information
bitbeckers authored Nov 21, 2024
2 parents 9e9caa8 + cd7fcb4 commit 46530db
Show file tree
Hide file tree
Showing 6 changed files with 145 additions and 19 deletions.
41 changes: 32 additions & 9 deletions .env.template
Original file line number Diff line number Diff line change
@@ -1,14 +1,37 @@
## SERVICE
INDEXER_ENVIRONMENT=test
PORT=4000
KEY=
PROOF=

SUPABASE_CACHING_DB_URL=
SUPABASE_CACHING_ANON_API_KEY=
### W3UP - https://github.com/storacha/w3up/tree/main/packages/w3up-client#bringing-your-own-agent-and-delegation
KEY=""
PROOF=""

SUPABASE_DATA_DB_URL=
SUPABASE_DATA_ANON_API_KEY=
SUPABASE_DATA_SERVICE_API_KEY=
### SUPABASE JS - https://www.npmjs.com/package/@supabase/supabase-js

SENTRY_DSN=
SUPABASE_CACHING_DB_URL="http://127.0.0.1:54321"
SUPABASE_CACHING_ANON_API_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0

INDEXER_ENVIRONMENT=test
SUPABASE_DATA_DB_URL="http://127.0.0.1:64321"
SUPABASE_DATA_ANON_API_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0
SUPABASE_DATA_SERVICE_API_KEY=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6InNlcnZpY2Vfcm9sZSIsImV4cCI6MTk4MzgxMjk5Nn0.EGIM96RAZx35lJzdJsyH-qQwv8Hdp7fsn3W0YpN81IU

### KYSELY

CACHING_DATABASE_URL=postgresql://postgres:[email protected]:54322/postgres
DATA_DATABASE_URL=postgresql://postgres:[email protected]:64322/postgres

### SENTRY
SENTRY_ENVIRONMENT=local # local | staging | production
SENTRY_DSN="" #disabled for local
SENTRY_AUTH_TOKEN="" #disabled for local

### RPCs - we implement a fallback to the first available RPC

# https://www.alchemy.com/
ALCHEMY_API_KEY=""

# https://www.infura.io/
INFURA_API_KEY=""

# https://drpc.org/
DRPC_API_KEY=""
110 changes: 107 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,51 @@
# Hypercerts API

The hypercerts API is the touchpoint for developers to interact with the hypercerts ecosystem. It provides endpoints for data upload and fetch, a GraphQL API for querying (on-chain) state and a health check endpoint.
The hypercerts API is the touchpoint for developers to interact with the hypercerts ecosystem. It provides endpoints for data upload and fetching, a GraphQL API for querying (on-chain) state and a health check endpoint.

## Endpoints
## Getting started

### Environment variables

#### W3UP

In `env.template` you'll find KEY and PROOF which are [w3up](https://web3.storage/docs/w3up-client/) key and proofs which you need to set up yourself for local otherwise you'll be superadmin 😉

#### Supabase JS

SupabaseJS is used to connect to both the the caching and data service. The local variables are deterministic and provided in the template.

#### Kysely

Kysely is implemented in favor of SupabaseJS as it allows for more flexibility and complexity in queries. To connect to the database you need to set the `CACHING_DATABASE_URL` and `DATA_DATABASE_URL` in the `.env` file. By default, the local variables are set to the local Supabase instance.

#### Sentry

Sentry is used for monitoring and error reporting. You can read more about it [here](https://docs.sentry.io/platforms/javascript/guides/node/configuration/env-vars/). When Sentry is set to `local` it will be disabled by default.

#### RPCs

The API implements a fallback to the first available RPC. You can set the RPCs in the `.env` file.

### Supabase

* Install Docker
* `git submodule init`
* `git submodule update --remote`
* `pnpm supabase:start:all`

This will spin up 2 Supabase instances in Docker, one for the indexer service (caching) and one for the data service (static data) which are both exposed by the API.

From both instances, you can get their respective keys and add them to the env vars. When in doubt you can run `supabase status` to fetch the keys again. By default this is not needed for local development.

### Run the API locally

`pnpm dev`

This will run a live production instance by running `swc` to compile the code and `nodemon` to restart the server on changes.

You can then find the API at `localhost:4000/spec` (Swagger instance) and the GraphQL at `localhost:4000/v1/graphql`

## Deployments

Production: `https://api.hypercerts.org/`
Staging: `https://staging-api.hypercerts.org`
Expand All @@ -16,4 +59,65 @@ Staging: `https://staging-api.hypercerts.org`
- `build`: Denerates the OpenAPI specification and routes using `tsoa`, and then compiles the TypeScript code into JavaScript using `swc`. The compiled code is output to the `dist` directory.
- `start`: Starts the application in production mode.
- `lint`: Runs `eslint` on the codebase to check for linting errors.
- `test`: Runs tests using `vitest`.
- `test`: Runs tests using `vitest`

## Data

The API service exposes data from two sources:

- The static data service which contains off-chain data like user data, hypercert collections, signed order messages, etc.
- The indexer service which contains on-chain data about hypercerts and the linked data on IPFS (hypercerts, ownerships, metadata, attestations, etc.)

### Static data service

The static data service is a Supabase database which is exposed by the API. This means that you can create, update and delete data through the API. For read functionality we recommend using the GraphQL API and playground to carefully compose the data model needed for your use case.

### Indexer service

The indexer service monitors our supported chains for relevant events and handles those events accordingly. All data exposed by the indexer service is available in different sources as well, like IPFS for metadata and EAS for attestations.

## Validations

The API also provides an upload and validation endpoint for hypercert and allow list data. In the [live docs](https://api.hypercerts.org/spec) you can find the endpoint and docs. Generally, by using `validate` you can post the data for validation without it being uploaded to IPFS.

## Architecture

```mermaid
graph TB
Client[Client Applications]
API[Hypercerts API :4000]
subgraph "API Endpoints"
Swagger["/spec\nSwagger Documentation"]
GraphQL["/v1/graphql\nGraphQL Endpoint"]
Upload["Upload & Validation\nEndpoints"]
end
subgraph "Data Services"
Static[("Static Data Service\n(Supabase DB)\n- User Data\n- Collections\n- Signed Orders")]
Indexer[("Indexer Service\n(Supabase DB)\n- On-chain Data\n- IPFS Data")]
end
subgraph "External Services"
IPFS[(IPFS\nMetadata Storage)]
Blockchain[(Blockchain\nSupported Chains)]
EAS[(EAS\nAttestations)]
end
Client --> API
API --> Swagger
API --> GraphQL
API --> Upload
GraphQL --> Static
GraphQL --> Indexer
Upload --> IPFS
Indexer --> Blockchain
Indexer --> IPFS
Indexer --> EAS
class Swagger,GraphQL,Upload apiEndpoint;
class Static,Indexer database;
class IPFS,Blockchain,EAS external;
```
2 changes: 1 addition & 1 deletion lib/hypercerts-indexer
4 changes: 2 additions & 2 deletions src/__generated__/routes/routes.ts

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

4 changes: 2 additions & 2 deletions src/__generated__/swagger.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

3 changes: 1 addition & 2 deletions src/instrument.mjs
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,9 @@ import * as Sentry from '@sentry/node';
import {assertExists} from "./utils/assertExists.js";
import { nodeProfilingIntegration } from "@sentry/profiling-node";

const SENTRY_DSN = assertExists(process.env.SENTRY_DSN, "SENTRY_DSN");
// Ensure to call this before importing any other modules!
Sentry.init({
dsn: SENTRY_DSN,
dsn: process.env.SENTRY_DSN,
integrations: [
nodeProfilingIntegration(),
],
Expand Down

0 comments on commit 46530db

Please sign in to comment.