Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[editor] Fix Client Request API Endpoint to Work for Any Prod Port #619

Merged
merged 1 commit into from
Dec 26, 2023

Conversation

rholinshead
Copy link
Contributor

@rholinshead rholinshead commented Dec 26, 2023

[editor] Fix Client Request API Endpoint to Work for Any Prod Port

In development, the client will be served via a node process from react-scripts at localhost:3000 by default. The python server will be at localhost:8080 by default.

In 'prod', the client will be bundled and statically served by the python server at the same port as the python server.

So, for prod we should make requests to '/api/*' which will resolve to the hostname/port of the server that served the JS.

For dev, we can just hardcode to the correct server port for now.

Testing:

yarn build in aiconfig/python/src/aiconfig/editor/client
Then run the prod server mode: aiconfig edit --aiconfig-path=cli/aiconfig-editor/travel.aiconfig.json --server-mode='prod'
and see the editor at localhost:8080 sending correct requests
Screenshot 2023-12-26 at 1 51 18 PM

Simultaneously, start up on another port:
ryanholinshead@Ryans-MacBook-Pro aiconfig % aiconfig edit --aiconfig-path=cli/aiconfig-editor/travel.aiconfig.json --server-mode='prod' --server-port=1234
and see the editor there as well

Screenshot 2023-12-26 at 1 57 13 PM

@rholinshead rholinshead merged commit 4b2b727 into main Dec 26, 2023
1 check passed
Ankush-lastmile added a commit that referenced this pull request Dec 26, 2023
…ifying port (#624)

[editor] Functionality to run server on multiple configs without
specifying port





## What:

Before creating the server process, check if the port to be used is
taken. If so, find the next available one before continuing

for context: this builds on top of #619 

## Why

When running the editor, user should be able to open multiple configs.

Under the hood this means spinning up another server in an unused port.
Frontend already supports dynamic ports.

## Testplan

1. yarn build in aiconfig/python/src/aiconfig/editor/client
2. Open two aiconfigs with "prod" editor"

| Getting Started aiconfig | Chain Of Verification aiconfig|
|------------|-------------|
| ![Screenshot
1](https://github.com/lastmile-ai/aiconfig/assets/141073967/ae710711-17f6-498f-a5cb-6eee04baa8eb)
| ![Screenshot
2](https://github.com/lastmile-ai/aiconfig/assets/141073967/9f2d2b85-f60e-4c9e-a76c-e22cfbecd16e)|
| ![Screenshot
3](https://github.com/lastmile-ai/aiconfig/assets/141073967/10b28a80-2d60-4352-a296-bc0231a76002)
| ![Screenshot
4](https://github.com/lastmile-ai/aiconfig/assets/141073967/96a5df41-99fb-4bee-8f3a-5826e88c8e53)|

Terminals showcase the command used to open editor. Each one has a
seperate config file path. No port is passed in from the user.

Chain of verification Uses Port 8080
Getting Started in a new process using Port 8081
@rossdanlm rossdanlm deleted the pr619 branch December 27, 2023 06:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants