-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: deploy ooniprobe service to prod #77
Conversation
Terraform Run Output 🤖Format and Style 🖌
|
Pusher | @DecFox |
Action | pull_request |
Environment | dev |
Workflow | .github/workflows/check_terraform.yml |
Last updated | Fri, 19 Jul 2024 18:26:25 GMT |
Ansible Run Output 🤖Ansible Playbook Recap 🔍
Ansible playbook output 📖
|
Pusher | @DecFox |
Action | pull_request |
Working Directory | |
Workflow | .github/workflows/check_ansible.yml |
Last updated | Fri, 19 Jul 2024 15:55:49 GMT |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks good to me. One thing we should be careful of, however, is the values for the auto scaling group values: https://github.com/ooni/devops/blob/main/tf/environments/prod/main.tf#L283. I believe the current counts should be enough, but we should be cautious in this, since if we mess up the numbers there is the risk that we end up without enough resources to keep all the services online.
needed for ooni/probe#2781 |
source = "../../modules/ooniapi_service" | ||
|
||
# First run should be set on first run to bootstrap the task definition | ||
first_run = true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Be sure to flip this after first deploy.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Already in prod, closing this |
@LDiazN afaik this is not deployed in prod
|
Yes you are right to point out that the VPN related endpoints are not currently exposed from the load balancer configuration. That's because we have made the decision to only expose the The most likely approach we might follow for delivering the openvpn configurations going forward is to support that as part of functionality we are planning to place inside of a new iteration of OONI Run v2 (see: ooni/backend#927). For the time being we are going to keep the openvpn configuration hardcoded and addresses passed via DNS pending the work on OONI Run v2.1. |
This diff adds the necessary terraform code to deploy
ooniprobe
backend service to prod