Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support bypassing cluster local kubeconfig in favor of explicit kubeconfig #388

Open
xrmzju opened this issue Jan 7, 2020 · 4 comments
Open

Comments

@xrmzju
Copy link

xrmzju commented Jan 7, 2020

Update by @consideRatio

If KubeSpawner is to do work in another cluster, it needs a way to get passed the credentials to speak with that k8s clusters api-server. Currently our logic doesn't support this.

def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
# Load kubernetes config here, since this is a Singleton and
# so this __init__ will be run way before anything else gets run.
try:
config.load_incluster_config()
except config.ConfigException:
config.load_kube_config()

Original issue

(blank)

@consideRatio
Copy link
Member

I need more details here to better think about this.

I use kubespawner from a jupyterhub living in a container in a pod in the k8s it control. I do it as part of the zero-to-jupyterhub-k8/s project. This jupyterhub pod has the rights to work on with the k8s api-server in the cluster by the RBAC details configured for the pod's service account.

Please describe your use case a bit @xrmzju, that is essential for work to be done towards it.

@xrmzju
Copy link
Author

xrmzju commented Jan 7, 2020

I need more details here to better think about this.

I use kubespawner from a jupyterhub living in a container in a pod in the k8s it control. I do it as part of the zero-to-jupyterhub-k8/s project. This jupyterhub pod has the rights to work on with the k8s api-server in the cluster by the RBAC details configured for the pod's service account.

Please describe your use case a bit @xrmzju, that is essential for work to be done towards it.

sry i just opened the issue for a mark yesterday. in my case, i need to deploy jupyterhub in cluster A and create notebook pod in cluster B, so kubespawner need to support load external cluster config(in my case, kubeconfig), but for now, it uses the default in cluster client.

@manics
Copy link
Member

manics commented Jan 7, 2020

@xrmzju What happens when you specify a kubeconfig at the moment?

def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
# Load kubernetes config here, since this is a Singleton and
# so this __init__ will be run way before anything else gets run.
try:
config.load_incluster_config()
except config.ConfigException:
config.load_kube_config()

implies it should already work.

@xrmzju
Copy link
Author

xrmzju commented Jan 8, 2020

@xrmzju What happens when you specify a kubeconfig at the moment?

def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
# Load kubernetes config here, since this is a Singleton and
# so this __init__ will be run way before anything else gets run.
try:
config.load_incluster_config()
except config.ConfigException:
config.load_kube_config()

implies it should already work.

i made it work by pass api_client client = Client(api_client=config.new_client_from_config(),*args, **kwargs) in
L43

@consideRatio consideRatio changed the title support load kubeconfig Support bypassing cluster local kubeconfig in favor of explicit kubeconfig Oct 25, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants