Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LLM allow specifying the prompt per port config #152

Open
ghoeffner opened this issue Jan 13, 2025 · 2 comments · Fixed by #153
Open

LLM allow specifying the prompt per port config #152

ghoeffner opened this issue Jan 13, 2025 · 2 comments · Fixed by #153
Assignees
Labels
enhancement New feature or request

Comments

@ghoeffner
Copy link

Is your feature request related to a problem? Please describe.
It would be great if we could overwrite the prompt from each individual config in order to be able to allow more flexibility in configuring the individual services.

Describe the solution you'd like
Just add a new config file option to specify a specific prompt per port.

@mariocandela
Copy link
Owner

Hi German,

Nice to meet you.

Sounds good to me, I will work on IT in the next days.

Thank you for your time and contribution 😄

Cheers

Mario

@mariocandela mariocandela added the enhancement New feature or request label Jan 13, 2025
@mariocandela mariocandela linked a pull request Jan 13, 2025 that will close this issue
7 tasks
mariocandela added a commit that referenced this issue Jan 14, 2025
* implement new feature, custom prompt

* Add doc for custom prompt
@mariocandela mariocandela reopened this Jan 14, 2025
@mariocandela
Copy link
Owner

mariocandela commented Jan 14, 2025

Hi @ghoeffner,

I've added a new command to the Honeypot configuration that enables the use of custom prompts. You can check out the details here: Honeypot Configuration - Custom Prompt.

This feature works for both SSH and HTTP honeypots.

Beelzebub version: V3.3.0

Thanks for your time, and happy hacking!

Best regards,
Mario

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants