-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Enhancement]: Updating the same project with complete opensource alternatives #2
Comments
If we can also include Deepseek, as I have spent a significant amount of money and have never completed the task due to persistent issues with installing tools that repeatedly fail. |
Hello @bb1nfosec Thank you for your enhancement request. Let me address each point: Local Language Model IntegrationCurrently, you can configure the following environment variables:
These variables allow PentAGI to connect to a local backend (such as vLLM) implementing the OpenAI interaction specification, but it can also be deployed locally. Additionally, you can set up a LiteLLM proxy server, which offers an OpenAI-compatible interface and forwards requests to another local server. I'll going to review the LocalAI specification and get back to you on this. It seems feasible to support this as an option for selecting a custom LLM server. Local Search EngineSupporting Whoosh or OpenSearch doesn't appear to be a major issue. However, the data storage structure may impose constraints on configuring such integration. What do you think about developing a simple HTTP-based protocol with a single POST endpoint for search parameters, while the connection logic to local databases is handled separately? Locally Installed ToolsCurrently, when a job is started, a Docker image is automatically selected to serve the flow. All necessary tools are either pre-installed within the image or downloaded as needed. I have considered creating a custom build based on Kali Linux, but have found a suitable image in Self-Contained DeploymentPentAGI is currently available in a docker-compose format, divided into three parts:
You can view the installation instructions: I'm planning to create a video guide on setup and configuration, including LangFuse and observability components. There will also be a separate video and guide on securely setting up the Docker environment, covering docker-in-docker and other deployment options. Could you provide more details on what you mean in this section? Custom ReportingCurrently, this feature is under development. There will be a standard report for flows and separate reports for each task in markdown and PDF formats. If you have specific requirements regarding the format and content of the reports, please share them with us, and we will incorporate them into our feature analysis. Thank you for helping us enhance our product. |
Hello @mazamizo21 I apologize for the inconvenience you've experienced. I will try to reproduce this problem on my side. In the meantime, here's a workaround: please ensure that the Docker image As an alternative solution, you can add to your prompt that GUI-based utilities should be excluded from use during execution and explicitly list a blacklist of utilities that should neither be installed nor used. Currently, users can only manage the job text, but we have tasks in our backlog to expand user customization to include system prompts via UI. For better understanding how your task might be executed, please take a look at the list of system prompts here: System Prompts. Thank you for your patience and understanding. |
Target Component
AI Agents (Researcher/Developer/Executor)
Enhancement Description
This enhancement rewrites PentAGI to eliminate reliance on public APIs (e.g., OpenAI, Google, Tavily). The new architecture is entirely open-source, self-hosted, and locally modifiable. This enables greater flexibility, control, and cost efficiency while ensuring data privacy.
Key Changes
Local Language Model Integration:
Replace public LLM APIs with self-hosted models (e.g., LLaMA, GPT-J).
Host the model locally using frameworks like LocalAI or FastAPI.
Local Search Engine:
Replace external search APIs (e.g., Google/Tavily) with local indexing tools such as Whoosh or OpenSearch.
Locally Installed Tools:
Use open-source pentesting tools installed on the system or Dockerized (e.g., nmap, sqlmap, Metasploit).
Self-Contained Deployment:
Use Docker Compose to manage all components, ensuring easy deployment and scalability.
Custom Reporting:
Use Jinja2 and WeasyPrint for generating detailed HTML and PDF reports.
Technical Details
Implementation Details
Framework: Host models like GPT-J or LLaMA locally with LocalAI.
API Endpoint: Serve via http://localhost:8000/generate.
The text was updated successfully, but these errors were encountered: