Skip to content

Commit

Permalink
Merge pull request #1 from guardrails-ai/web_sanitization_fix
Browse files Browse the repository at this point in the history
Format README to template
  • Loading branch information
CalebCourier authored Jul 30, 2024
2 parents 556ff15 + 62bb68d commit 3e5c415
Showing 1 changed file with 9 additions and 7 deletions.
16 changes: 9 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,28 +1,30 @@
# Overview

| Developed by | Guardrails AI |
| --- | --- |
| Date of development | Feb 15, 2024 |
| Validator type | Format |
| Blog | - |
| Blog | |
| License | Apache 2 |
| Input/Output | Output |

## Description

Scans LLM outputs for strings that could cause browser script execution downstream. Uses the `bleach` library to detect and escape suspect characters.

### (Optional) Intended Use
### Intended Use

Use this validator when you are passing the results of your LLM requests directly to a browser or other html-executable environment. It's a good idea to also implement other XSS and code injection prevention techniques.

## Requirements
* `bleach`
### Requirements

* Dependencies:
- `bleach`
- guardrails-ai>=0.4.0

## Installation

```bash
guardrails hub install hub://guardrails/web_sanitization
$ guardrails hub install hub://guardrails/web_sanitization
```

## Usage Examples
Expand Down Expand Up @@ -75,7 +77,7 @@ Initializes a new instance of the WebSanitization validator class.

<br>

**`__call__(self, value, metadata={}) ValidationResult`**
**`validate(self, value, metadata={}) -> ValidationResult`**

<ul>

Expand Down

0 comments on commit 3e5c415

Please sign in to comment.