Skip to content

Commit

Permalink
PEP 675: Mark malicious code example with red sidebar (#3574)
Browse files Browse the repository at this point in the history
Co-authored-by: Aliaksei Urbanski <[email protected]>
  • Loading branch information
hugovk and Jamim authored Dec 11, 2023
1 parent d9e47a2 commit 8ce4ba9
Showing 1 changed file with 16 additions and 11 deletions.
27 changes: 16 additions & 11 deletions peps/pep-0675.rst
Original file line number Diff line number Diff line change
@@ -1,14 +1,11 @@
PEP: 675
Title: Arbitrary Literal String Type
Version: $Revision$
Last-Modified: $Date$
Author: Pradeep Kumar Srinivasan <[email protected]>, Graham Bleaney <[email protected]>
Sponsor: Jelle Zijlstra <[email protected]>
Discussions-To: https://mail.python.org/archives/list/[email protected]/thread/VB74EHNM4RODDFM64NEEEBJQVAUAWIAW/
Status: Accepted
Type: Standards Track
Topic: Typing
Content-Type: text/x-rst
Created: 30-Nov-2021
Python-Version: 3.11
Post-History: 07-Feb-2022
Expand Down Expand Up @@ -50,7 +47,8 @@ However, the user-controlled data ``user_id`` is being mixed with the
SQL command string, which means a malicious user could run arbitrary
SQL commands:

::
.. code-block::
:class: bad
# Delete the table.
query_user(conn, "user123; DROP TABLE data;")
Expand Down Expand Up @@ -883,15 +881,17 @@ or ``do_mark_safe`` in `Jinja2
<https://github.com/pallets/jinja/blob/077b7918a7642ff6742fe48a32e54d7875140894/src/jinja2/filters.py#L1264>`_,
which cause XSS vulnerabilities:

::
.. code-block::
:class: bad
dangerous_string = django.utils.safestring.mark_safe(f"<script>{user_input}</script>")
return(dangerous_string)
This vulnerability could be prevented by updating ``mark_safe`` to
only accept ``LiteralString``:

::
.. code-block::
:class: good
def mark_safe(s: LiteralString) -> str: ...
Expand All @@ -913,7 +913,8 @@ insert expressions which execute arbitrary code and `compromise
<https://www.onsecurity.io/blog/server-side-template-injection-with-jinja2/>`_
the application:

::
.. code-block::
:class: bad
malicious_str = "{{''.__class__.__base__.__subclasses__()[408]('rm - rf /',shell=True)}}"
template = jinja2.Template(malicious_str)
Expand All @@ -923,7 +924,8 @@ the application:
Template injection exploits like this could be prevented by updating
the ``Template`` API to only accept ``LiteralString``:

::
.. code-block::
:class: good
class Template:
def __init__(self, source: LiteralString): ...
Expand All @@ -945,7 +947,8 @@ options which are vulnerable to Denial of Service attacks from
externally controlled logging strings. The following example
illustrates a simple denial of service scenario:

::
.. code-block::
:class: bad
external_string = "%(foo)999999999s"
...
Expand All @@ -957,7 +960,8 @@ string passed to the logger be a ``LiteralString`` and that all
externally controlled data be passed separately as arguments (as
proposed in `Issue 46200 <https://bugs.python.org/issue46200>`_):

::
.. code-block::
:class: good
def info(msg: LiteralString, *args: object) -> None:
...
Expand All @@ -983,7 +987,8 @@ warnings about non-literal strings.
4. Trivial functions could be constructed to convert a ``str`` to a
``LiteralString``:

::
.. code-block::
:class: bad
def make_literal(s: str) -> LiteralString:
letters: Dict[str, LiteralString] = {
Expand Down

0 comments on commit 8ce4ba9

Please sign in to comment.