Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

gh-126119: fix some crashes in code objects if co_stacksize is absurdly large #126122

Open
wants to merge 15 commits into
base: main
Choose a base branch
from

Conversation

picnixz
Copy link
Contributor

@picnixz picnixz commented Oct 29, 2024

Objects/frameobject.c Outdated Show resolved Hide resolved
@picnixz
Copy link
Contributor Author

picnixz commented Oct 29, 2024

I found out that this won't be sufficient so converting it to a draft for now until I patch more attackable entrypoints.

@picnixz picnixz marked this pull request as draft October 29, 2024 13:03
@markshannon
Copy link
Member

How about rejecting excessively large co_stacksize when creating code objects?
It will be a much simpler fix.

@picnixz
Copy link
Contributor Author

picnixz commented Oct 29, 2024

How about rejecting excessively large co_stacksize when creating code objects?

That's what I'm actually doing (or am I not?) [It's just that the expression to check is not the best one and I'm trying to see if I can make other place of the code overflow with:

    if ((size_t)con->stacksize
        >= (INT_MAX / sizeof(PyObject *))  - FRAME_SPECIALS_SIZE - nlocalsplus)
    {
        PyErr_SetString(PyExc_OverflowError, "code: co_stacksize is too large");
        return -1;
    }

@picnixz
Copy link
Contributor Author

picnixz commented Oct 29, 2024

Ok, I found different issues. It's possible to bypass the check in the code object validator that I added when attacking generator's frames. I'll need a bit more time for that but I won't be coding anymore today.

The evil size that should be checked in code constructor is "((_testcapi.INT_MAX - 10) // sizeof(PyObject *))" but the following crashes:

        ps = ctypes.sizeof(ctypes.c_void_p)  # sizeof(PyObject *)
        smallest_evil_co_stacksize = (_testcapi.INT_MAX // ps) - 10
        evil = f().gi_frame.f_code.__replace__(co_stacksize=smallest_evil_co_stacksize - 1) # ok
        evil_gi = types.FunctionType(evil, {})()  # crashes
        self.assertRaises(OverflowError, evil_gi.__sizeof__)  # crashes

With smallest_evil_co_stacksize, __replace__ would raise as expected but with smallest_evil_co_stacksize - 1 it's FunctionType that is crashing.

@markshannon
Copy link
Member

markshannon commented Oct 29, 2024

I meant choose a large value, rather than checking for overflow.
That way we won't need to check for overflow anywhere else.

Something like co_nlocals + co_stacksize < INT_MAX/2/SIZEOF_VOID_P co_nlocals + co_stacksize < INT_MAX/8

@picnixz
Copy link
Contributor Author

picnixz commented Oct 29, 2024

That way we won't need to check for overflow anywhere else.

Do you have a good value in mind? I'm not sure if it could affect recursion limit for instance or tracebacks if you change the stacksize (clearly not an expert here). I'm going offline now but we can continue this discussion tomorrow.


Also, maybe I could find a way to pass the check in the code object but make it crash when creating fake generators. So the choice of co_nlocals + co_stacksize < INT_MAX / 8 may not be sufficient I think (or am I messing up my arithmetic?). I'm a bit tired so I'll check tomorrow.


EDIT (30th October): I'm not sure I'll be able to work on that today (I have some IRL stuff to do and won't be available today a lot).

@picnixz
Copy link
Contributor Author

picnixz commented Oct 31, 2024

The free-threaded build crashes (just with "Segmentation fault (core dumped)", no traceback) but I don't know why... @colesbury do you have some idea? The bad code is

import types
def f(): yield
code = f().gi_frame.f_code
evil_code = code.__replace__(co_stacksize=268435444)  # OK
evil_gi_func = types.FunctionType(evil_code, {})      # OK
evil_gi = evil_gi_func()  							  # not ok!	

@picnixz picnixz marked this pull request as ready for review October 31, 2024 11:25
@picnixz picnixz changed the title gh-126119: fix an overflow on co_framesize if co_stacksize is absurdely large gh-126119: fix some crashes in code objects if co_stacksize is absurdely large Oct 31, 2024
@ZeroIntensity
Copy link
Member

I'll take a look at the feee-threaded failures.

@ZeroIntensity
Copy link
Member

Ah @picnixz, the problem is here. We need to overflow check, or perhaps prevent the code object from being able to do that in the first place.

@picnixz
Copy link
Contributor Author

picnixz commented Oct 31, 2024

We need to overflow check, or perhaps prevent the code object from being able to do that in the first place.

That's what I'm actually doing. It's exactly the check I'm performing but I wonder why _PyCode_Validate is not called prior to this point! It seems that there is a validation that is not being done. Because code->co_nlocalsplus + code->co_stacksize is < co_framesize and I do a check on co_framesize when constructing the code object. I'll try to dive more into it.

(Thanks for finding it btw)

@picnixz
Copy link
Contributor Author

picnixz commented Oct 31, 2024

For the evil example above, I have code->co_nlocalsplus + code->co_stacksize = 268435454 (which is co_stacksize + FRAME_SPECIALS_SIZE with FRAME_SPECIALS_SIZE = 10). Btw, the crash happens at i = 4595. Isn't it close to some recursion limit or stack limit or something like that?

Lib/test/test_frame.py Outdated Show resolved Hide resolved
@colesbury colesbury changed the title gh-126119: fix some crashes in code objects if co_stacksize is absurdely large gh-126119: fix some crashes in code objects if co_stacksize is absurdly large Nov 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants