-
-
Notifications
You must be signed in to change notification settings - Fork 32.1k
Possible unnecessary OverflowError in random.getrandbits #133489
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
@serhiy-storchaka The overflow occurs in the upstream call to The argument clinic spec reads:
Should |
Why not have int32 converter in the AC? Now we have PyLong_AsInt32/AsUInt32/etc. |
@skirpichev We want something like ssize_t so that we can ask for arrays as big as we can allocate. The limit should be driven by memory constraints rather than function argument constraints. 32-bit signed is insufficient. |
It was in my plans. For this case it is better to use the size_t converter (automatic ValueError for negative value), or even the uint64 converter (to be able creating bytes objects up to 2 GiB on 32-bit platforms). |
Yes, int32 was just a typo - I meant converters for fixed-size (unsigned) integers. 8-byte type looks better, of course, for the given case. |
See #133583. |
…tes() random.getrandbits() can now generate more that 2**31 bits. random.randbytes() can now generate more that 256 MiB.
#133658 makes What should we do with other versions, @Raymond? I think that it is not late to backport it to 3.14, and the backport is strightforward, because the uint64 converter exists in 3.14. In 3.13 we could use unsigned_long_long, which is practically the same. Or we can document the limitation if it is too late for backport. |
raising OverflowError feels like a bug so back porting all the way to 3.13 as a bug fix makes sense to me. |
…H-133658) random.getrandbits() can now generate more that 2**31 bits. random.randbytes() can now generate more that 256 MiB.
…tes() (pythonGH-133658) random.getrandbits() can now generate more that 2**31 bits. random.randbytes() can now generate more that 256 MiB. (cherry picked from commit 68784fe) Co-authored-by: Serhiy Storchaka <[email protected]>
… randbytes() (pythonGH-133658) random.getrandbits() can now generate more that 2**31 bits. random.randbytes() can now generate more that 256 MiB. (cherry picked from commit 68784fe) Co-authored-by: Serhiy Storchaka <[email protected]>
…ytes() (GH-133658) (#134964) gh-133489: Remove size restrictions on getrandbits() and randbytes() (GH-133658) random.getrandbits() can now generate more that 2**31 bits. random.randbytes() can now generate more that 256 MiB. (cherry picked from commit 68784fe) Co-authored-by: Serhiy Storchaka <[email protected]>
There is an (unintended?) api break. In python 3.13.3,
but in python 3.13.4,
This causes trouble for sagemath, where integers are by default a special type |
Can you please open a new issue to track that? |
Ah, that's because _PyLong_UnsignedLongLong_Converter() has no fallback to index-like objects. In 3.14+ we are using PyLong_AsNativeBytes for uint64 converter and it's not an issue. Maybe just add a uint64 converter for 3.13 (it has PyLong_AsNativeBytes) or this is too much? |
Uh oh!
There was an error while loading. Please reload this page.
Documentation
Problem
Actual upper bound of a
random.Random.randbytes
is2**28 - 1
Following docs and exception message I would expect the limit of 2 ** 32 - 1 as for C 4-bytes integer
Reproduced on
Python 3.9.4 (tags/v3.9.4:1f2e308, Apr 6 2021, 13:40:21) [MSC v.1928 64 bit (AMD64)] on win32
Python 3.11.2 (main, Jul 19 2024, 12:24:02) [GCC 12.2.0] on linux
Will you please point me - if there is a bug in docs/implementation, or this is my misunderstanding only.
Thank you in advance for you patience.
Linked PRs
random.randbytes
bounds. #133529The text was updated successfully, but these errors were encountered: