I created the following program in C++, and it didn’t produce a memory error.
int data[100000];
int main(){
return 0;
}
I also created this program in python:
for i in range(10000):
data.append(0)
The python program crashes, even though it should end up taking up less than a tenth of the memory. What causes this? Thanks in advance.
This is the error that is produced by python:
Traceback (most recent call last):
File “userpy”, line 42, in
MemoryError: memory allocation failed, allocating 65536 bytes
Don’t you just love C compiler optimization.
data was just optimized away as it was never used.
Python is not very efficient in storing objects (relatively). Anyway, on IQ the Python heap is (IIRC) only 56k of memory, and a certain amount of that is used for compiled byte code etc.
If python has 56k of memory, does C++ have more? And also, python seems to crash after, 8192 (2^13) elements in the array. Is there a per-array memory cap?
It’s complicated, but effectively C++ would have access to more memory.
Not that I remember, but even if you are storing just numbers, each one will use 4 bytes, and perhaps appending to an array causes the entire array to be copied, idk. There are ways to show the free memory, but I don’t have any of that information at hand tonight.
Bottom line is that IQ is limited in what it can do when it comes to handling large data sets.