You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In today's lecture, we talked about the reason Linux uses linked lists for managing threads, instead of using arrays (with bitmaps indicating each element's state).
The main reason seems to be that when using arrays, and the number of elements that need to be stored exceeds the size of the array, a new allocation is needed, which is undesired.
But I'd like to ask why we don't use a pre-allocated array, which is large enough to hold any possible number of elements.
My question is based on the following assumptions:
The maximum number of elements is usually bounded
For example, bounded by physical memory size, or the system configuration (in the case of number of threads)
Most OSes (including Linux) allocate physical memory lazily
Thus, having a large uninitialized array would not create memory utilization problem
64-bit virtual memory space is big
The text was updated successfully, but these errors were encountered:
In today's lecture, we talked about the reason Linux uses linked lists for managing threads, instead of using arrays (with bitmaps indicating each element's state).
The main reason seems to be that when using arrays, and the number of elements that need to be stored exceeds the size of the array, a new allocation is needed, which is undesired.
But I'd like to ask why we don't use a pre-allocated array, which is large enough to hold any possible number of elements.
My question is based on the following assumptions:
The text was updated successfully, but these errors were encountered: