Description
When a _Nt_checked
array whose elements don't contain checked pointers is allocated on the stack, the compiler enforces that if the array has an initializer, the initializer must have a null terminator. However, the compiler allows the array to be left uninitialized, without a null terminator. This could potentially allow the bounds of a pointer to the array to be widened indefinitely, exceeding the original bounds.
Example program:
#pragma CHECKED_SCOPE on
const int BIG = 123456789;
void scan(_Nt_array_ptr<long> p : count(i), long i) {
if (p[i]) {
p[i] = BIG;
scan(p, i + 1);
}
}
void test(long sz, _Array_ptr<long> outer : count(sz + 1)) {
long arr _Nt_checked[1]; // Not null-terminated!
scan(arr, 0); // Overwrites sz on the stack.
outer[sz]++; // Segmentation fault
}
int main(int argc, _Nt_array_ptr<_Nt_array_ptr<char>> argv : count(argc)) {
long outer _Checked[1];
test(0, outer);
return 0;
}
This compiles with one warning, which I think is due to an unrelated compiler bug:
stack-array-init.c:6:10: warning: cannot prove argument meets declared bounds for 1st parameter [-Wcheck-bounds-decls-checked-scope]
scan(p, i + 1);
^
stack-array-init.c:6:10: note: (expanded) expected argument bounds are 'bounds(p, p + i + 1)'
stack-array-init.c:6:10: note: (expanded) inferred bounds are 'bounds(p, p + i + 1)'
scan(p, i + 1);
^
1 warning generated.
Then at runtime, I get a segmentation fault on the indicated line. Of course, the results may depend on the system's stack layout, and it may not be possible to demonstrate a problem on every system. On some systems, it may only be possible to detect a slightly out-of-bounds stack read or write using Valgrind, not to use it to dereference a totally bogus pointer. I'm using Ubuntu Linux 20.04, x86_64, and compiling the example program with clang -g
and no other options.