I was reading around, stumbled on an article from Dr Dobb's online (I have a long history with Dr Dobb's
- had it not been for them - you wouldn't be reading this!). The article described a 'feature' of Windows 7 that I have mixed feelings on. The same sort of mixed (well, not so mixed, I lean far to one side on the use of this feature) feelings I have for cursor_sharing being set to anything other than EXACT.
Here is the article
Ok, so why don't I like it? It seems to be a way to 'self correct' a program. It "seems" like a "good thing".
I don't like it because it won't help get the problem fixed (same with cursor_sharing :( ). In fact, it will promote *more*
code being developed that suffers from heap overwrites. It lets bad developers develop bad code even faster
and distribute it - thinking they are seriously good developers. That is, it leads to delusion
and the bad coders getting more senior without learning from their mistakes.
In short, it instills a false sense of "hey, I'm pretty good" in developers that probably shouldn't have that sense.
It could definitely lead to some really strange issues - think about it, a program that used to crash - stops crashing - for a while - then crashes more (as the overwrite occasionally gets bigger than normal, requiring more "pad" bytes). And who knows how allowing a memory overwrite to propagate into other bits of the code will affect it. I prefer code that works right - not code that sometimes seems to work.
I'm reminded of a code review I did some 20 years ago. I asked the developer "why do you have this massive static array defined - we don't seem to use it". The answer "if you take it out, the program crashes, the compiler must have a bug". The look on my face - I wish I had a picture - it would have been priceless.
I'm not a fan of this "let's try to fix it for you and let you pretend you know what you're doing" approach to software. We rely on software way too much.
Oh well, just a 30 second thought - I just read the article and felt the need to gripe... Things like this scare me.
I have to say - I wrote something similar myself some many years ago . It was a C library I called xalloc. It replaced the malloc, calloc, realloc, free, etc functions of C. It worked by allocating (or freeing) the requested memory and adding a few bytes. It would set some bytes at the FRONT and the END of the allocated memory, set some bytes to represent the source code file and line number that allocated the memory, and return a pointer to the memory to be used by the program. Every time you called any of the xalloc functions - it would inspect the allocated memory (all of it) and CRASH the program if any of the magic bytes in front/at the end of the memory block had been changed. When the program exited - it would report on all allocated memory that wasn't freed. You could turn off the checking with an environment variable if you wanted, but it was always ready to be "on". I made everyone that worked in my team use it - it saved us countless hours (and it found the bug in the code of the person that needed to allocate that big array in a few seconds)...
My approach differed from Windows 7 in that I would prefer a program to crash and burn immediately rather than live for another second if it made such a grievous error. I'd still rather the program die than continue today...