initializing a variable does not cost too much performance.
I ALWAYS have rangechecks on during debugging. I ALWAYS resolve all related warnings and hints.
Quotestack overflow checkDo we still need it? Maybe. But I remember I used it when programming on 8-bit machine for DOS using Turbo Pascal 5.5. Now we're using 32 even 64 bit, do we now still have stack overflow issue?Yes, we still need it, because cocky programmers like recursion And multi-threading
stack overflow checkDo we still need it? Maybe. But I remember I used it when programming on 8-bit machine for DOS using Turbo Pascal 5.5. Now we're using 32 even 64 bit, do we now still have stack overflow issue?
Some errors can't be autodetected from sources - array bounds, invalid pointers, access to destroyed objects, wrong typecast, wrong record alignment.. It's shame, that for 15-20 years of language developmant and usage, it still don't have any protection from routines, that can cause random critical runtime errors. Smart pointers for objects by default can solve most casual troubles. In 90'th with limited computing resources, extra runtime checks was burdening, but now it's not a problem even for pocket devices. Difference in performance between debug and release builds is unnoticable.
Well, how about:- Using Low() and High() and Length() for arrays. If you don't, you are the cause of your problems- Testing Assigned(), Assert() and nil for pointers will help- Hard - wrong - type casts can never be made safe. That is a programmer instruction by definition.- Smart pointers should not be on by default, they are possible though, in trunk.- Record alignment is a programmers task. If writing to/from a storage medium is required a record should be packed and the programmer