Here are some reasons why an optimized program might produce different
results from one that has not undergone the optimization process:
- Optimized code can fail if a program contains code that is not valid. The
optimization process relies on your application conforming to language standards.
- If a program that works without optimization fails when you optimize,
check the cross-reference listing and the execution flow of the program for
variables that are used before they are initialized. Compile with the -qinitauto=hex_value option
to try to produce the incorrect results consistently. For example, using -qinitauto=FF gives variables an initial
value of "negative not a number" (-NAN). Any operations on these variables
will also result in NAN values. Other bit patterns (hex_value)
may yield different results and provide further clues as to what is going
on. Programs with uninitialized variables can appear to work properly when
compiled without optimization, because of the default assumptions the compiler
makes, but can fail when you optimize. Similarly, a program can appear to
execute correctly after optimization, but fails at lower optimization levels
or when run in a different environment.
- A variation on uninitialized storage. Referring to an automatic-storage
variable by its address after the owning function has gone out of scope leads
to a reference to a memory location that can be overwritten as other auto
variables come into scope as new functions are called.
Use with caution debugging techniques that rely on examining values in
storage. The compiler might have deleted or moved a common expression evaluation.
It might have assigned some variables to registers, so that they do not appear
in storage at all.