__OPTIMIZATION_LEVEL__ seems to be set to the optimization level invoked on the command line, but does not change if the optimization level for a specific routine is changed using an attribute such as __attribute__((optimize("O0"))) Is there a way to dete..
↧