Slide 4 of 16
EXAMPLE: finlib date string parsing: more careful checking of input data enabled doubling of parse speed and eliminated crashing on bad dates. Cross-platform automated test harnesses enabled better understanding of issues and more aggressive optimisation for all targets.
Note1: the tidier, better designed, understood and documented your system, the faster and safer it is likely to be.
Note2: Your automated harnesses can be for general testing or for performance evaluation.
- Are your interfaces adequately documented and defined?
- Do you have good version control and audit trails in your code (eg SCCS, RCS, CVS, Continuus, SNiFF+) to give you a more stable environment for optimisation?
- Do you check data thoroughly at interfaces to eliminate doubt and danger later?
- Do you apply similar care to internal and external interfaces?
- Have you eliminated code made redundant by your (data) checking?
- Are your interfaces thread-safe?
- Do you have automated test harnesses for all builds (remember compilers have bugs even if your code doesn’t) and for success and failure cases (to make sure you recover as you expect)?
- Do you add new test cases for things that have failed in the past (that may indicate weak points or edge cases)?