View CartView Cart

QuestionsContact Us

Useful Guidance

Testing Documentation Checklist for Internal Audits

Listed below are the items auditors or other internal control compliance professionals should include as part of their documentation for each control activity to ascertain efficiency and effectiveness of the internal audit procedures, lower the time and resource requirements for the regulatory compliance and attestation procedures.

Some of the items below may appear intuitive; however most of them are consistently omitted in the testing documentation we review. In the effort of improving the efficiency of internal and external review processes, it is recommended to consistently include the key workpaper elements below as part of all testing documentation:

Description of inquiry testing performed, including:
  • Date of the inquiry
  • Name and the title of the person auditor spoke to
  • Brief description of the relevant information collected during inquiry
Examples:
- Inquired of [Name], [Title], on [Date]
- Per discussion with [Name], [Title], on [Date]
- Corroborated with [Name], [Title] and [Name], [Title] on [Date]
When available, description of inspection of relevant standard operating procedure or other policy documentation, including, as appropriate:
  • Effective date and version number
  • Relevant excerpts of the policy as it relates to the control being tested
Example:
On [Date] inspected Company’s Password Policy dated [Date], noting the following: XYZ.
Description of the population & testing sample, including as appropriate:
  • Nature of system-generated population (e.g. list of active users generated on xx/xx/xx)
  • If no system-generated population is available, explain how an alternative population was identified
  • Total population size for the time period under review (e.g., 01/01/10 – 12/31/10) should be documented along with a method for selecting a representative sample
  • Note that the sample selection should be representative of the time period under review – the selection should be made throughout the year (see sampling guidelines below)
  • If the number of samples selected does not follow the prescribed guidance based on population, please explain the rationale for the sample chosen
  • Sample selection rationale (i.e., using random number generator software, judgmentally, etc.)
  • Description of the scope of the sample, if applicable (e.g., “A sample of XYZ users with account creation dates during XYZ period was selected across the following XYZ applications: XYZ")
Example:
On [Date] obtained from [Name], [Title], a system generated listing of user IDs created in [System Name] between [Date] and [Date] noting [Total Number of User IDs] of users. Haphazardly selected [Number of Selections] of [Total Number of User IDs] for testing.
The sample selection should be based on the frequency or number of occurrences, with suggested minimum sample sizes as follows:

Frequency of the Control Performed: Population Size: Recommended Minimum
Sample Size:
Annually/Quarterly 1 to 4 1
Monthly 5 to 12 2
Weekly 13 to 52 5
Daily 53 to 260 15
Many Times a Day Greater than 260 25

Description of the inspection testing, including, as appropriate:
  • Detailed steps of testing performed, including attributes tested for each sample
  • Reference to a supplemental spreadsheet that contains sample detail. Sample detail should include enough information for each sample such that someone could re-perform all of the test steps
  • Documentation should show that sufficient competent audit evidence was obtained to provide a reasonable basis for the conclusions reached (retain evidence!)

Description of conclusion on the effectiveness of each control activity tested, including as appropriate:
  • Clearly document the description of relevant issues (findings)
  • Make sure conclusions are consistent with the results of the work performed
  • When a preliminary issue or finding was deemed not an exception after further research or explanation, document this rationale in sufficient detail such that someone re-performing the test could arrive at similar conclusion
  • Mitigating controls identified and tested, if applicable.

Summarize issues/findings:
  • For issues noted during testing, refer to a document containing a summary of all exception 
  • In the summary document, include enough information for any exception to “stand on its own” (i.e., one does not need to go back to the testing workpaper to understand it)
  • For issues deemed low risk, include rationale for reaching such a conclusion.