TestSummary
TestSummary[symbol]
gives a summary of all tests associated with symbol.
TestSummary["symbol"]
gives a summary of all tests associated with symbol name symbol.
TestSummary["symbol/name"]
gives a summary of all tests associated with symbol and categorized according to name.
gives a summary of all tests in a paclet.
TestSummary[report]
gives a summary of report.
TestSummary[{report1,report2,…}]
gives a summary by merging summaries of each reporti.
TestSummary[{summary1,summary2,…}]
gives a summary by merging all summaries summaryi.
TestSummary[{test1,test2,…}]
gives a summary of the results of the testi.
TestSummary[File["file"]]
gives a summary of all tests in file.
Details and Options
- TestSummary returns a TestSummaryObject.
- TestSummary is a more succinct summary of a TestReport.
- TestSummary can be used to either obtain testing information on a package symbol, paclet or to combine existing summaries, reports and tests.
- TestSummary[symbol] gives a summary of all test files within a location of the form "root/symbol" where root resolves to the "TestFiles" directory of the paclet containing symbol's context.
- TestSummary["symbol/name"] gives a summary of all test files within a location of the form "root/symbol/name" either from a similarly-named file or else from all the files in a similarly-named directory.
- TestSummary["symbol/name"] is equivalent to TestSummary[TestFiles["symbol/name"]] if TestFiles["symbol/name"] is non-empty or otherwise it is equivalent to TestSummary[TestFile["symbol/name"]] if TestFile["symbol/name"] is non-empty. In either case, whether or not .wlt or .nb test file(s) are used to generate the test summary depends on the option "TestFileExtension".
- In TestSummary[PacletObject[…]], the summary is produced from all test files in the subdirectories of the paclet's "TestFiles" directory which is defined in its PacletInfo.wl file containing the Extensions key {"TestFiles", "Root" "TestFiles"}.
- In TestSummary["symbol/dir/…/name"] finer-granulated test categorizations are also supported.
- TestSummary effectively stores aggregable properties in TestSummaryObject like "TestsCount", "SuccessRate", "MemoryUsed", "MeanTimeElapsed", "MaxMemoryUsed" etc.
- TestSummary does not retain the original tests in a TestSummaryObject.
- TestSummary is designed to be used in concert with TestReport during systematic testing.
- TestSummary handles both .wlt test files or .nb testing notebooks.
- Source data types can be mixed so that combinations like TestSummary[{symbol,"symbol/name",paclet,report,summary,file,{test1,test2,…}}] or TestSummary[{symbol,report, {report1, report2},{summary1,summary2}, {file1,file2}}] are all supported.
- TestSummary[PacletObject[…]] is equivalent to TestSummary[TestFiles[PacletObject[…],Infinity]].
- In TestSummary[test1,test2,…], the testi take the form of a TestObject as generated from TestCreate or VerificationTest.
- In TestSummary[…, "Name""name"], the resulting TestSummaryObject displays name.
-
"Name" None name assigned to the TestSummaryObject - In TestSummary[File["file"]], the name of file is automatically set to the "Name" option.
- In TestSummary[symbol], TestSummary["symbol"] and TestSummary[PacletObject[…]], the type of test files located and the names of excluded test files are given by the following options respectively:
-
"TestFileExtension" ".wlt" specifies the test file extension to search for "ExcludedTestFiles" $ExcludedTestFiles test files excluded during search - Possible settings for "TestFileExtension" include:
-
".wlt" search only for test files with extension .wlt ".nb" search only for test files with extension .nb All search for test files with extension .wlt or .nb Automatic equivalent to the setting .wlt - Possible settings for "ExcludedTestFiles" include:
-
patt excluded test files match the string pattern patt RegularExpression["regex"] excluded test files match the regular expression {patt1,patt2,…} excluded test files match any of the patti None no excluded test files - $ExcludedTestFiles is set by default to "*FileSystemModify.*".
- In TestSummary[report], TestSummary[{report1, report2, …}] or TestSummary[{summary1,summary2, …}], the summary generated does not involve evaluating new tests and instead summarizes tests previously evaluated.
- In TestSummary[symbol], TestSummary[PacletObject[…]], TestSummary["file"] , TestSummary[{test1,test2,…}] etc. the generated summary involves evaluating new tests and options can affect/log tests in the same way they influence during TestReport invocations. TestSummary therefore, contains the same options as TestReport to similarly influence tests run during its execution:
-
HandlerFunctions <> how to handle events generated HandlerFunctionsKeys Automatic what parameters to supply to handler functions MemoryConstraint Infinity memory (in bytes) each test is allowed to use ProgressReporting $ProgressReporting whether to report the progress SameTest SameQ function to compare actual and expected outputs TestEvaluationFunction TestEvaluate function to evaluate created tests TimeConstraint Infinity time (in seconds) each test is allowed to use - During the execution of TestSummary, the following events can be generated:
-
"FileStarted" test file started "FileCompleted" test file completed "ReportStarted" test report started "ReportCompleted" test report completed "RuntimeFailure" runtime failure encountered "TestCreated" test created "TestEvaluated" test evaluated - With the specification HandlerFunctions-><…,"eventi"->fi,…>, fi[assoc] is evaluated whenever eventi is generated. The elements of assoc have keys specified by the setting for HandlerFunctionsKeys.
- Possible keys specified by HandlerFunctionsKeys include:
-
"EventName" name of the event being handled "EventID" unique ID of the event "Failure" failure object associated with the event "FailureType" failure type associated with the test "Outcome" outcome associated with the test "TestFileName" test file name associated with the event "TestObject" test object associated with the event "Title" name of the test report - TestSummary sets $TestFileName to the name of the file from which a test is being run.
- CreateNotebook["Testing"] opens a .nb testing notebook that can also be saved as a .wlt notebook for use in the testing framework.
- CreateDeveloperPaclet["name","package"{fn1,fn2,…}] creates a paclet configured with a .paclet file, TestFiles directory and test files ready for use in this testing framework built around TestSummary.
Examples
open allclose allBasic Examples (1)
Give a test summary of all tests associated with TestSummary.
Give a test summary of all exception-handling tests associated with TestSummary.
Give a test summary of all tests in the CodeAssurance paclet.
Introduce a toy test file with half of its tests being successful.
Generate a test summary of the toy file.
Handler functions are of particular importance in TestSummary invocations since they provide the ability to identify failing tests while maintaining the function's light footprint.
Define a handler to print failed tests along with their containing files
Apply the handler to the toy test file to identify test failures.
The built-in testing framework uses test reports that also integrate with test summaries.
Generate a simulated report containing a hundred tests.
Generate a summary of this report
A test summary has properties in common with a test report including whether or not all tests succeeded.
A test summary contains some properties not available in a test report as well as generating more human-readable output for other properties. Here, determine time and memory resources used in running tests including mean values of time and memory resources.
A report summary has a much smaller memory footprint than a test report due to not storing all of the original tests and their specifications.
Scope (2)
Generate two reports each populated with 100 toy tests.
Create a summary of the combined report and name it "Combined"
The disparity between the memory footprints of summaries and reports grows as a system scales. This is because the memory footprint of test reports grow with the number of tests whereas the memory footprint of test summaries more or less stays at a constant size. Test summaries therefore become increasingly useful as a system expands.
Compare the use of a test summary with that of a test report.
Generate a test report of all the tests of TestSummary, followed by generating a test summary of this report.
Extract core statistical properties
A test summary possesses some additional properties.
Generate a test report and test summary of the tests associated with TestSummaryObject.
Generate a test report and a test summary that combines the tests of TestSummary and TestSummaryObject.
Compare the test summaries and test reports of TestSummary and TestSummaryObject across selected properties.
Although a test summary does not contain the original tests, it nonetheless retains sufficient information for descriptive statistics to be precisely reconstructed.
Reconstruct some descriptive statistics for the property "CPUTTimeUsed".
Confirm that these statistics have been precisely reconstructed by using the full data from the original tests that were stored in the original test reports.
Aggregate values for the "CPUTimeUsed" property for all tests in the respective reports of TestSummary and TestSummaryObject before running standard statistics on this list.
This illustrates that a test summary, even though a substantially lighter version of the report from which it originates, can nonetheless extract similar distributional insights from this abbreviated form--an outcome with significant scaling implications. Naturally, analyses requiring the original tests themselves require a full test report.
The original tests are not stored in a test summary.
This suggests an optimal workflow that combines both test summaries and test reports.
Options (1)
Applications (1)
Create a function that plots a property of the test summary of a CodeAssurance symbol, broken down by test file.
Plot the TestsCount property for all the symbols of CodeAssurance.
The number of tests or TestsCount measures one dimension of a function's complexity but assumes that the complexity of each test remains relatively constant. This is often a reasonable assumption given that a "unit test" is so-named in order to signal the testing of a "unit" of functionality. For pragmatic purposes however, the coverage of a test suite can often be significantly improved by corralling multiple bits of functionality into a single test (see also IntermediateTest). In such scenarios, the complexity of a function is potentially more accurately measured through the aggregate size of all of a function's tests--a measure given by the TestSize property.
Plot the (aggregated) TestSize property of the tests associated with the symbols of CodeAssurance.
The broad complexity picture remains as before save for a few changes such as, for example, the extra aggregate size of the tests in TestSummary.
Properties & Relations (1)
Because a test summary can be generated from a variety of sources, it may not be immediately apparent whether or not all tests were run during the same invocation. Establishing such simultaneity ensures that tests have at least succeeded in a given in-situ environment.
Introduce a toy test file with half its tests correct.
Generate a test summary of this file.
Generate a summary from the toy tests
All of the tests in this combined summary were however, not conducted in the same invocation
Generate the same summary but this time in a single invocation.
All of the tests in the combined summary were run as part of the same invocation.
Possible Issues (6)
By including all of the original tests, TestReport can identify failed tests.
Create a test suite with a single failing test.
In contrast, a TestSummaryObject, while efficiently detecting a test failure …
… is by default unable to identify the actual test that failed along with any accompanying details.
In general, TestSummary, identifies failing tests through its HandlerFunctions option but in some cases this means preparing tests that are not immediately evaluated.
Create an event handling function.
Create the tests using TestCreate instead of VerificationTest while also injecting values to overcome the former's HoldAllComplete attribute.
Generate a test summary while also showing failing tests
It is therefore best practice to use both TestReport and TestSummary in concert; the former to establish a complete record of unit-testing performed and the latter to efficiently measure incremental unit-testing progress as a system evolves.
The property "MemoryUsed" aggregates the memory used in all tests and hence is a measure of memory requirements to the extent tests are run in parallel. Typically however, tests are run sequentially with memory recycled after each test meaning that "MaxMemoryUsed" is likely to be more indicative of memory resources needed.
TestSummary when applied to a file runs TestReport on the file before summarizing the resulting report. Because some of a report's properties will vary from run to run, slightly different summaries may result despite being derived from the same source file. First create a summary from an existing report.
Now create a summary directly from the source file itself.
Some properties are identical.
Other properties can differ marginally such as, for example, TimeElapsed.
Test reports automatically remove duplicates
Test summaries however, do not retain the original tests so duplicates cannot be similarly removed.
TestSummary generally operates on on the same types of input as TestReport except for some file-based exceptions.
Define a file location according to a string.
TestReport can operate on a raw location address.
TestSummary, on the other hand, reserves textual input for symbol-related locations so that raw file locations require a File wrapper.
Conversely, TestSummary omits File wrappers when specifying a symbol-related test file.
TestReport requires a TestFile wrapper on symbol-related test file
Creating a test summary for the tests associated with a global variables fails due to their evaluation.
Create a test summary for the tests associated with a global variable by using its string name.