by Simon Carstensen and Joel Aufrecht
OpenACS docs are written by the named authors, and may be edited
by OpenACS documentation staff.
-
+
It seems to me that a lot of people have been asking for some guidelines on how to write automated tests. I've done several tests by now and have found the process to be extremely easy and useful. It's a joy to work with automated testing once you get the hang of it.
Create the directory that will contain the test
script and edit the script file. The directory location and file name are standards which are recognized by the automated testing package:
[$OPENACS_SERVICE_NAME www]$ mkdir /var/lib/aolserver/$OPENACS_SERVICE_NAME/packages/myfirstpackage/tcl/test
[$OPENACS_SERVICE_NAME www]$ cd /var/lib/aolserver/$OPENACS_SERVICE_NAME/packages/myfirstpackage/tcl/test
@@ -41,7 +42,7 @@
To create a test case you call
aa_register_case test_case_name..
Once you've created the test case you start writing the needed logic.
-We'll use the tutorial package, "myfirstpackage," as an example.
+We'll use the tutorial package, "myfirstpackage," as an example.
Let's say you just wrote an API for adding and deleting notes in the
notes packages and wanted to test that. You'd probably want to write a
test that first creates a note, then verifies that it was inserted,
@@ -53,7 +54,7 @@
call to aa_run_with_teardown which basically means that all the
inserts, deletes, and updates will be rolled back once the test has
been executed. A very useful feature. Instead of inserting bogus data
-like: set name "Simon", I tend to generate a random script in order avoid inserting a value that's already in the database:
set name [ad_generate_random_string]
+like: set name "Simon", I tend to generate a random script in order avoid inserting a value that's already in the database:set name [ad_generate_random_string]
Here's how the test case looks so far:
aa_register_case mfp_basic_test {
My test
} {
@@ -64,10 +65,10 @@
}
}
Now let's look at the actual test code. That's the code that
-goes inside -test_code {}. We want to implement test case API-001, "Given an object id from API-001, invoke mfp::note::get. Proc should return the specific word in the title."
+goes inside -test_code {}. We want to implement test case API-001, "Given an object id from API-001, invoke mfp::note::get. Proc should return the specific word in the title."
set name [ad_generate_random_string]
set new_id [mfp::note::add -title $name]
- aa_true "Note add succeeded" [exists_and_not_null new_id]
To test our simple case, we must load the test file into the system (just as with the /tcl file in the basic tutorial, since the file didn't exist when the system started, the system doesn't know about it.) To make this file take effect, go to the APM and choose "Reload changed" for "MyFirstPackage". Since we'll be changing it frequently, select "watch this file" on the next page. This will cause the system to check this file every time any page is requested, which is bad for production systems but convenient for developing. We can also add some aa_register_case flags to make it easier to run the test. The -procs flag, which indicates which procs are tested by this test case, makes it easier to find procs in your package that aren't tested at all. The -cats flag, setting categories, makes it easier to control which tests to run. The smoke test setting means that this is a basic test case that can and should be run any time you are doing any test. (a definition of "smoke test")
Once the file is loaded, go to ACS Automated Testing and click on myfirstpackage. You should see your test case. Run it and examine the results.
API testing can only test part of our package - it doesn't test the code in our adp/tcl pairs. For this, we can use TCLwebtest. TCLwebtest must be installed for this test to work. This provides a library of functions that make it easy to call a page through HTTP, examine the results, and drive forms. TCLwebtest's functions overlap slightly with acs-automated-testing; see the example provided for one approach on integrating them.
Now we can add the rest of the API tests, including a test with deliberately bad data. The complete test looks like:
ad_library {
+ aa_true "Note add succeeded" [exists_and_not_null new_id]
To test our simple case, we must load the test file into the system (just as with the /tcl file in the basic tutorial, since the file didn't exist when the system started, the system doesn't know about it.) To make this file take effect, go to the APM and choose "Reload changed" for "MyFirstPackage". Since we'll be changing it frequently, select "watch this file" on the next page. This will cause the system to check this file every time any page is requested, which is bad for production systems but convenient for developing. We can also add some aa_register_case flags to make it easier to run the test. The -procs flag, which indicates which procs are tested by this test case, makes it easier to find procs in your package that aren't tested at all. The -cats flag, setting categories, makes it easier to control which tests to run. The smoke test setting means that this is a basic test case that can and should be run any time you are doing any test. (a definition of "smoke test")
Once the file is loaded, go to ACS Automated Testing and click on myfirstpackage. You should see your test case. Run it and examine the results.
API testing can only test part of our package - it doesn't test the code in our adp/tcl pairs. For this, we can use TCLwebtest. TCLwebtest must be installed for this test to work. This provides a library of functions that make it easy to call a page through HTTP, examine the results, and drive forms. TCLwebtest's functions overlap slightly with acs-automated-testing; see the example provided for one approach on integrating them.
Now we can add the rest of the API tests, including a test with deliberately bad data. The complete test looks like:
ad_library {
Test cases for my first package.
}
@@ -83,15 +84,15 @@
-test_code {
set name [ad_generate_random_string]
set new_id [mfp::note::add -title $name]
- aa_true "Note add succeeded" [exists_and_not_null new_id]
+ aa_true "Note add succeeded" [exists_and_not_null new_id]
mfp::note::get -item_id $new_id -array note_array
- aa_true "Note contains correct title" [string equal $note_array(title) $name]
+ aa_true "Note contains correct title" [string equal $note_array(title) $name]
mfp::note::delete -item_id $new_id
set get_again [catch {mfp::note::get -item_id $new_id -array note_array}]
- aa_false "After deleting a note, retrieving it fails" [expr $get_again == 0]
+ aa_false "After deleting a note, retrieving it fails" [expr $get_again == 0]
}
}
@@ -108,15 +109,15 @@
set name {-Bad [BAD] \077 { $Bad}}
append name [ad_generate_random_string]
set new_id [mfp::note::add -title $name]
- aa_true "Note add succeeded" [exists_and_not_null new_id]
+ aa_true "Note add succeeded" [exists_and_not_null new_id]
mfp::note::get -item_id $new_id -array note_array
- aa_true "Note contains correct title" [string equal $note_array(title) $name]
- aa_log "Title is $name"
+ aa_true "Note contains correct title" [string equal $note_array(title) $name]
+ aa_log "Title is $name"
mfp::note::delete -item_id $new_id
set get_again [catch {mfp::note::get -item_id $new_id -array note_array}]
- aa_false "After deleting a note, retrieving it fails" [expr $get_again == 0]
+ aa_false "After deleting a note, retrieving it fails" [expr $get_again == 0]
}
}
@@ -159,9 +160,9 @@
# Request note-edit page
set package_uri [apm_package_url_from_key myfirstpackage]
- set edit_uri "${package_uri}note-edit"
- aa_log "[twt::server_url]$edit_uri"
- twt::do_request "[twt::server_url]$edit_uri"
+ set edit_uri "${package_uri}note-edit"
+ aa_log "[twt::server_url]$edit_uri"
+ twt::do_request "[twt::server_url]$edit_uri"
# Submit a new note
@@ -176,8 +177,8 @@
# Request index page and verify that note is in listing
tclwebtest::do_request $package_uri
- aa_true "New note with title \"$note_title\" is found in index page" \
- [string match "*${note_title}*" [tclwebtest::response body]]
+ aa_true "New note with title \"$note_title\" is found in index page" \
+ [string match "*${note_title}*" [tclwebtest::response body]]
#-------------------------------------------------------------
# Delete Note
@@ -190,27 +191,27 @@
# 3) screen-scrape for the ID
# all options are problematic. We'll do #1 in this example:
- set note_id [db_string get_note_id_from_name "
+ set note_id [db_string get_note_id_from_name "
select item_id
from cr_items
where name = :note_title
and content_type = 'mfp_note'
- " -default 0]
+ " -default 0]
- aa_log "Deleting note with id $note_id"
+ aa_log "Deleting note with id $note_id"
- set delete_uri "${package_uri}note-delete?item_id=${note_id}"
+ set delete_uri "${package_uri}note-delete?item_id=${note_id}"
twt::do_request $delete_uri
# Request index page and verify that note is in listing
tclwebtest::do_request $package_uri
- aa_true "Note with title \"$note_title\" is not found in index page after deletion." \
- ![string match "*${note_title}*" [tclwebtest::response body]]
+ aa_true "Note with title \"$note_title\" is not found in index page after deletion." \
+ ![string match "*${note_title}*" [tclwebtest::response body]]
} -teardown_code {
twt::user::delete -user_id $user_id
}
}
-
See also the section called “Automated Testing”.