9th WORKSHOP ON TEACHING SOFTWARE TESTING (WTST 2010)
JANUARY 29 – 31, 2010
At the HARRIS INSTITUTE FOR ASSURED INFORMATION
CENTER FOR SOFTWARE TESTING EDUCATION & RESEARCH
ADVISORY BOARD MEETING, JANUARY 28, 2010
We recommend the Residence Inn by Marriott (http://www.melbourneresidenceinn.com/), located at 1430 S. Babcock Street in Melbourne, FL. Please call 321-723-5740 to book your room by January 13 and ask for the special WTST rates.
- Wed. & Thur. : $109.00 per room per night plus 11% tax
- Fri. & Sat. : $89.00 per room per night plus 11% tax
All reservations must be guaranteed with a credit card by January 13,2010 at 6:00 pm. If rooms are not reserved, they will be released for general sale. Following that date reservations can only be made based upon availability.
If you must cancel your reservation after 5:00 pm, 5 days prior to arrival, you will forfeit the deposit of one-night room charges as a cancellation fee.
Check-in time is at 4:00 p.m. and check out time is at 12:00 noon and early check-in and/or late check-outs are not available.
ROOMS AND AMENITIES:
- Studio Room With A King Bed & Sofabed
- Complimentary Hot Breakfast Buffet
- Complimentary Wireless/Wired Internet Access
- Daily Housekeeping Service
- Fully Equipped Kitchens In All Rooms:
- Full Size Refrigerator, Microwave, Stovetop, and Coffee Maker
- Outdoor Heated Pool
- Exercise Facility
- Outdoor SportCourt
- 100% Nonsmoking Hotel
- LG 32″ Flat Panel LCD HDTV
This property also has 1 and 2 bedroom suites but does not offer discounted rates for them. Nevertheless, you may prefer to split the cost of a larger room to share with another attendee at a savings. Contact the hotel to make arrangements.
CALL FOR PARTICIPATION
The Workshop on Teaching Software Testing is concerned with the practical aspects of teaching university-caliber software testing courses to academic or commercial students.
This year, we are particularly interested in teaching test-driven programming or other approaches to implementation-level testing.
We invite participation by:
- academics who have experience teaching testing courses or programming courses with significant testing components
- practitioners who teach professional seminars on software testing
- academic or practitioner instructors with significant online teaching experience and wisdom
- one or two graduate students
- a few seasoned teachers or testers who are beginning to build their strengths in teaching software testing.
There is no fee to attend this meeting. You pay for your seat through the value of your participation. Participation in the workshop is by invitation based on a proposal. We expect to accept 15 participants with an absolute upper bound of 25.
WTST is a workshop, not a typical conference. Our presentations serve to drive discussion. The target readers of workshop papers are the other participants, not archival readers. We are glad to start from already-published papers, if they are presented by the author and they would serve as a strong focus for valuable discussion.
In a typical presentation, the presenter speaks 10 to 90 minutes, followed by discussion. There is no fixed time for discussion. Past sessions’ discussions have run from 1 minute to 4 hours. During the discussion, a participant might ask the presenter simple or detailed questions, describe consistent or contrary experiences or data, present a different approach to the same problem, or (respectfully and collegially) argue with the presenter. In 20 hours of formal sessions, we expect to cover six to eight presentations.
We also have lightning presentations, time-limited to 5 minutes (plus discussion). These are fun and they often stimulate extended discussions over lunch and at night.
Presenters must provide materials that they share with the workshop under a Creative Commons license, allowing reuse by other teachers. Such materials will be posted at http://wtst.org.
BACKGROUND AND SUGGESTED TOPICS
Our focus is on implementation-level testing. We are interested in the types of tests that programmers write to understand and assess their own code. These include unit tests and lower-level integration tests. Presentations on “acceptance-test driven development” are out of scope of this meeting.
We were excited by the rise of interest in implementation-level testing that came with the agile development movement. Unfortunately, test-driven development seems to have been lightly adopted by the agile development community; even less formal approaches to unit testing appear to be minority practices ( see for example, http://www.agilealliance.org/show/1546, http://www.techworld.com.au/article/256619/unit_testing_doomed, and http://agile.dzone.com/videos/scott-ambler-agile-2009). Our impression is that unit testing has been gradually becoming less visible and less central in the agile community. Practitioners believe it is hard to do this well (http://www.ambysoft.com/surveys/practices2009.html) and we’ve been finding that it is hard to teach it well.
Perhaps part of the problem is that the books that introduced unit testing to new and relatively junior programmers are out of date. The newer books and online articles that we’ve seen are written for more experienced programmers. Most writing that we’ve seen focuses on test implementation (such as how to use the tools, how to create maintainable test suites, how to test private variables, etc.) rather than on what tests to run and why to run them. We haven’t seen much that is suitable for university courses that would be appropriate for average students or for commercial introductions to the practice of TDD.
So, what can we use, or what can we develop for use in our courses?
Here are *examples* of ideas that might be interesting to the participants at WTST:
- Course design: We’re looking for experience reports, not theory. We’re interested in your report if you have made a significant effort at teaching implementation-level testing and have insight into your successes, failures and challenges.
- Online course design: There are a lot of programming courses online. Are any of them good? Can we adapt their instructional methods to create online courses that emphasize implementation-level testing? (Not many universities will teach such courses—some good online courses can create learning opportunities for a broader pool of university students and commercial students.)
- Instructive examples: Do you have particularly successful activities or assignments? What are their details? What do students learn? How do you know? What problems do students have in attempting these and how do you recommend that we deal with them (if we reuse your activity)?
- Resources for implementation-level test-related activities and assignments: we have all heard of MERLOT and NSDL and several other repositories of learning objects. Have you found any good resources for software testing in any of these repositories? What have you found? How did you search? Can you give a demo, including your search strategy?
- Assessment: What techniques should we use to determine whether our assignments and activities are working? Have you used these assessment techniques? Can you give examples?
- Qualitative assessment methods: From sloppy anecdotal reports to rigorous qualitative design. How can we use qualitative methods to conduct research on the teaching of computing, including software testing?
- Differences in characteristics of learners that predict differences in effectiveness of activities or assignments
TO ATTEND AS A PRESENTER
Please send a proposal BY DECEMBER 15, 2009 to Cem Kaner <firstname.lastname@example.org> that identifies who you are, what your background is, what you would like to present, how long the presentation will take, any special equipment needs, and what written materials you will provide. Along with traditional presentations, we will gladly consider proposed activities and interactive demonstrations.
We will begin reviewing proposals on December 1. We encourage early submissions. It is unlikely but possible that we will have accepted a full set of presentation proposals by December 15.
Proposals should be between two and four pages long, in PDF format. We will post accepted proposals to http://wtst.org.
We review proposals in terms of their contribution to knowledge of HOW TO TEACH software testing. Proposals that present a purely theoretical advance in software testing, with weak ties to teaching and application, will not be accepted. Presentations that reiterate materials you have presented elsewhere might be welcome, but it is imperative that you identify the publication history of such work.
By submitting your proposal, you agree that, if we accept your proposal, you will submit a scholarly paper for discussion at the workshop by January 15, 2010. Workshop papers may be of any length and follow any standard scholarly style. We will post these at http://wtst.org as they are received, for workshop participants to review before the workshop.
TO ATTEND AS A NON-PRESENTING PARTICIPANT:
Please send a message by DECEMBER 15, 2008, to Cem Kaner <email@example.com> that describes your background and interest in teaching software testing. What skills or knowledge do you bring to the meeting that would be of interest to the other participants?
ADVISORY BOARD MEETING
Florida Tech’s Center for Software Testing Education & Research has been developing a collection of hybrid and online course materials for teaching black box software testing. We have NSF funding to adapt these materials for implementation by a broader audience. We have formed an Advisory Board to guide this adaptation and the associated research on the effectiveness of the materials in diverse contexts. We are interested in having a few new members. The Board will meet before WTST, on January 28, 2010.
This year’s meeting will focus on supporting/creating research collaborations by members of the Board. Our primary interest lies in expanding the community doing research/development on software testing education.
- The Center is interested in being involved in new proposals, but we want to foster good ideas at this meeting whether they involve the Center or not.
- The Center is interested in encouraging adoption, evaluation and improvement of the course materials that we’ve developed, but again, we are primarily interested in fostering good ideas at this meeting, not in promoting any particular set of ideas or materials.
- The Center’s work has primarily focused on system-level black box testing. The Advisory Board meeting is open to system-level ideas as well as implementation-level.
If you are interested attending as a Board Member:
- If you are not already a member of the Board, please read this invitation and submit an application.
- If you are already a member and are willing to come on January 28, please let us know ASAP.
- In either case, please let us know whether you plan to stay for WTST.
Most of our NSF funding has been exhausted. We have some additional money for this from donations. However at this point, we can only afford to cover hotel costs of advisory board members who attend the meeting and WTST. We cannot reimburse airfare. We’ll discuss this in more detail in correspondence with the Advisory Board.
Support for this meeting comes from the National Science Foundation, the Association for Software Testing and the Harris Institute for Assured Information at the Florida Institute of Technology.
The hosts of the meeting are:
- Scott Barber (http://www.perftestplus.com)
- Rebecca Fiedler (http://www.beckyfiedler.com)
- Cem Kaner (http://www.kaner.com and http://www.testingeducation.org)
|Name||Organization||Paper (if applicable)|
|Gentleman, Morven||Dalhousie||A Failure in Teaching Test Driven Design|
|Hoffman, Daniel||University of Victoria||A new tool for evaluating and improving skill in code reading|
|Kaner, Cem||Florida Tech||Experiences in teaching test-driven design–Part II [Presentation] [Course Syllabus] [Instructor’s Course Evaluation]|
|Smith, Mike||University of Calgary||Experiences in using TDD as a key component in lectures and laboratories when teaching hardware / software co-design of embedded systems.|
|Chen, Deborah K.||Norfolk State University|
|Hoffman, Doug||Software Quality Methods|
|Kiper, James||Miami University|
|Nelson, Maria Augusta Vieira||Pontifical Catholic University of Minas Gerais, Brazil|
|Tuya, Javier||Universidad de Oviedo|
|Hagar, Jon||Lockheed Martin||Objectives for a unit testing course|
|Kaplan, Randy M.||University of Kutztown||Testability attributes of programming languages|
|Bhallamudi, Pushparani||Florida Tech|
|Fiedler, Rebecca L.||Indiana State|
|Kabbani, Nawwar||Florida Tech||Experiences in teaching TDD [Paper] [Slides]|
|Clarke, Peter||Florida International University||WReSTT testing repository|
|King, Tariq||Florida International University||(with Clarke)|
|Pizzica, Dee Anne||TerpSys|
|Tilley, Scott||Florida Tech||Grant planning at advisory board meeting|
|Tao Xie||North Carolina State University|
|Jose Alejandro Betancur Alvarez||Intergrupo|
|Adam (with Dee Anne)|
OUR INTELLECTUAL PROPERTY AGREEMENT
- If we achieve our goal, the resulting work will be very interesting to the general testing and academic communities. Each of us will probably have our own take on what was learned. Participants (all people in the room) agree to the following:
- Any of us can publish the results as we see them. None of us is the official reporter of the meeting unless we decide at the meeting that we want a reporter.
- Any materials initially presented at the meeting or developed at the meeting may be posted to any of our web sites or quoted in any other of our publications, without further permission. That is, if I write a paper, you can put it on your web site. If you write a problem, I can put it on my web site. If we make flipchart notes, those can go up on the web sites too. None of us has exclusive control over this material. Restrictions of rights must be identified on the paper itself.
- NOTE: Some papers are circulated that are already published or are headed to another publisher. If you want to limit republication of a paper or slide set, please note the rights you are reserving on your document. The shared license to republish is our default rule, which applies in the absence of an asserted restriction.
- The usual rules of attribution apply. If you write a paper or develop an idea or method, anyone who quotes or summarizes you work should attribute it to you. However, many ideas will develop in discussion and will be hard (and not necessary) to attribute to one person.
- Any publication of the material from this meeting lists all attendees as contributors to the ideas published as well as the hosting organization.
- Articles will be circulated to WTST6 attendees before being published when possible. Circulation will be via posting to the wtst6 moodle site. At a minimum, notification of publication will be circulated.
- Any attendee may request that his or her name be removed from the list of attendees identified on a specific paper.
- If you have information which you consider proprietary or otherwise shouldn’t be disclosed in light of these publication rules, please do not reveal that information to the group.
Funding for WTST 1-5 came primarily from the National Science Foundation , under grant EIA-0113539 ITR/SY+PE “Improving the Education of Software Testers.” Partical funding for the Advisory Board meetings in WTST 6-8 came from the the National Science Foundation , under grant CCLI-0717613 “Adaptation & Implementation of an Activity-Based Online or Hybrid Course in Software Testing”. WTST 3-5 were hosted by Florida Institute of Technology.
Opinions expressed at WTST or published in connection with WTST do not recessarily reflect the views of NSF. WTST is often co-sponsored by the Association for Software Testing (AST) .
WTST is a peer conference in the tradition of the Los Altos Workshops of Software Testing (TM). We thank the AST for their encouragement and financial assistance in running WTST.