Online Survey Procedures

Surveys & Research Home
   

TUSD Stats Online Survey Procedures

To support the opinion research needs of the Tucson Unified School District, Department of Accountability and Research staff created an internet-based survey application. The survey application combines .NET webpage programming with Structured Query Language (SQL) Server database automation to create a flexible, modular survey tool.

Prior to conducting a survey, department staff works with the survey authors to write effective questions and response options, and to draft the text of the invitation emails and a survey purpose statement. Staff also works in conjunction with the survey authors to identify the survey’s target population. On occasion, Human Resources staff is included in the identification process when population membership is dependent on data that resides in the PeopleSoft HR database system.

Working in partnership with the survey authors, department staff constructs an electronic “roster” of participants. The roster includes participant names, email addresses, and any additional demographic data that might be relevant to the survey topic. Staff then uses an automated process to send a personalized email to each person on the roster. The email explains the survey topic, and requests their participation while providing them a link to the survey webpage. Embedded in the link is a unique “key” that allows the respondent to complete the survey only once.

When respondents click on the link to the survey page the .NET application builds the survey page dynamically from a list of questions and response options that are stored in a SQL Server database. All formatting is done automatically by the server, thereby eliminating the need for repeated design work by department staff. Once the survey window closes, the same server process automatically computes and formats the results.

When respondents successfully complete a survey, their completion is automatically recorded on the roster, and their answers to the survey questions are recorded in a separate and un-connected table, maintaining anonymity of the respondents. The roster table allows staff to keep track of who has completed the survey, so that a targeted follow-up email can be sent several days later, reminding non-completers that they still have time to complete the survey before the administration window closes. Once the survey window closes, response-rate statistics can be drawn from the roster that allows staff to evaluate whether the response group is representative of the target group.

The dynamic nature of the online survey application makes it possible for department staff to begin administration of an online survey immediately after the survey questions and participant roster have been finalized. Also, results are immediately available after the survey window closes. This last point is particularly important because survey participants and other TUSD stakeholders have immediate access to the information gathered through the survey process.


TUSDStats Data Analysis

Surveys conducted by Accountability and Research attempt to collect quantitative and qualitative data. By focusing on both data types, the survey results themselves benefit from asking survey respondents a combination of both open and closed-ended questions.

Quantitative Data Analysis

Quantitative data collected in surveys, typically result in the form of statistical or measurable summaries. In general, a response to a survey question is collected in a scaled form, measuring the degree of consensus among respondents. For example, one scale which may be used to measure responses to a question could look like: strongly agree, agree, disagree, and strongly disagree.

This process of quantitative data analysis has been automated by Accountability and Research staff using a SQL Server database to compute results (please see discussion above). The automation process makes survey results available immediately after the survey has been closed.

Qualitative Data Analysis

Accountability and Research developed a web-based application (Survey Comment Coding Tool) from which to qualitatively analyze comment data collected from respondents that participated in online surveys. The main purpose of the Survey Comment Coding Tool is to systematically analyze unstructured or descriptive information and distinguish patterns that may emerge once coding is completed.

The tool aids in understanding or interpreting the meaning(s) of various comments stated by survey participants by allowing the analyst to assign a code(s) (topic heading, keyword, or phrase) to each comment. Coding refers to grouping the participant’s comments into a limited number of categories. New categories are added as needed as the analyst reads each comment. Therefore, each set of codes is unique to a series of comments responding to a specific question. Table 1 below is an example of how codes are assigned to specific comments. In this example, four participants comprised the total survey population. The results are summarized in Table 2.

Given the above example, we can deduce that three out of the four participants who responded to the survey question with a comment, felt that hiring a new principal was an accomplishment.

Conclusion

The automated online survey process and qualitative and quantitative analysis procedures described above ensure that each survey administered by Accountability and Research is conducted in a replicable and repeatable manner. Additionally, the established survey procedures maintain participant anonymity. This is important when attempting to solicit valid feedback from a survey population. Furthermore, the automated survey process improves timeliness of both conducting the survey and analyzing results. Because the survey population is invited to participate through an automated email system, and statistical results are immediately available upon conclusion of the survey, survey findings can be more readily provided to those that make decisions.

In addition to automating the survey process, Accountability and Research conducted a survey to determine how to improve the quality of future surveys and attempt to understand the population which does not respond. Based on survey findings, the Accountability and Research Department can follow a set of steps which might help to achieve higher response rates in the future:

  1. Keep the survey short, and more to the point,
  2. Evaluate the survey population list prior to emailing invitations to ensure that the right population is being targeted,
  3. Clearly mark who the survey is from, followed by a brief title,
  4. Provide the participant with a detailed description of who the survey is targeting and what the data will be used for, and
  5. Following closure of the survey, send a final email inviting participants to review survey findings.

With these steps, and the suggestions provided by survey participants, it is possible to control non-response rates and minimize the potential for non-response bias.

For a more detailed description of the TUSD Non Responders Survey: Understanding Response Bias survey results:
View final Summary Report
View Summary Statistics


Department Contact Information

Office of Accountabitity and Research
442 East 7th Street Tucson, AZ 85705
Phone: (520) 225-5418 Fax: (520) 225-5226

Top of Page

Last updated July 2006