Tests that matter: Measuring the PDS Difference

We asked….They told.

The High School Survey of Student Engagement (known as the Hessie) is a highly regarded survey measuring the academic, social, and emotional engagement of high school students across the United States. It is administered annually by the Center for Evaluation and Education Policy at Indiana University. Since the survey’s inception, over 500,000 students nationally have participated in the HSSSE.

Among its purposes are:

1.     To help schools explore, understand and strengthen student engagement and

2.     To conduct rigorous research on issues of student engagement

We administered the test to all PDS high school students last spring. It takes between 20-30 minutes and consists of questions designed to investigate the levels of student engagement across three dimensions of life in school:

  • Cognitive, intellectual and academic engagement i.e.  the work students do and the ways they go about their work
  • Social, behavioral engagement and participation i.e. the ways they interact with the school community and
  • Emotional engagement – how students experience life in school and how they feel

Many of the kinds of questions that are asked on the HSSSE are in line with our  vision of a PDS graduate so for us it is a key test of how well we are living up to our mission. I will give some very specific examples of the questions in a later post.

The intention as explained on the actual question bubble sheet itself is that student responses will help the school “better understand (student needs) in order to create a school environment that is engaging, challenging and productive….”

We now have our results – a thick binder stuffed with data, details, comparisons and charts. It’s all a bit overwhelming. But as we begin to mine it for information we can use, several things become apparent – chief among them being, that in terms of national comparisons, PDS students test very highly in terms of engagement across all the dimensions. This is not a surprise of course, but it is useful to have the statistical confirmation of our anecdotal evidence and gut feeling.

That said – there is never time for complacency and we will use their responses and this data to identify areas for ongoing focus. There are areas we want to take a close look at. And  we need to ask students what the numbers might mean in terms of life at PDS.

So:  What did we learn? What did our students tell us?

Here – as a very preliminary stab at sharing these complex results is the summary of the three dimensions outlined above.

In reading the chart take a look at the “Effect” column. Effect size indicates the practical significance of the mean difference between groups being compared. In educational research, it is most common to find effect sizes between 0.10 and 0.40. Effect sizes: .20 ‐ .49 = small, .50 ‐.79 = medium, and .80+ = large.  Beyond .80 means the effect is very significant .

Now check the results in the three dimensions and check out that effect factor.

Your reactions and questions welcome.


  1. Dierj:

    These results show that which we all know – that dedication and heartfelt engagement with a task lead to the best possible results. Well done.

  2. admin:

    Hi Dierj: Pretty astonishing results. If 0.8 standard deviation equals a very significant difference then what do we make of a 1.64 difference!

Random Posts