
Rider checks on student attitudes with survey
by Jess Decina
About how often have you attended an art exhibit this year? How often have you asked questions in class or contributed to class discussions? How much has your coursework emphasized memorization of facts?
Freshmen and seniors might be looking at questions such as these if they choose to participate in the 2008 National Survey of Student Engagement (NSSE), which was distributed to students via e-mail this week.
The survey, which Rider has participated in since 2001, aims to assess student attitudes in five areas: level of academic challenge, amount of student interaction with faculty, degree of support in the campus environment, amount of active and collaborative work in and out of class, and frequency of out-of-class education enrichment. The NSSE is a survey designed “to check the way that students are engaged on campus,” said Eileen Corrigan, the University’s research analyst.
“[NSSE] wants to know how often students are participating in class working on project with professors, how often they’re prepared for class [or] how often they are working with other students,” she said. “We’re just trying to put together trend data.”
Ron Walker, associate vice president of institutional analysis, said he sees NSSE as a way to gauge information about Rider that goes beyond “how many books you have in the library.”
“It allows a student to see what’s going on in the teaching environment,” he said. “College is a complex experience for everyone, and this tries to redirect the focus to an educational and co-curricular aspect.”
In order to gauge this information, the first half of the student survey asks quantitative questions, such as how much time students spend doing homework or how many papers of a specific length students have written in the past year.
“I thought it was really random,” said freshman Jacqui Lehman, who completed the survey earlier this week. “The questions asked made sense, but it seemed way too general.”
The survey results won’t be available until later in the year; usually, the University has sorted out the data by early August, Walker said.
“We like to do our own analysis,” he said. “You get your score but it gets matched against schools like yours. At that point you can say, ‘We scored this; the national mean was this. What are we doing right or wrong?’ The one thing about data is that it allows you to measure things accurately. It is what it is. Data grounds you.”
Some students, like freshman Danny Viola, are eager to share their experiences so far.
“I don’t give back to Rider a lot and I figure a survey is an easy, anonymous way to do it,” he said. “I feel that it’s the only way I can get my feelings out. Normally I wouldn’t take the survey, but when it’s about Rider, I can get out real answers so that they have something real to go on.”
However, not all seniors and freshman have received the survey. This may be because of the number of credits that students have accrued, as some may be behind or ahead of their classmates’ number of credits.
Walker and Corrigan are aiming for at least 25 percent of students who received the survey to respond, though “more is better,” as Walker said.
“It fits right in with the accountability factor,” he said. “It’s measuring why you go to school.”