Academics and Staff set Continuous Goals

02/26/2010

People who are looking for online degree programs make quick decisions. That's why the Online Campus Admission Department welcomed the idea of starting a tracking process, with the goal of responding to inquiries from prospective students within 24 hours.

When Library staff started their own tracking process, they discovered students in certain departments heavily rely on inter-library loans. That gave them solid information as a basis for making decisions.
The Human Resources department also started tracking, and determined which parts of the New Employee Orientation the new hires found most useful.

These developments came about because The Chicago School of Professional Psychology (TCSPP) is analyzing data to improve administrative functions in much the same way as faculty have been assessing student performance data to improve student learning for years.

Starting in 2008, the school embarked on an effort to ask departments to take a look at how they are doing their jobs, gather data on their performance and see if that sheds any light on ways to perform their jobs more effectively, said Kathryn Talley, director of the Office of Institutional Research, who led the effort.

This effort, called the Institutional Effectiveness Initiative, asked heads of administrative departments to gather data on one or two of their functions. The data helped departments measure their response times, their productivity, or other criteria that gave them insight in evaluating how effectively they were meeting their goals. Academic departments were already using data in their annual program reviews.

"If you set high standards, people will try to reach them," said Talley. "I'm thrilled at the number of people who want to do excellent work here at our school."

Quicker turnarounds, more usefulness

The project was large in scope, since it asked 29 administrative/operational units to get started on the road of tracking their results with data.

"I thought it was a very good process," said Julie Bechtold, system director of career services. She chaired the Administrative Effectiveness Review Committee, which reviewed the units' findings. "It was helpful for people here to see they were making progress and that there's room for improvement. They were able to benchmark from that and move forward instead of feeling like, 'I'm just spinning my wheels.'"

Pari Pinyo, director of admission for the online degree division, said the process motivated his department to use software to track its response times to inquiries from prospective students.

"Online students make decisions to enroll quickly," he said. "It is vital that we respond to all inquiries in a timely fashion, and this process helped us move toward our goal of responding within 24 hours."
Indu Aggarwal, director of libraries, also said she found the process helpful.

"It was interesting to see how our users were using our library services and how they were taking advantage of interlibrary loans," she said.

In addition, the Facilities Department tracked overtime hours and their efficiency, and Financial Aid surveyed students about their perception of the staff's availability and service.

A shift towards tracking

The idea of measuring functions with data is relatively rare in the administrative/operational side of academia, Talley said.

The academic side must already meet standards for assessing and improving student learning that are required by accrediting agencies, and these only partially take into account administrative functions.

But across most of the functions on the staff side, such as marketing, information technology, or facilities, no government agency or regulatory bureau is measuring performance, she explained.

TCSPP realized earlier than most in higher education that efficiency on the administrative side greatly contributes to a school's overall quality. Back in October 2007, it established a task force to develop a continuous quality improvement plan.

A roadmap to quality

The task force developed a roadmap, the Institutional Effectiveness Initiative, and the President's Cabinet approved it in June 2008.

Talley's office kicked the Initiative into gear, asking administrative units to follow academic programs' lead by identifying at least one performance indicator they could measure, then collect data and analyze it. The units submitted their findings in reports by a March 2009 deadline.

Committees gave the reports a preliminary review in April and provided constructive feedback to the units on how they could refine their measurements and improve their performance. In July, an overarching committee representing both academic and administrative functions, the Institutional Effectiveness Review Committee, performed a big-picture review.

"It was a constructive, eye-opening experience to work with such a variety of departments and data. We summarized the findings and sent our conclusions and recommendations on to senior leadership," Talley said.

Continuing the process annually

And so began the inaugural year of the Institutional Effectiveness Initiative. It is now an annual process devoted to continuous improvement. This year's efforts are well underway, and many will build on baselines established last year.

"Everybody learned that tracking can help you do your job better," Talley said. "There were tons of processes that improved after doing this only one time."

While managers did have to devote time and thought to the process, Talley said it streamlined the workload for many, and gave others a better grasp of what they were accomplishing.

"It's a lot of work, and no external agency is making us do it, but it's a result of people here saying we want this institution to be as good as it can be," she said.


« Back