David Schleich, PhD
Our colleges and programs within higher education institutions are moving steadily from being small, private educational enterprises to having both national and international profiles. As we move unrelentingly into the higher education realm, not only do accreditation processes become increasingly top of mind, so too the need for institutional data to help in strategic planning. That realm impacts on the formation of our profession in that such institutional research data become instrumental in programmatic and regional institutional accreditation. In that realm, having timely data about every nook and cranny of our schools is more important than ever as we reach wider into historically less accessible applicant pools. Within those pools are potential ND, MSOM and MAc students who want to know a lot about us before signing up.
At the same time, not only are the declared and perceived needs of recruiting new people to build the profession as important as ever (Volkwein, 1999), but the data we need to meet our accreditation compliance reporting is getting more complex and more expensive to collect. In such a climate, institutional research capacity becomes critical to our success going forward. Institutional Research (IR) is not to be confused with academic or clinical research; rather, it is about the institution itself. It is as much about enrollment management, assessment of academic effectiveness and resource management as it is about mission, services and quality indicators.
Focus of IR
Overall, the focus of institutional research is to “improve and to redesign” (Peterson, 1999, p. 83). The earlier function of institutional research at our schools, though, often tended to address an immediate operational or political crisis. Moving from a reactive to a more proactive approach to planning has been an inevitable process for our colleges. In this regard, Keller (1983) earlier on clarified the “strategic imperative” of colleges and universities, which “increasingly must respond to their environments” (p. 88). Those responses demand good data.
In this new century, growing avalanches of data point out many challenges that are “colliding rapidly” with the historical role and practices of our programs and colleges. These challenges include such variables as the bargaining power of students, suppliers and staff; the threat of new organizational entrants; the threat of substitute services; cultural diversity; the telematics revolution; metamorphosing students; and globalization (Peterson, 1999, pp. 95 -91).
That increasingly busy path can be better navigated with the help of abundant, reliable, timely information and data, and these in turn are best generated by a strong IR unit. Even so, what is the likely receptiveness to such an organizational unit? Most often the early manifestations of our colleges and programs were more preoccupied with establishing fiscally and operationally solid program delivery, and thus had not made a priority the systematic mustering of what Terenzini calls “organizational intelligence” (1993, p. 21). Those days are gone.
Volkwein (1999) explains that at an early point in rapidly developing organizations, systematic intelligence gathering within the institution reveals that internal demands are likely contrasting with external ones. In our case, the priority of developing operational viability and fiscal stability did not mean that the external expectations of “the profession” (for input, accountability, consistent educational standards) were ignored; rather, our schools had simply overtaken the “institutional memory” of their earlier manifestations and there were more priorities; some internal, many external. At the same time, a newer cadre of staff brought to their management and leadership roles at the colleges entirely new networks and skill sets than had been operating before. Old relationship patterns with the local, state and provincial professional associations, for example, rapidly transformed as the colleges expanded their recruitment reach and their local impact.
During these transformative periods the academic culture in our colleges inevitably diverged from an administrative culture characterized by volunteerism, part-time engagement, scarce resources and less formal curriculum arrangements. This diverging was accelerated as the colleges added large numbers of full-time, permanent teaching and clinical staff. Inevitably our colleges’ institutional needs began to vary from what some leaders in the external profession thought the college’s priorities ought to be. While our attention in those days to relationship building outside the institution expanded with key external stakeholders, new, internal “impediments to change” began to emerge (Volkwein, 1999, p. 9). The external professional members and other stakeholders continued to understand the college through the lens of their own memories and experiences of an earlier form of their school or program, which invariably was markedly different from what they had known as students or suppliers or board members some years earlier. The need became ever greater for current institutional research to make available to the profession new data and other information about the new and different colleges emerging.
For example, the Bastyr University of 1998 was a different place from the John Bastyr College of 1978, and different again from the Bastyr University of 2008. As well, the Ontario College of Naturopathic Medicine (OCNM) of 1979 was a vastly different place from the CCNM of 1999. As well, the appearance of a naturopathic program at University of Bridgeport involved a somewhat different governance model and accountability framework than the one that appeared at SCNM a half decade earlier.
Launching an IR Strategy
The work of Peterson, Mets and Vega (1991) is relevant helping us to understand this emerging need for institutional research within these changing frameworks. Their comprehensive, annotated bibliography of theory and applications of institutional research clarifies to a certain extent what path naturopathic colleges might want to chart in order to launch or improve an IR strategy. No doubt, as our colleges continue to incorporate the many complex layers of “research” and “evaluation,” the process will assist in strengthening and maturing these organizations’ internal systems and external relationships.
Designing and implementing such capacity could derail, though, if newly minted data sometimes is used in less than helpful ways. Where a college’s new or improved IR initiative focuses its evaluative lens on the ND program, for example, some faculty and students, as Rossi and Freeman (1993) suggest, may tend to limit their attention and efforts to an assessment of the processes and standards of their immediate program, since it is their principle goal to graduate and enter practice. They will, as Freeman predicts, ask whether the external expectations of their program (that is, expectations of the regulatory boards who license practitioners in the state or province) have been achieved and then move on to judge the “worth and utility” of what has been achieved.
Other faculty and students from the same cohort, however, may see in the accumulating information and data evidence that something undesirable may be occurring right now; for example, that a perceived shift from a holistic curriculum to a scientific and evidence-based one is underway, and will want to utilize the same data and information to influence policy decisions of the Board around curriculum content and design. Not to worry, for at the same time as the evaluative lens gets pointed and polished, so too the research lens will also be doing its work. Research, Freeman explains, is a process within this larger assessment and evaluative effort. “Research” will be going about its business of collecting data to “stretch the envelope of what is known in order to prove or disprove a hypothesis or presupposition” (Boulmetis and Dutwin, 2000, p.4). The very articulation of such hypotheses or presuppositions (e.g., that there is an unwelcome shift toward science-based medicine in naturopathic college curriculum) becomes an essential starting point and justification for an IR strategy. Concomitantly, the longer view, which the layers of research and evaluation encourage, becomes more attractive, even critical.
IR Potential
If all goes well, an IR capacity can assist the institution to propel itself toward an increasingly strategic approach in its operational planning and gradually let go its grip on political reactivity and short-term targets. With regard to the example cited above, any debate over a scientific vs. holistic curriculum may be seen in the larger context of the integration of primary care medicine, for example. The point is that the principle objective of a formal IR department, then, would be to choreograph the college’s or program’s evaluation, assessment and research activities within a framework that has as its goal not second-guessing historical performance or assuming that some current activity exactly defines a long-term trend, but in enabling and supporting future strategic development based on reliable, accurate historical data and information.
Ideally, a well-conceived and implemented approach to IR can generate for our colleges a broad front of activities “directed at describing the full spectrum of functions (educational, administrative and support) occurring with the college” (Middaugh, 1994, p. 1). The upshot is that institutional research can then “examine those functions in their broadest definitions, and embrace data collection and analytical strategies in support of decision-making at the institution” (Middaugh, 1994, p. 1). In the end, whatever IR platform is established and whatever its assessment and self-evaluation techniques, inputs and outputs, the impulse to do so for reasons of “external accountability” (to the profession, in particular) may well be replaced in the first position of priority and purpose by the different goal of internal enhancement that benefits students and patients, thus advancing the college’s real mission.
Some of our colleges and programs are at a stage of their development where an IR capacity will greatly help in creating institutional capacity to move through what Volkwein calls the “collision of policy issues.” An institutional awareness of this phenomenon of policy issues collision is not new to our colleges. However, it is a dynamic that our now higher-profile schools have in common with other post-secondary institutions; namely, the intersection of such issues in the daily life of the campus. This, Volkwein explains, “challenges all of us in higher education” (Volkwein, 1999, p. 11).
This collision of policy concerns (e.g., scope of practice issues influencing clinical education curriculum; erosion of naturopathic philosophy and history as interest in a more scientific approach to naturopathic medicine grows; and variations in state and provincial regulations; to name a few) that are converging on our schools underscore the need for information to respond creatively and decisively to the increasing frequency of fender benders among the emerging sector called “medical academics,” the clinical practitioners of the profession at large, the students in training and the regulatory agencies governing the profession. Such bumps and wrinkles characterize the primary healthcare landscape in Canada and the U.S. generally and include issues around cost, productivity, access, effectiveness and accountability (Volkwein, 1999). Where such issues recur, IR cannot be too far away in response.
Generally speaking, the goal for our colleges in implementing an effective IR operation is to bring rationality to decision-making in institutions (Friedland, 1974, p. 27). For certain, we will want to chart a path that moves our changing organizations away from merely reacting to our various environments to a more exciting “trip tik” that would bring the schools to a different destination: the evolution of a strategic organization completely committed to professional formation. A strong IR capability built on increasingly sophisticated organizational intelligence would be ideal.
Next month we will consider the key elements of an institutional research program. What are the main inputs and outputs of such institutional intelligence gathering for our naturopathic programs and colleges? A complex stew.
David Schleich, PhD is president and CEO of NCNM, former president of Truestar Health and former CEO and president of CCNM, where he served from 1996 to 2003. Other previous posts have included appointments as vice president academic of Niagara College, and administrative and teaching positions at St. Lawrence College, Swinburne University (Australia) and the University of Alberta. His academic credentials have been earned from the University of Western Ontario (BA), the University of Alberta (MA), Queen’s University (BEd) and the University of Toronto (PhD).
REFERENCES
Boulmetis J and Dutwin P: The ABCs of Evaluation: Timeless Techniques for Program and Project Managers. San Francisco, 2000, Jossey-Bass.
Volkwein JF (ed): What is institutional research all about? a critical and comprehensive assessment of the profession, New Directions for Institutional Research No. 104. San Francisco, 1999, Jossey-Bass.
Peterson MW: The role of institutional research: from improvement to redesign. In JF Volkwein (ed), What is institutional research all about? a critical and comprehensive assessment of the profession, New Directions for Institutional Research, No. 104. San Francisco, 1999, Jossey-Bass.
Keller G: Academic Strategy: The Management Revolution in American Higher Education. Baltimore, 1983, Johns Hopkins University Press.
Terenzini PT: On the nature of institutional research and the knowledge and skills it requires. In JF Volkwein (ed), What is institutional research all about? a critical and comprehensive assessment of the profession. New Directions for Institutional Research, No. 104. San Francisco, 1993, Jossey-Bass.
Peterson MW et al: Theory and applications of institutional research. In WR Fendley and LT Seeloff (eds), Reference Sources: An Annotated Bibliography. Tallahassee, 1991, Association for Institutional Research.
Rossi PH and Freeman HE: Evaluation: A Systematic Approach. Newbury Park, 1993, Sage.
Middaugh MF et al: Strategies for the practice of institutional research: concepts, resources and applications, Resources in Institutional Research, No. 9. Tallahassee, 1994, The Association for Institutional Research, Florida State University.
Friedland EI: Introduction to the Concept of Rationality in Political Science. Morristown, 1974, General Learning Press.