Institutional Research Structure: The Nuts and Bolts of an Ideal IR Model for Naturopathic Colleges

 In Education

David Schleich, PhD

In recent months we have been considering the value to our naturopathic colleges to develop an institutional research (IR) department. It is important in this process to take into account what Middaugh describes as “possible interactions of measures” in a well-known model called the “input-process-output,” or IPO model. These are as follows:

  • Inputs (college selection survey; advanced placement tests; analysis of student characteristics; financial aid)
  • Intersecting inputs and processes (survey of housing needs; surveys of special population needs; survey of provincial and national demographics impacting on entry to practice)
  • Process (student satisfaction surveys, faculty and staff satisfaction surveys, student attitude surveys, faculty and staff attitude surveys, program evaluation)
  • Intersecting inputs, process and outputs (longitudinal studies of overall effect of college regionally, nationally, internationally; retention analysis; career plans and employment; portfolio assessment)
  • Outputs (post-graduate credential awarded; NPLEX and entry to practice/licensing results; alumnae surveys)

Because a critical element of any assessment activity within this framework is “high quality and, therefore, useful” information (Middaugh, 1994, p. 46), a general set of steps is important to incorporate into the design of the IR function. Davis (1989) recommends the following general approach for any assessment.

  • Focus the evaluation by identifying goals and restraints from the client
  • Identify stakeholders and audiences
  • Generate questions of interest to the stakeholders
  • Refine and limit questions through negotiation with the vested parties
  • Determine the methodology: For each question specify the instrument or data source, the sample from whom data have been or need to be collected, the time frame for collection of data, methods of analysis and the intended use of the results
  • Communicate the findings to stakeholders in ways that they can use the results

A naturopathic college IR department would not only make use of the guidelines and the IPO model outlined above, it could be a centralized model at our colleges. Slark (1992) describes a successful model of this kind established in the 1990s at Rancho Santiago College. He describes its IR program as “an effective, integral part of [his] college’s decision-making process” (p. 5). Slark goes on to explain that theirs is a model that “assumes a broad base of responsibility for institutional research” (p. 6). The model includes formal relationships between the IR office and other departments. As such, the “research activities become credible because they reflect the institutional priorities established by his college’s councils, the chancellor and the cabinet. The director of research and planning is an institutional leader and active member of the college team, rather than an isolated actor” (pp. 6-7). Strong co-leadership of projects is implicit in the structure. This centralized approach “has the obvious advantage of closely coordinating an institution’s research activities” (p. 10). Further, it avoids duplication, reveals gaps in the institution’s research knowledge and “allows a focus on the changes that will result from the research” (p. 11). The disadvantage of the research function being separate and its director perceived sometimes as an outsider can be overcome by strong connections to the programs and services being assessed.

Principal Functions

The following principal functions can be the basis for a naturopathic college IR department: research/evaluation services; planning and market assessment services; management information services; technical support and consulting services.

Research/Evaluation Services

  • Systematic, biannual program review
  • Annual student evaluation of instructors and college services, including the library, cafeteria, student services, residences/dormitory, plant and property and classroom/clinical experience
  • Annual student evaluation of counselors and advisors
  • Ongoing patient evaluation of clinical services, information services, customer service
  • Conducting of needs assessments and feasibility studies
  • Facilitation of educational research support services in conjunction with the office of the associate dean of research and graduate programs

Planning and Market Assessment Services

  • Periodic surveys of college students and graduates
  • Periodic surveys of the profession locally, provincially, nationally
  • Continuous monitoring of student and patient demographics

 

Management Information Services

  • Monitoring and reporting of college enrollments
  • Coordination and completion of provincial, federal and other reports and surveys
  • Provision of data for college and committee initiatives and projects
  • Assessment of institutional effectiveness and student outcomes

Technical Support and Consulting Services

  • Technical support/consulting for faculty research

Within these parameters, the following core indicators will be routinely assessed and reported on:

Student Services

  • Distribution of entering and exiting grade average
  • Acceptance and yield rates: entry from undergraduate background; professional second entry; entry from graduate background; special admission
  • Percent of U.S., Canadian, international students
  • Percent of students holding scholarly awards
  • Student retention, attrition and progress toward the ND credential
  • Assessment of student services provided: recruitment, financial aid, counseling, student life
  • Time to graduation

Academic Services

  • Full-time faculty: graduate and professional credentials
  • Library resources: volumes acquired/held; online services; utilization rates; total spending
  • Percent of faculty holding scholarly awards
  • Percent of faculty publishing
  • Percent of faculty with education credentials
  • Courses listed/courses actually scheduled
  • Percent of faculty with doctorates
  • Percent of faculty with ND doctorate (first professional degree)
  • Percent of faculty with other healthcare profession designation
  • Percent of faculty licensed/registered to practice as NDs
  • Instructional load: teacher/student ratio; student contact hours; class size; ratio of sections to courses
  • Academic staff on research or clinical practice leave
  • Research grants per professor
  • Research grants and contracts as a percentage of operating revenue
  • Educational technology: audiovisual collection per student; audiovisual collection per course; audiovisual collection with copyright; multimedia classrooms; courses with computer applications

Plant and Property

  • Space allocation/purpose
  • Space utilization: classrooms, teaching laboratories, research laboratories, academic department offices, central administration offices; library, college support and services; maintenance shops

Finances

  • Comparison of growth rate in E&G revenues compared with E&G expenditures
  • Ratio analysis: revenue contribution ratios; expenditure demand ratios
  • Salary and compensation comparisons
  • Annual reports on the economic status of the profession
  • Salary equity analyses
  • Semester-based and annualized instructional productivity analysis
  • Instructional productivity
  • Research and service productivity
  • Cost analysis: expenditures by object and function: direct instructional cost per student credit hour (by semester and annually)
  • Administration productivity

IR Implementation Guidelines

Middaugh et al. (1994) propose basic guidelines for the implementation of an institutional research capacity, all of which are relevant and useful for us:

  • Firmly establish the centrality of our naturopathic colleges’ respective offices of IR “in coordinating campus databases and disseminating institutional data”
  • Establish a friendly, professional relationship with the manager of management information systems and his or her staff
  • Wherever possible, obtain data from primary sources
  • Minimize the use of surveys where possible and, where utilized, favor commercially prepared, professionally prepared instruments
  • Write reports that are easy to understand, brief and concise, and that are focused on influencing change, always moving from information to improvement
  • IR should never be complacent; the IR department must constantly seek new ways to generate factual information that leads to concrete policy activity

With the above design elements, framework and implementation guidelines in mind, let us now consider how to activate an IR capacity at our colleges. See the accompanying article, Suggested Model IR Implementation Project.

Purpose

To generate a framework for implementing an ongoing, comprehensive assessment of our colleges’ operational and strategic activities by utilizing the resources of systematic IR.

Theoretical/Conceptual Framework and Rationale

Basing the development and implementation of the project on the theoretical considerations of Volkwein and Peterson (1999) in particular about IR, our colleges could strike steering committees to design and introduce into the culture an office of IR that will have a clear mandate and be properly resourced. We can turn to the work of Middaugh et al. (1994) for pragmatic resources to assist us in the design and implementation process. Of particular use can be the work of Peterson et al. (1991), Saupe (1990) and Norris and Poulton (1987) on the theory and application of IR; Society for College and University Planning (SCUP) for a focus on planning issues; Dunn (1989) for assistance in those assessment elements relating to electronic media and information sharing; Krol (1992) for help with Internet considerations; Kalton (1984) and Suskie (1992) for “profitable references” about survey research; Scannel (1992) for pragmatic analysis of admissions and financial aid processes; Teeter and Brinkman (1992) for guidelines about accessing and utilizing peer institutions for IR; Astin (1993) and Tinto (1987) for references on student development; Halpern (1987) and Noel et al. (1985) on student retention; and finally, Gerlinda Melchiori (1989) on alumni survey research.

The steering committee could incorporate existing data collection files and efforts into its resources and will begin immediately to design new procedures as necessary. Finally, the steering committee could establish at least two pilot projects during the first year at any college choosing to implement this approach. For example, the first project could be in the student services area, and the second in the clinic operations area.

Criteria for Assessment

There are four key indicators of the successful design and implementation of this IR strategy:

  • Staff, faculty, student consensus on the model developed
  • Successful design by the end of month one of an eight-month project
  • Successful implementation of a multi-phase rollout strategy by the end of the ninth month of the project
  • Articulation of operationally defined outcomes: appropriate use of measurable behavioral terms

Significance and Implications

If our colleges are to take their proper place among the distinguished colleges and universities of naturopathic medicine evolving in North America, they must develop capacity for IR that can assist in continuous improvement objectives, can help respond to external pressures from primary healthcare professionals and their agencies and associations and from government, and that can help celebrate successes by reporting outcomes accurately and reliably as they are achieved.

Anticipated Outcomes or Expectations

The naturopathic community will welcome an IR resource, and especially value objective, reliable data describing the efficacy of its inputs, processes and outputs. In particular, the inconsistency of the profession’s response to the state and provincial variations in health policy and the reforms, which are part of the healthcare landscape, can be more effectively countered with reliable data about the quality of professional education that nourishes the next wave of new professionals at the point of their entry to practice.


David_Schleich_Headshot-248x300David Schleich, PhD is president and CEO of NCNM, former president of Truestar Health, and former CEO and president of CCNM, where he served from 1996 to 2003. Other previous posts have included appointments as vice president academic of Niagara College, and administrative and teaching positions at St. Lawrence College, Swinburne University (Australia) and the University of Alberta. His academic credentials have been earned from the University of Western Ontario (BA), the University of Alberta (MA), Queen’s University (BEd) and the University of Toronto (PhD).

References

Astin A: What Matters in College. San Francisco, 1993, Jossey-Bass.

Davis BG: Demystifying assessment: learning from the field of evaluation. Achieving Assessment Goals Using Evaluation Techniques, In PJ Gray (ed), New Directions for Higher Education, No. 67. San Francisco, 1989, Jossey-Bass.

Dunn JA: Electronic media and information sharing. In P Ewell (ed), Enhancing Information Use in Decision Making New Directions for Institutional Research, No. 64. San Francisco, 1989, Jossey-Bass, Publishers.

Gray ST: Evaluation With Power: A New Approach to Organizational Effectiveness, Empowerment, and Excellence. San Francisco, 1998, Jossey-Bass.

Halpern DF (ed): Student outcomes assessment: what institutions stand to gain. In New Directions for Higher Education, No. 59. San Francisco, 1987, Jossey-Bass.

Kalton G: Introduction to Survey Sampling. Beverly Hills, 1984, Sage Publications.

Krol E: The Whole Internet: User’s Guide and Catalog. Sebastopol, 1992, O’Reilly and Associates.

Melchiori G: Alumni research: methods and applications. In New Directions for Institutional Research. San Francisco, 1989, Jossey-Bass.

Middaugh MF et al: Strategies for the practice of institutional research: concepts, resources and applications, No. 9. In Resources in Institutional Research. The Association for Institutional Research. Tallahassee, 1994, Florida State University.

Noel L et al. (eds): Increasing Student Retention. San Francisco, 1985, Jossey-Bass.

Norris, DM and Poulton NL: A Guide for New Planners. Ann Arbor, 1987, Society for College and University Planning.

Peters R: Accountability and the end(s) of higher education, Change Nov-Dec:16-23, 1994.

Peterson MW et al: Theory and applications of institutional research. In WR Fendley and LT Seeloff (eds), Reference Sources: An Annotated Bibliography. Tallahassee, 1991, Association for Institutional Research.

Peterson MW: The role of institutional research: from improvement to redesign. In JF Volkwein (ed), What Is Institutional Research All About? A Critical and Comprehensive Assessment of the Profession. New Directions for Institutional Research, No. 104. San Francisco, 1999, Jossey-Bass.

Popham WJ (ed): Evaluation in Education: Current Applications. Berkley, 1974, McCutchan.

Saupe JL: The Functions of Institutional Research (2nd ed). Tallahassee, 1990, Association for Institutional Research.

Scannell J: The Effect of Financial Aid Policies on Admission and Enrollment. New York, 1992, College Entrance Examination Board.

Society for College and University Planning (SCUP): www.scup.org, Ann Arbor, University of Michigan.

Slark J: The traditional centralized model of institutional research, New Directions for Community Colleges, No. 72:5-11, 1992.

Suskie LA: Questionnaire Survey Research: What Works. Tallahassee, 1992, Association for Institutional Research.

Teeter D and Brinkman P: Peer institutions. In M Whitely et al. (eds), The Primer for Institutional Research. Tallahassee, 1992, The Association for Institutional Research.

Tinto V: Leaving College: Rethinking the Causes and Cures of Student Attrition. Chicago, 1987, University of Chicago Press.

Terenzini PT: On the nature of institutional research and the knowledge and skills it requires. In JF Volkwein (ed), What Is Institutional Research All About? A Critical and Comprehensive Assessment of the Profession. New Directions for Institutional Research, No. 104. San Francisco, 1993, Jossey-Bass.

 

Recommended Posts

Start typing and press Enter to search