Appendices

Appendix 1: List of Legislation Administered by the Ministry

Accountants (Certified General)

Accountants (Chartered)

Accountants (Management)

Applied Science Technologists and Technicians

Architects

Architects (Landscape)

British Columbia Innovation Council

College and Institute

Degree Authorization

Engineers and Geoscientists

Music Teachers (Registered)

Open Learning Agency

Private Career Training Institutions

Public Education Flexibility and Choice Part 1

Royal Roads University

Scholarship

Sea to Sky University

Thompson Rivers University

Trinity Western University Foundation

University

University Foundations

Workers Compensation (s. 3 (6))

Appendix 2: Report on Accountability Framework Measures

Institutions submit data on measures in the Accountability Framework that are not included in the Ministry Service Plan, but that provide additional contextual information:

Annual educational activity occurring between May and August

The Ministry is committed to ensuring public post-secondary institutions maximize the efficient use of existing publicly-funded facilities before additional funds are allocated for capital expansion. One of many possible ways to increase efficiency is to promote the year round use of facilities for student instruction. For many reasons, the period of May through August has historically been a time of reduced instructional activity at most institutions (although it may also be a period of increases in other types of activity such as research). For those institutions that are able to offer more instructional activity during this period, doing so may ease some of their difficulty meeting student demand during the fall and winter and may result in more efficient use of resources and capacity.

This measure is intended to provide an indication of overall system progress in this regard. It is the percentage of annual instructional activity conducted during the summer academic period compared to the fall and winter academic periods. It is determined using data from public post-secondary institutions. Universities provide data showing equivalent enrolments taught (EETs) through The University Presidents' Council of British Columbia; colleges, university colleges and institutes provide student contact hour data to the Ministry. The rate is calculated by taking the program activity that occurs in the months May to August and dividing it by the total annual activity.

Sector 2001/02
Academic Year
2002/03
Academic Year
2003/04
Academic Year
2004/05
Academic Year
University 15.2% 15.8% 15.9% 15.5%
College 11.5% 13.1% 14.2% 14.2%
System total 13.3% 14.4% 15.0% 14.9%

As can be seen in the table above, the percentage of activity happening during the May – August time period has decreased slightly from the previous academic year, but has increased by 1.6 percentage points since the 2001/02 academic year.

Beginning with the 2005/06 fiscal year, this measure is no longer included in the Ministry Service Plan because the Ministry views overall facilities utilization as a more meaningful measure of efficiency than utilization at one particular time. However, as institutions are required to demonstrate a trend toward greater summer facilities utilization before issuing requests for capital expansion, this measure has been retained as a measure for institutional service plans issued under the Accountability Framework for British Columbia's Public Post-Secondary Education System. The Ministry will continue to explore methods by which institution year-round efficiency in the use of physical capacity can be appropriately and accurately measured.

Quality of Instruction

This measure is the percentage of former public post-secondary students who, when surveyed, rated the quality of instruction in their education program as very good or good. It is based on data obtained from annual student outcomes surveys.

Decisions concerning instructional policies and procedures are made exclusively by institutions. Consequently, beginning with the 2005/06 fiscal year, this measure is no longer included in the Ministry Service Plan, although it has been retained as a measure for institutional service plans issued under the Accountability Framework for British Columbia's Public Post-Secondary Education System.

The latest data show that former students continue to rank quality of instruction very high. Due to differences in the scales, no comparison should be made between the two sectors. Specifically, B.C. College and Institute Student Outcomes Survey respondents were asked to rate their quality of instruction on the following 5 point scale: Very Good, Good, Adequate, Poor, Very Poor. College and institute respondents also had the option to respond "Not Applicable," "Don't Know," and "Refused." On the other hand, B.C. University Baccalaureate Graduate Survey respondents were asked to rate their quality of instruction on the following 4 point scale: Very Good, Good, Poor, Very Poor. These respondents also had the option to respond "Don't Know" and "Refused."

Historical Data1

  Colleges, University Colleges and Institutes Universities
Survey Year % %
2000 n/a 95.0
2001 80.3 n/a2
2002 79.3 95.5
2003 81.3 n/a2
2004 83.5 95.8
2005 83.4 95.4
1  The margins of error are less than 1% at the 95% confidence level.
2  The 2001 and 2003 B.C. University Baccalaureate Graduate Surveys did not ask about quality of instruction.

Appendix 3: Updated Method of Counting Student FTEs for Post-secondary Central Data Warehouse (CDW) Reporting Institutions

For many years, the Ministry used student FTEs as the measurement unit in several performance measures. Beginning in fiscal year 2005/06, an updated method was implemented to calculate FTEs at 21 institutions that submit student data to the Ministry through the B.C. Post-secondary Central Data Warehouse. 2005/06 marks a transition between the previous and updated methods. The 2005/06 – 2007/08 Service Plan Update showed targets in the previous counting method. With implementation of the updated method, institutions used the new method to report their 2005/06 actual student FTEs. Therefore, the actual results for 2005/06 were recalculated to be consistent with the original targets.

Drivers for Change

During the past decade, the delivery models for post-secondary education have changed dramatically. On-line and distance learning programs have experienced a significant enrolment increase and lifelong learning has become a reality. Today's campus also delivers a wider range of educational programs such as continuing and cooperative education.

As a result of the changes in post-secondary education, the FTE reporting methodology had become outdated, requiring a new, more appropriate reporting method. The Ministry has developed the updated FTE reporting method in consultation with post-secondary institutions.

Updated Reporting Methodology

The fundamental change to the FTE reporting method is a more comprehensive accounting of instructional delivery at colleges, university colleges and institutes, whereby all educational instruction is included in FTE calculations. In the past, some instruction, such as continuing education programs, was excluded from FTE calculations. An example of this change follows:

Previously, the Hospital Unit Clerk program at one institution was considered "base funded" and counted in that institution's FTEs, but the Hospital Unit Clerk programs at three other institutions were not included.

In addition, the new counting method is more similar to how FTEs are counted in the university sector. Additional changes include new program coding to align with other provincial jurisdictions, measuring learning units in either credits or hours, and the introduction of divisors based on program length. A divisor, therefore, identifies a normal full course load of a program as approved through the institution's educational approval processes, and expressed in learning units. An example of this change follows:

Previously, 36 full-time students in Dental Assisting at one university college generated 42 FTEs, while 18 full-time students in entry level Carpentry at a college generated 16.2 FTEs. In the updated method, a full-time student in a full-time program for an academic year will generate one FTE, so the 36 full-time students in Dental Assisting will be 36 FTEs and the 18 full-time students in entry level Carpentry will be 18 FTEs.

Comparison of Reporting Methods

To ensure openness and transparency of performance information, the FTE measures presented in this service plan report are reproduced in the following tables to reflect both the previous and updated methods.

The updated FTE methodology includes an expansion to the scope, as college sector institutions may now count qualifying continuing education and contract training toward their FTE targets (almost half of the institutions were previously reporting their continuing education and contract training). The 2003/04 baseline has been revised to recognize the increased scope. The Strategic Investment Plan (SIP) growth of 25,000 seats by 2010 has been added to that updated baseline. The calculation of the 2009/10 target under the previous and updated methodologies is shown below.

Method Previous Method Updated
2003/04 baseline: 160,848 FTEs 165,846 FTEs
SIP growth: +25,000 FTEs +25,000 FTEs
2009/10 target: 185,848 FTEs 190,846 FTEs

The following table includes FTEs, as calculated under both the previous and updated methodologies for the following:

  • 2004/05 actual FTEs
  • 2005/06 targets and actual FTEs
  • 2005/06 utilization

Impact of Change in FTE Counting Method:

A. On Total FTE Targets

Fiscal Year Previous
Method
Updated
Method
2004/05 Actuals 161,681 166,247
2005/06 Target 168,265 173,263
2005/06 Actuals 165,739 169,243
2005/06 Utilization 98.5% 97.7%

B. On Computer Science, Electrical and
Computer Engineering Targets

Fiscal Year Previous
Method
Updated
Method
2004/05 Actuals 6,331 6,317
2005/06 Target 7,934 7,907
2005/06 Actuals 6,168 6,129
2005/06 Utilization 77.7% 77.5%

C. On Social Work Targets

Fiscal Year Previous
Method
Updated
Method
2004/05 Actuals 1,149 1,001
2005/06 Target 1,050 1,039
2005/06 Actuals 1,076 1,068
2005/06 Utilization 102.4% 102.8%

D. On Nursing and Allied Health Targets

Fiscal Year Previous
Method
Updated
Method
2004/05 Actuals 10,526 10,111
2005/06 Target 11,053 10,500
2005/06 Actuals 11,653 10,797
2005/06 Utilization 105.4% 102.8%

E. On Developmental Programs

Fiscal Year Previous
Method
Updated
Method
2004/05 Actuals 12,711 12,096
2005/06 Target 13,275 12,793
2005/06 Actuals 13,362 12,511
2005/06 Utilization 100.7% 97.8%

Notes:

  1. 2005/06 was the first year of a revised student FTE reporting method for the 21 institutions that report through the Post-secondary Central Data Warehouse (all institutions except five universities). As a result, FTE figures in this report will be subject to review and amendments will be published if required.
  2. Figures include Entry Level Trades Training FTEs, but not Apprenticeship FTEs.
  3. The updated method will result in a decrease in some program areas such as Nursing and Computer Science if students have already done their elective courses before enrolling in the program.
  4. Allied Health programs include increased scope resulting from newly identified health programs at some institutions.

Appendix 4: Report on Industry Training Authority Measures

The Industry Training Authority (ITA) now reports to the Ministry of Economic Development, and as of the 2005/06 – 2007/08 Service Plan Update published September 2005, is no longer included in the performance measures for the Ministry of Advanced Education. However, the Ministry of Advanced Education has committed to reporting on progress on two measures that were included in the 2005/06 – 2007/08 Service Plan published in February 2005.

Number of Trainees in Industry Training

This measure indicates whether the ITA was able to meet a targeted increase in the number of participants in industry training programs. The numbers reflect only registered trainees/apprentices in recognized and accredited industry training programs. Other vocational training in post-secondary institutions has been incorporated into the total student spaces measure.

  2004/05
Results
2005/06
Target
2005/06
Result
Number of Trainees in Industry Training 14,676 24,000 26,525

The ITA achieved its target of 24,000 registered trainees/apprentices in recognized and accredited industry training programs. Year-end results show an increase to 26,525 trainees/apprentices which is 10 per cent above the target.

Student Satisfaction with Education

The Ministry of Advanced Education's 2005/06 – 2007/08 Service Plan (February 2005) identified the need to measure student satisfaction with education. Due to the nature of the ITA's programs, the ITA has interpreted this as student satisfaction with technical training, which is the component of training that occurs within educational institutions. On the latest survey of trainees, 91 per cent of respondents were satisfied or very satisfied with their technical training, up from 83 per cent last year.

The Ministry of Advanced Education and the ITA undertook an apprenticeship pilot survey in early 2005, which included all former apprenticeship students who completed their apprenticeship technical training in a B.C. post-secondary institution between July 2003 and June 2004. Twenty-two institutions participated in this pilot project: 13 public and 9 private.

The survey found that 82 per cent of former apprenticeship students were 'completely' or 'mainly' satisfied with their in-school training.

Appendix 5: Student Outcomes Surveys in British Columbia

The following is a brief discussion about student outcomes surveys in British Columbia, the results from which are utilized for several performance measures identified in this report (i.e., performance measures # 9, 11, 12, 14, and 15).

Student outcomes surveys have been undertaken for the university college, college and institute sector since 1988 and for the university sector since 1995. These telephone surveys provide data about a number of things, including various aspects of the former students' post-secondary education experience, further education undertaken, labour market experience (employment outcomes), etc. A sample of former college, university college and institute students are surveyed annually, between nine months and 20 months after completion (or near completion) of their education program. A sample of university baccalaureate graduates are surveyed annually, two years and five years after graduation. Starting in 2005, the baccalaureate survey will provide two-year-out data annually. Prior to this, the B.C. University Baccalaureate Graduate Survey provided two-year-out data only every second year (i.e., the 2004, 2002 and 2000 surveys focused on graduates two years after graduation, whereas the 2003 and 2001 surveys focused on graduates five years after graduation).

By their nature, all surveys are subject to potential error due to sampling, questionnaire design and response bias. The amount of potential error (i.e., margin of error) in any survey result is estimated based on the level of confidence that the sample result accurately reflects what the true result would have been if the entire target population had been surveyed. For most performance measures that utilize student outcomes survey data, the margin of error is less than one per cent at the 95 per cent confidence level; in other words, the Ministry is 95 per cent confident that the results of the sample survey are less than one per cent different from what the true result would have been if the entire target population had been surveyed. Consequently, the Ministry believes the results of the student outcomes surveys are a reliable basis for performance measurement.

For some of the performance measures that utilize student outcomes survey data, the baselines identified in this report are different from the baselines identified in the 2004/05 – 2006/07 Service Plan. These baseline revisions, which were first identified in the 2005/06 – 2007/08 Service Plan, were made to facilitate the trend line analysis required for determining whether the targets were achieved, and to improve consistency with other measures, many of which have the 2001/02 (fiscal or academic) year as their baseline. The general principle for the revisions was to establish the baselines using the most recent survey data available in the 2001/02 year. For the surveys of former college, university college and institute students, the baselines were revised to reflect results of the 2001 survey (or the 2002 survey, if there was no result from the 2001 survey). For the surveys of university baccalaureate graduates, the baselines were revised to reflect results of the 2000 survey (or the 2002 survey, if there was no result from the 2000 survey).

For most of the performance measures that utilize student outcomes survey data, the 2005/06 target was to "maintain high level of satisfaction or student assessment (benchmark level of 85 or 90 per cent), or demonstrate performance improvement over time." Demonstrated performance improvement over time is based on a trend line calculated from annual performance (plus or minus a margin of error) over the period between the baseline and the most recent year.

Back. Annual Service Plan Reports 2005/06 Home.