British Columbia Government Crest.  
B.C. Home
CONTENTS
Message from the Minister and Accountability Statement  
Message from the Minister of State and Accountability Statement  
Highlights of the Year  
Ministry Role and Services  
Report on Performance  
Report on Resources  
Appendix 1: Information to Note  
Appendix 2: Glossary of Terms  
 
OTHER LINKS

Ministry of Children and Family Development  

Annual Service Plan Reports 2004/05 Home
 
B.C. Home  Annual Service Plan Reports 2004/05   Appendix 3: Additional Performance Information Adobe Acrobat Reader link page.

Appendix 3: Additional Performance Information

Performance Measure: Satisfaction of individuals with services received
Why did we choose to measure it?
  • Client satisfaction information supports quality improvement efforts and demonstrates value to clients to ensure that programs, products and services are delivered as effectively and efficiently as possible.
How was the target selected?
  • This was a new measure, therefore the baseline and associated target were under development.
What is the multi-year trend?
  • This is a new measure, therefore multi-year trend data is not available.
What are the things to keep in mind when reading the results?
  • The accreditation process more efficiently assesses client satisfaction with programs, products and services and what changes might be advantageous. This measure helps to determine how well programs are working from the client's perspective and what changes might be required.
How did we measure it?
  • As the intended survey was replaced with the progressive implementation of accreditation, which provides a more comprehensive measure of service quality, the ministry did not proceed with measurement of satisfaction in this manner.
Where did we get the data?
  • Data not collected because the originally intended satisfaction survey was not completed.
Performance Measure: Number of adults and families of children with special needs who receive direct or individualized funding (IF/DF)
Why did we choose to measure it?
  • The number of adults and families of children with special needs that are using IF/DF tells the ministry about the usage of this type of financial support.
  • IF/DF puts service providers in the position of having to respond directly and be accountable to the consumers of their services, as well as providing increased options and flexibility to individuals and their families.
  • Funding is provided directly to families of children with special needs to purchase intervention or support services.
  • IF/DF aims to improve outcomes for adults with developmental disabilities and for children and youth with special needs.
  • Individualized funding was a key strategy in the
    2004/05 Service Plan and an important element of CLBC’s proposed service transformation.
How was the target selected?
  • The new measure in the 2004/05 – 2006/07 Service Plan had a baseline of 3,150 families of children with special needs for 2003/04. The target for 2004/05 was 4,200 individuals (adults with developmental disabilities and families of children with special needs) receiving IF/DF.
  • IF/DF began as a joint initiative with the Interim Authority for Community Living B.C. in 2003/04. The target was selected to determine the uptake of new and existing individualized and direct funding programs in MCFD.
What is the multi-year trend?
  • This is a new measure therefore multi-year trend data is not available.
How did we measure it?
  • Number of families of children with special needs who receive direct or individualized funding (Autism Funding: Under Age Six, Autism Funding: Ages 6 – 18, At Home Respite and Supported Child Development).
What are the things to keep in mind when reading the results?
  • Although implementation of IF/DF for adults has been put on hold, it is a priority for CLBC.
  • IF/DF is intended to improve outcomes by allowing families and individuals to tailor services to their unique needs.
Performance Measure: Percentage of children (aged 4 – 6) and youth (aged 17 – 19) with special needs that have completed transition plans
Why did we choose to measure it?
  • For children and youth with special needs and their families, transitions at key developmental stages can be challenging. Having a transition plan in place for these individuals will increase their chances for success.
How was the target selected?
  • It was anticipated that the number of individuals that completed transition plans in 2004/05 would be double that of the previous year.
What is the multi-year trend?
  • New measure therefore multi-year trend is not available.
How did we measure it?
  • N/A
What are the things to keep in mind when reading the results?
  • N/A
Where did we get the data?
  • N/A
Performance Measure: Percentage of individuals served in family model homes
Why did we choose to measure it?
  • Family model homes are the residential setting of choice for adults with developmental disabilities.
  • Encouraging these placements when appropriate is best practice and one of the keys to ensuring community-based, inclusive and sustainable CLS services in the future.
How was the target selected?
  • The target represented a goal to support more individuals to live in settings that matched the individual's assessed need with type of resource.
What is the multi-year trend?
  • New measure therefore multi-year trend not available.
How did we measure it?
  • The measure reflects data received by the regions on a monthly basis. From the data, an overall percentage of occupancy for the province was calculated.
  • The measure for 2004/2005 represents a change in methodology from how the baseline (November 2003) was measured.
What are the things to keep in mind when reading the results?
  • This measure will continue to evolve as innovations to service delivery are developed based on best practices.
  • As regions place individuals, when appropriate, into family model homes and semi-independent living arrangements, overall occupancy proportions will eventually reach the original target established.
Where did we get the data?
  • Resource and Payment Systems (RAPS) and monthly regional reporting.
Performance Measure: Number of new public/private partnerships to raise awareness and commitment to Fetal Alcohol Spectrum Disorder (FASD) prevention
Why did we choose to measure it?
  • This measure was chosen because long-term and sustainable capacity building in the early childhood sector is dependent on the investment of the broader community — beyond government. Private/public partnerships encourage this investment from community and corporate stakeholders.
How was the target selected?
  • Private-public partnerships in the social service sector are a relatively new phenomenon. We expect to see continued growth in the number and extent of such partnerships yet must have realistic expectations about the potential for growth in this area.
What is the multi-year trend?
  • New measure therefore multi-year trend not available.
How did we measure it?
  • The count of the new and existing partnerships at the end of the fiscal year.
What are the things to keep in mind when reading the results?
  • See comments in "How was the target selected "
Where did we get the data?
  • Early Childhood Development Branch — MCFD.
Performance Measure: Percentage (number) of children up to age six on the wait list for supported child development (formerly Supported Child Care)
Why did we choose to measure it?
  • SCD permits children with special needs to participate in regular child care options. The intention was to increase access to services and reduce waitlists for SCD. The refocus provides inclusive community-based child care options for children with special needs with the goal of increasing social inclusion and school readiness.
How was the target selected?
  • Estimation of additional services that could be provided with additional funding available under the 2003 – 2008 Early Learning and Child Care multi-lateral agreement.
What is the multi-year trend?
  • New measure therefore multi-year trend not available.
How did we measure it?
  • Survey of providers to determine waitlist numbers and services provided.
What are the things to keep in mind when reading the results?
  • Of the 2,922 services provided to children under age six in 2004/05, 539 were Aboriginal.
  • MCFD tracks the number of services provided to children and children may receive more than one service.
  • The waitlist includes children waiting for new SCD services and those already being served but needing extra staffing supports.
  • Waitlists fluctuate throughout the year, based on timing of agency reporting.
  • Data is reported manually by SCD agencies. Development of a common data management system will standardize reporting, allowing for a profile of all services accessed by individual children and their families.
Where did we get the data?
  • Annual survey and regional reporting.
Performance Measure: Number of community-based initiatives designed to prevent Fetal Alcohol Spectrum Disorder (FASD)
Why did we choose to measure it?
  • This new measure is based on the provincial FASD strategic plan.
  • FASD prevention initiatives are expected to contribute to a reduction in the incidence of FASD, resulting in improved health status and reduced life-long costs that would otherwise be associated with FASD at the community level.
How was the target selected?
  • The target was selected based on the 2004/05 funding for FASD initiatives.
What is the multi-year trend?
  • This is a new measure therefore no multi-year trend data is available.
How did we measure it?
  • Annual count of the number of FASD prevention-focussed initiatives funded by MCFD.
What are the things to keep in mind when reading the results?
  • Community awareness and prevention activities rely on strong grassroots partnerships among service providers.
  • This measure supports the ongoing development of a stronger knowledge base so community members will learn of a wider range of methods to reduce the number of infants prenatally exposed to alcohol and other drugs, subsequently leading to an increased number of healthy pregnancies.
Where did we get the data?
  • Early Childhood Development Branch — MCFD.
Performance Measure: Number of Aboriginal communities with early childhood development (ECD) initiatives
Why did we choose to measure it?
  • This measure represents progress toward building capacity in Aboriginal communities to support early childhood and family development.
How was the target selected?
  • The target was chosen in order to encourage all five regions to support ECD initiatives.
What is the multi-year trend?
  • The number of Aboriginal communities with early childhood development programs continues to surpass the targets set, highlighting growth and program sustainability over time.
How did we measure it?
  • Yearly count of number of Aboriginal communities with ECD initiatives funded.
What are the things to keep in mind when reading the results?
  • An Aboriginal community is as defined by the particular community.
  • An initiative in a single community may have several project components.
Where did we get the data?
  • Early Childhood Development Branch — MCFD.
Performance Measure: Number of out-of-care placements
Why did we choose to measure it?
  • This new measure for 2004/05 was based on Child and Family Development Service Transformation initiative.
  • An increase in out-of-care placements builds on family strengths and maintains the continuity of family and community relationships, contributing to better outcomes for children, youth and families.
How was the target selected?
  • Target was selected through consultation with the regions based on practice experience and evidence from implementation in previous years.
What is the multi-year trend?
  • Multi-year trend is not available.
How did we measure it?
  • Counts from MCFD regions.
What are the things to keep in mind when reading the results?
  • Out-of-care placements were introduced in 2002/03. In 2003/04 social workers used out-of-care placements for two populations: children already in care who would be better served through an out-of-care placement; and children who might otherwise have been brought into care. As a result, the children who were in care and who would be better served with an out-of-care placement were identified in 2003/04 and supported in a planned transition to an out-of-care placement. During that same time period, children were not admitted into care where an appropriate out-of-care placement was identified as a viable option.
  • By 2004/05, there were significantly fewer children in care who could or should be moved to an out-of-care placement. Therefore the focus is now on seeking out-of-care placements for children rather than bringing them into care.
  • The data collection methodology was refined and the new baseline reflects the improved methodology.
Where did we get the data?
  • The revised baseline data and the 2004/05 totals are from the ministry's information system.
Performance Measure: Percentage (number) of child welfare interventions that are resolved through alternative dispute resolution processes (ADR)
Why did we choose to measure it?
  • To demonstrate the efficacy of utilizing ADR as a case management tool. ADR processes have been demonstrated to be effective and efficient in resolving family and community issues, leading to better and more timely outcomes for children and families.
How was the target selected?
  • The target selected based on considerations of staff capacities, additional training to be delivered in 2004, and improvement in the reporting and collection of ADR statistics.
What is the multi-year trend?
  • This is a new measure therefore multi-year trend data is not available.
How did we measure it?
  • Number of completed mediations plus the number of completed Family Group Conferences to get total number of ADR events completed in fiscal year.
What are the things to keep in mind when reading the results?
  • The originally stated measure was "the percentage of child welfare interventions resolved through ADR". This measure was modified because of issues that arose in defining the measure and developing the data collection methodology. ADR is utilized to resolve a variety of disputes in child welfare. This includes disputes as to the legal status of the child, placement, short and long-term plans for children and families. ADR is used to resolve issues between MCFD and families, foster parents and others. Given this breadth, it was not possible to accurately identify and count the number of child welfare interventions to use in calculating the original measure.
  • The reported numbers reflect the number of ADR events completed. These numbers do not include cases referred to ADR where the mediation or family group conference has not proceeded, although in some cases the preliminary work of the mediator or family group conference coordinator has assisted in resolving the matter, and the ADR is no longer needed.
  • Not all of the completed ADR events lead to resolution of all of the issues. However, research shows that ADR is an effective tool for resolving disputes. For example, in the review and evaluation of the Surrey Court Project 83 per cent of cases had all issues resolved, 12 per cent had some issued resolved, and only five per cent had no issues resolved.
Where did we get the data?
  • The source information for the number of mediations completed is reports generated by the Ministry of Attorney General's Dispute Resolution Office. The source information for the number of completed Family Group Conferences is manual counts conducted in each region.
Performance Measure: Number of service delivery sites where collaborative service approaches are in place
Why did we choose to measure it?
  • Integrated service delivery approaches that include alternate community-based programs reduce the fragmentation of child and family services and are more responsive to the needs of the community, promoting better outcomes for children, youth and families.
How was the target selected?
  • There was no baseline information available for this measure. The target was set to ensure that a reasonable minimum number of collaborative service sites were in place before the end of the 2004/05 fiscal year.
What is the multi-year trend?
  • This was a new measure therefore multi-year trend data is not available.
How did we measure it?
  • Manual counts conducted by regional staff.
What are the things to keep in mind when reading the results?
  • The data includes a range of approaches from a relatively simple co-location of two ministries or agencies to an elaborate collaborative service delivery centre involving numerous organizations.
Where did we get the data?
  • Child and Family Development Division — MCFD.
Performance Measure: Percentage of Aboriginal children in care served by delegated Aboriginal agencies
Why did we choose to measure it?
  • The target indicates a step towards our goal to have all Aboriginal children in care served by delegated Aboriginal agencies.
How was the target selected?
  • The ministry's plan in 2002/03 was to transfer 1,500 children in care to the care of Aboriginal agencies. Given that there were more than 4,200 Aboriginal children and youth in the ministry's care at the time of the establishment of the target, the target should have been 35 per cent based on the number of agencies in development.
What is the multi-year trend?
  • The trend is moving in the desired direction with almost one third of Aboriginal children in care served by delegated agencies.
How did we measure it?
  • The count was made on a monthly basis from Management Information system (MIS) and Social Worker Information System (SWS).
What are the things to keep in mind when reading the results?
  • The development of delegated agencies through the tri-partite process has taken longer than anticipated. It is expected that more agencies will be delegated to take on guardianship responsibilities in 2005/06 allowing for the transfer of more children in care.
  • Some delegated agencies have not yet assumed responsibility to deliver a full range of services.
Where did we get the data?
  • SWS/MIS, Child and Family Development Division — MCFD.
Performance Measure: Percentage of Aboriginal children in care of the ministry who are being cared for by Aboriginal families
Why did we choose to measure it?
  • This is a new measure for the 2004/05 – 2006/07 Service Plan, based on CFD Service Transformation initiative but previously had been internally monitored.
  • Caring for Aboriginal children in care in Aboriginal families is an important way of providing effective and culturally appropriate supports and services.
  • The ministry aims to increase the number of Aboriginal children served by the ministry who are cared for by Aboriginal families to help retain connections with their communities, extended family and cultural heritage.
How was the target selected?
  • The baseline was revised to 20 per cent for 2004/05 in the 2005/06-2007/08 Service Plan.
  • The target is based on information from a number of sources:
    • from our joint work with the Federation of Aboriginal Foster Parents to attract and refer more Aboriginal foster parents;
    • from our work on the Roots Project, examined plans of care for Aboriginal children; and
    • from ministry staff who are aware of the benefits of placing Aboriginal children in Aboriginal homes.
What is the multi-year trend?
  • Since 2001, the percentage of Aboriginal children in care who are being cared for by Aboriginal families has generally increased.
How did we measure it?
  • The results are tracked on a monthly basis in the Resource and Payment System (RAPS).
What are the things to keep in mind when reading the results?
  • 50 per cent of Aboriginal children adopted in 2004/05 were adopted by Aboriginal families (69 of 131 adoptions). Approximately 50 per cent of these placements were originally foster placements thereby reducing the number of available Aboriginal foster homes.
  • This measure is dependent on ministry staff completing the information on RAPS and identifying caregivers as Aboriginal.
  • The agencies have not had access to RAPS but as of September 2005, the ministry hopes to have corrected the inconsistency between ministry and delegated agencies electronic systems.
Where did we get the data?
  • RAPS
Performance Measure: Rate of youth in custody based on a proportion of all 12 – 17 year olds (per 10,000)
Why did we choose to measure it?
  • This is a continuing measure from the 2004/05 – 2006/07 Service Plan.
  • This measure gauges how much the youth correctional system relies on incarceration and indicates the effectiveness of community-based alternatives to custody.
How was the target selected?
  • The target was selected because it reflects historical trends in B.C. and provides a comparison to national and other provincial rates.
What is the multi-year trend?
  • The number of youth in the justice system is declining, demonstrating progress toward the goal of minimizing youth involvement in the criminal justice system by providing treatment services and community-based alternatives to custody. There was a steady decrease in youth custody rates that stabilized in 2003/04. A modest increase (three to five per cent) is projected as a result of demographic growth and adjustments in the justice system to the Youth Criminal Justice Act.
How did we measure it?
  • Custody centre daily counts are collected and reported via the Management Analysis and Reporting System (MARS). General population figures for 12 to 17 year old youth are extracted from B.C. Stats.
What are the things to keep in mind when reading the results?
  • B.C. has the lowest youth custody rate in Canada.
  • Although the overall numbers have declined, the needs profile of youth in custody has increased. Diverting youth with low needs and minor offences to community justice programs has left a higher concentration of youth in custody with special needs (e.g., FASD or mental health) and more serious offence histories. Although there has been a reduction in system capacity and consequently in overall staffing, there has been no reduction in per client staffing and services for the smaller number of youth in custody.
Where did we get the data?
  • MARS and B.C. Stats.
Performance Measure: Number of authorities established
Why did we choose to measure it?
  • This is a continuing measure from the 2004/05 – 2006/07 Service Plan.
How was the target selected?
  • The initial target was set following the Core Services Review as part of the ministry's strategic shifts.
What is the multi-year trend?
  • Permanent authority not yet established.
How did we measure it?
  • N/A
What are the things to keep in mind when reading the results?
  • The Boyd Report assessed readiness of transferring services and resulted in a planned delay to ensure a comprehensive, no-risk approach to the transfer.
  • Significant planning and implementation is underway to successfully create CLBC through legislation and transfer of services in Summer 2005.
Where did we get the data?
  • N/A
Performance Measure: Reduce the ministry's regulatory burden by 40 per cent
Why did we choose to measure it?
  • Government set targets and timelines for all ministries to reduce their regulatory burdens.
How was the target selected?
  • MCFD's target was 40 per cent reduction by June 2004.
What is the multi-year trend?
  • Deregulation was a three-year project, from June 2001 to June 2004.
How did we measure it?
  • Consistent with deregulation office guidelines.
What are the things to keep in mind when reading the results?
  • A significant portion of the reduction was due to duplication in Child, Family and Community Service policy.
Where did we get the data?
  • MCFD's deregulation database.
Performance Measure: Ministry rating of Enterprise-wide Risk Management implementation (based on government-endorsed Risk Maturity Index rating)
Why did we choose to measure it?
  • The ministry recognizes risk management as critical to the achievement of its goals and objectives. Enterprise-wide Risk management is a sound practice to manage risk effectively and to incorporate risk awareness and treatment into the processes used to pursue ministry objectives.
  • It is necessary to have a clear view of the ministry’s approach to risk and be able to benchmark its present organizational maturity using a generally accepted framework.
How was the target selected?
  • The target was based on government's recommended Risk Maturity Index which is an assessment tool designed to measure risk management capability and provide objectives for improvement.
What is the multi-year trend?
  • This is a new measure therefore multi-year trend data is not available.
How did we measure it?
  • The ministry used the definitions and criteria in the risk maturity index to assess its level of risk maturity.
  • The ministry developed a three-year Enterprise-wide Risk Management (ERM) Plan and set level 2 risk maturity as the target for fiscal year 2004/05. For the next two fiscal years, the target is to achieve level 3 maturity where risk management becomes a routine business process.
What are the things to keep in mind when reading the results?
  • Effective risk management requires a cultural change to ensure that this approach informs all of our decision making.
Where did we get the data?
  • We reviewed business processes in core areas of the ministry against the government's core policy and best practices.
Performance Measure: Number of funded child care facilities
Why did we choose to measure it?
  • The number of facilities receiving this funding is one indicator of government's support for child care service providers.
How was the target selected?
  • Streamlining of the process was expected to lead to a modest increase.
How did we measure it?
  • Count of all the active facilities funded under the Child Care Operating Funding as of March 31, 2005.
What is the multi-year trend?
  • Multi-year trend data is not available because of the amalgamation of funding initiatives in April 2003.
What are the things to keep in mind when reading the results?
  • "Funded" means funded under the Child Care Operating Fund and does not include other funding programs, e.g., capital.
Where did we get the data?
  • Child Care Operating Fund report for April 20, 2005 of funded facilities at March 31, 2005.
Performance Measure: Number of licensed child care spaces available for families in B.C.
Why did we choose to measure it?
  • Availability of child care is a key need of many families.
  • Measures the extent to which government supports operators to provide quality licensed child care spaces.
How was the target selected?
  • The target was based on the projected participation in the program.
How did we measure it?
  • N/A
What is the multi-year trend?
  • Multi-year trend data is not available.
What are the things to keep in mind when reading the results?
  • Current data on the number of licensed child care spaces is not available at the present time. Most recent data is from March 2003.
  • Some licensed child care facilities are not eligible for child care operating funding. This includes child minding, e.g., ski hills, shopping malls and those facilities that choose not to participate.
Where did we get the data?
  • N/A
Performance Measure: Per cent of licensed child care family facilities that are funded
Why did we choose to measure it?
  • Monitoring the number of licensed family and funded centre-based providers on an ongoing basis informs future planning.
How was the target selected?
  • The target was based on the projected participation in the program.
How did we measure it?
  • N/A
What is the multi-year trend?
  • Multi-year trend data is not available because of the amalgamation of funding initiatives in April 2003.
What are the things to keep in mind when reading the results?
  • Current data on the number of licensed child care facilities and spaces are not available.
Where did we get the data?
  • N/A
Performance Measure: Per cent of eligible child care centre-based facilities that are funded
Why did we choose to measure it?
  • Monitoring the number of licensed family and funded centre-based providers on an on-going basis informs future planning.
How was the target selected?
  • The target was based on the projected participation in the program.
How did we measure it?
  • N/A
What is the multi-year trend?
  • Multi-year trend data is not available because of the amalgamation of funding initiatives in April 2003.
What are the things to keep in mind when reading the results?
  • Current data on the number of licensed child care facilities and spaces are not available.
Where did we get the data?
  • N/A
Performance Measure: Number of child care subsidies for children of eligible parents
Why did we choose to measure it?
  • Subsidy utilization is a measure of government support to low income parents for quality child care and to be able to monitor trends to plan for future needs.
How was the target selected?
  • Expected uptake based on expanded criteria for subsidy.
How did we measure it?
  • The measure is based on claims submitted by providers.
What is the multi-year trend?
  • There has been a downward trend in the numbers of children receiving subsidy despite broadened eligibility criteria.
What are the things to keep in mind when reading the results?
  • The target for 2004/05 was established before the subsidy forecasting model was developed.
Where did we get the data?
  • Data is provided by Ministry of Human Resources management information system.
Performance Measure: Implement new child care funding based on allocation of federal funding
Why did we choose to measure it?
  • It was chosen in order to be accountable for the allocation of federal funding as per the multi-lateral agreement.
How was the target selected?
  • The target is discrete.
What is the multi-year trend?
  • New measure.
How did we measure it?
  • Budgetary monitoring that compares allocations and expenditures.
What are the things to keep in mind when reading the results?
  • This measures funding provided in the existing 2003 – 2008 ELCC agreement.
Where did we get the data?
  • Finance and Administration Branch — MCFD.
Performance Measure: Child care subsidy forecasting model developed
Why did we choose to measure it?
  • Trend data is essential for planning.
How was the target selected?
  • N/A
What is the multi-year trend?
  • As the target is discrete, there is no multi-year trend.
How did we measure it?
  • N/A
What are the things to keep in mind when reading the results?
  • The subsidy forecasting model is an internal tool that is likely to continue to evolve as data quality improves.
  • Success of the model depends on the quality of the data.
Where did we get the data?
  • N/A
Performance Measure: Evaluate the following programs: Child Care Operating Capital Projects Resource and Referral Centre Child Care Subsidy (formerly Parent Subsidy)
Why did we choose to measure it?
  • Ongoing evaluation of programs is important to ensure that programs continue to deliver effective and efficient services.
How was the target selected?
  • The target is discrete.
What is the multi-year trend?
  • N/A
How did we measure it?
  • N/A
What are the things to keep in mind when reading the results?
  • N/A
Where did we get the data?
  • N/A
     
Back. Annual Service Plan Reports 2004/05 Home. Back.
Top.
Copyright. Disclaimer. Privacy. Accessibility.