Part 3Annual performance statements

Results by performance criteria

Performance criterion 1—Number and types of visitor interactions

DPS is the custodian of APH both as the home of the Parliament and as the working symbol of Australian democracy, and as a significant destination for our citizens and international visitors alike. Not only is it an iconic building of national significance, but part of DPS’ purpose is to make the building, and the important activity that takes place within it, accessible to the public. Visitors to APH are encouraged to view proceedings of both the Senate and House of Representatives from the public galleries or via the ParlView website to witness democracy in action. They can also learn about the work of the Parliament through a range of tours available to all visitors.

DPS measures visitor numbers for four types of visitor interactions which reflect the different modes of access to the building and the activities that take place. These are:

  • number of visitors
  • number of virtual visitors
  • number of visitors for DPS school tours, and
  • number of participants to DPS organised tours and events.

Enhancing the Parliament’s engagement with the community is the strategic theme which links this performance criterion to the achievement of our purpose. The relevant intended results for this performance criterion are to:

  • enhance our visitor experience and community engagement including the use of social media and emerging technologies, and
  • enhance electronic access to parliamentary information for the community to engage easily with the parliamentary process.

Criterion source

  • Program 1, 2016–17 Portfolio Budget Statement, p14
  • Program 1, 2016–17 Corporate Plan, p11

Results against performance criterion

Table 2: Number and types of visitor interactions
Target—Equivalent or greater to same period last year
2014–15 results 2015–16 results 2016–17 results
Number of visitors 759,483 (720,759)5 725,992 759,005
Number of virtual visitors 3,979,949 4,706,404 5,190,519
Number of visitors for DPS school tours6 132,7817 127,292 127,176
Number of participants to DPS organised tours and events7 - 74,829 73,879
Participants in general public tours 55,893 -8 -
Participants in other tours 7,384 -9 -
Number of functions and events in Parliament House10
Official visits 37 - -
Parliamentary 331 - -
Non-parliamentary 813 - -

Methodology

The number of visitors to APH is the number of people experiencing the building as general and business visitors. We calculate the total magnetometer count minus the number of pass holder swipes at the main entrance of APH.

The number of virtual visitors is the number of people that visit the ‘Visit Parliament’ webpage. The calculation is based on a Google Analytics report.

The number of visitors for DPS school tours is the number of visitors for DPS school tours including students and accompanying adults. All school tour participants are manually counted by our Visitor Services Officers.

The number of participants to DPS organised tours and events measures the number of people that actively participate in tours and events organised by DPS. The number of people participating in the tours and events are manually counted by our Visitor Services Officers.

In previous years, reported visitor numbers have included pass holders entering APH through the main foyer. In 2015–16 a more accurate method of calculation was adopted to not include pass holder numbers. This was to ensure the results more accurately reflect visitor numbers to APH rather than occupancy numbers.

Participant numbers for tours and events were reported separately in previous years and the events data included events bookings taken by an external provider. In 2015–16 DPS applied one KPI to focus only on DPS organised tours and events participants, and this has continued for the 2016–17 year.

Analysis

Number of visitors

Visitor Service Officers actively engage visitors with the work, stories and collections of APH through their tour program, including a suite of nine different tours and delivery of concierge services. This, combined with the eight high quality exhibitions, saw APH visitor numbers increase in 2016–17 by 33,013 or 4.5 per cent.

Number of virtual visitors

Our virtual visitor numbers continue to grow and this year’s results are a 10 per cent increase on the 2015–16 figures. As part of the new APH website, a new dedicated Visit Parliament area was launched featuring more content for online visitors.

Number of visitors for DPS school tours

School tours of APH are available to all primary and secondary Australian schools. Bookings are managed by the Serjeant-at-Arms’ office in the Department of the House of Representatives. The number of participants in school tours is slightly down from last year by 116 participants (0.1 per cent). However, this year’s figure is still above the past five years’ average school visitation number of 126,813. We will continue working with the Parliamentary Education Office and external stakeholders including the National Capital Educational Tourism Project (NCETP) to undertake collaborative marketing activities with a view to increasing school visitation numbers. To boost school visitation numbers a Civics and Citizenship Education Poster was developed in conjunction with the NCETP and sent to all schools in February 2017. Further to this the NCETP filmed a promotional video in APH as part of a broader promotional video that will be shown to tour operators as a marketing tool later in 2017.

Figure 3: Annual schools visitation figures

Due to the complexity of this document no alternative description has been provided. Please contact the Department of Parliamentary Services at www.aph.gov.au/dps for an alternative description.

*Annual schools visitation figures include students, teachers, carers and accompanying parents.

Number of participants to DPS organised tours and events

DPS event and tour numbers are slightly below (1.3 per cent) the annual performance target. The decrease is mainly due to the APH Open Day falling in the 2015–16 financial year which attracted approximately 5,000 visitors. Work has commenced through the Visitor Experience Improvement Project for planned public programs and events in late 2017 and 2018 anniversary events which aim to increase event numbers and virtual traffic to the Visit Parliament web page.

A number of free and paid tours and events were conducted at APH during 2016–17, which were well received by visitors.

Daily tours

APH visitors participated in a number of daily guided tours in 2016–17 including:

  • Welcome Tours—offered five times a day to introduce visitors to the most significant features of APH. The tours include a visit to the Chambers of Parliament on non-sitting days and viewing of the extensive Parliament House Art Collection on show, including in the Great Hall, the Marble Foyer and the Members’ Hall, and
  • Behind the Scenes Tours (Discovery Tours on Sitting days)—offered three times a day to give visitors an exclusive chance to visit some of the private spaces of APH. Visitors have the opportunity to stand beneath the Australian flag, to hear of the events that have shaped Australia and the unique building of APH. During sitting days, the Discovery Tours are offered, but access to the private spaces is not available.
Seasonal and subject-based tours

APH visitors participated in various tours and events held in 2016–17, these included:

  • Spring Glory Tours in September and October 2016, which focussed on the hidden courtyards and landscapes of APH. These tours highlighted the courtyards on the Senate and House of Representatives sides of the building. They also featured the springtime foliage of the large and small trees in the courtyards of APH
  • the Enlighten festival saw the new APH Catering and Events team present events on the roof and in the Members and Guests Dining Room, which were extremely popular, and
  • Autumn Colours in the Courtyards Tours in April and May 2017, which highlighted the spectacular changing landscape in the hidden courtyards of APH.
Orientation tours

Orientation tours for building occupants were also conducted throughout 2016–17.

Community events included a Remembrance Day ceremony held for building occupants and visitors with the Last Post and the Rouse played by Corporal Justin Lingard from the Royal Military College Band. The annual ‘Christmas comes to APH’ program, launched by the Presiding Officers, featured free public performances of Christmas carols and the DPS giving tree in the Marble Foyer. Four school choirs participated in the performances and $778.55 and $828.00 was raised for charities Bravehearts and the Australian Indigenous Doctors’ Association (respectively).

Performance criterion 2—Visitor satisfaction with the APH Experience

DPS aims to offer high-quality food and beverage services and engaging and innovative programs to enhance our visitor experience and community engagement, making APH a destination of choice and a showcase for the best products of our surrounding region. Regular and on-going feedback is essential to understanding visitor satisfaction with the APH experience.

DPS measures visitor satisfaction for four types of visitor interactions which reflect the different modes of access to the building and the activities that take place within it. These are:

  • % of visitor feedback indicating their visit met or exceeded expectations
  • % of virtual visitor feedback indicating their visit met or exceeded expectations
  • % of school visitor feedback indicating their visit met expectations, and
  • % of participants attending DPS tours and events indicating their visit met or exceeded expectations.

Enhancing the Parliament’s engagement with the community is the strategic theme which links this performance criterion to the achievement of our purpose. The relevant intended results for this performance criterion are to:

  • enhance our visitor experience and community engagement including the use of social media and emerging technologies, and
  • enhance electronic access to parliamentary information for the community to engage easily with the parliamentary process.

Criterion source

  • Program 1, 2016–17 Portfolio Budget Statement, p14
  • Program 1, 2016–17 Corporate Plan, p11

Results against performance criterion

Table 3: Visitor satisfaction with Australian Parliament House Experience
Target—85% satisfaction
2014–15 results 2015–16 results 2016–17 results
% of visitor feedback indicating their visit met or exceeded expectations - 97.40% 96.70%
% of virtual visitor feedback indicating their visit met or exceeded expectations - 81.00% 85.71%
% of school visitor feedback indicating their visit met expectations11 - 99.50% 96.62%
% of participants attending DPS tours and events indicating their visit met or exceeded expectations - 99.30% 97.94%
Visitor satisfaction
Visitor services—tours and information, The Parliament Shop and visitor catering, building access and parking 84.74% -12 -
Website 65.49% -13 -

Methodology

Visitors’ satisfaction with their experience at APH is measured through the percentage of visitor feedback indicating the visit met or exceeded expectations. The feedback is collected through visitor comment cards which are available throughout the building. We ask visitors to rate their overall experience from 1 to 5, 1 being poor and 5 being excellent, a score of 3 or above indicates visitor satisfaction. Visitors are also provided with the opportunity to provide comments which are monitored and addressed as part of a separate process.

The percentage of virtual visitors’ feedback indicating their visit met or exceeded expectations measures virtual visitor satisfaction with their interaction with the website. A visitor satisfaction survey is available on the ‘Visit Parliament’ webpage and prompts ‘Visit Parliament’ visitors to answer the question Did you find the information you wanted easily? The rating is from 1 being strongly disagree to 5 being strongly agree, a score of 3 or above indicates virtual visitor satisfaction with their interaction.

The percentage of school visitors’ feedback indicating their visit met expectations is measured by capturing the satisfaction of school educators with the experience of APH. Comment cards are provided to the teacher to rate the experience from 1 to 5 against two statements, a score of 3 or above indicates visitor satisfaction. The two statements are:

  • the tour engaged students, and
  • the information in the tour will assist students with their learning.

The percentage of participants attending DPS tours and events indicating their visit met or exceeded expectations is measured by capturing the satisfaction of participants attending DPS tours and events. Comment cards are available from Visitor Service Officers leading the tours and events. We ask visitors to rate their overall experience from 1 to 5, 1 being poor and 5 being excellent, a score of 3 or above indicates visitor satisfaction. Visitors are also provided with the opportunity to provide comments which are monitored and addressed as part of a separate process.

In the 2016–17 reporting period, DPS used multiple mechanisms to measure visitor satisfaction including comment cards (general visitor, school tours and tours) and TripAdvisor.

Analysis

Visitor feedback

At 96.70 per cent, we achieved a high level of visitor satisfaction in 2016–17 and received excellent feedback from a variety of independent sources. This feedback demonstrated the quality and relevance of our programs for visitors. The TripAdvisor report of 1 July 2017 shows that APH maintained its number two ranking of ‘218 things to do in Canberra’ for the sixth consecutive month. In 2016, APH received a TripAdvisor Travellers Choice Award which is the third consecutive award from TripAdvisor for superior service and tourists ranked APH second of ‘218 things to do’ in Canberra.

Virtual visitor feedback

Virtual visitor satisfaction was 85.71 per cent for 2016–17. Virtual visitor feedback continues to be limited, with only 161 virtual visitors providing feedback via the website.

School visitor feedback

During 2016–17, 127,176 school students and teachers were provided with a tour of APH, with school tour satisfaction results of 96.62 per cent demonstrating that these tours are highly valued by participants.

DPS tours and events feedback

DPS tour and event visitor satisfaction achieved a high result for 2016–17 of 97.94 per cent, indicating that these tours and events are a valued experience for participants.

A sample of visitor comments for tours and events held in 2016–17 appears below:

Welcome Tour

  • We did indeed make the 3:30 tour and it was fantastic. May we commend Heather for her great presentation, informative, educational, and humorous! And she wasn’t disturbed by our somewhat rambunctious two year old. Thanks so much, it was definitely a highlight of our trip.
  • Shane our tour guide, impeccable presentation and very informative. Really great person.
  • This is a short note to recognise the very high quality of service provided by all members of staff with whom we came into contact during a visit to Parliament House today. I attended Question Time with my mother, who is immobile following an injury and currently getting around in a wheelchair. We found the staff with whom we had contact to be exceptional in their manner, helpfulness and consideration, and appreciate the opportunity this provides to all visitors to attend parliamentary sittings and experience the House.
  • We had a beautiful tour with Rosie—she was incredibly interesting, helpful, and managed our crazy six year old calmly and respectfully. I believe we got some lovely photos. A wonderful visit, and really understanding assistance from you. I can’t thank you all enough.

School Tour

  • I would like to commend the guide who supported our children during our visit, her name was Monique and she was just wonderful. The way she spoke to our students, encouraged them to share their knowledge, answered their questions and informed them of all the details and information was outstanding. The students came away with a wealth of knowledge and a real sense of pride in what they knew. She also got the message across to the students about the privilege of voting and the responsibility it entails. As a teacher I have visited Parliament on many occasions and Monique is by far the best guide we have had. Could you please pass on to her our thanks and appreciation.
  • Nathan asked me ahead of time what I wanted included and made sure all information was presented. He was very positive with the children and managed them impressively.
  • Gina was very informative, a wealth of information. Excellent overview of HoR and Portraits of former PMs + Senate.
  • Stephen was very engaging and always answered the children’s questions diligently. The information in the tour backed what we had been learning & opened new inquiries.
  • Emma was friendly, engaging and knowledgeable. Best tour guide we have had in 4 years! Thank you.
  • Tour was informative and engaging. Rosie was extremely passionate with her information and delivery.

Behind the Scenes Tour

  • Michael gave us the most magical experience, amazing knowledge and remarkable ability to hold every member of the group while sharing so much information…thank you!
  • Very interesting and insightful. Was impressed by Marie who added stories and personal experiences which made it exciting.
  • Sascha was a wonderful guide, behind the scenes tour. Her knowledge and enthusiasm shone through and was infectious.
  • Lori gave me one of the best tours I have ever received of a government building. That’s high praise as I have been on tours all over the US and Europe.

Spring Glory Tour

  • Tour guide was excellent in breadth of knowledge and information provided about both garden design and the individual plants.
  • A fabulous tour viewing the gardens. Andrew was a terrific guide full of lots of information. An asset to APH.
  • Monique was amazing. So much knowledge! Monique was engaging and interactive with our children and made it relevant to them and easy for them to understand. Well done!
  • Have visited Parliament house several times but this was exceptional. The commentary was extremely interesting and professional. Great!!! Marie was most excellent guide. The gardens are magnificent and good to see another side of our beautiful Parliament House.

Performance criterion 3—Building occupant satisfaction with timeliness and quality of DPS services

DPS is responsible for the delivery of a broad range of services directly and through facilitated arrangements. To continue to improve our services, it is important to gauge building occupant satisfaction with the timeliness and quality of DPS services.

DPS measures building occupant satisfaction with timeliness and quality of DPS services across a number of service categories. These are:

  • food and beverage/catering services
  • retail/sporting services
  • maintenance/cleaning services (including gardens and landscaping)
  • security services
  • parliamentary reporting and recording services
  • ICT services
  • visitor/art services
  • nurses centre services, and
  • loading dock services

Responding to the changing needs of the Parliament is the strategic theme which links this performance criterion to the achievement of our purpose. The relevant intended result for this performance criterion is to:

  • implement efficient and effective infrastructure, systems and services to respond to the changing needs of the Parliament and our parliamentarians.

Criterion source

  • Program 1, 2016–17 Portfolio Budget Statement, p14
  • Program 1, 2016–17 Corporate Plan, p11

Results against performance criterion

Table 4: Building occupant satisfaction with timeliness and quality of DPS services
Target—75% satisfaction
2014–15 results 2015–16 results 2016–17 results
% of building occupant feedback indicating a satisfied or neutral rating with timeliness and quality of DPS services (by DPS service category) - 89.40% 90.17%
Breakdown by service category
Food and beverage/catering services - 75.80% 88.34%
Retail/sporting services - 92.70% 94.85%
Maintenance/cleaning services (including gardens and landscaping) - 89.50% 89.34%
Security services - 94.10% 93.08%
Parliamentary reporting and recording services - 95.90% 97.67%
ICT services - 94.40% 86.05%
Visitor/Art services - 83.30% 94.98%
Nurses centre services - - 80.95%
Loading dock services - - 93.91%
ICT Services (ICT emails) 97.00% - -
Parliamentary Library (Library 2015 client evaluation survey) 93.00% - -
Hansard (ANAO/DPS survey results) 97.00% - -
Broadcasting (ANAO/DPS survey results) 97.00% - -
Security (ANAO/DPS survey results) 91.00% - -
Building maintenance (ANAO/DPS survey results) 75.00% - -
Other services (ANAO/DPS survey results) Cleaning—87.00% - -
Gardens and Landscaping—94.00% - -
Art Services—75.00% - -
Heritage Management—69.00% - -

Methodology

The satisfaction with timeliness and quality of DPS services is measured annually through a building occupant survey. Building occupants are asked to anonymously rate their level of satisfaction with DPS services and are encouraged to provide any comments or suggestions as to how the services could be improved in the free text fields.

To calculate the response count, any ‘Unsure / N/A’ responses are identified as having not accessed the service and have been excluded from the figures. To calculate the satisfaction rate, the ‘Very Satisfied’, ‘Satisfied’ and ‘Neutral’ (neither satisfied nor dissatisfied) responses are totalled and expressed as a percentage of the response count.

Analysis

We received a total of 509 responses to the building occupant survey from a distribution list of approximately 5,500 email addresses, which is a return of 9.25 per cent. The majority of respondents, 80.7 per cent, worked for one of the four parliamentary departments.

The target for building occupant satisfaction with timeliness and quality of our services is 75 per cent. The overall rating of building occupant satisfaction for the 2017 survey was above the target at 90.17 per cent. The target was achieved for all individual service categories in the 2017 survey.

Two additional service categories were included in the 2017 survey:

  • Nurses Centre services, and
  • Loading Dock services.

Three service categories, although achieving the target of 75 per cent, recorded a decreased satisfaction rate compared to 2016:

  • Maintenance and Cleaning services (including gardens and landscaping)
  • Security services, and
  • ICT services.

Analysis for each service category is summarised below, including the percentage achieved for each service category.

Food and Beverage / Catering services

The Food and Beverage / Catering services achieved an overall satisfaction rating of 88.34 per cent. Due to the change from contracted service (IHG Catering) to an in-house service (APH Catering and Events) from 1 January 2017, respondents were asked to rate their experiences in 2017 only. Food and Beverage / Catering services were broken down into seven categories:

  • Staff Dining Room
  • Queen’s Terrace Café
  • Coffee cart
  • Catering—Members’ Private Dining Room
  • Catering—Members and Guests Dining Room
  • Catering—Function catering, and
  • Catering—Room delivery.

Respondents were asked two questions in relation to Food and Beverage / Catering Services:

  1. Do you have any comments about the food and beverage / catering services and/or how they could be improved?
  2. Do you have any comments regarding the change to the food and beverage / catering services since January 2017?

Both positive and negative comments were received in relation to the food and beverage / catering services.

Positive comments centred on the:

  • quality of the food
  • general improvement in services, and
  • quality of the coffee.

Negative comments centred on:

  • particular issues and experiences with one dish or services
  • bad interactions with staff
  • speed of services, and
  • the price of food.

In addition, there were a number of comments on how to improve the food and beverage / catering services. Common suggestions included:

  • more gluten free and vegetarian/vegan options
  • greater variety in the menu, and
  • reintroducing the sandwich bar.

The 2017 result of 88.34 per cent is an increase of 12.51 percentage points compared to 2016. All sub-categories recorded an increase in satisfaction rates with the largest rises occurring for the Staff Dining Room and the Queen’s Terrace Café.

Retail and Sporting services

Retail and Sporting services achieved an overall satisfaction rating of 94.85 per cent. The services were broken down into six categories:

  • banking
  • physiotherapist
  • hairdresser
  • travel agent
  • The Parliament Shop, and
  • gym and sporting facilities.

The 2017 result of 94.85 per cent is an increase of 2.11 percentage points compared to 2016. Banking, the physiotherapist, the hairdresser, the travel agent and The Parliament Shop all recorded increased satisfaction rates, while the gym and sporting facilities recorded a decreased rate.

Respondents expressed satisfaction with the new timetable in the Health and Recreation Centre (HRC); there were a number of requests for the HRC to have longer opening hours. Other common themes raised in the feedback were:

  • replacement of outdated equipment in the HRC
  • The Parliament Shop has a limited range of products and is too expensive, and
  • more variety of banking services and ATMs.
Maintenance and Cleaning services (including gardens and landscaping)

Maintenance and Cleaning services achieved an overall satisfaction rating of 89.34 per cent. The services were broken down into five categories:

  • gardens and landscaping
  • cleaning of your Parliament House office/work area
  • cleaning of other Parliament House facilities
  • maintenance of your Parliament House office/work area, and
  • maintenance of Parliament House.

Both positive and negative comments were received in relation to the services. Common themes raised in the feedback were:

  • the excellent condition of the gardens and landscaping, and
  • the need for more cleaning throughout the building.

The 2017 result of 89.34 per cent is a decrease of 0.15 percentage points compared to 2016. Gardens and landscaping, cleaning of your Parliament House office/work area, and maintenance of Parliament House all recorded increased satisfaction rates, while cleaning of other Parliament House facilities and maintenance of your Parliament House office/work area recorded decreased rates.

Security services

Security services achieved an overall satisfaction rating of 93.08 per cent, which is a decrease of 1.02 percentage points compared to 2016. Although quite a few comments reflected an improvement in the attitude and friendliness of security staff, there were still comments about negative attitudes and manner which are reflected in the results.

Parliamentary Recording and Reporting services

Parliamentary Recording and Reporting services achieved an overall satisfaction rating of 97.67 per cent. The services were broken down into three categories:

  • Parliamentary reporting (Hansard) services—timeliness
  • Parliamentary reporting (Hansard) services—accuracy, and
  • Audio-visual (Broadcasting) services (including House monitoring).

Feedback was mostly positive, especially towards both Hansard and Broadcasting staff. The 2017 result of 97.67 per cent is an increase of 1.74 percentage points compared to 2016. All subcategories recorded an increase in satisfaction.

Information and Communication Technology (ICT) services

ICT services achieved an overall satisfaction rating of 86.05 per cent. The services were broken down into four categories:

  • ICT equipment and services
  • 2020 helpdesk—timeliness
  • 2020 helpdesk—adequate resolution, and
  • availability/coverage of WiFi network for work purpose.

The 2017 result of 86.05 per cent is a decrease of 5.71 percentage points compared to 2016. All sub-categories recorded a decrease in satisfaction rate, except for availability/coverage of WiFi network for work purposes, which was a new sub-category.

Feedback was mixed, with many comments praising ICT staff for their knowledge and helpfulness, but many also reporting the opposite, indicating possible issues with consistency of service. Other themes raised in the feedback were:

  • computers and the internet are very slow
  • WiFi is slow and inconsistent
  • ICT equipment is old and outdated, and
  • resolution of issues can be lengthy.

This decrease in satisfaction also reflects the decrease in performance against criterion five—ICT Service Standards are achieved. Performance criterion five is broken down into 15 individual performance criteria, each measuring the key services in support of the efficient and effective operations of the Parliament, the parliamentary departments and APH. Failure to meet the performance targets for these key services would contribute to decreased satisfaction with ICT Services at APH. For further analysis see performance criterion five—ICT Service Standards are achieved.

Visitor and Art services

Visitor and Art services achieved an overall satisfaction rating of 94.98 per cent. The services were broken down into two categories:

  • building tours for staff and constituents, and
  • exhibitions and displays (including artwork in general circulation areas).

Comments were mostly positive, especially in relation to Visitor Service Officers, the Art collection, and exhibitions. Other themes raised in the feedback were:

  • some exhibitions need to be updated, and
  • more staff tours during non-sitting weeks.
Parliament House Art Collection Feedback Card results

The service providing artwork to parliamentarians’ offices achieved an overall satisfaction rating of 96.14 per cent.

The Artwork provided in parliamentarians’ offices from the Parliament House Art Collection is no longer captured in the Building Occupant Satisfaction Survey but rather through feedback cards left at the time the service is provided. The services were broken down into three categories through the questions:

  • How would you rate the overall quality of the service provided by DPS Art Collection and Exhibition staff?
  • How would you rate the timeliness of our services?
  • How would you rate the instalment and presentation of the artwork within your suite and Parliament House generally?

The response rate from parliamentarians has increased from 6 responses in 2016 through the Building Occupant Satisfaction Survey to 57 responses with the use of the feedback cards. The new method has proven successful at achieving a greater response rate and providing DPS with improved data.

Nurses Centre services

The Nurses Centre services category achieved an overall satisfaction rating of 80.95 per cent. Two common themes raised in the feedback were:

  • appreciation for the Nurses Centre service and staff, and
  • the Centre should be open every day, not just sitting days.

The Nurses Centre services were not surveyed in 2016 so no comparative data is available.

Loading Dock services

Loading Dock services achieved an overall satisfaction rating of 93.91 per cent. The two common themes raised in the feedback were:

  • appreciation for the helpfulness and friendliness of Loading Dock staff, and
  • customer service could be improved.

The Loading Dock services were not surveyed in 2016 so no comparative data is available.

After rating and commenting on specific services, respondents were asked ‘Do you have any comments about your satisfaction with any other services DPS provides and/or how they could be improved?’ The most common theme was satisfaction with the new parking arrangements.

Respondents were also asked ‘Have you contacted DPS in the past 12 months about any of these services?’ and ‘What was the purpose of your contact(s)?’ The majority of contact was to provide feedback on a service or services, particularly car parking and catering.

There were also 75 comments in response to the final question of the survey, ‘Do you have any other comments about your experience with DPS services?’ Many of the comments were positive in relation to DPS staff and services.

The results of the survey, including both satisfaction ratings and individual comments, were provided to the relevant DPS Assistant Secretary. Where necessary, action plans have been developed in response to both the survey results and comments and all actions associated with the Building Occupant Satisfaction Survey for the current and previous years will be tracked by the Executive Committee over the coming year.

In the 2015–16 Annual Performance Statement DPS commented on the appropriateness of the 75 per cent satisfaction target and indicated we would look to increase the target. Due to the change from contracted service (IHG) to an in-house service (APH Catering and Events), we made the decision to continue with the 75 per cent target for the implementation period. Further analysis of the appropriateness of the target will be undertaken for the 2018–19 reporting period.

Performance criterion 4—Parliamentary Library Service KPIs are achieved

The Parliamentary Library Service metric is an index to capture all of the service standards or key performance indicators for the Parliamentary Library that are approved by the Presiding Officers in the annual Library Resource Agreement.

The office and functions of the Parliamentary Librarian are established by the Parliamentary Service Act 1999 (PS Act) (sections 38A and 38). In the DPS Corporate Plan, the Library’s activities fall under the strategic theme ‘Respond to the changing needs of the Parliament’. The relevant objective for this performance criterion is to:

  • retain the Parliamentary Library’s position as our client’s preferred and trusted source of high quality information, analysis and advice.

Criterion source

  • Program 1, 2016–17 Portfolio Budget Statement, p15
  • Program 1, 2016–17 Corporate Plan, p11

Results against performance criterion

Table 5: Parliamentary Library Service KPIs are achieved
Target—90%
2014–15 results 2015–16 results 2016–17 results
% of Library Service KPIs set out in the annual Library Resource Agreement that are achieved - 93.30% 90.00%
% of Library Service Standards set out in the annual Library Resource Agreement that are achieved14 89.76% - -

Methodology

Key priorities and performance indicators for the Parliamentary Library are approved each year by the Presiding Officers as part of Library’s Annual Resource Agreement (PS Act, section 38G). The KPIs in each Resource Agreement set out the outcomes and key deliverables for that year and also measure the:

  • percentage of clients using the Library’s services
  • customer satisfaction
  • number of completed client requests
  • number of publications produced
  • number of online uses of the Library’s publications
  • attendance at training courses and events
  • timeliness of research and library services
  • number of items added to the Library’s Electronic Media Monitoring Service (EMMS) and ParlInfo data bases
  • number of new titles added to the catalogue
  • percentage of the collection available online, and
  • use of the Library’s collections and data-bases and media portal.

The Library uses the RefTracker Information Request Management System to manage client requests and other client related work. This provides a rich array of client related data, including number of requests, usage, and timeliness. Satisfaction data is derived primarily from a formal evaluation of the Library’s services conducted once in every Parliament, the most recent being undertaken in 2015. (The evaluation for the 45th Parliament will be conducted in calendar year 2017.) Data regarding the number of publications produced and the number of items added to the EMMS and ParlInfo Search databases is obtained from ParlInfo Search.

Data relating to visits to the Library client portal (intranet) are captured by Sitecore’s engagement analytics. The Parliamentary Library currently uses Google analytics and Splunk web-analytics application to analyse statistics for use of publications and collection items. A manual count is used to report on attendance at training courses and events and new titles added to the Library catalogue. Reports generated from the Integrated Library System provide information regarding the percentage of titles in the Library’s collection available online in full-text. Statistics on the use of the Library’s collections and databases is formulated from Integrated Library System reports, Splunk data and vendor provided usage statistics.

Analysis

In 2016–17 the Library met 90 per cent of its key deliverables and targets.

Significant initiatives completed in the reporting period included: a program of orientation and training for new senators, members and their staff; a Request for Tender for news services for the Parliament; implementation of a new social media monitoring system; procurement and implementation of a new digital repository for the Library’s collection (and associated program of data remediation); the development of a new framework for the digital delivery of Library services and a digital preservation framework and policy; and completion of the first stage of the online Parliamentary Handbook project.

In regard to service benchmarks, the Library achieved a rating of 93 per cent for client satisfaction against its target of 95 per cent (based on data from the most recent client evaluation (2015)). It completed 11,681 individual client requests against (target 13,000)—reflecting in part the cyclical dip in client requests in election years. Hours spent on senators’ and members’ client requests amounted to 42,178, reflecting the increasing complexity of client requests. The Library met its client usage target of 100 per cent (consistent with the previous financial year); and received two complaints (again, consistent with the previous financial year). In regard to client requests, the Library achieved a timeliness outcome of nearly 98 per cent (target 90 per cent). The Library recorded 6.4 million uses of its online publications (target 5.4 million) through ParlInfo Search and the internet; and 1,101 clients attended training courses and events (target 500). It also released 280 research publications (target 260).

In regard to library collections and databases, over 168,000 items were added to the EMMS service and ParlInfo databases (target 150,000); and 6,575 new titles were added to the catalogue (target 5,000). Ninety-six per cent of parliamentarian offices used the media-portal (target 80 per cent) and 56 per cent the social media monitoring service (target 20 per cent). There were a little over 3.81 million uses of the Library collection and databases, slightly below the target of 4 million. The Library will monitor usage closely over the coming year. The percentage of the Library’s collection available online increased to 42.2 per cent (target 42 per cent). In regard to timeliness, the 100 per cent target was met for the urgent new titles added to the Library’s catalogue. However, Library recorded only 94.4 per cent against its timeliness target of 95 per cent for new titles added to the EMMS and newspaper clippings databases. In both cases, the failure to meet the target was due to technical issues.

Detailed discussion of the Library’s performance is contained in the Parliamentary Librarian’s Annual Report which is included in the DPS Annual Report, as required by section 65 (1)(c) of the PS Act.

Performance criterion 5—ICT Service Standards are achieved

The ICT Service Standard is an index composed of 15 individual performance criteria each of which is measured monthly. Each criterion measures the delivery of key services in support of the effective and efficient operations of the Parliament, Electorate Offices, the parliamentary departments and Parliament House.

Responding to the changing needs of the Parliament is the strategic theme which links this performance criterion to the achievement of our purpose. The intended results for this performance criterion are to:

  • implement efficient and effective infrastructure, systems and services to respond to the changing needs of the Parliament and our parliamentarians, and
  • explore and develop innovative technology and systems for the delivery of timely information and services to parliamentarians.

Criterion source

  • Program 1, 2016–17 Portfolio Budget Statement, p15
  • Program 1, 2016–17 Corporate Plan, p11

Results against performance criterion

Table 6: ICT Service standards are achieved
Target—90%
2014–15 results 2015–16 results 2016–17 results
% of ICT Service Standards outlined in the ICT SLA that are achieved - 91.66% 88.33%
% of calls answered in 30 seconds 93.5015 - -
IT services-incident resolution 96.10% - -

Methodology

Information Services uses the ServiceNow IT Service Management System to capture and manage client interactions received via telephone, email, self-service and face-to-face contacts. Client Interactions are classified and prioritised appropriately before being assigned to the relevant support group for resolution. Data specifically relating to the management and handling of telephone calls to the 2020 Service Desk is obtained from the Alcatel-Lucent Call Management System.

Availability statistics for key ICT systems and infrastructure is obtained directly from event logging and monitoring software systems. Manual methods are used to calculate the availability of Broadcasting Services. This is due to the nature of these analogue systems. Their availability is determined through a combination of regular scheduled testing, monitoring and incidents raised by clients directly with the 2020 Service Desk.

On 2 July 2017, DPS transitioned to the Whole of Government Secure Internet Gateway. Availability of this service is reported to DPS by the vendor.

Analysis

In 2016–17 we achieved a number of the targets for the performance criteria which form the ICT Service Standards (see Table 7). A combination of factors contributed to some of the performance criteria not being achieved at several points through the year. A number of appropriate corrective actions have been implemented to continue to work towards achieving the performance criteria in the 2017–18 reporting period.

Following the 2016 election we responded to a higher than anticipated demand for services. Additional resources were engaged to assist with the transition to the 45th Parliament and a number of activities including relocation of 56 electorate offices across Australia and 146 Parliamentarian suite relocations within APH.

While a number of performance criteria were impacted throughout the year, we continued to achieve the First Call Resolution target ensuring a high number of client interactions were resolved at the first point of contact. During high periods of demand our staff remained focussed on resolving incidents and requests directly impacting on Parliament, parliamentarians and their staff. This was achieved by re-directing available resources towards the resolution of high priority requests with a target resolution time of four hours or less. As a result a tactical decision was taken to accept that lower priority requests would not achieve their target service level agreement.

A number of outages experienced by both third party vendors and ICT Services during the 2016–17 reporting period have had a direct impact on the availability of key systems and services including internet and email within APH and electorate offices. These outages are not typical of the stability of ICT systems within the Parliament. During the 12 month reporting period the performance criteria for these services was 100 per cent with ICT Services maintaining over 99 per cent uptime of key systems supporting the chambers and communications systems including email. These outages also impacted on the ‘percentage of calls answered in 30 seconds’ performance criterion as the increased volume of calls relating to these outages led to a number of calls not being answered in the 30 second time frame.

The implementation of an Identity and Access Management system will replace the current manual User Account Creation process and address a number of issues resulting from manual data entry processes. This will remove the current manual processes that have led to the User Accounts creation target not being met for two of the 12 months in this reporting period.

Table 7: 2016–17 ICT results
All 15 individual criteria are outlined below along with an explanation of their performance for the year.
Monthly target achieved Service Target Jul Aug Sep Oct Nov Dec Jan Feb Mar Apr May Jun
Broadcasting services
This criterion requires that the target availability for the Chamber Sound Reinforcement, Division Bells and Lights, Clocks and “Live to Air” broadcasts is 100 per cent during sitting times.
12/12 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100%
Chamber services
This criterion measures the availability of specific systems used by the chamber departments to ensure the effective and efficient operation of the systems supporting the parliamentary chambers. These systems include the Table Office Production System, Dynamic Red and Live Minutes and the target average availability of these systems is 100 per cent during sitting periods.
10/12 100% 100% 100% 100% 99.62% 100% 100% 100% 100% 100% 100% 99.90% 100 %
Customer Engagement Model
A customer request is a ‘request’ for a new solution (application or hardware) that is currently not part of the DPS existing service offering. This criterion requires that 100 per cent of customer requests are responded to within 6 days of lodgement and that 100 per cent of customer requests have undergone a detailed assessment within a timeframe agreed to with the customer.
11/12 100% 100% 100% 95% 100% 100% 100% 100% 100% 100% 100% 100% 100%
Email
This criterion requires that the target average availability for email and mail exchange services provided by DPS is 99 per cent.
11/12 99% 100% 100% 99.69% 99.62% 99% 100% 100% 100% 98.40% 100% 100% 99.98%
File and print
This criterion requires the target availability for file and print services provided to our customers is 99 per cent.
12/12 99% 100% 100% 100% 99.62% 100% 100% 100% 100% 100% 100% 100% 100%
First call resolution
This criterion requires that 70 per cent of calls to the ICT 2020 Service Desk are resolved on first contact.
11/12 70% 65.20% 72.10% 71.32% 71.31% 73.32% 72.62% 73.29% 72.70% 70.10% 73.90% 72.00% 71.70%
Gateway availability
This criterion requires that gateway services are available 24 hours a day, 7 days a week (excluding scheduled downtime). The target availability for gateway services is 99 per cent.
12/12 99% 100% 100% 99.92% 100% 99% 100% 100% 100% 100% 100% 100% 99.98%
Incident resolution
This criterion requires that 95 per cent of all incidents reported to the ICT 2020 Service Desk are resolved within the agreed resolution times.
8/12 95% 97% 92.92% 95.90% 94.78% 92.69% 97.69% 96.97% 97.92% 92.73% 98.30% 98.30% 97.5%
Information services
This criterion measures the availability of the EMMS and ParlView services. The target availability is 99 per cent.
12/12 99% 100% 100% 99.69% 100% 100% 100% 100% 100% 100% 100% 100% 99.89%
Internet
This criterion requires that the target average availability of internet services for our customers is 99 per cent.
11/12 99% 100% 100% 99.79% 100% 99% 100% 100% 99% 98.92% 99.81% 100% 99.92%
Mobile Device Management
A Mobile Device Management application is used to provide security to mobile devices. This criterion requires that this service is available 24 hours a day, 7 days a week (excluding scheduled downtime).
12/12 99% 100% 100% 100% 99% 100% 100% 100% 100% 100% 100% 100% 100%
PABX availability
This criterion requires that the target availability of the core telephone system (PABX) is 100 per cent.
12/12 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100%
Percentage of calls answered in 30 seconds
This criterion requires that 90 per cent of calls made to the ICT 2020 Service Desk are answered within 30 seconds.
3/12 90% 87.10% 74.10% 80.10% 74.10% 79.70% 88.80% 88% 86.60% 91.70% 91.90% 91.70% 89.5%
Telephone—technical
This criterion requires that each telephone handset meets target availability of 99 per cent.
12/12 99% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100%
User accounts
This criterion requires that 100 per cent of all new user accounts are created within 24 hours.
10/12 100% 100% 99.43% 100% 99.40% 100% 100% 100% 100% 100% 100% 100% 100%

Performance criterion 6—Hansard Service KPIs are achieved

DPS performs a critical function supporting the work of the Parliament through the Parliamentary Recording and Reporting Branch (PRRB). This performance criterion demonstrates timely reporting of chamber committee proceedings through the production of Hansard.

Responding to the changing needs of the Parliament is the strategic theme which links this performance criterion to the achievement of our purpose. The relevant intended results for this performance criterion are to:

  • implement efficient and effective infrastructure, systems and services to respond to the changing needs of the Parliament and our parliamentarians, and
  • explore and develop innovative technology and systems for the delivery of timely information and services to parliamentarians.

Criterion source

  • Program 1, 2016–17 Portfolio Budget Statement, p15
  • Program 1, 2016–17 Corporate Plan, p11

Results against performance criterion

Table 8: Hansard Service KPIs are achieved
Target
% of individual draft speeches delivered within two hours of speech finishing—85%
% of electronic proof Hansard reports delivered within agreed timeframes—95%
% of committee transcripts delivered within agreed timeframes—95%
2014–15 results 2015–16 results 2016–17 results
% of Individual draft speeches delivered within two hours of speech finishing 86.50% 85.74% 91.74%
% of electronic proof Hansard reports delivered within agreed timeframes 94.40% 92.79% 90.09%
% of committee transcripts delivered within agreed timeframes 97.20% 92.90% 98.78%

Methodology

The Hansard Production System (HPS) is used to obtain the data for the performance criterion results on a monthly basis. The HPS records and produces reports on when Hansard documents are delivered. An individual draft speech (known as a ‘pink’ or ‘green’) is considered to have been delivered on time if the entire speech reaches the office of the senator or member within two hours of the speech. Proof Hansards are reported as being on time if published in full within three hours of the chamber rising. Committee transcripts are reported as being on time if published in the timeframe agreed with the committee.

Analysis

We exceeded the performance targets for timely delivery of both the percentage of individual draft speeches delivered within two hours of speech finishing, and committee transcripts delivered within agreed timeframes. We have well-established resourcing and negotiate timeframes with committees to enable us to consistently meet these targets. We will continue to progress staff training to enable more editors to send draft speeches directly to senators and members which help us to achieve our targets.

Hansard did not achieve, but was within five percentage points of, the performance criterion for percentage of electronic Proof Hansard reports delivered within agreed timeframes. When a chamber sits significantly later than its normal sitting hours, the practice is to suspend transcription late in the evening, publish the transcript in part, and complete the remainder of the transcript the following day. We achieved the performance criterion on all sitting days where chambers sat normal hours. There were 11 occasions in 2016–17 where a chamber sat late, which account for all of the instances of this performance criterion not being met (Senate—15 September 2016, 22 November 2016, 29 November 2016, 2 December 2016, 15 February 2017, 23 March 2017, 11 May 2017, 22 June 2017, and 23 June 2017; House of Representatives—1 December 2016 and 23 June 2017).

We remain committed to supporting the work of the Parliament by publishing an accurate account of the proceedings of the Parliament and its committees for parliamentarians, their staff, and the Australian public in a timely manner. We will continue to enhance electronic access to parliamentary information for the community to engage easily with the parliamentary process.

Performance criterion 7—Continuity of Design Integrity

The Continuity of Design Integrity attempts to measure the percentage of projects that have a material impact on the design integrity of the building where design integrity is maintained or improved. In particular, it attempts to measure the number of capital works plan projects with design intent that were considered and approved in accordance with DPS policy.

Effective stewardship of APH is the strategic theme which links this performance criterion to the achievement of our purpose. The relevant intended result for this performance criterion is to:

  • ensure adaptations of the building uses are strategic, appropriate and reference design integrity principles.

Criterion source

  • Program 2, 2016–17 Portfolio Budget Statement, p16
  • Program 2, 2016–17 Corporate Plan, p11

Results against performance criterion

Table 9: Continuity of Design Integrity
Target—90%
2014–15 results 2015–16 results 2016–17 results
% of projects that have a material impact on design integrity of the building where design integrity is maintained or improved16 - - 90.00%
% of significant building changes (not-like-for-like) resulting in an unchanged or positive impact to the Design Integrity of the Building17 89.98% 89.75% -

Methodology

The Continuity of Design Integrity methodology was reviewed during 2015–16 due to concerns with its appropriateness as a measure of design integrity.

The methodology applied in 2016–17 requires the Design Integrity and Archives Unit in partnership with the responsible Division to identify the projects in the capital works plan with design intent implications and assess if the implications have been referred through the Design Integrity and Archives Unit which facilitates design integrity consultation processes prior to the commencement of the project.

The review is undertaken annually through manual inspection of documentation (i.e. desktop review) and processes.

Further review of this criterion has been under way throughout the 2016–17 year and this is reflected in the 2017–18 Portfolio Budget Statements (PBS) and in the 2017–18 Corporate Plan.

Analysis

The target for Continuity of Design Integrity was achieved in 2016–17 with a score of 90 per cent. The reviewed performance criterion was implemented and the methodology reflects a more appropriate and accurate rating than previous years.

As noted in last year’s annual report, the Secretary has established a regular design integrity consultation process with Ms Pamille Berg AO Hon FRAIA and Mr Hal Guida LFRAIA AIA. Ms Berg and Mr Guida are the nominated moral rights representatives of the late Mr Romaldo Giurgola AO (principal architect of Mitchell/Giurgola & Thorp (MGT) Architects responsible for the design of APH); they were also key members of the original MGT design team.

Quarterly design integrity meetings are held to discuss, among other things, projects which may have a material impact on the design intent of APH. Ad hoc meetings are also arranged as required. In addition, we consult with Ms Berg and Mr Guida on design intent matters as they arise, with contact occurring either directly between them and the relevant project officers or as facilitated by the Design Integrity and Archives Unit. Consequently, any potential design difficulties which may arise are addressed at the earliest opportunity.

The design integrity consultation processes are separate and additional to the formal moral rights notification procedures under the Copyright Act 1968. As Mr Giurgola’s nominated moral rights representatives, Ms Berg and Mr Guida are also formally consulted about proposed changes in accordance with that Act’s requirements.

During 2016–17, we undertook numerous capital works projects at APH. Of those, 25 minor and major capitals works projects were identified in consultation with Ms Berg and Mr Guida as potentially impacting on the design intent of APH. These projects are at various stages of development, from very early conceptual phase through to those that are in advanced design and/or construction or have been completed. All of these projects have been the subject of ongoing consultation with Ms Berg and Mr Guida throughout the year in line with agreed consultation processes.

Early in July 2016, work also recommenced on finalising the Central Reference Document. This work is currently being undertaken by Ms Berg (with assistance from Mr Guida and Mrs Lesley McKay) and will provide a comprehensive record of the Parliament’s intent for APH and the Architect’s design response. Work is also well under way on a project to re-establish and future proof a central register of fabrics, with similar registers for carpets and leathers expected to commence in early 2017–18. All of these projects will become invaluable resources for staff and assist staff in increasing their knowledge and understanding of important design intent issues.

The 90 per cent result reflects the necessary balance we have made in addressing design integrity matters in the current security environment to ensure the security of all who work and visit APH is of the highest standard. However, in relation to the current security works every effort has been made to ensure that the works are undertaken in a way which minimises the impact on design integrity of the APH (to the extent possible). As the Speaker of the House of Representatives advised the Chamber:

To ensure the moral rights of the late Romaldo Giurgola in the design of Australian Parliament House have been considered, the Department of Parliamentary Services conducted appropriate consultation with the nominated administrator. It is important to acknowledge that these works do have an impact on the original design intent of Parliament House. However, it is also important to acknowledge the world has changed since the original design brief for Parliament was created in the late 1970s. Where possible the measures and comments put forward by the nominated administrator of Mr Giurgola’s moral rights have been noted and implemented carefully balancing risk mitigation requirements.18

Performance criterion 8—Building Condition Rating

DPS measures the Building Condition Rating (BCR) by the percentage of building areas reviewed that are assessed as being in good or better condition.

Effective stewardship of APH is the strategic theme which links this performance criterion to the achievement of our purpose. The relevant intended results for this performance criterion are to:

  • ensure adaptations of the building uses are strategic, appropriate and reference design integrity principles, and
  • effectively manage all assets within APH including collections.

Criterion source

  • Program 2, 2016–17 Portfolio Budget Statement, p16
  • Program 2, 2016–17 Corporate Plan, p11

Results against performance criterion

Table 10: Building Condition Rating
Target—80%
2014–15 results 2015–16 results 2016–17 results
% of building areas reviewed that are assessed as being in good or better condition19 - - 80.88%
% of building areas reviewed that are assessed as being in fair or better condition 88.50% 88.45% -

Methodology

As a result of recommendations from an internal audit the BCR was changed slightly in 2016–17. The changes included ensuring the rating scale provided a full range of scores from 0 to 100 per cent, amending the rating itself from being ‘fair’ to ‘good’ and lowering the target to 80 per cent rather than 90 per cent as in previous years. The combined effect of the change from ‘fair’ to ‘good’ and lowering the target from 90 to 80 per cent is to more effectively articulate the department’s expectations of performance to the extent that it does not require a large proportion of the building to be in only fair condition, but instead expects a smaller proportion of the building to be in good or better condition as befits an iconic building of national significance.

The BCR measures the current condition of the building fabric of APH, expressed as a percentage of the original condition. The BCR is determined by a visual inspection of the building and fabric surfaces for deterioration and damage caused by general wear and tear.

For the purposes of the BCR the building is divided up into eight zones and over the course of the 12 month reporting period an inspection is carried out using the building condition rating methodology. Each zone has a potential 31 elements which are scored. The building condition rating methodology scores each element in each area from zero (disrepair) to 100 (excellent).

A total percentage score is calculated by dividing the total of the given scores for each zone by the total of the potential optimum scores for each zone.

Analysis

At 80.88 per cent, we achieved the new target of 80 per cent for 2016–17. This result is substantially lower (7.57 percentage points) than the previous year, due to the application of the revised methodology which provides a more accurate reflection of the building condition of APH.

In 2016–17 the BCR was assessed over 453 inspections, 147 inspections fewer than the 2015–16 year. The reduction was the result of the implementation of the new methodology for this rating along with staff turnover within the team.

A number of areas of the building condition have seen improvements in 2016–17 which have helped us achieve our target. The following work was undertaken in 2016–17:

  • the refurbishment of 19 suites with new carpet and paint
  • painting of over 66,000m2 throughout APH
  • replacement of 5,100m2 of carpet to suites and general circulation spaces
  • replacement of wall tiles in 23 toilets
  • sealant replacement of 2,500 lineal metres, and
  • a total of 235 global and commissioned furniture items received conservation/refurbishment/reupholstering/manufacture.

Building Fabric Services conduct ongoing maintenance in all areas of APH through assigned work orders and provide a quick response to urgent and unforeseen requests. Building Fabric Services completed 4,463 work orders through 2016–17, which included:

  • lock maintenance
  • carpentry maintenance
  • leather maintenance
  • sign maintenance
  • clear finish maintenance
  • furniture maintenance, and
  • stone/marble maintenance.

While the new target of 80 per cent was met, there are still areas for improvement to the building condition. The following activities are planned for 2017–18 to improve the building condition of APH:

Building Fabric Services

  • ongoing suite refurbishment works which include recarpeting and repainting to approximately 30 suites
  • ongoing wall tiling replacement works to ensuites and toilets
  • sealant replacement throughout APH to approximately 1500 lineal metres
  • on-going carpet replacement, and
  • bulk maintenance painting of general circulation spaces and suites.

Furniture Manager

A total of 282 global and commissioned furniture items are scheduled to be conserved, refurbished, reupholstered, or manufactured in 2017–18, including:

  • proposed conservation of 13 pieces of commissioned furniture
  • proposed refurbishment of 262 pieces of global furniture, and
  • proposed manufacture of seven pieces of global furniture.

In 2017–18 we will continue with the delivery of accommodation refurbishments, fitouts and relocations and undertake repairs and maintenance of status A and B furniture to improve the building condition of APH.

Performance criterion 9—Landscape Condition Rating

The Landscape Condition Rating (LCR) measures the current condition of the landscape surrounding APH.

Effective stewardship of APH is the strategic theme which links this performance criterion to the achievement of our purpose. The relevant intended results for this performance criterion are to:

  • ensure adaptations of the building uses are strategic, appropriate and reference design integrity principles, and
  • effectively manage all assets within APH including collections.

Criterion source

  • Program 2, 2016–17 Portfolio Budget Statement, p16
  • Program 2, 2016–17 Corporate Plan, p11

Results against performance criterion

Table 11: Landscape Condition Rating
Target—85%
2014–15 results 2015–16 results 2016–17 results
% of landscaped areas reviewed that are assessed as being in good or better condition - - 87.67%
% of landscaped areas reviewed that are assessed as being in fair or better condition 82.00% 83.00%20 -

Methodology

As a result of recommendations from an internal audit the LCR was changed in 2016–17. The internal audit recommended that the 2015–16 LCR calculation methodology be reviewed. The review resulted in a number of changes, including: an expanded rating table, revised method of calculating the score to ensure all elements of the landscape were reflected in the result, a change to the rating itself, from ‘fair’ to ‘good’, reduction in the target to 85 per cent from 90 per cent as in previous years, and a change to the timing of the annual assessment to March/ April rather than October. Changes to the methodology are intended to ensure that the basis for ratings across types of landscape elements is well understood, that all landscape elements are represented equally in the final result, and that the annual assessment is undertaken more closely to the reporting period. The combined effect of the change from ‘fair’ to ‘good’ and the reduction of the target from 90 to 85 per cent more accurately reflects the desired condition of the landscape (i.e. that DPS expects a higher performance standard, albeit for a slightly smaller proportion of the landscape).

The LCR is expressed as a percentage and is measured annually. For the purposes of the assessment process, the landscape is divided into 10 zones which include up to 10 separate elements such as lawns, trees and hard surfaces. Each element is manually assessed by a team of five Landscape Services staff. The assessment takes into account variables such as the intended purpose, lifecycle, planned maintenance levels and seasonal variations. The agreed scores are provided against each element for each zone and the total score achieved (across all elements and zones) is expressed as a percentage of the total possible score.

The methodology for 2016–17 is designed to give a fair representation of the overall landscape condition.

Analysis

We achieved the target of 85 per cent with a rating of 87.67 per cent. The LCR was assessed on 29 March 2017 through a full site assessment of the condition of the landscape around the Parliamentary precincts.

A number of areas within the landscape have seen improvements in 2016–17 which have helped us achieve our target. The areas of improvement are:

  • the Senate tennis courts were resurfaced between August and September 2016, this was an area of concern in 2015–16
  • the turf condition across the Parliamentary precincts has improved, and
  • planting in sections of the peripheral gardens has improved the appearance of the area.

While we achieved the target for the LCR there are still areas in which we could improve the condition of the landscape. The areas we assessed as requiring further improvement are:

  • the replacement of mature trees that are in poor condition for the location. The replacement will be undertaken in consultation with the moral rights holders and at a suitable time of the year
  • replanting in areas where shrubs have not been growing well. This is an ongoing process across the landscape within the Parliamentary precincts
  • the forecourt paving requires repair to the pond and grouting in the pavers. This will be addressed as part of a Forecourt redevelopment project. The timeframe to start these works is second quarter of 2018, and
  • the Members’ Terrace garden is leaking into the Marble Foyer and requires repair to limit potential damage to the building fabric. This has been identified and further consultation is required before further action is taken.

Building occupants also provided satisfaction ratings for the landscape as part of the 2016–17 Building Occupant Satisfaction Survey. The satisfaction rating achieved was 99.32 per cent which is reflective of the high regard in which the DPS Landscape Services team are held. This satisfaction rating, which represents the views of the building users, reinforces the integrity of the LCR which is the professional view of the Landscape Services team.

Performance criterion 10—Engineering Systems Condition Rating

The Engineering Systems Condition Rating (ESCR) measures the current operation and condition of the engineering systems in APH against the expected decline of those systems through their lifecycles.

Effective stewardship of APH is the strategic theme which links this performance criterion to the achievement of our purpose. The relevant intended results for this performance criterion are to:

  • ensure adaptations of the building uses are strategic, appropriate and reference design integrity principles
  • ensure a secure environment while maintaining public accessibility, and
  • effectively manage all assets within APH including collections.

Criterion source

  • Program 2, 2016–17 Portfolio Budget Statement, p17
  • Program 2, 2016–17 Corporate Plan, p11

Results against performance criterion

Table 12: Engineering Systems Condition Rating
Target—90%
2014–15 results 2015–16 results 2016–17 results
% of critical engineering systems reviewed that are assessed as being in good or better condition - - 49.56%
% of critical engineering systems reviewed that are assessed as being in fair or better condition 88.68% 88.73%

Methodology

DPS engaged a third party to undertake a complete review of the methodologies associated with the assessment of the condition of its essential building services assets. DPS recognised that there is a need to provide a more comprehensive measurement of asset condition and to implement a system that provides an ability to undertake a rolling review of the condition of assets as a means of ensuring effective maintenance. This approach reflects how operating equipment, with typical lifecycle between 10 and 25 years, is maintained in a building with an operational life of 200 years.

The new performance criterion system considers four key factors when assessing an asset: physical condition, operating condition, obsolescence and residual life. Ranking is now assessed over nine system categories and 410 sub-system categories providing a substantial increase in the resolution of the systems being examined and a more useful and accurate KPI.

Targets have been reset to use two assessment rankings:

  • good condition: definition—90 per cent reliability and performance, with minimal faults
  • fair or average condition: definition—asset performing up to 75 per cent reliability and performance, with repairable faults.

While each subsystem category was scored against each of the four key factors, for the purposes of calculating the total final rating for this performance criterion, residual life was not used because this does not reflect the performance of the engineering systems, but will be used for capital expenditure planning purposes.

Once a score was allocated to each of the three remaining factors against each subcategory, an overall score for the category was generated against each factor by way of a count of subcategories which were assessed as good or better. This score was converted to a percentage for each factor. Then, the overall result for each system category is determined by averaging the percentage scores across the three factors, and the total results calculated by averaging the overall result for each system category. Using this methodology, each of the nine system categories can be assessed individually against the target, as well as providing a total result across all engineering systems.

Table 13: New assessment targets
Target 2016–17
Good condition: 70% 49.56%
Fair or average condition: 95% 87.25%

Due to the substantial changes made to the asset ranking criteria and methodology a meaningful comparison cannot be made against the previously reported values.

Analysis

The overall assessment provides a substantially lower value for ‘good’ than the 2016–17 PBS and Corporate Plan targets for this performance criterion. This is due to much equipment being near the end of its life, due to either wear and tear, obsolescence or unavailability of spare parts. The majority of capital replacement projects have been committed; however, due to the size and extent of these systems, the replacement program generally runs over a 2–3 year period.

A synopsis of the capital works programs that will impact performance in this area over the next few years is as follows:

Electrical Services

  • replacement of all air circuit breakers in main switchboards, due for completion June 2018
  • electrical distribution boards replacement, due for completion June 2020
  • emergency lighting monitoring, due for completion June 2020
  • light fittings upgrade to low energy luminaires, due for completion June 2020
  • lightning control upgrade, due for completion November 2020, and
  • review of electrical essential power requirements and replacement/upgrade of emergency power generators, due for completion June 2018.

Fire Services

  • emergency warning system upgrade due to be competed in October 2017
  • sliding fire door replacement, due for completion September 2019, and
  • fire sprinkler services upgrade design contract due for completion February 2018.

Mechanical Services

  • upgrade to trade waste systems, completed May 2017
  • replacement of all mechanical switchboards, due for completion June 2020
  • replacement of refrigeration chillers 4 & 5, due for completion October 2017
  • replacement of all major heating, ventilation and air-conditioning air handling plant, due for completion June 2020, and
  • replacement of the boiler plant, due for completion June 2019.

Lifts

  • progressive replacement of the 41 lifts over four years commencing later in 2017.

As these works progress and their revised rankings applied to the asset database there will be observable improvements to the calculated rankings towards their targets.

Footnotes:

5 720,759 is the adjusted 2014–15 visitor number based on the 2015–16 methodology.

6 This was reported as ‘Participants in school tours’ in previous years. This performance measure has been renamed ‘Visitors for DPS school tours’ in the 2016–17 PBS to more accurately reflect the type of visitors and the nature of their visitor experience.

7 Includes public tours, other tours and event numbers.

8 Results now included in DPS organised tours and events.

9 Results now included in DPS organised tours and events.

10 This is not a key performance indicator and is reported on separately at pages 86 and 87 of the Annual Report.

11 Previously reported as ‘% of school/education visitor feedback indicating their visit met or exceeded expectations’. This performance measure has been renamed in the 2016–17 PBS to more accurately reflect the type of visitors and the nature of their visitor experience. Specifically that they are visitors from schools, and DPS does not provide ‘educational’ tours. (The provision of educational services is the role of the Parliamentary Education Office in the Department of the Senate).

12 Previously reported as Visitor services—tours and information, The Parliament Shop and visitor catering, building access and parking.

13 Previously reported as Website

14 The data from 2014–15 is for a different KPI—timeliness for research services.

15 Previously reported in 2014–15 as Help Desk calls (answered before 30 seconds).

16 This is a new measure, which has been introduced due to concerns about appropriateness of the previous measure.

17 This measure was discontinued following a review in 2015–16 of its effectiveness.

18 T. Smith, ‘Parliamentary zone’, House of Representatives, Debates 1 December 2016, p5090.

19 The building condition rating target was amended from 90% in 2014–15 and 2015–16 to 80% in 2016–17 based on an internal review process.

20 The landscape target was amended from 90% in 2014–15 and 2015–16 to 85% in 2016–17 based on an internal review process.