In Search Of Operational Excellence in Education

Learning institutions rely on many complex processes working together to meet the needs of stakeholders. And it’s these processes that come under scrutiny when institutions are asked to achieve more with less, be more accountable or flexible, or improve service to those stakeholders.

The recent Ernst & Young Universities of the Future report highlighted the need to streamline large back office operations. How a learning institution manages its many academic processes like course development, curriculum revision and advising, and administrative processes such as enrolling students can make all the difference when facing rising costs, reduced funding, falling enrolments of foreign students, increasing demands for flexible learning, and other pressures.

We’re all in search of that Holy Grail; operational excellence, and yet, the closer a trained eye looks at current processes, the more gaps appear, and the further off appears the goal.

This shouldn’t surprise us. We rarely encounter anyone who really understands how things are supposed to work from start to finish, and even fewer who can tell us what the organisational policy is for the existence and guidance of the process. Such a poor starting point inevitably leads to poorly conceived or implemented ‘enhancements’, often based on information systems that deliver expensive yet disappointing results. The outcome can often be poor service delivery, high cost and low flexibility, exactly the opposite of the ‘excellence’ being sought.

It needn’t be this way; our overwhelming message is one of hope and encouragement. From process-improvement experience across a dozen Australian universities and TAFEs, we’ve seen how common processes can be quickly, significantly and sustainably improved. It just takes the right approach.

Process Improvement Basics

Improving processes begins with mapping the existing processes and quantitatively ‘base-lining’ key operational metrics. Complicated? Not really, if one knows what to ask. Metrics such as cycle time, cost, seasonal impacts, and staff requirements, can be comfortably developed and compared. It is no secret that these are the main metrics that contribute to improving enterprise value.

Next, opportunities for improvement should be critically examined by team leaders, supervisors and administrators who understand how the process (or their part of it) is meant to work.

Operational weaknesses usually arise in one of four types;

  • Process weakness – a poor design of how a series of actions is supposed to occur
  • Policy weakness – a lack of operational guidance that justifies and ‘steers’ the process
  • People weakness – inadequate management of the human input
  • Technology weakness – poor support from computer-based systems within the process.

Interestingly, we find the same weaknesses appear regardless of type of process—so much so that they’re called ‘recurring themes’. They can be illustrated in a simple Process Requirements Model, used to contextualise components and allow measurement and comparison of, for want of a better term, the ‘health’ of each step in that process.

Process Requirements Model

The 4 themes are shown, broken further into sub-themes, for example, the Process issues theme contains the additional dimensions of (a) Clear requirements, (b) Well defined Inputs and Outputs, (c) Avoidance of duplication, (d) Design to minimise Duration and Effort, (e) Good communications, (f) Control over speed, quality and cost of the process output.

Each of the other 3 themes are similarly broken down, and the resulting model provides a practical checklist for analysis and improvement. Let’s briefly examine what this model has shown up in the past.

Policy – “Tell me how you’ll measure me, and I’ll tell you how I’ll behave”

Process and Technology themes are no longer surprising, and the People theme is usually under-reported by operational staff, perhaps owing to human nature and the tendency to downplay self-critique.

Most surprisingly, however, is the recurrence of the Policy theme. Here we find that the set of guiding ‘statements of intent’, or business rules and decision-making criteria making up the typical Policy are often out of date, informally applied or missing altogether. 

Organisational policy significantly impacts how processes function. Very often 10%-30% of gaps we find are not due to poor processes per se, but to a weak policy environment; (1) The Obsolete Policy – they no longer apply, (2) The Missing Policy – there are none, (3) The Unknown Policy – they’re there, but staff don’t know them.

A large Victorian University a few years ago was procuring most goods and services via Purchase Order, even though it cost them 8 times more this way than paying by credit card. The solution? Update the policy; changing P.O. minimum order amounts and transaction types, then communicating, implementing and monitoring the updated policy. 

Process – “A sequence of activities designed to achieve a specific outcome”

The design of a process plays a key role in optimising quality, cycle time, risk, and cost. We find five common themes characterising badly designed and maintained processes; (1) Inefficiency – the process requires too much effort (thus cost), (2) Poor standard of input/output – work fed into or received out of a process doesn’t meet requirements, (3) Lengthy process duration – too much time’s taken from start to finish (cycle time), (4) Poor process management – the process itself may be fine, but it’s not running at full capacity, (5) Poor communications – the ‘team isn’t talking to each other on the field’.

A large TAFE Student Administration initiative uncovered process inefficiencies in the practices of laborious matching of student id records, document duplication where students are enrolled over multiple terms, unnecessary class creations that were never used, maintaining two sources of course information (a detailed handbook, and a summarised web version containing differing data), and the downloading of admissions system information into a manipulation and reporting spreadsheet.

According to a Quality and Compliance Manager, “achieving operational excellence requires constant scrutiny and challenge, for example, Justification – Should this process even exist in the first place? Simplification – Is it as streamlined and uncluttered as possible? Optimisation – Is it designed for maximum speed with minimum effort? Standardisation – Is there a single way it can be completed? Centralisation – Where sensible, can we consolidate activity into a single location?”

Technology – “If you apply technology to a bad process, all you get is a fast, bad process”

There’s nothing quite like the use of inappropriate information technology to kill the effectiveness of a process. Conversely, we also find that not every process needs, or benefits from, high-tech, fully automated computer systems – sometimes the best information systems are simple paper-based checklists, tick sheets and whiteboards.

In broken technology-related processes we generally find two recurring themes; (1) Poor system functionality – existing computerised systems don’t do what’s required, or (2) Poor system integration – each computerised application works fine on its own but manual interventions are required to make the end to end process work.

In a large TAFE, their Student Enrolment processes involved the manual completion of enrolment forms by hand, and so were prone to error and required much data checking, completion and correction. The allocation of student numbers was also done manually outside of the administration system (which itself was quite capable of unique number generation), using a separate database and stickers, which increased the work effort, duration and error rate.

The solution? Deployment of an online enrolment form with inbuilt data validation mechanisms, plus the return to standard system functionality for application identification. 

People – “the most costly and least managed enabler of any process”

This key process enabler requires active management like no other, and yet low skill levels, mismatched staff allocation and poor seasonality management are allowed to impact service quality, cost and risk. Two people themes are common; (1) Under-skilled labour – people are not sufficiently trained to do what’s required of them, or (2) Insufficient labour – not enough trained staff, either overall, or at key times.

A large NSW University’s Student Administration area had an exam results and grading-related process which was only required for a spell of 4 weeks in each semester. According to their Faculty General Manager “such a large ‘volume spike’ required a ‘flying circus’ type operation; the setup, operation and dismantling of the process on a scale that tested the ability to source casual and part time staff in sufficient quantity to handle the peak loads required for very short periods, and without sufficient cross skilling being available. It’s an epically chaotic affair, and everyone hates it”.

The solution? Understand and quantify the additional resource required well beforehand, identify non-critical work tasks, identify back-fillable staff, and formulate plans to redeploy staff from non-critical activities for these high pressure periods.

The Bottom Line

Australian learning institutions are under relentless efficiency and effectiveness pressure.

The good news is that when institutions undertake considered, systematic process-improvement initiatives, they routinely find 50% improvements in service delivery, and efficiency savings of at least 20%, sometimes 30% – often accompanied by ‘quick win’ programs to usher the transition.

A Senior Executive of a large and well respected Queensland University reviewed the success factors in their operational excellence journey. “A good place to start was to establish the highest possible level of commitment for ongoing process change. Next, we established a competent, continuous improvement function, one empowered to boldly challenge any operational status quo. Then, we adopted a collaborative approach; (a) bottom-up involving operators identifying improvement, building understanding and fuelling commitment for improvement, and (b) top-down in which managers impart clear, strategic requirements into future processes. We then took the most insightful step of them all; we simply looked at two of our processes—one we believed to be in good shape and another known to be dysfunctional. Then, through some straight-forward, systematic analysis, we amazed ourselves at the opportunities for improvement that we discovered in both of them”.

It seems like sound advice for the rest of us on the operational excellence journey.

Deepening Customer Understanding with Clustering

Historically, marketing communications comprised largely generic messaging. Low levels of personalisation were employed, if they were at all, to distribute a message to as many people as efficiently as possible. Low cost tools made it simple and easy to do this. This is an example of failing to consider digital transformation properly. Simple replacement of an analog process with a low-cost digital one was too hard to resist. This approach is no longer effective. Businesses and consumers have rejected generic messaging and frequently unsubscribe from these communications. Data driven, highly-personalised content experiences are the new standard. Failure to adopt this means the market is more likely to disengage with your brand.

The conclusion from this trend is that delightful customer experience (CX) is vital to future success. If we cannot communicate with our market in a simple, straightforward manner with content relevant to the recipient, we are unlikely to provide a satisfactory customer experience. However, to achieve this we need to seriously understand our customers and the journeys they follow.

In this blog we will demonstrate methods for understanding customer segments that are both traditional and contemporary. Further, we will show the substantial improvements possible when these methods are combined.

Thinking about Customer Segmentation

Customer segmentation is the practice of organising customers into discrete groups with similar characteristics. This is typically done in one of two ways: Personas and Clustering, which are otherwise known as opinion-based and fact-based respectively.

Personas: Opinion-Based

A persona is a conceptual model of a person with a name, characteristics and a specific way of doing things; it is a fictitious or imaginary person. From a marketing point of view, the aim of a persona is to be able to see your products and services from their perspective.

Persona development is a collaborative venture drawing on input from several people in the organisation that “understand” or “know” the customer. This can involve people from marketing, sales and customer service. The thoughts and experiences of the contributors determines the personas chosen; the more diverse their perspectives, the better the outcome.

There are several drawbacks to developing personas:

  • Contributors with many similar perspectives can lead to skew or bias;
  • Subjective views and a shallow understanding will directly impact the quality of the outcome;
  • Changes in customer behaviour can take time to register. It is therefore difficult to maintain an up to date view due to this latency; and
  • The process is not fact based. Opinions and assumptions drive the responses.

Opinion based persona development is a tried and tested approach. Despite the drawbacks it is nonetheless a good starting point. A strong focus on the quality and balance of contributors is key to the value of the outcome.

Clustering: Fact-Based

An alternative method of understanding customers is cluster analysis. This leverages Machine Learning, a sub domain of Artificial Intelligence (AI). This method mathematically identifies groups that exist within a data set that are reflective of the customer base. This fact-based approach immediately limits the errors and limitations in the opinion-based or human approach. It can segment customers over many attribute dimensions and create homogenous groups.

Just as the opinion-based person approach depends on the reliability of the opinions and assumptions, this approach is only as good as the quality of the data used; it is truly garbage in, garbage out. There are also limitations in the application of machine learning. It is not a common approach, the quality and experience of the person interpreting this will impact the result.

  • Subjective views and a shallow understanding will directly impact the quality of the outcome;
  • Changes in customer behaviour can take time to register. It is therefore difficult to maintain an up to date view due to this latency; and
  • The process is not fact based. Opinions and assumptions drive the responses.

Opinion based persona development is a tried and tested approach. Despite the drawbacks it is nonetheless a good starting point. A strong focus on the quality and balance of contributors is key to the value of the outcome.

Which Method?

On the face of it the fact-based method will provide a more reliable outcome. If we wish to avoid the limitations of human interference, a fact-based approach is surely better? However, the opinion-based approach also has merits and is the current go-to approach for creating personas.

Below we contrast the two approaches:

Two is Sometimes Better than One

Despite the strong differences between an opinion and a fact based approach, one does not preclude the other; the approaches are complimentary. To accurately realise customer segments, leverage the established insights of opinion-based personas with impartial, fact-based clusters. The approach taken depends on whether a defined set of personas exists:

  • Validation Method: This is applied when personas already exist. These personas are then validated with results from a cluster analysis. The validation method provides an unbiased perspective to confirm persona hypotheses.
  • Generation Method: This is applied when no personas exist. Cluster analysis is used to gain an understanding of customers, which is then augmented using personas. In this way a much improved customer understanding is achieved.


Optimal experiences are key to customer success. Through a deeper, richer understanding of customers we can deliver more personalised communications and improve brand engagement. This improved understanding of customers is achieved by combining traditional, opinion-based personas with fact-based clusters.

How to Embed Design into Your Organisation

The pace of change and disruption presents an increasing challenge for organisations. In particular, opportunities presented by technology are driving much of this change. Organisations require an agile means of defining and designing solutions that recognise the predominantly digital nature of these transformations.

Design4Digital is BAC Partners’ proprietary methodology for managing organisational design and innovation the creation of viable new offerings. Based on a Design Thinking approach, this methodology guides the process of understanding opportunities and challenges, defining areas of focus, designing a range of options to address these and delivering a chosen design.


The discover phase uncovers the “What Is” situation. It is a research phase to ensure complete understanding; it is situational analysis. The objective of this phase is to understand the scope in terms of opportunities and challenges. The starting point is a challenge brief, which is based on an idea or inspiration that often stems from some type of discovery process.

This phase comprises largely of research activity and data gathering. As a human centred approach, it is important to be empathetic i.e. understand and share the feelings of a person in a specific situation. For example, someone interacting with a business in the role of a buyer. This is a group of activities requiring divergent thinking.


Using the Challenge Brief, which is the starting point, we ensure a shared understanding of the brief. Using this as a basis, further information is gathered to define the scope. In pursuit of this, several methods are employed to increase understanding. Methods such as stakeholder mapping, cognitive mapping and cognitive walk-through are used to better understand the situation. Personas and Journey Maps are important outcomes of this process.

Research (Primary)

Primary research supports the main process of triage. It is used to collect information to better understand service interactions and comportment. It focuses on customers, competitors and other stakeholders. The main research method preferred is digital ethnography. This describes the process and methodology of doing ethnographic research in a digital space. The main concern of a digital ethnographer is identifying relevant subjects and learning about “language” and culture.

Research (Secondary)

Secondary research also supports the main process of triage. It comprises information collected and synthesised from existing data rather than original material sourced through primary research with participants.


The define phase develops a “What If” situation. It is essentially an analysis and synthesis phase to ensure an opportunity definition. The purpose of this phase is to define a clear problem to be solved in a properly framed context. It is the Interpretation and selection of an idea.

Gain Insights

Gain insights supports the main process of identify white space. The objective is to construe insight through analysis and synthesis of prior research outcomes. An outcome from this is key findings in support of identifying white space.

Choose Themes

Choose themes also supports the main process of identify white space. It is used to identify clear themes in support of the opportunity brief. Based on the insights we define we identify some clear design themes. These themes feed into the identification of white space component.

Identify White Space

The normal definition of whitespace is spacing between words in a document or between design objects on a canvas. We recognise whitespace as addressing the unknown. It is a process of identifying unmet and unarticulated needs. It is where products and services don’t exist based on the present understanding of values, definition of business or even existing competencies.

How Might it Work (HMW)

This component defines opportunities in terms of how they might work. It is a “To Be” or “What If” scenario that is based on the learning acquired throughout the discover and define phases.


The design phase develops a “What Could Be” situation. It is takes an opportunity brief and generates design-led solutions, which is an iterative process. This phase is known as ideation in design thinking circles.

Design Options

The design options component takes the opportunity brief output from the define phase and evaluates design options available. The activities make use of methods such as design workshops, kano modelling, design charettes and other methods to generate design options. This is an innovative, iterative process hence its divergent nature. In this component no evaluation of options is carried out – all options are valid at this point. The idea is to generate as many ideas as possible.

Determine Architectures

This component is a support process for evaluate choices. The aim is to consider the architectural constructs required to support the design options being either digital or physical touch points. From a digital perspective, this is the conceptual model that defines the structure, behaviour, and attributes of a system. It is described in a way that supports reasoning about the structures and behaviours of the system.

Determine DV

This component is also a support process for evaluate choices. It stands for desirability, feasibility and viability (DFV). It is a set of activities that assesses design options to assist in making choices.The most successful design outcomes lie at the intersection of desirability, feasibility, and viability. Design options are assessed and reassessed to ensure we deliver an appropriate, actionable, and tangible strategy. We are searching innovative avenues for growth that are grounded in business viability and market desirability.

Evaluate Choices

This component is a core process aimed at selecting the most appropriate design option(s) for further consideration. It is not necessarily a single design option being considered; several options may be selected for further evaluation. The determine architecture and determine DFV support processes assist this evaluation step.


The deliver phase develops a “What Is Preferred” outcome. It takes a set of design choices and uses prototyping to select attractive options. Through this process, candidate options are evaluated and a design outcome selected.

Understand Criteria

This component is a support process for prototyping. The objective is to understand clearly criteria that the final design must meet. For example, a high level criterion might stipulate that the interactions involved are mobile in nature and the design must therefore reflect this. Further, that certain throughput levels are required in order to be viable. The criteria are defined and used to guide the prototyping process.

Choose Enablers

This component is a support process for prototyping. The objective is to identify digital and analogue artefacts that meet the design choice(s). For example, if the design requires that a particular digital touchpoint is social in nature, is the enabler required a public or facility? For example, facebook could be considered if the criteria stipulate a public facility, otherwise inter-company messaging for private social interaction.


This component is a core process aimed at producing a working prototype of a set of design choices. This is often referred to as ideation since it is an iterative process aimed at creating fresh ideas to be tried. Prototyping is the tangible representation of artefacts at various levels of resolution. Experience prototyping differs from more passive approaches designed to convey a concept. It specifically promotes active participation between stakeholders to perceive a live experience with a system or service.

Evaluate and Select

This component is a core process that considers various prototypes of design choices and evaluates them given the defined criteria and a set of enablers. The evaluation process leads to selection of a nominee for the chosen design. There are many design methods available in support of this activity. The more thorough the evaluation and testing at this stage of the process, the closer the final design will deliver a better overall service experience.