The Japanese use the term “Kaizen” to describe the ongoing process of continuous improvement.
While the concepts of continuous improvement and total quality management require a complete overhaul of management philosophy and organizational culture, managers can rely on several specific tools and techniques for improving quality. Some of the approaches and methodologies for quality improvement are discussed as follows:
Benchmarking is the continuous process of comparing a company’s strategy, products and processes with such other similar organizations who are the best in that class in order to learn how they achieved excellence and then setting out with changes in strategies, products, and processes to match them and then surpass them.
A benchmark demonstrates the degree to which the customers of other similar organizations are satisfied. It identifies an organization whose operations are so superior that it enjoys the highest degree of customer satisfaction. The goal is to beat such an organization in performance.
The benchmarking process usually involves the following steps.
1. Identify a critical area in your own organization that needs improvement.
2. Identify some other organization which excels in quality in that area.
3. That organization would then become your benchmark for that area for improvement. Study the organization carefully and especially its benchmark activity.
4. Analyze the data so gathered from the benchmark organization and compare it with your own activity.
5. Improve the critical area at your own organization.
Selecting an industry leader provides an insight into what successful competitors are doing. It also allows organizations to set realistic and rigorous new performance targets based on what they learn from industry leaders.
Benchmarking also provides a vehicle whereby products and services are redesigned to achieve outcomes that meet or exceed customer expectations. Since the employees take pride in their organization being the “best”, it further motivates them to do their best to continue to be the best.
Outsourcing is the process of subcontracting operations and services to other firms that specialize in such operations and services and then do better. Since there are a number of operational and administrative functions that an organization is involved in, it is quite possible that some of these functions are not being performed in an optimal manner because of lack of resources or expertise. If such inefficient areas can be identified and outsourced, the organizations can realize a higher quality service or operation.
Speed refers to the time needed by the organization to get something accomplished without sacrificing its quality. An organization which produces faster, distributes faster and adapts to new way of doing things faster will be ahead of competition. One recent survey identified speed as the number one strategic issue confronting managers in the 1990s’.
The quality function deployment defines the relationship between the customer’s desires and the products supplied. Defining this relationship clearly is the first step in building a world-class production system. Then the products and processes can be built with features desired by the customers.
Named after a Japanese engineer, Genichi Taguchi, this approach is built around three concepts namely quality robustness, quality loss factor and target oriented quality.
“Quality robust” products would continue to retain quality even in adverse manufacturing and environmental conditions. Taguchi’s idea is to remove the “effects” of adverse conditions instead of removing the causes: Removing causes can sometimes be very costly and time consuming, and hence it may be cheaper and faster to remove such effects.
The “quality loss function” (QLF) identifies all costs connected with poor quality including the costs of customer dissatisfaction, warranties and services, scraps, waste and repairs and possibly some social costs. The quality loss function is defined as:
L = D2C
L = Loss
D = Deviation from the target value
C = Cost of avoiding the deviation
D2 shows that the farther the product is from the target value, the more severe the loss.
“Target-oriented, quality” is philosophy of continuous improvement to bring the product up to the most realistic but high quality target.
A flow diagram serves as a visual representation of a system or a process and it allows one to see the flow of steps in a process from the beginning to the end and serves as a kind of a road map for locating and solving problems for improving quality. For example a simple flow chart for producing a new item might be as follows.
Vilfredo Pareto, a ninteenth century economist suggested that 80 percent of the problems are the result of only 20 percent of the causes. Pareto analysis organizes errors, problems or defects so that attention can be focussed on the most important problem areas. The idea is to classify these problems according to the degree of their importance so that the most important problems can be resolved first. The 80-20 rule, as stated above, suggests that by removing 20 percent of the causes, 80 percent of the errors can be removed. For example, if 80 percent of machine breakdowns come from 20 percent of the machines then attention should be focussed on these 20 percent machines.
The cause-and-effect diagrams offer a structured approach to problems solving. These are also known as “fishbone diagrams” because of their shape. The diagrams help organize problem solving efforts by providing several layers of categories that may be factors in causing problems. The four major such categories are methods, manpower, materials and machines. Each category can then provide more information about specific causes of problems in that category. A simple fishbone diagram may be as follows.
Let us take an example of the problem of a dissatisfied airline passenger to illustrate this technique. Each bone in the fishbone structure represents a possible source of error.
When such a chart is properly built and in detail, then the possible quality problems are highlighted and proper steps can be taken to solve these problems to the customer’s satisfaction.
Statistical process control is primarily concerned with managing quality, rather than improving quality. It consists of a set of statistical techniques that can be used to monitor quality. It involves control charts which are graphic presentations of data over time and sets acceptable limits, both upper as well as lower, of acceptable quality.
It monitors standards, makes measurements and takes corrective action as a product or service is being produced. Samples of process outputs are taken and analyzed and if they are within the acceptable limits, the process is considered to be under control.
The variation of the quality measurements within the acceptable limits must be random, however, if there is a pattern of movement in one direction or the other or if the values fall outside the control limits then the process is not considered to be under control and corrective actions are taken to bring the process back under control.