MBA Knowledge Base

Business • Management • Technology

Home » Management Case Studies » Case Study: Quality Management System at Coca Cola Company

Case Study: Quality Management System at Coca Cola Company

Coca Cola’s history can be traced back to a man called Asa Candler, who bought a specific formula from a pharmacist named Smith Pemberton. Two years later, Asa founded his business and started production of soft drinks based on the formula he had bought. From then, the company grew to become the biggest producers of soft drinks with more than five hundred brands sold and consumed in more than two hundred nations worldwide.

Although the company is said to be the biggest bottler of soft drinks, they do not bottle much. Instead, Coca Cola Company manufactures a syrup concentrate, which is bought by bottlers all over the world. This distribution system ensures the soft drink is bottled by these smaller firms according to the company’s standards and guidelines. Although this franchised method of distribution is the primary method of distribution, the mother company has a key bottler in America, Coca Cola Refreshments.

In addition to soft drinks, which are Coca Cola’s main products, the company also produces diet soft drinks. These are variations of the original soft drinks with improvements in nutritional value, and reductions in sugar content. Saccharin replaced industrial sugar in 1963 so that the drinks could appeal to health-conscious consumers. A major cause for concern was the inter product competition which saw some sales dwindle in some products in favor of others.

Coca Cola started diversifying its products during the First World War when ‘Fanta’ was introduced. During World War 1, the heads of Coca Cola in Nazi Germany decided to establish a new soft drink into the market. During the ongoing war, America’s promotion in Germany was not acceptable. Therefore, he decided to use a new name and ‘Fanta’ was born. The creation was successful and production continued even after the war. ‘Sprite’ followed soon after.

In the 1990’s, health concerns among consumers of soft drinks forced their manufactures to consider altering the energy content of these products. ‘Minute Maid’ Juices, ‘PowerAde’ sports drinks, and a few flavored teas variants were Coca Cola’s initial reactions to this new interest. Although most of these new products were well received, some did not perform as well. An example of such was Coca Cola classic, dubbed C2.

Coca Cola Company has been a successful company for more than a century. This can be attributed partly to the nature of its products since soft drinks will always appeal to people. In addition to this, Coca Cola has one of the best commercial and public relations programs in the world. The company’s products can be found on adverts in virtually every corner of the globe. This success has led to its support for a wide range of sporting activities. Soccer, baseball, ice hockey, athletics and basketball are some of these sports, where Coca Cola is involved

Quality Management System at Coca Cola Company

The Quality Management System at Coca Cola

It is very important that each product that Coca Cola produces is of a high quality standard to ensure that each product is exactly the same. This is important as the company wants to meet with customer requirements and expectations. With the brand having such a global presence, it is vital that these checks are continually consistent. The standardized bottle of Coca Cola has elements that need to be checked whilst on the production line to make sure that a high quality is being met. The most common checks include ingredients, packaging and distribution. Much of the testing being taken place is during the production process, as machines and a small team of employees monitor progress. It is the responsibility of all of Coca Colas staff to check quality from hygiene operators to product and packaging quality. This shows that these constant checks require staff to be on the lookout for problems and take responsibility for this, to ensure maintained quality.

Coca-cola uses inspection throughout its production process, especially in the testing of the Coca-Cola formula to ensure that each product meets specific requirements. Inspection is normally referred to as the sampling of a product after production in order to take corrective action to maintain the quality of products. Coca-Cola has incorporated this method into their organisational structure as it has the ability of eliminating mistakes and maintaining high quality standards, thus reducing the chance of product recall. It is also easy to implement and is cost effective.

Coca-cola uses both Quality Control (QC) and Quality Assurance (QA) throughout its production process. QC mainly focuses on the production line itself, whereas QA focuses on its entire operations process and related functions, addressing potential problems very quickly. In QC and QA, state of the art computers check all aspects of the production process, maintaining consistency and quality by checking the consistency of the formula, the creation of the bottle (blowing), fill levels of each bottle, labeling of each bottle, overall increasing the speed of production and quality checks, which ensures that product demands are met. QC and QA helps reduce the risk of defective products reaching a customer; problems are found and resolved in the production process, for example, bottles that are considered to be defective are placed in a waiting area for inspection. QA also focuses on the quality of supplied goods to Coca-cola, for example sugar, which is supplied by Tate and Lyle. Coca-cola informs that they have never had a problem with their suppliers. QA can also involve the training of staff ensuring that employees understand how to operate machinery. Coca-Cola ensures that all members of staff receive training prior to their employment, so that employees can operate machinery efficiently. Machinery is also under constant maintenance, which requires highly skilled engineers to fix problems, and help Coca-cola maintain high outputs.

Every bottle is also checked that it is at the correct fill level and has the correct label. This is done by a computer which every bottle passes through during the production process. Any faulty products are taken off the main production line. Should the quality control measures find any errors, the production line is frozen up to the last good check that was made. The Coca Cola bottling plant also checks the utilization level of each production line using a scorecard system. This shows the percentage of the line that is being utilized and allows managers to increase the production levels of a line if necessary.

Coca-Cola also uses Total Quality Management (TQM) , which involves the management of quality at every level of the organisation , including; suppliers, production, customers etc. This allows Coca-cola to retain/regain competitiveness to achieve increased customer satisfaction . Coca-cola uses this method to continuously improve the quality of their products. Teamwork is very important and Coca-cola ensures that every member of staff is involved in the production process, meaning that each employee understands their job/roles, thus improving morale and motivation , overall increasing productivity. TQM practices can also increase customer involvement as many organisations, including Coca-Cola relish the opportunity to receive feedback and information from their consumers. Overall, reducing waste and costs, provides Coca-cola with a competitive advantage .

The Production Process

Before production starts on the line cleaning quality tasks are performed to rinse internal pipelines, machines and equipment. This is often performed during a switch over of lines for example, changing Coke to Diet Coke to ensure that the taste is the same. This quality check is performed for both hygiene purposes and product quality. When these checks are performed the production process can begin.

Coca Cola uses a database system called Questar which enables them to perform checks on the line. For example, all materials are coded and each line is issued with a bill of materials before the process starts. This ensures that the correct materials are put on the line. This is a check that is designed to eliminate problems on the production line and is audited regularly. Without this system, product quality wouldn’t be assessed at this high level. Other quality checks on the line include packaging and carbonation which is monitored by an operator who notes down the values to ensure they are meeting standards.

To test product quality further lab technicians carry out over 2000 spot checks a day to ensure quality and consistency. This process can be prior to production or during production which can involve taking a sample of bottles off the production line. Quality tests include, the CO2 and sugar values, micro testing, packaging quality and cap tightness. These tests are designed so that total quality management ideas can be put forward. For example, one way in which Coca Cola has improved their production process is during the wrapping stage at the end of the line. The machine performed revolutions around the products wrapping it in plastic until the contents were secure. One initiative they adopted meant that one less revolution was needed. This idea however, did not impact on the quality of the packaging or the actual product therefore saving large amounts of money on packaging costs. This change has been beneficial to the organisation. Continuous improvement can also be used to adhere to environmental and social principles which the company has the responsibility to abide by. Continuous Improvement methods are sometimes easy to identify but could lead to a big changes within the organisation. The idea of continuous improvement is to reveal opportunities which could change the way something is performed. Any sources of waste, scrap or rework are potential projects which can be improved.

The successfulness of this system can be measured by assessing the consistency of the product quality. Coca Cola say that ‘Our Company’s Global Product Quality Index rating has consistently reached averages near 94 since 2007, with a 94.3 in 2010, while our Company Global Package Quality Index has steadily increased since 2007 to a 92.6 rating in 2010, our highest value to date’. This is an obvious indication this quality system is working well throughout the organisation. This increase of the index shows that the consistency of the products is being recognized by consumers.

Related posts:

  • Case Study: The Coca-Cola Company Struggles with Ethical Crisis
  • Case Study: Analysis of the Ethical Behavior of Coca Cola
  • Case Study of Burger King: Achieving Competitive Advantage through Quality Management
  • SWOT Analysis of Coca Cola
  • Case Study: Marketing Strategy of Walt Disney Company
  • Case Study of Papa John’s: Quality as a Core Business Strategy
  • Case Study: Johnson & Johnson Company Analysis
  • Case Study: Inventory Management Practices at Walmart
  • Case Study: Analysis of Performance Management at British Petroleum
  • Total Quality Management And Continuous Quality Improvement

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *


Sign up for the newsletter

Digital editions.

Digital Editions

Total quality management: three case studies from around the world

With organisations to run and big orders to fill, it’s easy to see how some ceos inadvertently sacrifice quality for quantity. by integrating a system of total quality management it’s possible to have both.

Feature image

Top 5 ways to manage the board during turbulent times Top 5 ways to create a family-friendly work culture Top 5 tips for a successful joint venture Top 5 ways managers can support ethnic minority workers Top 5 ways to encourage gender diversity in the workplace  Top 5 ways CEOs can create an ethical company culture Top 5 tips for going into business with your spouse Top 5 ways to promote a healthy workforce Top 5 ways to survive a recession Top 5 tips for avoiding the ‘conference vortex’ Top 5 ways to maximise new parents’ work-life balance with technology Top 5 ways to build psychological safety in the workplace Top 5 ways to prepare your workforce for the AI revolution Top 5 ways to tackle innovation stress in the workplace Top 5 tips for recruiting Millennials

There are few boardrooms in the world whose inhabitants don’t salivate at the thought of engaging in a little aggressive expansion. After all, there’s little room in a contemporary, fast-paced business environment for any firm whose leaders don’t subscribe to ambitions of bigger factories, healthier accounts and stronger turnarounds. Yet too often such tales of excess go hand-in-hand with complaints of a severe drop in quality.

Food and entertainment markets are riddled with cautionary tales, but service sectors such as health and education aren’t immune to the disappointing by-products of unsustainable growth either. As always, the first steps in avoiding a catastrophic forsaking of quality begins with good management.

There are plenty of methods and models geared at managing the quality of a particular company’s goods or services. Yet very few of those models take into consideration the widely held belief that any company is only as strong as its weakest link. With that in mind, management consultant William Deming developed an entirely new set of methods with which to address quality.

Deming, whose managerial work revolutionised the titanic Japanese manufacturing industry, perceived quality management to be more of a philosophy than anything else. Top-to-bottom improvement, he reckoned, required uninterrupted participation of all key employees and stakeholders. Thus, the total quality management (TQM) approach was born.

All in Similar to the Six Sigma improvement process, TQM ensures long-term success by enforcing all-encompassing internal guidelines and process standards to reduce errors. By way of serious, in-depth auditing – as well as some well-orchestrated soul-searching – TQM ensures firms meet stakeholder needs and expectations efficiently and effectively, without forsaking ethical values.

By opting to reframe the way employees think about the company’s goals and processes, TQM allows CEOs to make sure certain things are done right from day one. According to Teresa Whitacre, of international consulting firm ASQ , proper quality management also boosts a company’s profitability.

“Total quality management allows the company to look at their management system as a whole entity — not just an output of the quality department,” she says. “Total quality means the organisation looks at all inputs, human resources, engineering, production, service, distribution, sales, finance, all functions, and their impact on the quality of all products or services of the organisation. TQM can improve a company’s processes and bottom line.”

Embracing the entire process sees companies strive to improve in several core areas, including: customer focus, total employee involvement, process-centred thinking, systematic approaches, good communication and leadership and integrated systems. Yet Whitacre is quick to point out that companies stand to gain very little from TQM unless they’re willing to go all-in.

“Companies need to consider the inputs of each department and determine which inputs relate to its governance system. Then, the company needs to look at the same inputs and determine if those inputs are yielding the desired results,” she says. “For example, ISO 9001 requires management reviews occur at least annually. Aside from minimum standard requirements, the company is free to review what they feel is best for them. While implementing TQM, they can add to their management review the most critical metrics for their business, such as customer complaints, returns, cost of products, and more.”

The customer knows best: AtlantiCare TQM isn’t an easy management strategy to introduce into a business; in fact, many attempts tend to fall flat. More often than not, it’s because firms maintain natural barriers to full involvement. Middle managers, for example, tend to complain their authority is being challenged when boots on the ground are encouraged to speak up in the early stages of TQM. Yet in a culture of constant quality enhancement, the views of any given workforce are invaluable.

AtlantiCare in numbers

5,000 Employees

$280m Profits before quality improvement strategy was implemented

$650m Profits after quality improvement strategy

One firm that’s proven the merit of TQM is New Jersey-based healthcare provider AtlantiCare . Managing 5,000 employees at 25 locations, AtlantiCare is a serious business that’s boasted a respectable turnaround for nearly two decades. Yet in order to increase that margin further still, managers wanted to implement improvements across the board. Because patient satisfaction is the single-most important aspect of the healthcare industry, engaging in a renewed campaign of TQM proved a natural fit. The firm chose to adopt a ‘plan-do-check-act’ cycle, revealing gaps in staff communication – which subsequently meant longer patient waiting times and more complaints. To tackle this, managers explored a sideways method of internal communications. Instead of information trickling down from top-to-bottom, all of the company’s employees were given freedom to provide vital feedback at each and every level.

AtlantiCare decided to ensure all new employees understood this quality culture from the onset. At orientation, staff now receive a crash course in the company’s performance excellence framework – a management system that organises the firm’s processes into five key areas: quality, customer service, people and workplace, growth and financial performance. As employees rise through the ranks, this emphasis on improvement follows, so managers can operate within the company’s tight-loose-tight process management style.

After creating benchmark goals for employees to achieve at all levels – including better engagement at the point of delivery, increasing clinical communication and identifying and prioritising service opportunities – AtlantiCare was able to thrive. The number of repeat customers at the firm tripled, and its market share hit a six-year high. Profits unsurprisingly followed. The firm’s revenues shot up from $280m to $650m after implementing the quality improvement strategies, and the number of patients being serviced dwarfed state numbers.

Hitting the right notes: Santa Cruz Guitar Co For companies further removed from the long-term satisfaction of customers, it’s easier to let quality control slide. Yet there are plenty of ways in which growing manufacturers can pursue both quality and sales volumes simultaneously. Artisan instrument makers the Santa Cruz Guitar Co (SCGC) prove a salient example. Although the California-based company is still a small-scale manufacturing operation, SCGC has grown in recent years from a basement operation to a serious business.

SCGC in numbers

14 Craftsmen employed by SCGC

800 Custom guitars produced each year

Owner Dan Roberts now employs 14 expert craftsmen, who create over 800 custom guitars each year. In order to ensure the continued quality of his instruments, Roberts has created an environment that improves with each sale. To keep things efficient (as TQM must), the shop floor is divided into six workstations in which guitars are partially assembled and then moved to the next station. Each bench is manned by a senior craftsman, and no guitar leaves that builder’s station until he is 100 percent happy with its quality. This product quality is akin to a traditional assembly line; however, unlike a traditional, top-to-bottom factory, Roberts is intimately involved in all phases of instrument construction.

Utilising this doting method of quality management, it’s difficult to see how customers wouldn’t be satisfied with the artists’ work. Yet even if there were issues, Roberts and other senior management also spend much of their days personally answering web queries about the instruments. According to the managers, customers tend to be pleasantly surprised to find the company’s senior leaders are the ones answering their technical questions and concerns. While Roberts has no intentions of taking his manufacturing company to industrial heights, the quality of his instruments and high levels of customer satisfaction speak for themselves; the company currently boasts one lengthy backlog of orders.

A quality education: Ramaiah Institute of Management Studies Although it may appear easier to find success with TQM at a boutique-sized endeavour, the philosophy’s principles hold true in virtually every sector. Educational institutions, for example, have utilised quality management in much the same way – albeit to tackle decidedly different problems.

The global financial crisis hit higher education harder than many might have expected, and nowhere have the odds stacked higher than in India. The nation plays home to one of the world’s fastest-growing markets for business education. Yet over recent years, the relevance of business education in India has come into question. A report by one recruiter recently asserted just one in four Indian MBAs were adequately prepared for the business world.

RIMS in numbers

9% Increase in test scores post total quality management strategy

22% Increase in number of recruiters hiring from the school

20,000 Increase in the salary offered to graduates

50,000 Rise in placement revenue

At the Ramaiah Institute of Management Studies (RIMS) in Bangalore, recruiters and accreditation bodies specifically called into question the quality of students’ educations. Although the relatively small school has always struggled to compete with India’s renowned Xavier Labour Research Institute, the faculty finally began to notice clear hindrances in the success of graduates. The RIMS board decided it was time for a serious reassessment of quality management.

The school nominated Chief Academic Advisor Dr Krishnamurthy to head a volunteer team that would audit, analyse and implement process changes that would improve quality throughout (all in a particularly academic fashion). The team was tasked with looking at three key dimensions: assurance of learning, research and productivity, and quality of placements. Each member underwent extensive training to learn about action plans, quality auditing skills and continuous improvement tools – such as the ‘plan-do-study-act’ cycle.

Once faculty members were trained, the team’s first task was to identify the school’s key stakeholders, processes and their importance at the institute. Unsurprisingly, the most vital processes were identified as student intake, research, knowledge dissemination, outcomes evaluation and recruiter acceptance. From there, Krishnamurthy’s team used a fishbone diagram to help identify potential root causes of the issues plaguing these vital processes. To illustrate just how bad things were at the school, the team selected control groups and administered domain-based knowledge tests.

The deficits were disappointing. A RIMS students’ knowledge base was rated at just 36 percent, while students at Harvard rated 95 percent. Likewise, students’ critical thinking abilities rated nine percent, versus 93 percent at MIT. Worse yet, the mean salaries of graduating students averaged $36,000, versus $150,000 for students from Kellogg. Krishnamurthy’s team had their work cut out.

To tackle these issues, Krishnamurthy created an employability team, developed strategic architecture and designed pilot studies to improve the school’s curriculum and make it more competitive. In order to do so, he needed absolutely every employee and student on board – and there was some resistance at the onset. Yet the educator asserted it didn’t actually take long to convince the school’s stakeholders the changes were extremely beneficial.

“Once students started seeing the results, buy-in became complete and unconditional,” he says. Acceptance was also achieved by maintaining clearer levels of communication with stakeholders. The school actually started to provide shareholders with detailed plans and projections. Then, it proceeded with a variety of new methods, such as incorporating case studies into the curriculum, which increased general test scores by almost 10 percent. Administrators also introduced a mandate saying students must be certified in English by the British Council – increasing scores from 42 percent to 51 percent.

By improving those test scores, the perceived quality of RIMS skyrocketed. The number of top 100 businesses recruiting from the school shot up by 22 percent, while the average salary offers graduates were receiving increased by $20,000. Placement revenue rose by an impressive $50,000, and RIMS has since skyrocketed up domestic and international education tables.

No matter the business, total quality management can and will work. Yet this philosophical take on quality control will only impact firms that are in it for the long haul. Every employee must be in tune with the company’s ideologies and desires to improve, and customer satisfaction must reign supreme.


  • Industry Outlook



  • The Magazine
  • Newsletters
  • Managing Yourself
  • Managing Teams
  • Work-life Balance
  • The Big Idea
  • Data & Visuals
  • Reading Lists
  • Case Selections
  • HBR Learning
  • Topic Feeds
  • Account Settings
  • Email Preferences

Quality management

  • Business management
  • Process management
  • Project management

Creating a Culture of Quality

  • Ashwin Srinivasan
  • Bryan Kurey
  • From the April 2014 Issue

Reign of Zero Tolerance (HBR Case Study and Commentary)

  • Janet Parker
  • Eugene Volokh
  • Jean Halloran
  • Michael G. Cherkasky
  • From the November 2006 Issue

The Culture to Cultivate

  • George C. Halvorson
  • From the July–August 2013 Issue

Power of Internal Guarantees

  • Christopher W.L. Hart
  • From the January–February 1995 Issue

case study for quality management

Teaching Smart People How to Learn

  • Chris Argyris
  • From the May–June 1991 Issue

Benchmarking Your Staff

  • Michael Goold
  • David J. Collis
  • From the September 2005 Issue

case study for quality management

The CEO of Canada Goose on Creating a Homegrown Luxury Brand

  • From the September–October 2019 Issue

Get Ahead by Betting Wrong

  • J.P. Eggers
  • From the July–August 2014 Issue

Made in U.S.A.: A Renaissance in Quality

  • Joseph M. Juran
  • July 01, 1993

Relentless Idealism for Tough Times

  • Alice Waters
  • From the June 2009 Issue

Develop Your Strengths, Not Your Weaknesses

  • Scott K. Edinger
  • February 28, 2012

case study for quality management

Will Disruptive Innovations Cure Health Care? (HBR OnPoint Enhanced Edition)

  • Clayton M. Christensen
  • Richard Bohmer
  • John Kenagy
  • June 01, 2004

case study for quality management

A Better Way to Onboard AI

  • Boris Babic
  • Daniel L. Chen
  • Theodoros Evgeniou
  • Anne-Laure Fayard
  • From the July–August 2020 Issue

Fixing Health Care from the Inside, Today

  • Steven J. Spear
  • September 01, 2005

case study for quality management

The IT Transformation Health Care Needs

  • Nikhil Sahni
  • Robert S. Huckman
  • Anuraag Chigurupati
  • David M. Cutler
  • From the November–December 2017 Issue

Manage Your Human Sigma

  • John H. Fleming
  • Curt Coffman
  • James K. Harter
  • From the July–August 2005 Issue

Growth as a Process: The HBR Interview

  • Jeffrey R. Immelt
  • Thomas A. Stewart
  • From the June 2006 Issue

case study for quality management

Health Care Needs Real Competition

  • Leemore S Dafny
  • Thomas H. Lee M.D.
  • From the December 2016 Issue

Six Sigma Pricing

  • Manmohan S. Sodhi
  • Navdeep S. Sodhi
  • From the May 2005 Issue

Coming Commoditization of Processes

  • Thomas H. Davenport
  • From the June 2005 Issue

case study for quality management

Pleasant Bluffs: Launching a Home-Based Hospital Program

  • Laura Erskine
  • April 07, 2016

Oberoi Hotels: Train Whistle in the Tiger Reserve

  • Ryan W. Buell
  • Ananth Raman
  • Vidhya Muthuram
  • January 09, 2015

Toyota Motor Manufacturing, U.S.A., Inc.

  • Kazuhiro Mishina
  • September 08, 1992

Xerox: Design for the Environment

  • Richard H.K. Vietor
  • Fiona E.S. Murray
  • January 07, 1994

Impacore - A Disruptive Force in Consulting

  • Dennis Campbell
  • Emer Moloney
  • June 15, 2016

Applying the Service Activity Sequence in the World of Culture

  • Beatriz Munoz-Seca Fernandez-Cuesta
  • Susana Llerena
  • September 21, 2012

Six Sigma at Cintas Corporation

  • P. Fraser Johnson
  • Adam Bortolussi
  • August 29, 2010

Danone: Changing the Food System

  • David E. Bell
  • Federica Gabrieli
  • Daniela Beyersdorfer
  • November 14, 2019

Koo Foundation Sun Yat-Sen Cancer Center: Breast Cancer Care in Taiwan

  • Michael E. Porter
  • Jennifer F. Baron
  • C. Jason Wang
  • December 08, 2009

Six Sigma at Academic Medical Hospital (C)

  • Robert D. Landel
  • September 28, 2003

Onnie Jewellers

  • Larry Menor
  • August 01, 2017

Wipro Technologies Europe (A)

  • Gerry Yemen
  • Martin N. Davidson
  • Heather Wishik
  • May 29, 2002

Health-Tech Strategy at KG Hospital Part A: Identification and Prioritization of Key Focus Areas

  • Vijaya Sunder M
  • Meghna Raman
  • January 09, 2022

Nanxi Liu: Finding the Keys to Sales Success at Enplug

  • Paul Orlando
  • Megan Strawther
  • October 12, 2016

GE: We Bring Good Things to Life (B)

  • James L. Heskett
  • January 20, 1999

Czech Mate: Jake and Dan's Marvelous Adventure

  • Elliott N. Weiss
  • Robert Collier
  • Mike Kadish
  • September 22, 2009

Quality Management in the Oil Industry: How BP Greases Its Machinery for Frictionless Sourcing

  • Martin Lockstrom
  • Shengrong Zhang
  • March 28, 2012

Lean at Wipro Technologies

  • Bradley R. Staats
  • David M. Upton
  • October 16, 2006

case study for quality management

Operations Management Reading: Managing Quality with Process Control

  • Roy D. Shapiro
  • September 10, 2013

Samsung Electronics: Using Affinity Diagrams and Pareto Charts

  • Jack Boepple
  • February 08, 2013

case study for quality management

Pleasant Bluffs: Launching a Home-Based Hospital Program, Teaching Note

Setting measures and targets that drive performance, a balanced scorecard reader.

  • Balanced Scorecard Collaborative
  • December 17, 2008

Popular Topics

Partner center.

Making quality assurance smart

For decades, outside forces have dictated how pharmaceutical and medtech companies approach quality assurance. The most influential force remains regulatory requirements. Both individual interpretations of regulations and feedback received during regulatory inspections have shaped quality assurance systems and processes. At the same time, mergers and acquisitions, along with the proliferation of different IT solutions and quality software, have resulted in a diverse and complicated quality management system (QMS) landscape. Historically, the cost of consolidating and upgrading legacy IT systems has been prohibitively expensive. Further challenged by a scarcity of IT support, many quality teams have learned to rely on the processes and workflows provided by off-the-shelf software without questioning whether they actually fit their company’s needs and evolving regulatory requirements.

In recent years, however, several developments have enabled a better way. New digital and analytics technologies make it easier for quality teams to access data from different sources and in various formats, without replacing existing systems. Companies can now build dynamic user experiences in web applications at a fraction of the cost of traditional, enterprise desktop software; this development raises the prospect of more customized, user-friendly solutions. Moreover, regulators, such as the FDA, are increasingly focused on quality systems and process maturity. 1 MDIC Case for Quality program. The FDA also identified the enablement of innovative technologies as a strategic priority, thereby opening the door for constructive dialogue about potential changes. 2 Technology Modernization Action Plan, FDA.

Smart quality at a glance

“Smart quality” is a framework that pharma and medtech companies can apply to redesign key quality assurance processes and create value for the organization.

Smart quality has explicit objectives:

  • to perceive and deliver on multifaceted and ever-changing customer needs
  • to deploy user-friendly processes built organically into business workflows, reimagined with leading-edge technologies
  • to leapfrog existing quality management systems with breakthrough innovation, naturally fulfilling the spirit—not just the letter—of the regulations

The new ways in which smart quality achieves its objectives can be categorized in five building blocks (exhibit).

To learn more about smart quality and how leading companies are reimagining the quality function, please see “ Smart quality: Reimagining the way quality works .”

The time has arrived for pharmaceutical and medtech companies to act boldly and reimagine the quality function. Through our work on large-scale quality transformation projects and our conversations with executives, we have developed a new approach we call “smart quality” (see sidebar, “Smart quality at a glance”). With this approach, companies can redesign key quality processes and enable design-thinking methodology (to make processes more efficient and user-friendly), automation and digitization (to deliver speed and transparency), and advanced analytics (to provide deep insights into process capability and product performance).

The quality assurance function thereby becomes a driver of value in the organization and a source of competitive advantage—improving patient safety and health outcomes while operating efficiently, effectively, and fully aligned with regulatory expectations. In our experience, companies applying smart quality principles to quality assurance can quickly generate returns that outweigh investments in new systems, including line-of-sight impact on profit; a 30 percent improvement in time to market; and a significant increase in manufacturing and supply chain reliability. Equally significant are improvements in customer satisfaction and employee engagement, along with reductions in compliance risk.

Revolutionizing quality assurance processes

The following four use cases illustrate how pharmaceutical and medtech companies can apply smart quality to transform core quality assurance processes—including complaints management, quality management review, deviations investigations, and supplier risk management, among others.

1. Complaint management

Responding swiftly and effectively to complaints is not only a compliance requirement but also a business necessity. Assessing and reacting to feedback from the market can have an immediate impact on patient safety and product performance. Today, a pharmaceutical or medtech company may believe it is handling complaints well if it has a single software deployed around the globe for complaint management, with some elements of automation (for example, flagging reportable malfunctions in medical devices) and several processing steps happening offshore (such as intake, triage, and regulatory reporting).

Yet, for most quality teams, the average investigation and closure cycle time hovers around 60 days—a few adverse events are reported late every month, and negative trends are addressed two or more months after the signals come in. It can take quality assurance teams even longer to identify complaints that collectively point to negative trends for a particular product or device. At the same time, less than 5 percent of incoming complaints are truly new events that have never been seen before. The remainder of complaints can usually be categorized into well-known issues, within expected limits; or previously investigated issues, in which root causes have been identified and are already being addressed.

The smart quality approach improves customer engagement and speed

By applying smart quality principles and the latest technologies, companies can reduce turnaround times and improve the customer experience. They can create an automated complaint management process that reduces costs yet applies the highest standards:

  • For every complaint, the information required for a precise assessment is captured at intake, and the event is automatically categorized.
  • High-risk issues are immediately escalated by the system, with autogenerated reports ready for submission.
  • New types of complaints and out-of-trend problems are escalated and investigated quickly.
  • Low-risk, known issues are automatically trended and closed if they are within expected limits or already being addressed.
  • Customer responses and updates are automatically available.
  • Trending reports are available in real time for any insights or analyses.

To transform the complaint management process, companies should start by defining a new process and ensuring it meets regulatory requirements. The foundation for the new process can lie in a structured event assessment that allows automated issue categorization based on the risk level defined in the company’s risk management documentation. A critical technological component is the automation of customer complaint intake; a dynamic front-end application can guide a customer through a series of questions (Exhibit 1). The application captures only information relevant to a specific complaint evaluation, investigation, and—if necessary—regulatory report. Real-time trending can quickly identify signals that indicate issues exceeding expected limits. In addition, companies can use machine learning to scan text and identify potential high-risk complaints. Finally, risk-tailored investigation pathways, automated reporting, and customer response solutions complete the smart quality process. Successful companies maintain robust procedures and documentation that clearly explain how the new process reliably meets specific regulatory requirements. Usually, a minimal viable product (MVP) for the new process can be built within two to four months for the first high-volume product family.

In our experience, companies that redesign the complaint management process can respond more swiftly—often within a few hours—to reduce patient risk and minimize the scale and impact of potential issues in the field. For example, one medtech company that adopted the new complaint management approach can now automatically assess all complaints and close more than 55 percent of them in 24 hours without human intervention. And few, if any, reportable events missed deadlines for submission. Now, subject matter experts are free to focus on investigating new or high-risk issues, understanding root causes, and developing the most effective corrective and preventive actions. The company also reports that its customers prefer digital interfaces to paper forms and are pleased to be updated promptly on their status and resolution of their complaints.

2. Quality management review

Real-time performance monitoring is crucial to executive decision making at pharmaceutical and medtech companies. During a 2019 McKinsey roundtable discussion, 62 percent of quality assurance executives rated it as a high priority for the company, exceeding all other options.

For many companies today, the quality review process involves significant manual data collection and chart creation. Often, performance metrics focus on quality compliance outcomes and quality systems—such as deviation cycle times—at the expense of leading indicators and connection to culture and cost. Managers and executives frequently find themselves engaged in lengthy discussions, trying to interpret individual metrics and often missing the big picture.

Although many existing QMS solutions offer automated data-pull and visualization features, the interpretation of complex metric systems and trends remains largely a manual process. A team may quickly address one performance metric or trend, only to learn several months later that the change negatively affected another metric.

The smart quality approach speeds up decision making and action

By applying smart quality principles and the latest digital technologies, companies can get a comprehensive view of quality management in real time. This approach to performance monitoring allows companies to do the following:

  • automatically collect, analyze, and visualize relevant leading indicators and outcomes on a simple and intuitive dashboard
  • quickly identify areas of potential risk and emerging trends, as well as review their underlying metrics and connections to different areas
  • rapidly make decisions to address existing or emerging issues and monitor the results
  • adjust metrics and targets to further improve performance as goals are achieved
  • view the entire value chain and create transparency for all functions, not just quality

To transform the process, companies should start by reimagining the design of the process and settling on a set of metrics that balances leading and lagging indicators. A key technical enabler of the system is establishing an interconnected metrics structure that automates data pull and visualization and digitizes analysis and interpretation (Exhibit 2). Key business processes, such as regular quality management reviews, may require changes to include a wider range of functional stakeholders and to streamline the review cascade.

Healthcare companies can use smart quality to redesign the quality management review process and see results quickly. At one pharmaceutical and medtech company, smart visualization of connected, cross-functional metrics significantly improved the effectiveness and efficiency of quality management review at all levels. Functions throughout the organization reported feeling better positioned to ascertain the quality situation quickly, support decision making, and take necessary actions. Because of connected metrics, management can not only see alarming trends but also link them to other metrics and quickly align on targeted improvement actions. For example, during a quarterly quality management review, the executive team linked late regulatory reporting challenges to an increase in delayed complaint submissions in some geographic regions. Following the review, commercial leaders raised attention to this issue in their respective regions, and in less than three months, late regulatory reporting was reduced to zero. Although the company is still in the process of fully automating data collection, it has already noticed a significant shift in its work. The quality team no longer spends the majority of its time on data processing but has pivoted to understanding, interpreting, and addressing complex and interrelated trends to reduce risks associated with quality and compliance.

Healthcare companies can use smart quality to redesign the quality management review process and see results quickly.

3. Deviation or nonconformance investigations

Deviation or nonconformance management is a critical topic for companies today because unaddressed issues can lead to product recalls and reputational damage. More often, deviations or nonconformances can affect a company’s product-release process, capacity, and lead times. As many quality teams can attest, the most challenging and time-consuming part of a deviation or nonconformance investigation is often the root cause analysis. In the best of circumstances, investigators use a tracking and trending system to identify similar occurrences. However, more often than not, these systems lack good classification of root causes and similarities. The systems search can become another hurdle for quality teams, resulting in longer lead times and ineffective root cause assessment. Not meeting the standards defined by regulators regarding deviation or nonconformance categorization and root cause analysis is one of the main causes of warning letters or consent decrees.

The smart quality approach improves effectiveness and reduces lead times

Our research shows companies that use smart quality principles to revamp the investigation process may reap these benefits:

  • all pertinent information related to processes and equipment is easily accessible in a continuously updated data lake
  • self-learning algorithms predict the most likely root cause of new deviations, thereby automating the review of process data and statements

In our experience, advanced analytics is the linchpin of transforming the investigation process. The most successful companies start by building a real-time data model from local and global systems that continuously refreshes and improves the model over time. Natural language processing can generate additional classifications of deviations or nonconformances to improve the quality and accuracy of insights. Digitization ensures investigators can easily access graphical interfaces that are linked to all data sources. With these tools in place, companies can readily identify the most probable root cause for deviation or nonconformance and provide a fact base for the decision. Automation also frees quality assurance professionals to focus on corrective and preventive action (Exhibit 3).

Pharmaceutical and medtech companies that apply these innovative technologies and smart quality principles can see significant results. Our work with several companies shows that identifying, explaining, and eliminating the root causes of recurring deviations and nonconformances can reduce the overall volume of issues by 65 percent. Companies that use the data and models to determine which unexpected factors in processes and products influence the end quality are able to control for them, thereby achieving product and process mastery. What’s more, by predicting the most likely root causes and their underlying drivers, these companies can reduce the investigation cycle time for deviations and nonconformances by 90 percent.

4. Supplier quality risk management

Drug and medical device supply chains have become increasingly global, complex, and opaque as more pharmaceutical and medtech companies outsource major parts of production to suppliers and contract manufacturing organizations (CMOs). More recently, the introduction of new, complex modalities, such as cell therapy and gene editing, has further increased pressure to ensure the quality of supplier products. Against this backdrop, it is critical to have a robust supplier quality program that can proactively identify and mitigate supplier risks or vulnerabilities before they become material issues.

Today, many companies conduct supplier risk management manually and at one specific point in time, such as at the beginning of a contract or annually. Typically, risk assessments are done in silos across the organization; every function completes individual reports and rarely looks at supplier risk as a whole. Because the results are often rolled up and individual risk signals can become diluted, companies focus more on increasing controls than addressing underlying challenges.

The smart quality approach reduces quality issues and optimizes resources

Companies that break down silos and apply a more holistic risk lens across the organization have a better chance of proactively identifying supplier quality risks. With smart quality assurance, companies can do the following:

  • identify vulnerabilities by utilizing advanced analytics on a holistic set of internal and external supplier and product data
  • ensure real-time updates and reviews to signal improvements in supplier quality and any changes that may pose an additional risk
  • optimize resource allocation and urgency of action, based on the importance and risk level of the supplier or CMO

Current technologies make it simpler than ever to automatically collect meaningful data. They also make it possible to analyze the data, identify risk signals, and present information in an actionable format. Internal and supplier data can include financials, productivity, and compliance metrics. Such information can be further enhanced by publicly available external sources—such as regulatory reporting, financial statements, and press releases—that provide additional insights into supplier quality risks. For example, using natural language processing to search the web for negative press releases is a simple yet powerful method to identify risks.

Would you like to learn more about our Life Sciences Practice ?

Once a company has identified quality risks, it must establish a robust process for managing these risks. Mitigation actions can include additional monitoring with digital tools, supporting the supplier to address the sources of issues, or deciding to switch to a different supplier. In our experience, companies that have a deep understanding of the level of quality risk, as well as the financial exposure, have an easier time identifying the appropriate mitigation action. Companies that identify risks and proactively mitigate them are less likely to experience potentially large supply disruptions or compliance findings.

Many pharmaceutical and medtech companies have taken steps to improve visibility into supplier quality risks by using smart quality principles. For example, a large pharmaceutical company that implemented this data-driven approach eliminated in less than two years major CMO and supplier findings that were identified during audits. In addition, during the COVID-19 pandemic, a global medtech company was able to proactively prevent supply chain disruptions by drawing on insights derived from smart quality supplier risk management.

Getting started

Pharmaceutical and medtech companies can approach quality assurance redesign in multiple ways. In our experience, starting with two or three processes, codifying the approach, and then rolling it out to more quality systems accelerates the overall transformation and time to value.

Smart quality assurance starts with clean-sheet design. By deploying modern design techniques, organizations can better understand user needs and overcome constraints. To define the solution space, we encourage companies to draw upon a range of potential process, IT, and analytics solutions from numerous industries. In cases where the new process is substantially different from the legacy process, we find it beneficial to engage regulators in an open dialogue and solicit their early feedback to support the future-state design.

Once we arrive at an MVP that includes digital and automation elements, companies can test and refine new solutions in targeted pilots. Throughout the process, we encourage companies to remain mindful of training and transition planning. Plans should include details on ensuring uninterrupted operations and maintaining compliance during the transition period.

The examples in this article are not exceptions. We believe that any quality assurance process can be significantly improved by applying a smart quality approach and the latest technologies. Pharmaceutical and medtech companies that are willing to make the organizational commitment to rethink quality assurance can significantly reduce quality risks, improve their speed and effectiveness in handling issues, and see long-term financial benefits.

Note: The insights and concepts presented here have not been validated or independently verified, and future results may differ materially from any statements of expectation, forecasts, or projections. Recipients are solely responsible for all of their decisions, use of these materials, and compliance with applicable laws, rules, and regulations. Consider seeking advice of legal and other relevant certified/licensed experts prior to taking any specific steps.

Explore a career with us

Related articles.

Smart quality: Reimagining the way quality works

Smart quality: Reimagining the way quality works

Ready for launch: Reshaping pharma’s strategy in the next normal

Ready for launch: Reshaping pharma’s strategy in the next normal

Healthcare innovation: Building on gains made through the crisis

Healthcare innovation: Building on gains made through the crisis

  • Browse All Articles
  • Newsletter Sign-Up

case study for quality management

  • 11 Apr 2023
  • Cold Call Podcast

A Rose by Any Other Name: Supply Chains and Carbon Emissions in the Flower Industry

Headquartered in Kitengela, Kenya, Sian Flowers exports roses to Europe. Because cut flowers have a limited shelf life and consumers want them to retain their appearance for as long as possible, Sian and its distributors used international air cargo to transport them to Amsterdam, where they were sold at auction and trucked to markets across Europe. But when the Covid-19 pandemic caused huge increases in shipping costs, Sian launched experiments to ship roses by ocean using refrigerated containers. The company reduced its costs and cut its carbon emissions, but is a flower that travels halfway around the world truly a “low-carbon rose”? Harvard Business School professors Willy Shih and Mike Toffel debate these questions and more in their case, “Sian Flowers: Fresher by Sea?”

case study for quality management

  • 17 Sep 2019

How a New Leader Broke Through a Culture of Accuse, Blame, and Criticize

Children’s Hospital & Clinics COO Julie Morath sets out to change the culture by instituting a policy of blameless reporting, which encourages employees to report anything that goes wrong or seems substandard, without fear of reprisal. Professor Amy Edmondson discusses getting an organization into the “High Performance Zone.” Open for comment; 0 Comments.

case study for quality management

  • 27 Feb 2019
  • Research & Ideas

The Hidden Cost of a Product Recall

Product failures create managerial challenges for companies but market opportunities for competitors, says Ariel Dora Stern. The stakes have only grown higher. Open for comment; 0 Comments.

case study for quality management

  • 31 Mar 2018
  • Working Paper Summaries

Expected Stock Returns Worldwide: A Log-Linear Present-Value Approach

Over the last 20 years, shortcomings of classical asset-pricing models have motivated research in developing alternative methods for measuring ex ante expected stock returns. This study evaluates the main paradigms for deriving firm-level expected return proxies (ERPs) and proposes a new framework for estimating them.

  • 26 Apr 2017

Assessing the Quality of Quality Assessment: The Role of Scheduling

Accurate inspections enable companies to assess the quality, safety, and environmental practices of their business partners, and enable regulators to protect consumers, workers, and the environment. This study finds that inspectors are less stringent later in their workday and after visiting workplaces with fewer problems. Managers and regulators can improve inspection accuracy by mitigating these biases and their consequences.

  • 23 Sep 2013

Status: When and Why It Matters

Status plays a key role in everything from the things we buy to the partnerships we make. Professor Daniel Malter explores when status matters most. Closed for comment; 0 Comments.

  • 16 May 2011

What Loyalty? High-End Customers are First to Flee

Companies offering top-drawer customer service might have a nasty surprise awaiting them when a new competitor comes to town. Their best customers might be the first to defect. Research by Harvard Business School's Ryan W. Buell, Dennis Campbell, and Frances X. Frei. Key concepts include: Companies that offer high levels of customer service can't expect too much loyalty if a new competitor offers even better service. High-end businesses must avoid complacency and continue to proactively increase relative service levels when they're faced with even the potential threat of increased service competition. Even though high-end customers can be fickle, a company that sustains a superior service position in its local market can attract and retain customers who are more valuable over time. Firms rated lower in service quality are more or less immune from the high-end challenger. Closed for comment; 0 Comments.

  • 08 Dec 2008

Thinking Twice About Supply-Chain Layoffs

Cutting the wrong employees can be counterproductive for retailers, according to research from Zeynep Ton. One suggestion: Pay special attention to staff who handle mundane tasks such as stocking and labeling. Your customers do. Closed for comment; 0 Comments.

  • 01 Dec 2006
  • What Do You Think?

How Important Is Quality of Labor? And How Is It Achieved?

A new book by Gregory Clark identifies "labor quality" as the major enticement for capital flows that lead to economic prosperity. By defining labor quality in terms of discipline and attitudes toward work, this argument minimizes the long-term threat of outsourcing to developed economies. By understanding labor quality, can we better confront anxieties about outsourcing and immigration? Closed for comment; 0 Comments.

  • 20 Sep 2004

How Consumers Value Global Brands

What do consumers expect of global brands? Does it hurt to be an American brand? This Harvard Business Review excerpt co-written by HBS professor John A. Quelch identifies the three characteristics consumers look for to make purchase decisions. Closed for comment; 0 Comments.

To read this content please select one of the options below:

Please note you do not have access to teaching notes, total quality management from theory to practice: a case study.

International Journal of Quality & Reliability Management

ISSN : 0265-671X

Article publication date: 1 May 1993

Most quality professionals recommend a core set of attributes as the nucleus of any quality improvement process. These attributes include: (1) clarifying job expectations; (2) setting quality standards; (3) measuring quality improvement; (4) effective super‐vision; (5) listening by management; (6) feedback by management; and (7) effective training. Based on a survey of employees at a medium‐sized manufacturing firm in the United States, it was found that management philosophy and actions can undermine even a proven total quality management (TQM) programme. For the many firms which hire outside consultants to set up a TQM programme, makes recommendations to management to ensure its successful implementation.


Longenecker, C.O. and Scazzero, J.A. (1993), "Total Quality Management from Theory to Practice: A Case Study", International Journal of Quality & Reliability Management , Vol. 10 No. 5.

Copyright © 1993, MCB UP Limited

Related articles

We’re listening — tell us what you think, something didn’t work….

Report bugs here

All feedback is valuable

Please share your general feedback

Join us on our journey

Platform update page.

Visit to discover the latest news and updates

Organizational approach to Total Quality Management: a case study

Profile image of Rafikul Islam

Related Papers

Afizan Amer

case study for quality management

Management Science Letters

Yuni Pambreni

Assoc. Prof. Cross Ogohi Daniel

This study came to examine the impact of Total Quality management (TQM) as instrument in achieving on the organisational performance. TQM is defined as a policy that essentially aimed at establish and deliver high quality products and services that cover all their client's demands and achieve a high level of customer satisfaction. Total Quality Management (TQM) is a management is an administrative approach for firms focused on quality, in light of the cooperation and every individuals and aims at long haul accomplishment through consumer's loyalty and advantages to all individuals from the associations and society. The impact of many organisations failure to properly implement TQM by all administration level, challenges the organisation ability to organise frequent employee training have been a big problem. This research work attempt to find out the effect of TQM execution in the board inclusion, challenges disturbing the usages, impact of employee training and TQM standard application to the accomplishment of organisational goal. The key discoveries demonstrated that rehearsing TQM but it is yet to implement it to the highest level of subscribing to a quality reward system. The implementation of TQM is at the quality assurance level. It was discovered that administration inactions undermined initiative promise to quality and rendered TQM rehearsal insufficient. It is through questionnaires method and oral interview that data are collected from the aforementioned organisations. References were made to journals, related books, internet the aforementioned organization concurred that TQM have in hierarchical execution.

Proceedings of International Conference on Business Management

Dr. S.T.W.S. Yapa

Present-day customers are very conscious of the quality of products and services. They are ready to pay a higher price for a quality product or service. A company that meets such demands gains a competitive advantage in the market over its competitors. One of the best approaches to address this challenge is the implementation of Total Quality Management (TQM). TQM, a systematic management approach and a journey to meet competitive and technological challenges, has been accepted by both service and manufacturing organizations globally. It is commonly agreed that by adopting TQM, the overall effectiveness and performance of organizations can be improved. Despite TQM offers numerous benefits, it is not an easy task to implement it. It is generally experienced that implementation of TQM is hard and painful due to certain barriers that inhibit the successful implementation of TQM. Understanding the factors that are likely to obstruct TQM implementation enables managers to develop more ef...

International Public Management Journal

Teddy Lian Kok Fei , Hal Rainey

This research highlights the factors that have contributed to the implementation and impact of Total Quality Management (TQM) in Malaysian government agencies and to compare agencies that have won quality awards to those that have not.

Quality and Quantity

Ahamad Bahari

Maged Awwad

In the current market economy, companies are constantly struggling to achieve a sustained competitive advantage that will enable them to improve performance, which results in increased competitiveness, and of course, profit. Among the few competitive advantages that can become sustainable competitive advantages, quality plays a crucial role. Recent research shows that about 90% of buyers in the international market, consider quality as having at least equal importance with price in making the decision to purchase. In the opinion of some specialists in economic theory and practice, total quality refers to the holistic approach of quality, which actually means, addressing all aspects of economic and social development and technical of quality. Thus, the holistic approach of quality at organisation-wide involves procedural approach of quality, in this respect, the study focuses on this type of quality approach, i.e. the procedural approach, taking into account the strategic aspects of the continuous improvement of quality, which means in fact, the quality management. Total Quality Management is seen as a way to transform the economies of some countries to be more competitive than others. However, Total Quality Management brings not and will not produce results overnight, it is not a panacea for all the problems facing the organization. Total Quality Management requires a change in organizational culture, which must focus on meeting customer expectations and increasing the involvement of all employees to meet this objective, as an expression of the ethics of continuous improvement. In general, research on quality aiming identify why an organization should adopt the principles of total quality management, but attempts to identify the failing companies' attempts to implement total quality management principles are not so visible. Concerns companies to introduce quality management systems are becoming more pronounced, therefore, in this study we try to identify and present the main reasons that prevent achieving quality and implementation of total quality management system, in other words, we are interested in identify barriers to implementation and development of a quality management system.

Aliza Ramli

Haile Shitahun Mengistie

The main purpose of this paper was to investigate the effect of Total Quality Management practices on organizational performance the case of Bahir Dar Textile SC. It adopted an explanatory research design. The sample size of 71 respondents was drawn using stratified random sampling technique. The study findings of correlation analysis showed that all constructs of total quality management (customer focus, employee's empowerment, top management commitment, continuous empowerment, supplier quality management, process approach) were positively and significantly affect organizational performance. The findings of the multiple regressions analysis showed that the observed changes in organizational performance attributed by the elements of total quality management practice is 49.4% (adjusted r2=.494). The study also reveals from six major elements of total quality management practices, customer focus, top management commitment, continuous improvement, employee's empowerment, and supplier quality management has a positive effect on organizational performance, while process approach doesn't have a significant effect.

Dr. Faisal Talib


Leni siti Nuraeni

Predrag Ilic

Trabajo de fin de Master - UPV

Tanja Mastroiacovo

Revista Arbitrada Interdisciplinaria de Ciencias de la Salud. Salud y Vida

Adisnay Rodríguez Plasencia

Alan Carlos Nery Dos Santos

Roberto Valdivia

Everton Picolotto

Posgrado y Sociedad. Revista Electrónica del Sistema de Estudios de Posgrado

Christian Segura

muhtar pati

Revista Brasileira de Educação do Campo

Cicero Silva

Vestnik Tomskogo gosudarstvennogo universiteta. Kul'turologiya i iskusstvovedenie

Vitaly Kalitzky


Lili Khechuashvili , Mariam Gogichaishvili

RePEc: Research Papers in Economics

Veronica Polin

Journal of Medical Genetics

Victoria Murday

Revista Panamericana de Salud Pública

Pedro Orduñez

Breeding Science

Kazuo Watanabe

Bà đẻ uống nước ép được không

baubi sausinh

CUNY Brooklyn毕业证书 纽约城市大学布鲁克林学院文凭证书

Monthly Notices of the Royal Astronomical Society: Letters

H. Kjeldsen


Dilber C A M A L I Türksoy

The Nonproliferation Review

Nikolai Sokov

Paedagogia: Jurnal Pendidikan

Gita ira yolanda

Sofia Sofia

Biology of Reproduction

Carlos Stocco


  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

Machine Learning and image analysis towards improved energy management in Industry 4.0: a practical case study on quality control

  • Original Article
  • Open access
  • Published: 13 May 2024
  • Volume 17 , article number  48 , ( 2024 )

Cite this article

You have full access to this open access article

case study for quality management

  • Mattia Casini 1 ,
  • Paolo De Angelis 1 ,
  • Marco Porrati 2 ,
  • Paolo Vigo 1 ,
  • Matteo Fasano 1 ,
  • Eliodoro Chiavazzo 1 &
  • Luca Bergamasco   ORCID: 1  

With the advent of Industry 4.0, Artificial Intelligence (AI) has created a favorable environment for the digitalization of manufacturing and processing, helping industries to automate and optimize operations. In this work, we focus on a practical case study of a brake caliper quality control operation, which is usually accomplished by human inspection and requires a dedicated handling system, with a slow production rate and thus inefficient energy usage. We report on a developed Machine Learning (ML) methodology, based on Deep Convolutional Neural Networks (D-CNNs), to automatically extract information from images, to automate the process. A complete workflow has been developed on the target industrial test case. In order to find the best compromise between accuracy and computational demand of the model, several D-CNNs architectures have been tested. The results show that, a judicious choice of the ML model with a proper training, allows a fast and accurate quality control; thus, the proposed workflow could be implemented for an ML-powered version of the considered problem. This would eventually enable a better management of the available resources, in terms of time consumption and energy usage.

Avoid common mistakes on your manuscript.


An efficient use of energy resources in industry is key for a sustainable future (Bilgen, 2014 ; Ocampo-Martinez et al., 2019 ). The advent of Industry 4.0, and of Artificial Intelligence, have created a favorable context for the digitalisation of manufacturing processes. In this view, Machine Learning (ML) techniques have the potential for assisting industries in a better and smart usage of the available data, helping to automate and improve operations (Narciso & Martins, 2020 ; Mazzei & Ramjattan, 2022 ). For example, ML tools can be used to analyze sensor data from industrial equipment for predictive maintenance (Carvalho et al., 2019 ; Dalzochio et al., 2020 ), which allows identification of potential failures in advance, and thus to a better planning of maintenance operations with reduced downtime. Similarly, energy consumption optimization (Shen et al., 2020 ; Qin et al., 2020 ) can be achieved via ML-enabled analysis of available consumption data, with consequent adjustments of the operating parameters, schedules, or configurations to minimize energy consumption while maintaining an optimal production efficiency. Energy consumption forecast (Liu et al., 2019 ; Zhang et al., 2018 ) can also be improved, especially in industrial plants relying on renewable energy sources (Bologna et al., 2020 ; Ismail et al., 2021 ), by analysis of historical data on weather patterns and forecast, to optimize the usage of energy resources, avoid energy peaks, and leverage alternative energy sources or storage systems (Li & Zheng, 2016 ; Ribezzo et al., 2022 ; Fasano et al., 2019 ; Trezza et al., 2022 ; Mishra et al., 2023 ). Finally, ML tools can also serve for fault or anomaly detection (Angelopoulos et al., 2019 ; Md et al., 2022 ), which allows prompt corrective actions to optimize energy usage and prevent energy inefficiencies. Within this context, ML techniques for image analysis (Casini et al., 2024 ) are also gaining increasing interest (Chen et al., 2023 ), for their application to e.g. materials design and optimization (Choudhury, 2021 ), quality control (Badmos et al., 2020 ), process monitoring (Ho et al., 2021 ), or detection of machine failures by converting time series data from sensors to 2D images (Wen et al., 2017 ).

Incorporating digitalisation and ML techniques into Industry 4.0 has led to significant energy savings (Maggiore et al., 2021 ; Nota et al., 2020 ). Projects adopting these technologies can achieve an average of 15% to 25% improvement in energy efficiency in the processes where they were implemented (Arana-Landín et al., 2023 ). For instance, in predictive maintenance, ML can reduce energy consumption by optimizing the operation of machinery (Agrawal et al., 2023 ; Pan et al., 2024 ). In process optimization, ML algorithms can improve energy efficiency by 10-20% by analyzing and adjusting machine operations for optimal performance, thereby reducing unnecessary energy usage (Leong et al., 2020 ). Furthermore, the implementation of ML algorithms for optimal control can lead to energy savings of 30%, because these systems can make real-time adjustments to production lines, ensuring that machines operate at peak energy efficiency (Rahul & Chiddarwar, 2023 ).

In automotive manufacturing, ML-driven quality control can lead to energy savings by reducing the need for redoing parts or running inefficient production cycles (Vater et al., 2019 ). In high-volume production environments such as consumer electronics, novel computer-based vision models for automated detection and classification of damaged packages from intact packages can speed up operations and reduce waste (Shahin et al., 2023 ). In heavy industries like steel or chemical manufacturing, ML can optimize the energy consumption of large machinery. By predicting the optimal operating conditions and maintenance schedules, these systems can save energy costs (Mypati et al., 2023 ). Compressed air is one of the most energy-intensive processes in manufacturing. ML can optimize the performance of these systems, potentially leading to energy savings by continuously monitoring and adjusting the air compressors for peak efficiency, avoiding energy losses due to leaks or inefficient operation (Benedetti et al., 2019 ). ML can also contribute to reducing energy consumption and minimizing incorrectly produced parts in polymer processing enterprises (Willenbacher et al., 2021 ).

Here we focus on a practical industrial case study of brake caliper processing. In detail, we focus on the quality control operation, which is typically accomplished by human visual inspection and requires a dedicated handling system. This eventually implies a slower production rate, and inefficient energy usage. We thus propose the integration of an ML-based system to automatically perform the quality control operation, without the need for a dedicated handling system and thus reduced operation time. To this, we rely on ML tools able to analyze and extract information from images, that is, deep convolutional neural networks, D-CNNs (Alzubaidi et al., 2021 ; Chai et al., 2021 ).

figure 1

Sample 3D model (GrabCAD ) of the considered brake caliper: (a) part without defects, and (b) part with three sample defects, namely a scratch, a partially missing letter in the logo, and a circular painting defect (shown by the yellow squares, from left to right respectively)

A complete workflow for the purpose has been developed and tested on a real industrial test case. This includes: a dedicated pre-processing of the brake caliper images, their labelling and analysis using two dedicated D-CNN architectures (one for background removal, and one for defect identification), post-processing and analysis of the neural network output. Several different D-CNN architectures have been tested, in order to find the best model in terms of accuracy and computational demand. The results show that, a judicious choice of the ML model with a proper training, allows to obtain fast and accurate recognition of possible defects. The best-performing models, indeed, reach over 98% accuracy on the target criteria for quality control, and take only few seconds to analyze each image. These results make the proposed workflow compliant with the typical industrial expectations; therefore, in perspective, it could be implemented for an ML-powered version of the considered industrial problem. This would eventually allow to achieve better performance of the manufacturing process and, ultimately, a better management of the available resources in terms of time consumption and energy expense.

figure 2

Different neural network architectures: convolutional encoder (a) and encoder-decoder (b)

The industrial quality control process that we target is the visual inspection of manufactured components, to verify the absence of possible defects. Due to industrial confidentiality reasons, a representative open-source 3D geometry (GrabCAD ) of the considered parts, similar to the original one, is shown in Fig. 1 . For illustrative purposes, the clean geometry without defects (Fig.  1 (a)) is compared to the geometry with three possible sample defects, namely: a scratch on the surface of the brake caliper, a partially missing letter in the logo, and a circular painting defect (highlighted by the yellow squares, from left to right respectively, in Fig.  1 (b)). Note that, one or multiple defects may be present on the geometry, and that other types of defects may also be considered.

Within the industrial production line, this quality control is typically time consuming, and requires a dedicated handling system with the associated slow production rate and energy inefficiencies. Thus, we developed a methodology to achieve an ML-powered version of the control process. The method relies on data analysis and, in particular, on information extraction from images of the brake calipers via Deep Convolutional Neural Networks, D-CNNs (Alzubaidi et al., 2021 ). The designed workflow for defect recognition is implemented in the following two steps: 1) removal of the background from the image of the caliper, in order to reduce noise and irrelevant features in the image, ultimately rendering the algorithms more flexible with respect to the background environment; 2) analysis of the geometry of the caliper to identify the different possible defects. These two serial steps are accomplished via two different and dedicated neural networks, whose architecture is discussed in the next section.

Convolutional Neural Networks (CNNs) pertain to a particular class of deep neural networks for information extraction from images. The feature extraction is accomplished via convolution operations; thus, the algorithms receive an image as an input, analyze it across several (deep) neural layers to identify target features, and provide the obtained information as an output (Casini et al., 2024 ). Regarding this latter output, different formats can be retrieved based on the considered architecture of the neural network. For a numerical data output, such as that required to obtain a classification of the content of an image (Bhatt et al., 2021 ), e.g. correct or defective caliper in our case, a typical layout of the network involving a convolutional backbone, and a fully-connected network can be adopted (see Fig. 2 (a)). On the other hand, if the required output is still an image, a more complex architecture with a convolutional backbone (encoder) and a deconvolutional head (decoder) can be used (see Fig. 2 (b)).

As previously introduced, our workflow targets the analysis of the brake calipers in a two-step procedure: first, the removal of the background from the input image (e.g. Fig. 1 ); second, the geometry of the caliper is analyzed and the part is classified as acceptable or not depending on the absence or presence of any defect, respectively. Thus, in the first step of the procedure, a dedicated encoder-decoder network (Minaee et al., 2021 ) is adopted to classify the pixels in the input image as brake or background. The output of this model will then be a new version of the input image, where the background pixels are blacked. This helps the algorithms in the subsequent analysis to achieve a better performance, and to avoid bias due to possible different environments in the input image. In the second step of the workflow, a dedicated encoder architecture is adopted. Here, the previous background-filtered image is fed to the convolutional network, and the geometry of the caliper is analyzed to spot possible defects and thus classify the part as acceptable or not. In this work, both deep learning models are supervised , that is, the algorithms are trained with the help of human-labeled data (LeCun et al., 2015 ). Particularly, the first algorithm for background removal is fed with the original image as well as with a ground truth (i.e. a binary image, also called mask , consisting of black and white pixels) which instructs the algorithm to learn which pixels pertain to the brake and which to the background. This latter task is usually called semantic segmentation in Machine Learning and Deep Learning (Géron, 2022 ). Analogously, the second algorithm is fed with the original image (without the background) along with an associated mask, which serves the neural networks with proper instructions to identify possible defects on the target geometry. The required pre-processing of the input images, as well as their use for training and validation of the developed algorithms, are explained in the next sections.

Image pre-processing

Machine Learning approaches rely on data analysis; thus, the quality of the final results is well known to depend strongly on the amount and quality of the available data for training of the algorithms (Banko & Brill, 2001 ; Chen et al., 2021 ). In our case, the input images should be well-representative for the target analysis and include adequate variability of the possible features to allow the neural networks to produce the correct output. In this view, the original images should include, e.g., different possible backgrounds, a different viewing angle of the considered geometry and a different light exposure (as local light reflections may affect the color of the geometry and thus the analysis). The creation of such a proper dataset for specific cases is not always straightforward; in our case, for example, it would imply a systematic acquisition of a large set of images in many different conditions. This would require, in turn, disposing of all the possible target defects on the real parts, and of an automatic acquisition system, e.g., a robotic arm with an integrated camera. Given that, in our case, the initial dataset could not be generated on real parts, we have chosen to generate a well-balanced dataset of images in silico , that is, based on image renderings of the real geometry. The key idea was that, if the rendered geometry is sufficiently close to a real photograph, the algorithms may be instructed on artificially-generated images and then tested on a few real ones. This approach, if properly automatized, clearly allows to easily produce a large amount of images in all the different conditions required for the analysis.

In a first step, starting from the CAD file of the brake calipers, we worked manually using the open-source software Blender (Blender ), to modify the material properties and achieve a realistic rendering. After that, defects were generated by means of Boolean (subtraction) operations between the geometry of the brake caliper and ad-hoc geometries for each defect. Fine tuning on the generated defects has allowed for a realistic representation of the different defects. Once the results were satisfactory, we developed an automated Python code for the procedures, to generate the renderings in different conditions. The Python code allows to: load a given CAD geometry, change the material properties, set different viewing angles for the geometry, add different types of defects (with given size, rotation and location on the geometry of the brake caliper), add a custom background, change the lighting conditions, render the scene and save it as an image.

In order to make the dataset as varied as possible, we introduced three light sources into the rendering environment: a diffuse natural lighting to simulate daylight conditions, and two additional artificial lights. The intensity of each light source and the viewing angle were then made vary randomly, to mimic different daylight conditions and illuminations of the object. This procedure was designed to provide different situations akin to real use, and to make the model invariant to lighting conditions and camera position. Moreover, to provide additional flexibility to the model, the training dataset of images was virtually expanded using data augmentation (Mumuni & Mumuni, 2022 ), where saturation, brightness and contrast were made randomly vary during training operations. This procedure has allowed to consistently increase the number and variety of the images in the training dataset.

The developed automated pre-processing steps easily allows for batch generation of thousands of different images to be used for training of the neural networks. This possibility is key for proper training of the neural networks, as the variability of the input images allows the models to learn all the possible features and details that may change during real operating conditions.

figure 3

Examples of the ground truth for the two target tasks: background removal (a) and defects recognition (b)

The first tests using such virtual database have shown that, although the generated images were very similar to real photographs, the models were not able to properly recognize the target features in the real images. Thus, in a tentative to get closer to a proper set of real images, we decided to adopt a hybrid dataset, where the virtually generated images were mixed with the available few real ones. However, given that some possible defects were missing in the real images, we also decided to manipulate the images to introduce virtual defects on real images. The obtained dataset finally included more than 4,000 images, where 90% was rendered, and 10% was obtained from real images. To avoid possible bias in the training dataset, defects were present in 50% of the cases in both the rendered and real image sets. Thus, in the overall dataset, the real original images with no defects were 5% of the total.

Along with the code for the rendering and manipulation of the images, dedicated Python routines were developed to generate the corresponding data labelling for the supervised training of the networks, namely the image masks. Particularly, two masks were generated for each input image: one for the background removal operation, and one for the defect identification. In both cases, the masks consist of a binary (i.e. black and white) image where all the pixels of a target feature (i.e. the geometry or defect) are assigned unitary values (white); whereas, all the remaining pixels are blacked (zero values). An example of these masks in relation to the geometry in Fig. 1 is shown in Fig. 3 .

All the generated images were then down-sampled, that is, their resolution was reduced to avoid unnecessary large computational times and (RAM) memory usage while maintaining the required level of detail for training of the neural networks. Finally, the input images and the related masks were split into a mosaic of smaller tiles, to achieve a suitable size for feeding the images to the neural networks with even more reduced requirements on the RAM memory. All the tiles were processed, and the whole image reconstructed at the end of the process to visualize the overall final results.

figure 4

Confusion matrix for accuracy assessment of the neural networks models

Choice of the model

Within the scope of the present application, a wide range of possibly suitable models is available (Chen et al., 2021 ). In general, the choice of the best model for a given problem should be made on a case-by-case basis, considering an acceptable compromise between the achievable accuracy and computational complexity/cost. Too simple models can indeed be very fast in the response yet have a reduced accuracy. On the other hand, more complex models can generally provide more accurate results, although typically requiring larger amounts of data for training, and thus longer computational times and energy expense. Hence, testing has the crucial role to allow identification of the best trade-off between these two extreme cases. A benchmark for model accuracy can generally be defined in terms of a confusion matrix, where the model response is summarized into the following possibilities: True Positives (TP), True Negatives (TN), False Positives (FP) and False Negatives (FN). This concept can be summarized as shown in Fig. 4 . For the background removal, Positive (P) stands for pixels belonging to the brake caliper, while Negative (N) for background pixels. For the defect identification model, Positive (P) stands for non-defective geometry, whereas Negative (N) stands for defective geometries. With respect to these two cases, the True/False statements stand for correct or incorrect identification, respectively. The model accuracy can be therefore assessed as Géron ( 2022 )

Based on this metrics, the accuracy for different models can then be evaluated on a given dataset, where typically 80% of the data is used for training and the remaining 20% for validation. For the defect recognition stage, the following models were tested: VGG-16 (Simonyan & Zisserman, 2014 ), ResNet50, ResNet101, ResNet152 (He et al., 2016 ), Inception V1 (Szegedy et al., 2015 ), Inception V4 and InceptionResNet V2 (Szegedy et al., 2017 ). Details on the assessment procedure for the different models are provided in the Supplementary Information file. For the background removal stage, the DeepLabV3 \(+\) (Chen et al., 2018 ) model was chosen as the first option, and no additional models were tested as it directly provided satisfactory results in terms of accuracy and processing time. This gives preliminary indication that, from the point of view of the task complexity of the problem, the defect identification stage can be more demanding with respect to the background removal operation for the case study at hand. Besides the assessment of the accuracy according to, e.g., the metrics discussed above, additional information can be generally collected, such as too low accuracy (indicating insufficient amount of training data), possible bias of the models on the data (indicating a non-well balanced training dataset), or other specific issues related to missing representative data in the training dataset (Géron, 2022 ). This information helps both to correctly shape the training dataset, and to gather useful indications for the fine tuning of the model after its choice has been made.

Background removal

An initial bias of the model for background removal arose on the color of the original target geometry (red color). The model was indeed identifying possible red spots on the background as part of the target geometry as an unwanted output. To improve the model flexibility, and thus its accuracy on the identification of the background, the training dataset was expanded using data augmentation (Géron, 2022 ). This technique allows to artificially increase the size of the training dataset by applying various transformations to the available images, with the goal to improve the performance and generalization ability of the models. This approach typically involves applying geometric and/or color transformations to the original images; in our case, to account for different viewing angles of the geometry, different light exposures, and different color reflections and shadowing effects. These improvements of the training dataset proved to be effective on the performance for the background removal operation, with a validation accuracy finally ranging above 99% and model response time around 1-2 seconds. An example of the output of this operation for the geometry in Fig.  1 is shown in Fig. 5 .

While the results obtained were satisfactory for the original (red) color of the calipers, we decided to test the model ability to be applied on brake calipers of other colors as well. To this, the model was trained and tested on a grayscale version of the images of the calipers, which allows to completely remove any possible bias of the model on a specific color. In this case, the validation accuracy of the model was still obtained to range above 99%; thus, this approach was found to be particularly interesting to make the model suitable for background removal operation even on images including calipers of different colors.

figure 5

Target geometry after background removal

Defect recognition

An overview of the performance of the tested models for the defect recognition operation on the original geometry of the caliper is reported in Table 1 (see also the Supplementary Information file for more details on the assessment of different models). The results report on the achieved validation accuracy ( \(A_v\) ) and on the number of parameters ( \(N_p\) ), with this latter being the total number of parameters that can be trained for each model (Géron, 2022 ) to determine the output. Here, this quantity is adopted as an indicator of the complexity of each model.

figure 6

Accuracy (a) and loss function (b) curves for the Resnet101 model during training

As the results in Table 1 show, the VGG-16 model was quite unprecise for our dataset, eventually showing underfitting (Géron, 2022 ). Thus, we decided to opt for the Resnet and Inception families of models. Both these families of models have demonstrated to be suitable for handling our dataset, with slightly less accurate results being provided by the Resnet50 and InceptionV1. The best results were obtained using Resnet101 and InceptionV4, with very high final accuracy and fast processing time (in the order \(\sim \) 1 second). Finally, Resnet152 and InceptionResnetV2 models proved to be slightly too complex or slower for our case; they indeed provided excellent results but taking longer response times (in the order of \(\sim \) 3-5 seconds). The response time is indeed affected by the complexity ( \(N_p\) ) of the model itself, and by the hardware used. In our work, GPUs were used for training and testing all the models, and the hardware conditions were kept the same for all models.

Based on the results obtained, ResNet101 model was chosen as the best solution for our application, in terms of accuracy and reduced complexity. After fine-tuning operations, the accuracy that we obtained with this model reached nearly 99%, both in the validation and test datasets. This latter includes target real images, that the models have never seen before; thus, it can be used for testing of the ability of the models to generalize the information learnt during the training/validation phase.

The trend in the accuracy increase and loss function decrease during training of the Resnet101 model on the original geometry are shown in Fig. 6 (a) and (b), respectively. Particularly, the loss function quantifies the error between the predicted output during training of the model and the actual target values in the dataset. In our case, the loss function is computed using the cross-entropy function and the Adam optimiser (Géron, 2022 ). The error is expected to reduce during the training, which eventually leads to more accurate predictions of the model on previously-unseen data. The combination of accuracy and loss function trends, along with other control parameters, is typically used and monitored to evaluate the training process, and avoid e.g. under- or over-fitting problems (Géron, 2022 ). As Fig. 6 (a) shows, the accuracy experiences a sudden step increase during the very first training phase (epochs, that is, the number of times the complete database is repeatedly scrutinized by the model during its training (Géron, 2022 )). The accuracy then increases in a smooth fashion with the epochs, until an asymptotic value is reached both for training and validation accuracy. These trends in the two accuracy curves can generally be associated with a proper training; indeed, being the two curves close to each other may be interpreted as an absence of under-fitting problems. On the other hand, Fig. 6 (b) shows that the loss function curves are close to each other, with a monotonically-decreasing trend. This can be interpreted as an absence of over-fitting problems, and thus of proper training of the model.

figure 7

Final results of the analysis on the defect identification: (a) considered input geometry, (b), (c) and (d) identification of a scratch on the surface, partially missing logo, and painting defect respectively (highlighted in the red frames)

Finally, an example output of the overall analysis is shown in Fig. 7 , where the considered input geometry is shown (a), along with the identification of the defects (b), (c) and (d) obtained from the developed protocol. Note that, here the different defects have been separated in several figures for illustrative purposes; however, the analysis yields the identification of defects on one single image. In this work, a binary classification was performed on the considered brake calipers, where the output of the models allows to discriminate between defective or non-defective components based on the presence or absence of any of the considered defects. Note that, fine tuning of this discrimination is ultimately with the user’s requirements. Indeed, the model output yields as the probability (from 0 to 100%) of the possible presence of defects; thus, the discrimination between a defective or non-defective part is ultimately with the user’s choice of the acceptance threshold for the considered part (50% in our case). Therefore, stricter or looser criteria can be readily adopted. Eventually, for particularly complex cases, multiple models may also be used concurrently for the same task, and the final output defined based on a cross-comparison of the results from different models. As a last remark on the proposed procedure, note that here we adopted a binary classification based on the presence or absence of any defect; however, further classification of the different defects could also be implemented, to distinguish among different types of defects (multi-class classification) on the brake calipers.

Energy saving

Illustrative scenarios.

Given that the proposed tools have not yet been implemented and tested within a real industrial production line, we analyze here three perspective scenarios to provide a practical example of the potential for energy savings in an industrial context. To this, we consider three scenarios, which compare traditional human-based control operations and a quality control system enhanced by the proposed Machine Learning (ML) tools. Specifically, here we analyze a generic brake caliper assembly line formed by 14 stations, as outlined in Table 1 in the work by Burduk and Górnicka ( 2017 ). This assembly line features a critical inspection station dedicated to defect detection, around which we construct three distinct scenarios to evaluate the efficacy of traditional human-based control operations versus a quality control system augmented by the proposed ML-based tools, namely:

First Scenario (S1): Human-Based Inspection. The traditional approach involves a human operator responsible for the inspection tasks.

Second Scenario (S2): Hybrid Inspection. This scenario introduces a hybrid inspection system where our proposed ML-based automatic detection tool assists the human inspector. The ML tool analyzes the brake calipers and alerts the human inspector only when it encounters difficulties in identifying defects, specifically when the probability of a defect being present or absent falls below a certain threshold. This collaborative approach aims to combine the precision of ML algorithms with the experience of human inspectors, and can be seen as a possible transition scenario between the human-based and a fully-automated quality control operation.

Third Scenario (S3): Fully Automated Inspection. In the final scenario, we conceive a completely automated defect inspection station powered exclusively by our ML-based detection system. This setup eliminates the need for human intervention, relying entirely on the capabilities of the ML tools to identify defects.

For simplicity, we assume that all the stations are aligned in series without buffers, minimizing unnecessary complications in our estimations. To quantify the beneficial effects of implementing ML-based quality control, we adopt the Overall Equipment Effectiveness (OEE) as the primary metric for the analysis. OEE is a comprehensive measure derived from the product of three critical factors, as outlined by Nota et al. ( 2020 ): Availability (the ratio of operating time with respect to planned production time); Performance (the ratio of actual output with respect to the theoretical maximum output); and Quality (the ratio of the good units with respect to the total units produced). In this section, we will discuss the details of how we calculate each of these factors for the various scenarios.

To calculate Availability ( A ), we consider an 8-hour work shift ( \(t_{shift}\) ) with 30 minutes of breaks ( \(t_{break}\) ) during which we assume production stop (except for the fully automated scenario), and 30 minutes of scheduled downtime ( \(t_{sched}\) ) required for machine cleaning and startup procedures. For unscheduled downtime ( \(t_{unsched}\) ), primarily due to machine breakdowns, we assume an average breakdown probability ( \(\rho _{down}\) ) of 5% for each machine, with an average repair time of one hour per incident ( \(t_{down}\) ). Based on these assumptions, since the Availability represents the ratio of run time ( \(t_{run}\) ) to production time ( \(t_{pt}\) ), it can be calculated using the following formula:

with the unscheduled downtime being computed as follows:

where N is the number of machines in the production line and \(1-\left( 1-\rho _{down}\right) ^{N}\) represents the probability that at least one machine breaks during the work shift. For the sake of simplicity, the \(t_{down}\) is assumed constant regardless of the number of failures.

Table  2 presents the numerical values used to calculate Availability in the three scenarios. In the second scenario, we can observe that integrating the automated station leads to a decrease in the first factor of the OEE analysis, which can be attributed to the additional station for automated quality-control (and the related potential failure). This ultimately increases the estimation of the unscheduled downtime. In the third scenario, the detrimental effect of the additional station compensates the beneficial effect of the automated quality control on reducing the need for pauses during operator breaks; thus, the Availability for the third scenario yields as substantially equivalent to the first one (baseline).

The second factor of OEE, Performance ( P ), assesses the operational efficiency of production equipment relative to its maximum designed speed ( \(t_{line}\) ). This evaluation includes accounting for reductions in cycle speed and minor stoppages, collectively termed as speed losses . These losses are challenging to measure in advance, as performance is typically measured using historical data from the production line. For this analysis, we are constrained to hypothesize a reasonable estimate of 60 seconds of time lost to speed losses ( \(t_{losses}\) ) for each work cycle. Although this assumption may appear strong, it will become evident later that, within the context of this analysis – particularly regarding the impact of automated inspection on energy savings – the Performance (like the Availability) is only marginally influenced by introducing an automated inspection station. To account for the effect of automated inspection on the assembly line speed, we keep the time required by the other 13 stations ( \(t^*_{line}\) ) constant while varying the time allocated for visual inspection ( \(t_{inspect}\) ). According to Burduk and Górnicka ( 2017 ), the total operation time of the production line, excluding inspection, is 1263 seconds, with manual visual inspection taking 38 seconds. For the fully automated third scenario, we assume an inspection time of 5 seconds, which encloses the photo collection, pre-processing, ML-analysis, and post-processing steps. In the second scenario, instead, we add an additional time to the pure automatic case to consider the cases when the confidence of the ML model falls below 90%. We assume this happens once in every 10 inspections, which is a conservative estimate, higher than that we observed during model testing. This results in adding 10% of the human inspection time to the fully automated time. Thus, when \(t_{losses}\) are known, Performance can be expressed as follows:

The calculated values for Performance are presented in Table  3 , and we can note that the modification in inspection time has a negligible impact on this factor since it does not affect the speed loss or, at least to our knowledge, there is no clear evidence to suggest that the introduction of a new inspection station would alter these losses. Moreover, given the specific linear layout of the considered production line, the inspection time change has only a marginal effect on enhancing the production speed. However, this approach could potentially bias our scenario towards always favouring automation. To evaluate this hypothesis, a sensitivity analysis which explores scenarios where the production line operates at a faster pace will be discussed in the next subsection.

The last factor, Quality ( Q ), quantifies the ratio of compliant products out of the total products manufactured, effectively filtering out items that fail to meet the quality standards due to defects. Given the objective of our automated algorithm, we anticipate this factor of the OEE to be significantly enhanced by implementing the ML-based automated inspection station. To estimate it, we assume a constant defect probability for the production line ( \(\rho _{def}\) ) at 5%. Consequently, the number of defective products ( \(N_{def}\) ) during the work shift is calculated as \(N_{unit} \cdot \rho _{def}\) , where \(N_{unit}\) represents the average number of units (brake calipers) assembled on the production line, defined as:

To quantify defective units identified, we consider the inspection accuracy ( \(\rho _{acc}\) ), where for human visual inspection, the typical accuracy is 80% (Sundaram & Zeid, 2023 ), and for the ML-based station, we use the accuracy of our best model, i.e., 99%. Additionally, we account for the probability of the station mistakenly identifying a caliper as with a defect even if it is defect-free, i.e., the false negative rate ( \(\rho _{FN}\) ), defined as

In the absence of any reasonable evidence to justify a bias on one mistake over others, we assume a uniform distribution for both human and automated inspections regarding error preference, i.e. we set \(\rho ^{H}_{FN} = \rho ^{ML}_{FN} = \rho _{FN} = 50\%\) . Thus, the number of final compliant goods ( \(N_{goods}\) ), i.e., the calipers that are identified as quality-compliant, can be calculated as:

where \(N_{detect}\) is the total number of detected defective units, comprising TN (true negatives, i.e. correctly identified defective calipers) and FN (false negatives, i.e. calipers mistakenly identified as defect-free). The Quality factor can then be computed as:

Table  4 summarizes the Quality factor calculation, showcasing the substantial improvement brought by the ML-based inspection station due to its higher accuracy compared to human operators.

figure 8

Overall Equipment Effectiveness (OEE) analysis for three scenarios (S1: Human-Based Inspection, S2: Hybrid Inspection, S3: Fully Automated Inspection). The height of the bars represents the percentage of the three factors A : Availability, P : Performance, and Q : Quality, which can be interpreted from the left axis. The green bars indicate the OEE value, derived from the product of these three factors. The red line shows the recall rate, i.e. the probability that a defective product is rejected by the client, with values displayed on the right red axis

Finally, we can determine the Overall Equipment Effectiveness by multiplying the three factors previously computed. Additionally, we can estimate the recall rate ( \(\rho _{R}\) ), which reflects the rate at which a customer might reject products. This is derived from the difference between the total number of defective units, \(N_{def}\) , and the number of units correctly identified as defective, TN , indicating the potential for defective brake calipers that may bypass the inspection process. In Fig.  8 we summarize the outcomes of the three scenarios. It is crucial to note that the scenarios incorporating the automated defect detector, S2 and S3, significantly enhance the Overall Equipment Effectiveness, primarily through substantial improvements in the Quality factor. Among these, the fully automated inspection scenario, S3, emerges as a slightly superior option, thanks to its additional benefit in removing the breaks and increasing the speed of the line. However, given the different assumptions required for this OEE study, we shall interpret these results as illustrative, and considering them primarily as comparative with the baseline scenario only. To analyze the sensitivity of the outlined scenarios on the adopted assumptions, we investigate the influence of the line speed and human accuracy on the results in the next subsection.

Sensitivity analysis

The scenarios described previously are illustrative and based on several simplifying hypotheses. One of such hypotheses is that the production chain layout operates entirely in series, with each station awaiting the arrival of the workpiece from the preceding station, resulting in a relatively slow production rate (1263 seconds). This setup can be quite different from reality, where slower operations can be accelerated by installing additional machines in parallel to balance the workload and enhance productivity. Moreover, we utilized a literature value of 80% for the accuracy of the human visual inspector operator, as reported by Sundaram and Zeid ( 2023 ). However, this accuracy can vary significantly due to factors such as the experience of the inspector and the defect type.

figure 9

Effect of assembly time for stations (excluding visual inspection), \(t^*_{line}\) , and human inspection accuracy, \(\rho _{acc}\) , on the OEE analysis. (a) The subplot shows the difference between the scenario S2 (Hybrid Inspection) and the baseline scenario S1 (Human Inspection), while subplot (b) displays the difference between scenario S3 (Fully Automated Inspection) and the baseline. The maps indicate in red the values of \(t^*_{line}\) and \(\rho _{acc}\) where the integration of automated inspection stations can significantly improve OEE, and in blue where it may lower the score. The dashed lines denote the breakeven points, and the circled points pinpoint the values of the scenarios used in the “Illustrative scenario” Subsection.

A sensitivity analysis on these two factors was conducted to address these variations. The assembly time of the stations (excluding visual inspection), \(t^*_{line}\) , was varied from 60 s to 1500 s, and the human inspection accuracy, \(\rho _{acc}\) , ranged from 50% (akin to a random guesser) to 100% (representing an ideal visual inspector); meanwhile, the other variables were kept fixed.

The comparison of the OEE enhancement for the two scenarios employing ML-based inspection against the baseline scenario is displayed in the two maps in Fig.  9 . As the figure shows, due to the high accuracy and rapid response of the proposed automated inspection station, the area representing regions where the process may benefit energy savings in the assembly lines (indicated in red shades) is significantly larger than the areas where its introduction could degrade performance (indicated in blue shades). However, it can be also observed that the automated inspection could be superfluous or even detrimental in those scenarios where human accuracy and assembly speed are very high, indicating an already highly accurate workflow. In these cases, and particularly for very fast production lines, short times for quality control can be expected to be key (beyond accuracy) for the optimization.

Finally, it is important to remark that the blue region (areas below the dashed break-even lines) might expand if the accuracy of the neural networks for defect detection is lower when implemented in an real production line. This indicates the necessity for new rounds of active learning and an augment of the ratio of real images in the database, to eventually enhance the performance of the ML model.


Industrial quality control processes on manufactured parts are typically achieved by human visual inspection. This usually requires a dedicated handling system, and generally results in a slower production rate, with the associated non-optimal use of the energy resources. Based on a practical test case for quality control on brake caliper manufacturing, in this work we have reported on a developed workflow for integration of Machine Learning methods to automatize the process. The proposed approach relies on image analysis via Deep Convolutional Neural Networks. These models allow to efficiently extract information from images, thus possibly representing a valuable alternative to human inspection.

The proposed workflow relies on a two-step procedure on the images of the brake calipers: first, the background is removed from the image; second, the geometry is inspected to identify possible defects. These two steps are accomplished thanks to two dedicated neural network models, an encoder-decoder and an encoder network, respectively. Training of these neural networks typically requires a large number of representative images for the problem. Given that, one such database is not always readily available, we have presented and discussed an alternative methodology for the generation of the input database using 3D renderings. While integration of the database with real photographs was required for optimal results, this approach has allowed fast and flexible generation of a large base of representative images. The pre-processing steps required for data feeding to the neural networks and their training has been also discussed.

Several models have been tested and evaluated, and the best one for the considered case identified. The obtained accuracy for defect identification reaches \(\sim \) 99% of the tested cases. Moreover, the response of the models is fast (in the order of few seconds) on each image, which makes them compliant with the most typical industrial expectations.

In order to provide a practical example of possible energy savings when implementing the proposed ML-based methodology for quality control, we have analyzed three perspective industrial scenarios: a baseline scenario, where quality control tasks are performed by a human inspector; a hybrid scenario, where the proposed ML automatic detection tool assists the human inspector; a fully-automated scenario, where we envision a completely automated defect inspection. The results show that the proposed tools may help increasing the Overall Equipment Effectiveness up to \(\sim \) 10% with respect to the considered baseline scenario. However, a sensitivity analysis on the speed of the production line and on the accuracy of the human inspector has also shown that the automated inspection could be superfluous or even detrimental in those cases where human accuracy and assembly speed are very high. In these cases, reducing the time required for quality control can be expected to be the major controlling parameter (beyond accuracy) for optimization.

Overall the results show that, with a proper tuning, these models may represent a valuable resource for integration into production lines, with positive outcomes on the overall effectiveness, and thus ultimately leading to a better use of the energy resources. To this, while the practical implementation of the proposed tools can be expected to require contained investments (e.g. a portable camera, a dedicated workstation and an operator with proper training), in field tests on a real industrial line would be required to confirm the potential of the proposed technology.

Agrawal, R., Majumdar, A., Kumar, A., & Luthra, S. (2023). Integration of artificial intelligence in sustainable manufacturing: Current status and future opportunities. Operations Management Research, 1–22.

Alzubaidi, L., Zhang, J., Humaidi, A. J., Al-Dujaili, A., Duan, Y., Al-Shamma, O., Santamaría, J., Fadhel, M. A., Al-Amidie, M., & Farhan, L. (2021). Review of deep learning: Concepts, cnn architectures, challenges, applications, future directions. Journal of big Data, 8 , 1–74.

Article   Google Scholar  

Angelopoulos, A., Michailidis, E. T., Nomikos, N., Trakadas, P., Hatziefremidis, A., Voliotis, S., & Zahariadis, T. (2019). Tackling faults in the industry 4.0 era-a survey of machine—learning solutions and key aspects. Sensors, 20 (1), 109.

Arana-Landín, G., Uriarte-Gallastegi, N., Landeta-Manzano, B., & Laskurain-Iturbe, I. (2023). The contribution of lean management—industry 4.0 technologies to improving energy efficiency. Energies, 16 (5), 2124.

Badmos, O., Kopp, A., Bernthaler, T., & Schneider, G. (2020). Image-based defect detection in lithium-ion battery electrode using convolutional neural networks. Journal of Intelligent Manufacturing, 31 , 885–897.

Banko, M., & Brill, E. (2001). Scaling to very very large corpora for natural language disambiguation. In Proceedings of the 39th annual meeting of the association for computational linguistics (pp. 26–33).

Benedetti, M., Bonfà, F., Introna, V., Santolamazza, A., & Ubertini, S. (2019). Real time energy performance control for industrial compressed air systems: Methodology and applications. Energies, 12 (20), 3935.

Bhatt, D., Patel, C., Talsania, H., Patel, J., Vaghela, R., Pandya, S., Modi, K., & Ghayvat, H. (2021). Cnn variants for computer vision: History, architecture, application, challenges and future scope. Electronics, 10 (20), 2470.

Bilgen, S. (2014). Structure and environmental impact of global energy consumption. Renewable and Sustainable Energy Reviews, 38 , 890–902.

Blender. (2023). Open-source software. . Accessed 18 Apr 2023.

Bologna, A., Fasano, M., Bergamasco, L., Morciano, M., Bersani, F., Asinari, P., Meucci, L., & Chiavazzo, E. (2020). Techno-economic analysis of a solar thermal plant for large-scale water pasteurization. Applied Sciences, 10 (14), 4771.

Burduk, A., & Górnicka, D. (2017). Reduction of waste through reorganization of the component shipment logistics. Research in Logistics & Production, 7 (2), 77–90.

Carvalho, T. P., Soares, F. A., Vita, R., Francisco, R., d. P., Basto, J. P., & Alcalá, S. G. (2019). A systematic literature review of machine learning methods applied to predictive maintenance. Computers & Industrial Engineering, 137 , 106024.

Casini, M., De Angelis, P., Chiavazzo, E., & Bergamasco, L. (2024). Current trends on the use of deep learning methods for image analysis in energy applications. Energy and AI, 15 , 100330.

Chai, J., Zeng, H., Li, A., & Ngai, E. W. (2021). Deep learning in computer vision: A critical review of emerging techniques and application scenarios. Machine Learning with Applications, 6 , 100134.

Chen, L. C., Zhu, Y., Papandreou, G., Schroff, F., Adam, H. (2018). Encoder-decoder with atrous separable convolution for semantic image segmentation. In Proceedings of the European conference on computer vision (ECCV) (pp. 801–818).

Chen, L., Li, S., Bai, Q., Yang, J., Jiang, S., & Miao, Y. (2021). Review of image classification algorithms based on convolutional neural networks. Remote Sensing, 13 (22), 4712.

Chen, T., Sampath, V., May, M. C., Shan, S., Jorg, O. J., Aguilar Martín, J. J., Stamer, F., Fantoni, G., Tosello, G., & Calaon, M. (2023). Machine learning in manufacturing towards industry 4.0: From ‘for now’to ‘four-know’. Applied Sciences, 13 (3), 1903.

Choudhury, A. (2021). The role of machine learning algorithms in materials science: A state of art review on industry 4.0. Archives of Computational Methods in Engineering, 28 (5), 3361–3381.

Dalzochio, J., Kunst, R., Pignaton, E., Binotto, A., Sanyal, S., Favilla, J., & Barbosa, J. (2020). Machine learning and reasoning for predictive maintenance in industry 4.0: Current status and challenges. Computers in Industry, 123 , 103298.

Fasano, M., Bergamasco, L., Lombardo, A., Zanini, M., Chiavazzo, E., & Asinari, P. (2019). Water/ethanol and 13x zeolite pairs for long-term thermal energy storage at ambient pressure. Frontiers in Energy Research, 7 , 148.

Géron, A. (2022). Hands-on machine learning with Scikit-Learn, Keras, and TensorFlow . O’Reilly Media, Inc.

GrabCAD. (2023). Brake caliper 3D model by Mitulkumar Sakariya from the GrabCAD free library (non-commercial public use). . Accessed 18 Apr 2023.

He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 770–778).

Ho, S., Zhang, W., Young, W., Buchholz, M., Al Jufout, S., Dajani, K., Bian, L., & Mozumdar, M. (2021). Dlam: Deep learning based real-time porosity prediction for additive manufacturing using thermal images of the melt pool. IEEE Access, 9 , 115100–115114.

Ismail, M. I., Yunus, N. A., & Hashim, H. (2021). Integration of solar heating systems for low-temperature heat demand in food processing industry-a review. Renewable and Sustainable Energy Reviews, 147 , 111192.

LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521 (7553), 436–444.

Leong, W. D., Teng, S. Y., How, B. S., Ngan, S. L., Abd Rahman, A., Tan, C. P., Ponnambalam, S., & Lam, H. L. (2020). Enhancing the adaptability: Lean and green strategy towards the industry revolution 4.0. Journal of cleaner production, 273 , 122870.

Liu, Z., Wang, X., Zhang, Q., & Huang, C. (2019). Empirical mode decomposition based hybrid ensemble model for electrical energy consumption forecasting of the cement grinding process. Measurement, 138 , 314–324.

Li, G., & Zheng, X. (2016). Thermal energy storage system integration forms for a sustainable future. Renewable and Sustainable Energy Reviews, 62 , 736–757.

Maggiore, S., Realini, A., Zagano, C., & Bazzocchi, F. (2021). Energy efficiency in industry 4.0: Assessing the potential of industry 4.0 to achieve 2030 decarbonisation targets. International Journal of Energy Production and Management, 6 (4), 371–381.

Mazzei, D., & Ramjattan, R. (2022). Machine learning for industry 4.0: A systematic review using deep learning-based topic modelling. Sensors, 22 (22), 8641.

Md, A. Q., Jha, K., Haneef, S., Sivaraman, A. K., & Tee, K. F. (2022). A review on data-driven quality prediction in the production process with machine learning for industry 4.0. Processes, 10 (10), 1966.

Minaee, S., Boykov, Y., Porikli, F., Plaza, A., Kehtarnavaz, N., & Terzopoulos, D. (2021). Image segmentation using deep learning: A survey. IEEE transactions on pattern analysis and machine intelligence, 44 (7), 3523–3542.

Google Scholar  

Mishra, S., Srivastava, R., Muhammad, A., Amit, A., Chiavazzo, E., Fasano, M., & Asinari, P. (2023). The impact of physicochemical features of carbon electrodes on the capacitive performance of supercapacitors: a machine learning approach. Scientific Reports, 13 (1), 6494.

Mumuni, A., & Mumuni, F. (2022). Data augmentation: A comprehensive survey of modern approaches. Array, 16 , 100258.

Mypati, O., Mukherjee, A., Mishra, D., Pal, S. K., Chakrabarti, P. P., & Pal, A. (2023). A critical review on applications of artificial intelligence in manufacturing. Artificial Intelligence Review, 56 (Suppl 1), 661–768.

Narciso, D. A., & Martins, F. (2020). Application of machine learning tools for energy efficiency in industry: A review. Energy Reports, 6 , 1181–1199.

Nota, G., Nota, F. D., Peluso, D., & Toro Lazo, A. (2020). Energy efficiency in industry 4.0: The case of batch production processes. Sustainability, 12 (16), 6631.

Ocampo-Martinez, C., et al. (2019). Energy efficiency in discrete-manufacturing systems: Insights, trends, and control strategies. Journal of Manufacturing Systems, 52 , 131–145.

Pan, Y., Hao, L., He, J., Ding, K., Yu, Q., & Wang, Y. (2024). Deep convolutional neural network based on self-distillation for tool wear recognition. Engineering Applications of Artificial Intelligence, 132 , 107851.

Qin, J., Liu, Y., Grosvenor, R., Lacan, F., & Jiang, Z. (2020). Deep learning-driven particle swarm optimisation for additive manufacturing energy optimisation. Journal of Cleaner Production, 245 , 118702.

Rahul, M., & Chiddarwar, S. S. (2023). Integrating virtual twin and deep neural networks for efficient and energy-aware robotic deburring in industry 4.0. International Journal of Precision Engineering and Manufacturing, 24 (9), 1517–1534.

Ribezzo, A., Falciani, G., Bergamasco, L., Fasano, M., & Chiavazzo, E. (2022). An overview on the use of additives and preparation procedure in phase change materials for thermal energy storage with a focus on long term applications. Journal of Energy Storage, 53 , 105140.

Shahin, M., Chen, F. F., Hosseinzadeh, A., Bouzary, H., & Shahin, A. (2023). Waste reduction via image classification algorithms: Beyond the human eye with an ai-based vision. International Journal of Production Research, 1–19.

Shen, F., Zhao, L., Du, W., Zhong, W., & Qian, F. (2020). Large-scale industrial energy systems optimization under uncertainty: A data-driven robust optimization approach. Applied Energy, 259 , 114199.

Simonyan, K., & Zisserman, A. (2014). Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556 .

Sundaram, S., & Zeid, A. (2023). Artificial Intelligence-Based Smart Quality Inspection for Manufacturing. Micromachines, 14 (3), 570.

Szegedy, C., Ioffe, S., Vanhoucke, V., & Alemi, A. (2017). Inception-v4, inception-resnet and the impact of residual connections on learning. In Proceedings of the AAAI conference on artificial intelligence (vol. 31).

Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., Anguelov, D., Erhan, D., Vanhoucke, V., & Rabinovich, A. (2015). Going deeper with convolutions. In Proceedings of the IEEE conference on computer vision and pattern recognition (pp. 1–9).

Trezza, G., Bergamasco, L., Fasano, M., & Chiavazzo, E. (2022). Minimal crystallographic descriptors of sorption properties in hypothetical mofs and role in sequential learning optimization. npj Computational Materials, 8 (1), 123.

Vater, J., Schamberger, P., Knoll, A., & Winkle, D. (2019). Fault classification and correction based on convolutional neural networks exemplified by laser welding of hairpin windings. In 2019 9th International Electric Drives Production Conference (EDPC) (pp. 1–8). IEEE.

Wen, L., Li, X., Gao, L., & Zhang, Y. (2017). A new convolutional neural network-based data-driven fault diagnosis method. IEEE Transactions on Industrial Electronics, 65 (7), 5990–5998.

Willenbacher, M., Scholten, J., & Wohlgemuth, V. (2021). Machine learning for optimization of energy and plastic consumption in the production of thermoplastic parts in sme. Sustainability, 13 (12), 6800.

Zhang, X. H., Zhu, Q. X., He, Y. L., & Xu, Y. (2018). Energy modeling using an effective latent variable based functional link learning machine. Energy, 162 , 883–891.

Download references


This work has been supported by GEFIT S.p.a.

Open access funding provided by Politecnico di Torino within the CRUI-CARE Agreement.

Author information

Authors and affiliations.

Department of Energy, Politecnico di Torino, Turin, Italy

Mattia Casini, Paolo De Angelis, Paolo Vigo, Matteo Fasano, Eliodoro Chiavazzo & Luca Bergamasco

R &D Department, GEFIT S.p.a., Alessandria, Italy

Marco Porrati

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Luca Bergamasco .

Ethics declarations

Conflict of interest statement.

The authors declare no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file 1 (pdf 354 KB)

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit .

Reprints and permissions

About this article

Casini, M., De Angelis, P., Porrati, M. et al. Machine Learning and image analysis towards improved energy management in Industry 4.0: a practical case study on quality control. Energy Efficiency 17 , 48 (2024).

Download citation

Received : 22 July 2023

Accepted : 28 April 2024

Published : 13 May 2024


Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Industry 4.0
  • Energy management
  • Artificial intelligence
  • Machine learning
  • Deep learning
  • Convolutional neural networks
  • Computer vision
  • Find a journal
  • Publish with us
  • Track your research

Top 10 Project Management Case Studies with Examples 2024

1. nasa's mars exploration rover: innovative project management in space exploration., 2. apple's iphone development: delivering revolutionary products with precision., 3. tesla's gigafactory construction: exemplary project execution in renewable energy., 4. netflix's content expansion: agile management in the entertainment industry., 5. amazon's prime air drone delivery: pioneering logistics project management., 6. google's waymo self-driving cars: cutting-edge technology meets project efficiency., 7. mcdonald's digital transformation: adaptive project management in fast food., 8. ikea's sustainable store design: eco-friendly project implementation in retail., 9. unicef's vaccine distribution: humanitarian project management at scale., 10. spacex's starlink satellite network: revolutionizing global connectivity with project prowess., discover more stories.

  • Project managers
  • IT professionals
  • Why Atlassian?

Flexible project management

Knowledge, all in one place

Jira Service Management

High-velocity service delivery

Organized & visualized work

Jira Product Discovery New

Capture & prioritize ideas

Compass New

Optimize software health

Quick, async video updates

Enhanced cloud security

Collaborative code repos

Ship high-quality, innovative software faster

Collaborative strategies for marketing success

Deliver exceptional operations and support services.


Connect thousands of apps to your Atlassian products

Case studies & stories powered by teamwork

Compliant solutions for the public sector

Enterprise-grade & highly performant infrastructure

Our deeply integrated, reliable & secure platform

Trust center

Ensure your data’s security, compliance & availability

Customer Support

Ask questions, report bugs & give us feedback

Find Partners

Consulting, training & product customization support

Migration Program

Step-by-step guidance for your Atlassian cloud migration

Learn and expand your skill set for all things Atlassian

case study for quality management

  • # of Agile at Scale Sales: 5
  • # of Agile at Scale Delivery: 1
  • # of Cloud Migration Sales: 5
  • # of Cloud Migration Delivery: 2
  • # of ITSM Sales: 6
  • # of ITSM Delivery: 7
  • # of Product Development Sales: 16
  • # of Product Dev Delivery: 11
  • Total # of Delivery Accreditations: 21
  • Total # of Sales Accreditations: 32

Atlassian Partner of the Year 2019: ITSM

  • Cloud Migration Services
  • Consulting Services
  • Managed Services & Support
  • Custom Development
  • Implementation Services
  • License Management

IT Service Management

About Eficode US

Eficode is committed to helping organizations of all sizes and complexities globally to fully leverage their Atlassian solutions. We believe that with the right ways of working, tools, and skillset, anything is possible.


  • Atlassian one-stop-shop: Our certified specialists know the products inside out and provide all the services you need, including expert consulting, cloud and organization transformations, Eficode ROOT managed services, and training and license management. 
  • DevOps and Agile transformations: We guide digital transformations with a focus on DevOps and Agile methodologies, helping you to optimize and accelerate your software development lifecycle.
  • Service management: From ITSM to broader work management, we provide tailored services that connect products to customer success. As the founder of Assets, now included in Jira Service Management, we ensure our clients get the most out of their JSM investment.
  • Cloud mastery and AWS hosting: Our expertise extends to full cloud transformations and efficient AWS hosting, ensuring scalable, secure, and compliant infrastructure solutions.


  • Atlassian Platinum Solution Partner: We have met Atlassian's highest partner criteria and have a proven practice that can scale from small to large customers.
  • Awarded Partner of the Year 11 times: Awarded within Services, DevOps, and ITSM categories, our award collection sums up to a total of 11.
  • Specialized partner: We have reached Specialization in all 3 areas - IT Service Management, Cloud, and Agile at Scale–proof that we offer high-quality and tailored services to solve your business challenges.
  • Vast experience : Our 20+ years of experience as an Atlassian Partner gives us a deep understanding of your unique challenges and goals.
  • Global reach, local touch: Eficode offers the robustness of a global powerhouse while maintaining the personalized, dedicated support you'd expect from a local partner.

Eficode has expanded through the acquisitions of Clearvision, Riada, Beecom, Contribyte, Avoset & Praqma who are now part of the Eficode Group.

Together, we managed a tough deadline thanks to joint efforts. Eficode has delivered what they promised 100 percent.

case study for quality management

Case Studies

case study for quality management


  1. case study about quality

    case study for quality management

  2. Solved Part II:Mini case study- Quality Management(5 Marks)

    case study for quality management

  3. Chapter 4 : Case Studies of Alternative Quality Management

    case study for quality management

  4. healthcare case study quality management

    case study for quality management

  5. Case Study: How to Get Started with Quality Performance Management

    case study for quality management

  6. Quality Management Case Study

    case study for quality management


  1. دورة أسس الجودة وتطبيق نظام ادارة الجودة الايزو 9001:2015 ISO البروفيسورهاني العمري

  2. Presentation Case Study Quality Control

  3. SAP S4Hana Tutorial/Follow-Along -- QM Case Study Step 19 updated

  4. SAP S4Hana Tutorial/Follow-Along -- QM Case Study Step 18 updated

  5. SAP S4Hana Tutorial/Follow-Along -- QM Case Study Step 12 updated

  6. SAP S4Hana Tutorial/Follow-Along -- QM Case Study Step 4 updated


  1. Case Studies

    Using DMAIC to Improve Nursing Shift-Change Assignments. In this case study involving an anonymous hospital, nursing department leaders sought to improve efficiency of their staff's shift change assignments. Upon value stream mapping the process, team members identified the shift nursing report took 43 minutes on average to complete.

  2. Case Study: Quality Management System at Coca Cola Company

    The successfulness of this system can be measured by assessing the consistency of the product quality. Coca Cola say that 'Our Company's Global Product Quality Index rating has consistently reached averages near 94 since 2007, with a 94.3 in 2010, while our Company Global Package Quality Index has steadily increased since 2007 to a 92.6 rating in 2010, our highest value to date'.

  3. Total quality management: three case studies from around the world

    According to Teresa Whitacre, of international consulting firm ASQ, proper quality management also boosts a company's profitability. "Total quality management allows the company to look at their management system as a whole entity — not just an output of the quality department," she says. "Total quality means the organisation looks at ...

  4. Quality management

    Manage Your Human Sigma. Organizational Development Magazine Article. John H. Fleming. Curt Coffman. James K. Harter. If sales and service organizations are to improve, they must learn to measure ...

  5. Smart quality assurance approach

    Case study. Healthcare companies can use smart quality to redesign the quality management review process and see results quickly. At one pharmaceutical and medtech company, smart visualization of connected, cross-functional metrics significantly improved the effectiveness and efficiency of quality management review at all levels.

  6. Case Studies of Quality Improvement Initiatives

    This case study (PDF, 1.2 MB) focuses on the successful quality improvement methods employed by a health plan to improve customer service for its members. Published by RAND, August 2007. Published by RAND, August 2007.

  7. Quality: Articles, Research, & Case Studies on Quality

    by by Jim Heskett. A new book by Gregory Clark identifies "labor quality" as the major enticement for capital flows that lead to economic prosperity. By defining labor quality in terms of discipline and attitudes toward work, this argument minimizes the long-term threat of outsourcing to developed economies.

  8. Total Quality Management : Key Concepts and Case Studies

    Butterworth-Heinemann, Oct 28, 2016 - Technology & Engineering - 580 pages. Total Quality Management: Key Concepts and Case Studies provides the full range of management principles and practices that govern the quality function. The book covers the fundamentals and background needed, as well as industry case studies and comprehensive topic ...

  9. The Pillars of Project Management Quality: Insights, Case Studies, and

    Quality Objective serves as one of the essential pillars holding up the structure of a project's success. It is a performance goal that defines the efficiency, effectiveness, and reliability of a ...

  10. PDF Building a Business Case for Quality Management Transformation

    Quality/Regulations T ransforming quality management is the underlying goal of Quality 4.0 (1), a global, cross-industry initia-tive that pursues quality excellence through digital transformation. Disruptive technologies that connect systems, processes, and people provide the foundation for qual -

  11. Total Quality Management from Theory to Practice: A Case Study

    These attributes include: (1) clarifying job expectations; (2) setting quality standards; (3) measuring quality improvement; (4) effective super‐vision; (5) listening by management; (6) feedback by management; and (7) effective training. Based on a survey of employees at a medium‐sized manufacturing firm in the United States, it was found ...

  12. 4 Case Studies in Quality Management Systems

    MSDS. Deviations. Corrective action. Incidents. Using a quality management system not only increased the size of this company, but also increased profits. Implementing the electronic workflows ...

  13. Reducing the Costs of Poor Quality: A Manufacturing Case Study

    improvement strategies aligned with the total quality management (TQM) model (Drohomertetski, Gouvea da Costa, Pihheiro de Lima, & Garbuio, 2014; Hunold, 2014; Parihar, Bhar, & Kumar, 2015). In this single case study, I explore strategies manufacturing managers can use to lower COPQ and increase profits. Background of the Problem

  14. Total Quality Management : Text with Cases

    Books. Total Quality Management: Text with Cases. John S. Oakland. Routledge, 2003 - Business & Economics - 483 pages. 'TQM: Text with Cases' is clearly written in a logical manner and points are supported by real life case studies. Professor Oakland demonstrates how a Total Quality Management strategy can be applied in all business activities ...

  15. (PDF) Toyota Quality System case study

    T oyota Quality System case study. Introduction. T oyota from the early 1960s alongside their supplier network consolidated the way in which. they were able to refine their production system ...

  16. Organizational approach to Total Quality Management: a case study

    Striving towards Continuous Quality Improvement: A Case Study of St. Mary's Hospital. Health Care Manager 18(2), 33-40. Prajogo, D.I. and Sohal, A.S., 2004. The Sustainability and Evolution of Quality Improvement Programs - An Australian Case Study. Total Quality Management and Business Excellence 15(2), 205-220.

  17. Healthcare Quality Management

    Paperback / softback. 404 Pages. Trim Size: 7in x 10in. Number of Illustrations: 60. ISBN: 9780826145130. eBook ISBN: 9780826145147. Additional resources. Reviews. Healthcare Quality Management: A Case Study Approach is the first comprehensive case-based text combining essential quality management knowledge with real-world scenarios.

  18. PDF Quality Risk Management Principles and Industry Case Studies

    Case study utilizes recognized quality risk management tools. Case study is appropriately simple and succinct to assure clear understanding. Case study provides areas for decreased and increased response actions. 7. Case study avoids excessive redundancy in subject and tools as compared to other planned models. 8.

  19. Quality 2030: quality management for the future

    6.1. General conclusions. Quality 2030 consists of five collectively designed themes for future QM research and practice: (a) systems perspectives applied, (b) stability in change, (c) models for smart self-organising, (d) integrating sustainable development, and (e) higher purpose as QM booster.

  20. Quality Management Case Study

    Project Management Journal. Kuprenas, J. A., Kendall, R. L., & Madjidi, F. (1999). A quality management case study: defects in spacecraft electronics components. Project Management Journal, 30 (2), 14-21. This paper presents a project quality management case study for the production of spacecraft electronics components.

  21. Perspective: State‐of‐the‐Art: The Quality of Case Study Research in

    In contrast to innovation management, there are lively and ongoing debates about the role and quality of case study research in other management disciplines, such as operations management (e.g., Barratt, Choi, and Li, 2011), industrial marketing (e.g., Beverland and Lindgreen, 2010; Piekkari, Plakoyiannaki, and Welch, 2010), and information ...

  22. AHRQ Seeks Examples of Impact for Development of Impact Case Studies

    Post-Award Grant Management. AHRQ Grantee Profiles; Getting Recognition for Your AHRQ-Funded Study; ... Available online and searchable via an interactive map, the Impact Case Studies help to tell the story of how AHRQ-funded research ... Agency for Healthcare Research and Quality. 5600 Fishers Lane Rockville, MD 20857 Telephone: (301) 427-1364 ...

  23. Machine Learning and image analysis towards improved energy management

    With the advent of Industry 4.0, Artificial Intelligence (AI) has created a favorable environment for the digitalization of manufacturing and processing, helping industries to automate and optimize operations. In this work, we focus on a practical case study of a brake caliper quality control operation, which is usually accomplished by human inspection and requires a dedicated handling system ...

  24. Oilseed Radish: Nitrogen and Sulfur Management Strategies for Seed

    Nitrogen (N) and sulfur (S) fertilization significantly affect seed yield and quality in Brassica oilseed crops. The effect of N and S management on the crop parameters (plant height, stem-base diameter, and number of branches), yield (seed yield components, seed and straw yields, harvest index—HI), and the quality of the seeds and oil (crude fat—CF, total protein—TP, crude fiber—CFR ...

  25. Full article: Effectiveness of the Assessment of Burden of Chronic

    The current study builds on the evidence base surrounding the Assessment of Burden of COPD-tool, the predecessor of the ABCC-tool, which has shown to be effective in improving quality of life and perceived quality of care [Citation 8-10]. The primary aim of the current study was to assess the effectiveness of using the ABCC-tool in patients ...

  26. Top 10 Project Management Case Studies with Examples 2024

    Top 10 Project Management Case Studies with Examples 2024. 1. NASA's Mars Exploration Rover: Innovative project management in space exploration. 2. Apple's iPhone Development: Delivering revolutionary products with precision. 3. Tesla's Gigafactory Construction: Exemplary project execution in renewable energy.

  27. Eficode US

    Specialized partner: We have reached Specialization in all 3 areas - IT Service Management, Cloud, and Agile at Scale-proof that we offer high-quality and tailored services to solve your business challenges. Vast experience: Our 20+ years of experience as an Atlassian Partner gives us a deep understanding of your unique challenges and goals ...

  28. Peran Sistem Pembayaran dalam Pelayanan Penyakit Kronis: Tinjauan

    Publikasi baru WHO "The role of purchasing arrangements for quality chronic care: a scoping review" menelaah bagaimana berbagai metode pembayaran kepada penyedia layanan kesehatan berdampak pada mutu pelayanan untuk penyakit-penyakit kronis. ... "Chronic disease management programme (‎PROLANIS)‎ in Indonesia: case study" menggambarkan ...