Kevin Price is a technical product evangelist at Infor EAM. Michael Malecha is CEO of Creative Business Development LLC. The views expressed in this article are solely those of the authors and do not necessarily reflect those of FreightWaves.
Engine oil is considered the lifeblood of an engine, just as data is considered the lifeblood of your company and, more importantly, your maintenance operations. Drivers are required to check the oil level daily; so too should you check your data quality regularly as a responsible steward of your company’s safety and profitability.
When checking oil, you not only look at the level but also for contaminants and quality. Corollaries apply to data you use every day to make the maintenance, safety, and economic decisions for your company. Collecting data isn’t enough; you must also be mindful of the quality of the data.
In the mechanical and early electronic technology period, I helped develop a 35,000-mile service interval for our fleet. Part of our cost management effort was to modify our maintenance practices so we did not have to bring the truck in for maintenance between service intervals. We were quite successful with unscheduled maintenance visits averaging 25,000-27,000 miles. Today I see this number as low as 3,700 miles and typically in the 6,500-mile range. What happened and what can we do to manage and improve this critical metric as part of our continuous improvement process?
The question of what happened can be addressed by considering increased system complexity – or, “as parts counts increase reliability goes down.” There is an intuitive logic to this statement and validation in a general sense as miles between required shop visits have deteriorated over the years. A decrease in miles between unscheduled maintenance means reliability is going down.
The increase in parts count in trucks and trailers can be traced to an effort by the industry and government to improve safety, mileage and emissions. Increasing autonomous truck functions add to already high parts count and system complexities. However, with the eventual movement toward electric drive trucks as we have seen on the automotive side, a significant reduction in parts count due to simplified drivetrain design can be expected.
Many fleet managers are adopting enterprise asset management (EAM) systems to improve their management of these increasingly complex vehicle systems. EAM functionality ranges from simple whiteboard replacements to full-featured, world-class information systems providing predictive analytics and a foundation for machine learning and artificial intelligence. Foundational to any EAM solution is data that is accurate, timely, delivered to the right place, at the right time, and to the right person.
A recent web search led to several fascinating and insightful articles on data quality. One of the most striking was a Harvard Business Review (HBR) article indicating that only 3% of companies surveyed considered data quality to meet an acceptable standard of 97 accurate records out of 100. To quote, “Our analyses confirm that data is in far worse shape than most managers realize – and greater than we feared – and carry enormous implications for managers everywhere.”
For a quick test of how this also applies to maintenance data, look at your parts master data and count how many duplicate part numbers are in your parts catalog or inventory system. On the operational side, look at your company master file and find out how many ways the same company/shipper/consignee/bill-to records are in your system under separate codes. These examples are just scratching the data quality surface and are indicative of the pervasiveness of the problem.
The most common method I have encountered to address this issue has been data cleansing after the fact or accommodation of bad data within the calculation process or report structure. These and other methods to improve data accuracy after the fact are expensive in terms of time and effort while still providing inaccurate data. The consensus among data quality professionals is that these after-the-fact correction methods cost up to 10 times the cost of doing it right at the beginning. Companies rely on this inaccurate information to make business and personnel decisions that have long-term effects on their economic viability and culture.
Once companies realize the damage and commit to transforming the way they view data quality from an afterthought to a priority, there are plenty of resources to help. These include Donald Wheeler’s books on defining processes and understanding data variation via methods developed and promoted by Walter Shewhart, W. Edward Deming, and others. Larry English’s “The 14 Points of Information Quality Transformation” is also a valuable resource for data quality management. Let’s look at a few data quality basics:
- Everyone throughout the organization has a role in data quality. Those roles are either data owner, data steward, data manager or data user. Excellent descriptions of these roles are available online. You must understand and manage these roles within your company to ensure data accuracy.
- The most accurate and robust data collected is at the time and source of the transaction. Details get lost and context is missing with data entered after the fact. Of note on this topic is the value of time data attributes or metadata. Only with accurate time metadata can we begin to analyze processes for efficiency.
- Process consistency, management and monitoring across the organization are critical to data quality.
- Deming said, “Management is prediction.” I would add accurate prediction is the essence of management, and we cannot do this without accurate data.
- Data capture costs money – more so when the source is manual, but not so much when data integrations are part of your EAM system. Do you know what data you need for analysis, and are you capturing it accurately?
- Data entry should be required only one time. Inaccuracies accumulate when data has to be entered multiple times in an EAM or transportation management system.
Data for EAM comes from several sources, including vehicles, telematics, vendors – parts and service, technician activity, drivers, inspection reports, and other business sources.
Understanding the origin and processes generating accurate EAM data is critical to evaluating accuracy.
Of all data sources in maintenance, the mileage meter is most important, whether it is odometer or engine control module (ECM) mileage. ECM hours, power take off (PTO), and fuel consumption are all useful analysis denominators and preventative maintenance triggers. The best practice is to integrate with telematics providers for continuous meter data updates in your EAM system.
Telematics data is becoming more comprehensive due to electronic logging devices (ELD) legislation. An ELD synchronizes with a vehicle engine to automatically record driving time, for easier, more accurate hours-of-service (HOS) recording. The ELD rule is intended to help create a safer work environment for drivers, and make it easier and faster to accurately track, manage, and share records of duty status (RODS) data. Richer data includes fault codes, sensor conditions such as temperature, voltage, voltage drop, seatbelt use, roll stability, anti-lock brake use, and other operational data. Capture and use of this data enable predictive analytics for enhancement of correct preventative maintenance.
Technician activity through the use of devices at the toolbox can both capture critical time metadata along with Vehicle Maintenance Reporting Standards (VMRS) labor and parts cost allocation. VMRS guidelines also account for direct (time spent on actual repair activity) and indirect labor, providing data for time study and technician optimization. Technicians embrace technology that makes capturing their activity more efficient. Voice to text, barcode/RFID integration, and use of photo documentation to record their work provide excellent data for costing analysis. Detailed data capture enhanced by well thought out and executed user interfaces separate best-in-class EAM systems. Additionally, accurate information at the technician level provides better planning due to increased accuracy of estimated times and objective technician evaluation, among others.
Another benefit of excellent user interface design and its facilitation of accurate technician data capture is the ability to evaluate your technicians objectively. Their value to the company and self-worth increase when objective measurements of their activity are used to guide management in the process of continual employee development. In this area, it is critical to have accurate data for both practical results and company protection. How does yours stack up?
Bi-directional integrations are critical to master data sources in maintaining data accuracy. Examples are parts catalog or shop inventory integrations with vendors, vehicle manufacturers, and other second- and third-tier parts sources, fuel data, and others. All alleviate the manual task of generating master records. Accuracy is greatly enhanced when VMRS codes are part of the record.
Accurate data and careful analysis enable fleets to reverse the trend of lower miles between unscheduled repairs. Learn, observe and distribute responsibility for accurate data throughout your organization. Challenge your software providers and redesign business processes to incorporate data quality best practices. This reliable data will enhance your analysis as the means to increase revenue and profitability.
king
does a reprogramming service done on the PCM will reset my data?