„In God we trust, all others must bring data“
Since the very beginning of the IT evolution, data has been digitally processed into information on a numeric basis. The quality of results has always depended on the quality and reliability of the input data, and this will continue to be the case long into the future. At the same time, the calculation model behind it all is of vital importance as it processes the input data into useful and efficient results. There is no way around the need for digitization, especially when you are trying to continuously process even more data for an even better warehouse operation.
The most essential data, such as valid material master data, represents an enormous potential for controlling a warehouse in the most efficient and effective way possible.
Generating Valid Material Master Data
The following questions can help determine the quality and the potential for improvement of data:
- Are the bins used in the warehouse adapted to the item sizes and do the needed quantities meet the warehouse requirements?
- What are the possibilities to (recurrently) control the content/filling level on the basis of known data?
- What is the rate of deficient quantities (or excess stock) resulting from quantity registration problems?
- Are potential legal requirements (for example prohibitions on mixed storage of certain warehouse goods) continuously and evidently met?
Avoiding Inefficient Warehousing
But what happens if not enough attention is paid to certain warehousing parameters? What is the impact of inefficient warehousing and the associated deficient material master data?
- High expenditure due to the bin size not being matched to the replenishment quantity:
Supply containers that have already been opened but not yet completely emptied must be returned to the large warehouse. This must be done because the quantities of the delivered containers for the more efficient small parts warehouse cannot be stored in the incorrectly selected bin during repacking.
- The hazard of structural rack instabilities due to overload: Due to the lack of a correct registration of item weights and the consequently missing weight indications or incorrect load calculations, areas may be overloaded. However, this may not be noticed until it is literally too late.
- Multiple storage of the identical item at different storage locations due to unidentified double or multiple registrations: A comparison based on the manufacturer code linked to the item could help save space.
- Damaged items due to the lack of corresponding information regarding fragility or similar.
- Frequent selection of over-sized shipment loading units and enormous consumption of filling material during packing.
- Spoiled goods because the BBD data or other maximum time intervals for storage were exceeded.
- Destroyed goods due to missing data regarding special storage requirements (environmental conditions, electrostatically protected areas, etc.).
Questioning Intralogistical Processes and Recognizing Optimization Potential
In this case it is incredibly important to raise awareness of optimization potential in the warehouse. Awareness of this opportunity to increase efficiency is the only way to encourage warehouse operators to question and consequently also optimize intralogistical processes.
Usually, warehouse operators with this valuable know-how are very pleased with the competitive logistics costs. These warehouses, for instance, allow:
- Higher volume utilization through more efficient container use.
- Automated and reliable packing patterns and packing image calculations since dimensions and weight for optimization algorithms are correct.
- An automated preliminary check of picked orders based on the total weight due to reliable weight specifications of the items.
- Efficient replenishment processes based on current product classifications and thereby exact quantity determinations for the respective replenishment process.
The question ‘To what extent am I prepared for industry 4.0 with my warehouse?’ also implies the question ‘How well prepared is my data foundation for industry 4.0?’. The implementation partner can only create the respective framework conditions for future-proof warehouse operations if a certain data quality is ensured.
 N.B.: This quote is attributed to W. Edwards Deming. However, the authorship is disputed.
About the author:
Markus Klug graduated from the TU Wien in Applied Mathematics. He did some postgraduate research in Glasgow regarding Kernel-based Methods and their area of possible applications for event-discrete simulation models. Afterwards he managed national and international research and innovation projects related to transport logistics, site logistics and worldwide supply chains at the applied industrial research center Seibersdorf.
Markus Klug has been part of SSI Schaefer since 2013 and is responsible for the use of data analysis and simulation, a role which later grew to encompass data science and artificial intelligence/machine learning.