Augmented Reality

In our first article, we established the ERP as the “conductor” of the smart warehouse. Now, let’s look at the first instrument in the orchestra: Augmented Reality (AR).
In any warehouse (Automotive, Industrial, or A&D), the highest cost and the single biggest source of errors often lies in the “last 10 meters” — the manual pick.
In an environment where speed and precision have become dominant drivers, operators have long been head-down, constantly looking away from their work to check a paper list or a handheld RF scanner. This context-switching is inefficient and the root cause of mistakes (wrong item, wrong quantity, wrong bin).

AR Vision Picking
Augmented Reality (AR) solves this by putting the data directly into the operator’s line of sight. We are talking about a fundamental process change that creates a “hands-free” and “eyes-up” operation.
It’s important to understand why this is becoming mainstream now. Early industrial smart glasses, like the first Google Glass or Vuzix M-series were pioneers but often bulky. They featured single-eye displays, limited battery life, and a narrow field of view. The hardware unveiled in 2024 and 2025, such as the Vuzix Z1 or the XReal Air series, represents a massive leap. The new standard obtained is:
- Lightweight: Often indistinguishable from standard safety glasses, designed for all-day-comfort.
- Binocular: Full-color displays for both eyes, creating a true augmented view.
- Wider Field of View: Allowing more information to be displayed without blocking the operator’s vision.
- On-board AI: Better processing for faster object and text recognition.
This evolution to a comfortable, powerful wearable is the catalyst for making “Vision Picking” a scalable reality today.

How AR Transforms LN Picking Flow
Let’s simulate a real case.
In a traditional warehouse, an operator uses an RF scanner to consult the picking list.
They are “head-down,” reading a small inch screen, walking to the aisle, scanning the bin, checking the screen again for the quantity, picking, and then confirming. Now, let’s apply AR, orchestrated by Infor LN:
- LN generates the Picking List based on an order.
This process does not change; the ERP remains the “conductor.” - ION sends the first pick instruction (e.g., “Pick 4 units of item X from Aisle 05, Bin 02”) to the AR software.
- AR Guided Execution:
- “Eyes-Up” Navigation: The operator sees “Aisle 05, Bin 02” in their vision with a digital arrow guiding them. They walk “hands-free” without looking down.
- Visual Confirmation: Upon arriving, the operator looks at the bin’s barcode. The smart glasses’ camera scans it instantly, and the operator sees a green “check” in their vision.
This confirms the location. - Item Verification: The display shows “Pick 4” and an image of the item (pulled from Infor LN’s item master data session).
This visual check prevents picking the wrong item. - Hands-Free Action: The operator picks the 4 items using both hands, remaining focused.
- Real-Time Confirmation: The operator confirms the pick via a wearable ring scanner or voice command.
This action is sent back to Infor LN in real-time, closing the loop on that specific picking line.
The operator is already receiving the next instruction (“Aisle 07, Bin 04”) before they have turned around. This eliminates context-switching, minimizes human error, and optimizes movement. This technology delivers measurable ROI in logistics.
Case 1: The Classic Warehouse (DHL Supply Chain)
DHL is the pioneer here.
Their initial pilot project in the Netherlands (Bergen op Zoom), conducted for their customer Ricoh, proved the concept. They implemented “Vision Picking” using Vuzix M100 smart glasses and Ubimax software. Their operators received order information directly in their field of vision, guiding them to the correct location while keeping their hands free.
-
- The Result: The pilot was highly successful. As detailed in their press release, DHL achieved a 25% efficiency increase in the picking process.
They also noted high accuracy and positive feedback from employees, who found the system easy to use. The solution was so successful that it was rolled out globally.
Case 2: A Modern Case (Amazon “Last Mile”)
This Amazon case is fascinating and very modern, as it moves AR from the warehouse to the delivery van. The principle is identical, but the “operator” is the driver. Amazon developed this new AR capability in-house to improve the last mile.
The Use Case: Drivers use smart glasses (custom hardware) that are fed data about their route and cargo. When a driver arrives at a stop, the AR display provides navigation cues and delivery details. But the crucial aspect is that it helps them locate the specific package inside the van, an operation that wastes a lot of time. Drivers see package information (like the address) as an overlay, allowing them to identify the right box quicker.

-
- The Result is a “hands-free” solution that saves time and improves accuracy.
It also increases safety, as drivers are not constantly looking down at a handheld device. - Link: About Amazon: How smart glasses help Amazon delivery drivers
- The Result is a “hands-free” solution that saves time and improves accuracy.
Whether in the aisle or in the delivery van, Augmented Reality gives the human operator the “superpower” of real-time data without the cognitive load of looking away. It is the perfect blend of human flexibility and digital accuracy, all orchestrated by the ERP.
Next Up: What happens when we eliminate human travel time entirely? We’ll discuss the “goods-to-person” revolution: Autonomous Mobile Robots (AMR).
Written by Andrea Guaccio
November 10, 2025