badge icon

This article is not approved yet.

Article

Data-Driven Autonomous Stock Management Robots with Advanced Human-Machine Collaboration (HMC) Features

Quote
Agus SUKOCO
Age
45

Stock Management & Robot

Due to limited delivery options, grocery and other company and office stock management systems are logistically inefficient. Items are commonly left at security posts, causing recipients long retrieval distances and security officers large handling volumes. Inefficiencies disrupt workflows and lower productivity. Manual inter-departmental deliveries extend delays and divert staff. Data-driven autonomous delivery robots with HMC features use powerful information processing techniques to solve the problem. These outdoor robots navigate and detect obstacles using real-time data analytics like LiDAR and GNSS (Ahmad, et al, 2023; Hossain, 2023; Madani & Ndiaye, 2022). Using predictive models and decision-making algorithms, HMC features enable emergency interventions and route optimization.

Dynamic outside conditions challenge autonomous delivery robot navigation (Lee, et al., 2022). Localization and safe route planning increase using LiDAR and GNSS sensors. Other studies emphasize route conflict optimization for Automated Guided Vehicles (AGVs) to ensure timely deliveries (Nishida & Nishi, 2022). FOODIEBOT shows the rise of specialized autonomous robots for food and light item delivery [6]. Distributed resource allocation methods for cooperative robots have done well in large-scale pickup and delivery without central coordination (Camisa, et al., 2022). Technically, land-based delivery vehicle research emphasizes Dynamic Parameter A* (DP-A*) algorithms for dynamic learning in obstacle-changing situations (Gan, et l., 2023).Autonomous delivery robots improve logistics and Information Sciences by advancing machine learning, data integration, and IoT-enabled communication. These technologies improve efficiency, adaptability, and collaboration, establishing the groundwork for intelligent, sustainable logistics systems for modern institutions. This research intends to create an autonomous stock management delivery robot with superior HMC features. The project focusses on data-driven systems and AI-based decision-making to improve logistics, sustainability, and user productivity.

A prototype that works reliably under various outside circumstances is the goal. Modern logistics solutions like the Smart System-based Delivery Robot efficiently, safely, and sustainably distribute items in campus and office settings. A Smart System 4-Layer method was used to construct this robot:

  1. Instrumentation — Real-time environmental monitoring via LiDAR, GNSS, and cameras.
  2. Information System - Architecture for route planning, status monitoring, and logistics reporting using integrated data processing.
  3. Machine learning techniques for autonomous decision-making, danger identification, and environmental adaption are implemented in AI.
  4. Gamification—Creating compelling user interactions like package tracking, interactive notifications, and feedback with an informative and amusing user interface.

This robot delivers and illustrates how an intelligent system may use digital literacy to tackle real-world problems sustainably. Figure 1 shows how the modular, scalable system may adapt to diverse outdoor operations circumstances.

Figure 1. Oregon State University research team developing autonomous delivery robot prototype

The proposed delivery robot can operate autonomously in outdoor settings like campuses and offices. This robot can move safely, efficiently, and reliably without human involvement, but manual intervention is available.

In Figure 2, the system has several main components:

Figure 2. Designing an autonomous delivery robot


System Control and Administration

The operator can watch the robot's movements, receive status reports, and manually intervene in emergencies with this component. The robot and control center communicate via LoRa and a web-based or mobile device.

Autonomous System

  1. Planning: Using environmental data and delivery destinations, the robot will automatically plan its journey. Route planning algorithms provide the fastest and safest route.
  2. Perception: Roads, obstructions, and pedestrians provide data for the perception and planning system of the robot. This system maps the surroundings, detects impediments, and properly locates the robot using LiDAR, cameras, GNSS, and ultrasonic sensors.
  3. Control: The control system controls speed, direction, and obstacle avoidance to execute the plan.

Support Systems

  1. Load Locker: Protects things from falling or being stolen during delivery.
  2. The Power System: Maintains system stability by powering all components.

Actuators

Actuators translate control system commands into steering, acceleration, and braking.

Interfaces with End Users

Users will send and receive things via robot. A mobile app will provide notifications, delivery status, and the robot's location.

This solution solves logistical problems and demonstrates smart technology and automation in daily living.


Motor vehicle mechanical design begins with sketches and technical drawings to visualize the design concept. Performance, efficiency, ergonomics, and aesthetics are considered when evaluating design options. The best design is refined and optimized using CAD/CAE software to create the final design. Technical sketches and prototype images show the design's development to the final product. The final design was chosen for its technical specs, production simplicity, and aesthetics. Engine, chassis, and superstructure materials were used to make this vehicle. Materials are chosen based on strength, durability, weight, and cost. The chassis is aluminum alloy to decrease weight and preserve strength. Body and interior are also developed with material choices in consideration. Subcomponents ready-made or designed for this project are also listed. This vehicle's production procedures were chosen for efficiency, quality, and scalability. The key production operations are cutting, shaping, assembling, and finishing vehicle parts. Plastic vehicle bodies are made via thermoforming, injection molding, and compression molding. Body panels are thermoformed on a plastic sheet forming machine. Assembling the parts involves welding, connecting, and fitting.

A metal bending machine still makes the car frame. Final steps include painting, accessory installation, and product inspection to assure quality. The chosen production processes save money and time while maintaining vehicle quality.


Physical feature

Figure 3. physical feature


Figure 3 shows that wheeled robotic platforms or vehicles can move autonomously. The boxy, compact car can maneuver easily in many conditions. Its weight is unknown, but the platform appears light and autonomous. The vehicle is simple but functional, having good mobility for autonomous robotic applications like inspection, surveillance, and logistics.

Digital Design Process

Figure 4. Digital Design Process

This system uses sensors and cameras to detect and track objects for autonomous vehicles. Ultrasonic sensors measure distance, while YOLO cameras classify objects. The RabbitMQ message broker sends sensor and camera object detection data to the Environmental Awareness module, which projects object positions in four directions using Cartesian coordinate transformation.

The Vehicle Control module analyses the projected object position and trajectory to avoid collisions; it may reroute the trajectory. Autonomous vehicle movement is controlled in real time by the resulting vehicle control decisions. Key elements of this system include:

  1. Multi-camera integration provides a complete view of the environment and object detection and tracking.
  2. Cartesian projection aligns radar and camera data for spatial precision.
  3. The use of radar to supplement optical data in low-visibility settings ensures system reliability in bad weather.
  4. Accurate distance measurements and historical data visualization for object position monitoring.

This technology improves object detection in varied environmental circumstances for autonomous driving by merging many sensors and powerful algorithms.

 This technology improves object detection in varied environmental circumstances for autonomous driving by merging many sensors and powerful algorithms.

 

 

Algorithms design

Figure 5. Algorithms design



The diagram shows the autonomous vehicle control algorithm design process: First, the camera takes images of the environment, and then the YOLO algorithm detects things. The Environmental Awareness module receives item detection data via RabbitMQ Messaging Queue. The Environmental Awareness module uses object detection data to project object positions in four directions (front, left, right, and rear) to understand the vehicle's surroundings. The Vehicle Control module receives object location data. The Vehicle Control module analyses object trajectories and object position data to determine action. To avoid collisions, the program reroutes if an object blocks the trajectory. To move the car, vehicle control choices are sent. To ensure safe and dependable vehicle control, our algorithm design includes YOLO object recognition, environmental comprehension, object trajectory analysis, and adaptive vehicle control decision-making.

According to the diagram, the autonomous vehicle control algorithm design process begins with the camera taking photographs of the environment and the YOLO algorithm detecting objects. The Environmental Awareness module receives item detection data via RabbitMQ Messaging Queue. The Environmental Awareness module gathers object detection data and projects their positions in four directions (front, left, right, rear) to understand the vehicle's surroundings. The Vehicle Control module receives object location data. The Vehicle Control module analyses object trajectories and object position data to determine action. To avoid collisions, the program will reroute the trajectory if an item blocks it. To move the car, vehicle control choices are sent. Correct object detection using YOLO, comprehensive environmental understanding, object trajectory analysis, and adaptive vehicle control decision making to environmental conditions are the key to safe and reliable vehicle control with this algorithm design.

The team developed a software design approach for the wheeled robot platform's control, navigation, and guidance algorithms. This procedure has numerous steps:

  1. Object detection utilizing YOLO (You Only Look Once): The unit detects and tracks objects using the YOLO technique. RabbitMQ receives YOLO object detection data.
  2. The Vehicle Environmental Awareness Module gets object detection data from the messaging queue and projects the object's position in four directions (front, left, right, rear) using camera data. The vehicle control module receives the object's position.
  3. Vehicle Control Module: Receives object position and trajectory data. Based on object position and trajectory, the vehicle control algorithm will move the vehicle. The car stops and reroutes if an object blocks its path.
  4. Python was utilized to create this technique since it offers many robotics system development tools and frameworks, such as OpenCV, ROS, and NumPy. Python's wide community and ease of learning make code development and maintenance easier.

With this systematic software design method, the wheeled robot platform should operate securely, responsively, and identify and avoid obstacles.


External Interface

Figure 6. External Interface


This interface provides detailed information about the mobile robot's position and environment to help it avoid obstacles. The robot can monitor its surroundings in real time with 360-degree front, left, right, and back camera views. The obstacle graph shows the robot's surroundings and allows the system to avoid collisions. The mobile robot's situational awareness and navigation in complex situations depend on this interface design. With complete and clear robot position data, the control system can make better decisions and reduce collisions. Industrial and rover robots that need precision navigation can benefit from this interface.

A multi-sensor system that integrates object detection, video data, and radar data helps the robot understand its environment. Cartesian projection and data fusion improve object localization accuracy. The system can detect and track objects from diverse angles and determine distance accurately, according to experiments. Mobile robot design and operation must prioritize safety. Some ways to achieve this competition's safety standards are:

  1. Using safe, non-toxic materials for the robot's structure and protecting all electronic components to prevent electric shock. Mechanically resistant materials lessen damage and injury risk.
  2. Emergency detection and stop systems must also detect individuals and objects near the robot and rapidly stop its movement in harmful situations. Ensure the robot's control system responds swiftly to emergency stop commands.
  3. Provide a physical barrier or guard around the robot to prevent direct contact with users or things, and make sure the robot doesn't have any dangerous parts.
  4. Robot users need clear training and operational protocols, including a safe operation handbook and standard operating procedures, to use the robot safely.
  5. Mobile robots can operate securely and avoid injury by taking precautions.

The vehicle's navigation, stability, and control system were tested. The design was tested in a controlled environment for operational compatibility. The drivetrain, sensors, and electronic systems were also tested for reliability. To guarantee vehicle compliance, test results were compared to design specifications. The vehicle's response was tested using speed, acceleration, and distance monitoring devices and an environmental simulator.

The crew had many wins and failures while designing, building, and testing this wheeled robotic platform. The team struggled to create a successful navigation and object detection system due to camera latency. We optimized camera data processing and integrated lidar sensors to fix these difficulties. Electronic components often broke or became loose during production, causing the team problems. The researchers solved these problems by strengthening sensor mounting and choosing vibration-resistant components.


References:

Ahmad, A., et al. 2023. “A Review on Autonomous Delivery Robots.” In 2023 2nd International Conference on Multidisciplinary Engineering and Applied Science (ICMEAS), Abuja, Nigeria, 1-6.

Hossain, M. 2023. “Autonomous Delivery Robots: A Literature Review.” IEEE Engineering Management Review 51 (4): 77-89.

Madani, B., and M. Ndiaye. 2022. “Hybrid Truck-Drone Delivery Systems: A Systematic Literature Review.” IEEE Access 10: 92854-92878.

Lee, J., G. Park, I. Cho, K. Kang, D. Pyo, S. Cho, M. Cho, and W. Chung. 2022. “Ods-bot: Mobile Robot Navigation for Outdoor Delivery Services.” IEEE Access 10: 107250–107258.

Nishida, K., and T. Nishi. 2022. “Dynamic Optimization of Conflict-Free Routing of Automated Guided Vehicles for Just-in-Time Delivery.” IEEE Transactions on Automation Science and Engineering 20 (3): 2099–2114.

Moshayedi, A. J., A. S. Roy, L. Liao, A. S. Khan, A. Kolahdooz, and A. Eftekhari. 2024. “Design and Development of FOODIEBOT Robot: From Simulation to Design.” IEEE Access 12.

Camisa, A., A. Testa, and G. Notarstefano. 2022. “Multi-Robot Pickup and Delivery via Distributed Resource Allocation.” IEEE Transactions on Robotics 39 (2): 1106–1118.

Gan, X., Z. Huo, and W. Li. 2023. “Dp-a*: For Path Planning of UGV and Contactless Delivery.” IEEE Transactions on Intelligent Transportation Systems 25 (1): 907–919.

Ravi. 2027. “Inventory Management System Using PHP and MySQL.” International Journal of Innovative Research in Computer Science & Technology 5 (4): 87-93. https://doi.org/10.17148/IJIRCCT.2017.5412.

Patil, D., M. Rawal, Miss S. Bandgar, M. Pathan, and J. T. Patil. 2023. “Stock Management System.” International Research Journal of Engineering and Technology (IRJET) 10 (3): March.


Author Information

Avatar
AuthorAgus SUKOCOAugust 12, 2025 at 7:19 AM

Tags

Discussions

No Discussion Added Yet

Start discussion for "Data-Driven Autonomous Stock Management Robots with Advanced Human-Machine Collaboration (HMC) Features" article

View Discussions

Contents

  • Stock Management & Robot

    • This robot delivers and illustrates how an intelligent system may use digital literacy to tackle real-world problems sustainably. Figure 1 shows how the modular, scalable system may adapt to diverse outdoor operations circumstances.

    • The proposed delivery robot can operate autonomously in outdoor settings like campuses and offices. This robot can move safely, efficiently, and reliably without human involvement, but manual intervention is available.

    • In Figure 2, the system has several main components:

    • System Control and Administration

    • Autonomous System

    • Support Systems

    • Actuators

    • Interfaces with End Users

  • Physical feature

    • Figure 3 shows that wheeled robotic platforms or vehicles can move autonomously. The boxy, compact car can maneuver easily in many conditions. Its weight is unknown, but the platform appears light and autonomous. The vehicle is simple but functional, having good mobility for autonomous robotic applications like inspection, surveillance, and logistics.

  • Digital Design Process

  • Algorithms design

  • External Interface

Ask to Küre