This article is not approved yet.
Due to limited delivery options, grocery and other company and office stock management systems are logistically inefficient. Items are commonly left at security posts, causing recipients long retrieval distances and security officers large handling volumes. Inefficiencies disrupt workflows and lower productivity. Manual inter-departmental deliveries extend delays and divert staff. Data-driven autonomous delivery robots with HMC features use powerful information processing techniques to solve the problem. These outdoor robots navigate and detect obstacles using real-time data analytics like LiDAR and GNSS (Ahmad, et al, 2023; Hossain, 2023; Madani & Ndiaye, 2022). Using predictive models and decision-making algorithms, HMC features enable emergency interventions and route optimization.
Dynamic outside conditions challenge autonomous delivery robot navigation (Lee, et al., 2022). Localization and safe route planning increase using LiDAR and GNSS sensors. Other studies emphasize route conflict optimization for Automated Guided Vehicles (AGVs) to ensure timely deliveries (Nishida & Nishi, 2022). FOODIEBOT shows the rise of specialized autonomous robots for food and light item delivery [6]. Distributed resource allocation methods for cooperative robots have done well in large-scale pickup and delivery without central coordination (Camisa, et al., 2022). Technically, land-based delivery vehicle research emphasizes Dynamic Parameter A* (DP-A*) algorithms for dynamic learning in obstacle-changing situations (Gan, et l., 2023).Autonomous delivery robots improve logistics and Information Sciences by advancing machine learning, data integration, and IoT-enabled communication. These technologies improve efficiency, adaptability, and collaboration, establishing the groundwork for intelligent, sustainable logistics systems for modern institutions. This research intends to create an autonomous stock management delivery robot with superior HMC features. The project focusses on data-driven systems and AI-based decision-making to improve logistics, sustainability, and user productivity.
A prototype that works reliably under various outside circumstances is the goal. Modern logistics solutions like the Smart System-based Delivery Robot efficiently, safely, and sustainably distribute items in campus and office settings. A Smart System 4-Layer method was used to construct this robot:

Figure 1. Oregon State University research team developing autonomous delivery robot prototype

Figure 2. Designing an autonomous delivery robot
The operator can watch the robot's movements, receive status reports, and manually intervene in emergencies with this component. The robot and control center communicate via LoRa and a web-based or mobile device.
Actuators translate control system commands into steering, acceleration, and braking.
Users will send and receive things via robot. A mobile app will provide notifications, delivery status, and the robot's location.
This solution solves logistical problems and demonstrates smart technology and automation in daily living.
Motor vehicle mechanical design begins with sketches and technical drawings to visualize the design concept. Performance, efficiency, ergonomics, and aesthetics are considered when evaluating design options. The best design is refined and optimized using CAD/CAE software to create the final design. Technical sketches and prototype images show the design's development to the final product. The final design was chosen for its technical specs, production simplicity, and aesthetics. Engine, chassis, and superstructure materials were used to make this vehicle. Materials are chosen based on strength, durability, weight, and cost. The chassis is aluminum alloy to decrease weight and preserve strength. Body and interior are also developed with material choices in consideration. Subcomponents ready-made or designed for this project are also listed. This vehicle's production procedures were chosen for efficiency, quality, and scalability. The key production operations are cutting, shaping, assembling, and finishing vehicle parts. Plastic vehicle bodies are made via thermoforming, injection molding, and compression molding. Body panels are thermoformed on a plastic sheet forming machine. Assembling the parts involves welding, connecting, and fitting.
A metal bending machine still makes the car frame. Final steps include painting, accessory installation, and product inspection to assure quality. The chosen production processes save money and time while maintaining vehicle quality.

Figure 3. physical feature

Figure 4. Digital Design Process
This system uses sensors and cameras to detect and track objects for autonomous vehicles. Ultrasonic sensors measure distance, while YOLO cameras classify objects. The RabbitMQ message broker sends sensor and camera object detection data to the Environmental Awareness module, which projects object positions in four directions using Cartesian coordinate transformation.
The Vehicle Control module analyses the projected object position and trajectory to avoid collisions; it may reroute the trajectory. Autonomous vehicle movement is controlled in real time by the resulting vehicle control decisions. Key elements of this system include:
This technology improves object detection in varied environmental circumstances for autonomous driving by merging many sensors and powerful algorithms.
This technology improves object detection in varied environmental circumstances for autonomous driving by merging many sensors and powerful algorithms.

Figure 5. Algorithms design
The diagram shows the autonomous vehicle control algorithm design process: First, the camera takes images of the environment, and then the YOLO algorithm detects things. The Environmental Awareness module receives item detection data via RabbitMQ Messaging Queue. The Environmental Awareness module uses object detection data to project object positions in four directions (front, left, right, and rear) to understand the vehicle's surroundings. The Vehicle Control module receives object location data. The Vehicle Control module analyses object trajectories and object position data to determine action. To avoid collisions, the program reroutes if an object blocks the trajectory. To move the car, vehicle control choices are sent. To ensure safe and dependable vehicle control, our algorithm design includes YOLO object recognition, environmental comprehension, object trajectory analysis, and adaptive vehicle control decision-making.
According to the diagram, the autonomous vehicle control algorithm design process begins with the camera taking photographs of the environment and the YOLO algorithm detecting objects. The Environmental Awareness module receives item detection data via RabbitMQ Messaging Queue. The Environmental Awareness module gathers object detection data and projects their positions in four directions (front, left, right, rear) to understand the vehicle's surroundings. The Vehicle Control module receives object location data. The Vehicle Control module analyses object trajectories and object position data to determine action. To avoid collisions, the program will reroute the trajectory if an item blocks it. To move the car, vehicle control choices are sent. Correct object detection using YOLO, comprehensive environmental understanding, object trajectory analysis, and adaptive vehicle control decision making to environmental conditions are the key to safe and reliable vehicle control with this algorithm design.
The team developed a software design approach for the wheeled robot platform's control, navigation, and guidance algorithms. This procedure has numerous steps:
With this systematic software design method, the wheeled robot platform should operate securely, responsively, and identify and avoid obstacles.

Figure 6. External Interface
This interface provides detailed information about the mobile robot's position and environment to help it avoid obstacles. The robot can monitor its surroundings in real time with 360-degree front, left, right, and back camera views. The obstacle graph shows the robot's surroundings and allows the system to avoid collisions. The mobile robot's situational awareness and navigation in complex situations depend on this interface design. With complete and clear robot position data, the control system can make better decisions and reduce collisions. Industrial and rover robots that need precision navigation can benefit from this interface.
A multi-sensor system that integrates object detection, video data, and radar data helps the robot understand its environment. Cartesian projection and data fusion improve object localization accuracy. The system can detect and track objects from diverse angles and determine distance accurately, according to experiments. Mobile robot design and operation must prioritize safety. Some ways to achieve this competition's safety standards are:
The vehicle's navigation, stability, and control system were tested. The design was tested in a controlled environment for operational compatibility. The drivetrain, sensors, and electronic systems were also tested for reliability. To guarantee vehicle compliance, test results were compared to design specifications. The vehicle's response was tested using speed, acceleration, and distance monitoring devices and an environmental simulator.
The crew had many wins and failures while designing, building, and testing this wheeled robotic platform. The team struggled to create a successful navigation and object detection system due to camera latency. We optimized camera data processing and integrated lidar sensors to fix these difficulties. Electronic components often broke or became loose during production, causing the team problems. The researchers solved these problems by strengthening sensor mounting and choosing vibration-resistant components.
References:
Ahmad, A., et al. 2023. “A Review on Autonomous Delivery Robots.” In 2023 2nd International Conference on Multidisciplinary Engineering and Applied Science (ICMEAS), Abuja, Nigeria, 1-6.
Hossain, M. 2023. “Autonomous Delivery Robots: A Literature Review.” IEEE Engineering Management Review 51 (4): 77-89.
Madani, B., and M. Ndiaye. 2022. “Hybrid Truck-Drone Delivery Systems: A Systematic Literature Review.” IEEE Access 10: 92854-92878.
Lee, J., G. Park, I. Cho, K. Kang, D. Pyo, S. Cho, M. Cho, and W. Chung. 2022. “Ods-bot: Mobile Robot Navigation for Outdoor Delivery Services.” IEEE Access 10: 107250–107258.
Nishida, K., and T. Nishi. 2022. “Dynamic Optimization of Conflict-Free Routing of Automated Guided Vehicles for Just-in-Time Delivery.” IEEE Transactions on Automation Science and Engineering 20 (3): 2099–2114.
Moshayedi, A. J., A. S. Roy, L. Liao, A. S. Khan, A. Kolahdooz, and A. Eftekhari. 2024. “Design and Development of FOODIEBOT Robot: From Simulation to Design.” IEEE Access 12.
Camisa, A., A. Testa, and G. Notarstefano. 2022. “Multi-Robot Pickup and Delivery via Distributed Resource Allocation.” IEEE Transactions on Robotics 39 (2): 1106–1118.
Gan, X., Z. Huo, and W. Li. 2023. “Dp-a*: For Path Planning of UGV and Contactless Delivery.” IEEE Transactions on Intelligent Transportation Systems 25 (1): 907–919.
Ravi. 2027. “Inventory Management System Using PHP and MySQL.” International Journal of Innovative Research in Computer Science & Technology 5 (4): 87-93. https://doi.org/10.17148/IJIRCCT.2017.5412.
Patil, D., M. Rawal, Miss S. Bandgar, M. Pathan, and J. T. Patil. 2023. “Stock Management System.” International Research Journal of Engineering and Technology (IRJET) 10 (3): March.
No Discussion Added Yet
Start discussion for "Data-Driven Autonomous Stock Management Robots with Advanced Human-Machine Collaboration (HMC) Features" article
Stock Management & Robot
This robot delivers and illustrates how an intelligent system may use digital literacy to tackle real-world problems sustainably. Figure 1 shows how the modular, scalable system may adapt to diverse outdoor operations circumstances.
The proposed delivery robot can operate autonomously in outdoor settings like campuses and offices. This robot can move safely, efficiently, and reliably without human involvement, but manual intervention is available.
In Figure 2, the system has several main components:
System Control and Administration
Autonomous System
Support Systems
Actuators
Interfaces with End Users
Physical feature
Figure 3 shows that wheeled robotic platforms or vehicles can move autonomously. The boxy, compact car can maneuver easily in many conditions. Its weight is unknown, but the platform appears light and autonomous. The vehicle is simple but functional, having good mobility for autonomous robotic applications like inspection, surveillance, and logistics.
Digital Design Process
Algorithms design
External Interface