See discussions, st ats, and author pr ofiles f or this public ation at : https:www .researchgate.ne tpublic ation336162116 [610310]
See discussions, st ats, and author pr ofiles f or this public ation at : https://www .researchgate.ne t/public ation/336162116
Lidar-based Obstacle Avoidance for the Autonomous Mobile Robot
Conf erence Paper · July 2019
DOI: 10.1109/IC TS.2019.8850952
CITATIONS
3READS
700
4 author s:
Some o f the author s of this public ation ar e also w orking on these r elat ed pr ojects:
Implement asi Me tode Anfis unt uk Menghindari Dynamic Obst acle di Ar ea Koridor p ada Thr ee Wheels Omni-Dir ectional Mobile R obot View pr oject
Design of electr onic nose syst em using g as chr omat ography principle and Surf ace Ac oustic W ave sensor View pr oject
Dony Hut abarat
Instit ut T eknologi Sepuluh Nopember
2 PUBLICA TIONS 3 CITATIONS
SEE PROFILE
Muhammad Riv ai
Instit ut T eknologi Sepuluh Nopember
162 PUBLICA TIONS 451 CITATIONS
SEE PROFILE
Djok o Pur wanto
Instit ut T eknologi Sepuluh Nopember
50 PUBLICA TIONS 324 CITATIONS
SEE PROFILE
Harjuno Hut omo
National T aiwan Univ ersity of Scienc e and T echnolog y
1 PUBLICA TION 3 CITATIONS
SEE PROFILE
All c ontent f ollo wing this p age was uplo aded b y Muhammad Riv ai on 17 Oct ober 2019.
The user has r equest ed enhanc ement of the do wnlo aded file.
12th International Conference onInformation &Communication Technology andSystem (lCTS) 2019
Lidar-based Obstacle Avoidance for the
Autonomous Mobile Robot
Dony Hutabarat
Department ofElectrical Engineering
Institut Teknologi Sepuluh Nopember
Surabaya, Indonesia
donyhutabarat.18071 @mhs.its.ac.id
Djoko Purwanto
Department ofElectrical Engineering
Institut Teknologi Sepuluh Nopember
Surabaya ,Indonesia
djoko @ee.its .ac.id
Abstract- Inconditions that aredangerous forhumans and
their environment, the use of robots can be a solution to
overcome these problems. Various sensors areused to determine
theobstacle free path and theexact position of the robot.
However ,conventional sensors have limitations interms of
detection distance, spatial resolution, and processing
complexity. Inthisstudy, anautonomous mobile robot has been
developed equipped with Light Detection andRanging (LiDAR)
sensor to avoid obstacle. Braitenberg vehicle strategy is used to
navigate themovements of the robot. Sensor data collection and
control algorithm areimplemented on a single computer board
ofRaspberry Pi 3. The experimental results show that this
sensor canmeasure distance consistently which is not affected
by the object's color and ambient light intensity. The mobile
robot can avoid colored objects ofdifferent sizes. This
autonomous mobile robot can also navigate inside aroom
without anyimpact on the wall or the obstacle.
Keywords -Autonomous mobile robot, Braitenberg vehicle,
LiDAR, Obstacle avoidance
I. INTROD UCTION
Some dangerous situations, such as chemical industries,
polluted environments, and mining areas can endanger
workers andresidents around them .These will have an impact
on both financial and life losses. Therefore, the use ofrobots
can be a solution toovercome these problems r11.Mobile
robots equipped with several sensors can investigate in a room
or in an open area [2], as well as in industrial fields [3 ,4]. The
autonomous mobile robots that have been developed are
equipped with obstacle avoidance features as one type of
intelligent robots [5].
The ability todetect walls and obstacles around them to
predict collision-free paths automatically is the main feature
ofautonomous mobile robots [6].Various sensors such as
infrared andultrasonic range finders, cameras and GPS are
used to determine theobstacle free path and the exact position
ofthe mobile robot [7]. However ,these conventional sensors
have limitations in terms ofdetection distance ,spatial
resolution, and processing complexity. As an instance,
blanking intervals andangular uncertainty arelimitation son
theultrasonic distance sensors [8].
In this study ,Light Detection and Ranging (LiDAR)
technology isused as an obstacle avoidance system on an
autonomous mobile robot. LiDAR hasseveral advantages,
978-1-7281-2133-8/19 /$31.00 ©2019 IEEE 197Muhammad R ivai
Departm entofElectr icalEngine ering
Institut Tekn ologi Sepuluh Nop ember
Surabaya, Indones ia
muhammad Jiva i@ee.its.ac.id
Harjuno Hutomo
Department ofElectrical Engin eering
Institut Teknol ogiSepuluh Nopember
Surabaya, Indones ia
harjuno270396 @gmail.com
such as a high level ofprecision with a long detection
distance [5,7].
II. LITERATIJRE REVIEW
A.Light D etection andRanging
LiDAR is an optical scanning technology that measures
theproperties ofradiated light to find distance and other
information from target. Shooting pulses oflaser light onto
the object's surface is one method for determining the
distance ofanobject ,asshown inFigure 1. Light travels at a
speed ofabout 300,000 kilometers persecond or 0.3meters
pernanosecond invacuum .The time difference between light
emitted and received back can be used todetermine the
distance oftheobject [9-11] .
The standard information from LiDAR is an arrangement
ofpoints on polar coordinates [8]. The distance measuring
technology iswidely used in the fields ofgeodesy ,archeology ,
geography ,geology, geomorphology, seismology, and remote
sensing [12].In this study, we use X4 YDLiDAR ,aLiDAR
which is equipped with a motor so that it can rotate to provide
a360-degree view, which has a working voltage of5 V and a
range ofmeasurements from 0.12 m to 10m indoors .The
interface data communication uses a serial port or USB
adapter.
Principles ofLiDAR
Object
-f[–:
-U–'I
Photo Detector I
I
I
:. .:
Time d ifference
can be converted
todistance
Fig.I.The principles ofLiDAR .
Fig. 3. The differential drive kinematics model [12].
C.Mobile Robot
Many kinds ofsensors can be implemented on a mobile
robot to determine its environmental conditions. The sensors
commonly used are ultrasonic, infrared, camera, and others
[15]. In this study, a LiDAR sensor is implemented on a
differential drive mobile robot, where there are two wheels
each ofwhich is driven by a de motor and a free wheel [16].
The difference in speed between the right and left wheels
is used as a robot control in this steering type [15,17,18]. The
speed difference will result in a kinematic motion ofthe
mobile robot that forms a rotating angle when the mobile robot
turns the direction. The kinematics model for differential
steering is shown in Figure 3, where the ICC is the
Instantaneous Center ofCurvature, R is the radius ofthe ICC
to the midpoint between two robot wheels, B is the width of
the robot, Vnis the linear speed ofthe right wheel, VL is the
linear speed ofleft wheel. The relationship between linear and
angular velocity is as follows :
(2)(1) wR vB. Raspberry Pi
Raspberry Pi is a single board computer that can be used
in various applications. Raspberry Pi 3 works at a voltage of
5V using a micro USB connector. This computer uses the
Broadcom BCM287 chipset processor, with 1 GB ofmemory,
as shown in Table 1. Raspberry Pi has data storage ofa SD
card memory, as shown in Figure 2.
Table 1. Raspberry Pi 3 technical specifications [13].
Microprocessor Broadcom BCM2837 64 bit Quad
Core Processor
Processor Ooeratinz Voltaze 3.3V
Raw Voltage input 5 V. 2 A power source
Maximum current through 16mA
each I/O pin
Maximum total current drawn 54mA
from all I/O pins
Flash Memory (Operating 16 Gbytes SSD memory card
System)
Internal RAM I G bytes DDR2
Clock Freouencv 1.2GHz
GPU Dual Core Video Core IV ®
Multimedia Co-Processor .Provides
Open GLES 2.0 ,hardware-
accelerated Open VG, and 1080p30
H.264 high- profile decode .
Capable ofIGpixells ,1.5Gtexells or
24GFLOPs with texture filtering and
DMA infrastructure .
Ethernet 10/100 Ethernet
Wireless Connectivity BCM43 143 (802.11 blg/n Wireless
LAN and Bluetooth 4.1)
Operating Temperature -40°C to+85 °C
'loYXI"! J-DMIOut
ft-"lcr:- usa'
Fig.2.Block diagram oftheRaspberry Pi 3 [14].I
,',IlI:"PUI
ccrrccsn..o,n(3)
D.Braitenberg Vehicle
Braitenberg vehicle is a vehicle which is equipped with
two sensors and two motors, but with different connections
between them [19]. Figure 4 shows the type ofbraitenberg
vehicle 2a and 2b. The Braitenberg vehicle 2a is the type used
for obstacle avoidance, while type 2b is for tracking. The '+'
sign indicates a higher sensor signal which will increase the
de motor speed.
In the Braitenberg algorithm, a number ofsensors are
connected to motor settings where the motor speed is affected
by the sensor input [20]. The algorithm produces a weighted
matrix that converts sensor inputs into motor speeds. The
matrix is a two-dimensional array with the columns and the
rows are the number ofsensors, and the number ofmotors,
respectively.
198
Fig.4.Braitenberg vehicle 2a dan 2b [16].
Equation (4)represents theweighted matrix with WLS' is
the value ofthe S, sensor to adjust the left motor speed, while
equation (5) is the value ofthe sensor. The motor speed can
becalculated as shown in equation (6), where Smax is the
maximum value ofsensor.___________ fI1o.Qi~B~Q.t_r––
I
I
I
'D' A I Irstanc nguar
I
I
I
I
I
I
I,
I
I
Fig. 5. Block diagram of the mobile robot system.sensor-,I/-8-
/I-,"
\
\
I
J
J,
(4)
(5)
(6)
Fig. 6. The YDLIDAR X4 with 360 "scanning [21].
Ifthe distance is less than 0.5 m at an angle between 90·
and 0 ·,then the right motor speed will gradually decrease until
the distance is 0.15 m .The motor speed reaches a minimum
and makes the robot turn right, and vice versa. The flow
diagram oftheobstacle avoidance system is shown in Figure
7.III. RESEARCH METHODS
Figure 5 depicts the overall block diagram oftheobstacle
avoidance system. The values that will be used as the
references in this system are the coordinates ofthe angle and
distance ofthe object obtained from LiDAR. Robot
Operating System (ROS) and YDLiDAR drivers are installed
on the Raspberry Pi 3 single computer board. This study
applies the Braitenberg vehicle 2bmethod where the sensor
on the left side ofthe mobile robot is used to control the right
motor and vice versa. Pulse Width Modulation (PWM)
method is used to determine the motor speed.
Figure 6 shows the LiDAR which acts as a sensor that
scans the distance ofobject in a clockwise direction. In this
study, only 360 data will be used in the range from 90 ·to
90· for a half-degree change. The weight values and the
results ofdistance measurements can be presented inequation
(7) and (8), respectively. Equation (9) and (10) are used to
adjust the motor speed, with Ymax is the motor speed when
PWM at the duty cycle of100%, dminof0.15 is the minimum
distance value, and d rnaxof0.5 is the maximum distance
value.VL=V rnax-(W L*[1-(d-~min )J)«:«:
v,=v~-[w,'Hd:-~;JJ(8)
(9)
(10)
(7)
199
Yes
No
No Reduce theright moto rspeed
'––.I Scanning theright side oftherobot
Yes
Reduce theleft moto rspeed
Fig. 7. Flowchart of the obstacle avoidance system.
Fig.8. The mobile robot platform used in this experiment.
IV. RESULTS AND DISCUSSION
A.LiDAR measurement
The mobile robot platform used in this experiment is
shown in Figure 8. The robot consists ofLiDAR as a 360
scanner sensor for obtaining the angle and distance ofobjects
around robot, Raspberry Pi 3 as a data collector from LiDAR
and as a motor controller, motor driver module, 11.1V
200Lithium Polymer (LiPo) battery pack, and Buck converter step
down module as a 5V power supply .Table 2 shows that the
minimum and maximum distance measurements ofthe
LiDAR are 0 .12m and 10.5 m, respectively, with an average
error of0.9%. In order to find out the reliability ofthe data
obtained by the LiDAR, the measurements are carried out with
various environmental light intensities and different object
colors. There are six kinds of object colors in each ofthe
experiment, namely red, green, blue, orange, purple, and
black, with the object distances of1m and 2m .The results of
LiDAR measurements for light intensity of83 lux, 15.5 lux,
and 0.04 lux are shown in Table 3, Table 4 ,and Table 5,
respectively. The results ofthis experiment shows that the
ambient light intensity and the surface color of the object do
not significantly influence the LiDAR measurements.
Table 2. The LiDAR measurements .
Distance (m) Read Distance (m) Error (%)
0-0.11 0 –
0.12 0.121 0.83
0.16 0.161 0.83
0.20 0.201 0.83
0.24 0.241 0.83
0.5 0.505 I
1.5 1.495 0.33
2.5 2.509 0.36
3.5 3.517 0.48
4.5 4.531 0.68
5.5 5.525 0.45
6.5 6.561 0.93
7.5 7.606 1.41
8.5 8.600 1.17
9.5 9.679 1.88
10.5 10.662 1.54
1l.5 0 –
12.5 0 –
Average 0.9
Table 3. The LiDAR measurements at a light intensity of83
lux.
ObjectObject distance: 1 m Object distance: 2 m
Color Measured Error Measured Error
distance (m) (%) distance (m) (%)
Red 0.994 0.6 2.016 0 .8
Green 0.984 1.6 2.008 0.4
Blue 0.980 2 2.013 0.65
Orange 0.992 0.8 2.006 0.3
Purple 0.991 0.9 2.001 0.05
Black 1.003 0.3 2.033 1.65
Average 0.990 1.03 2.013 0.64
Table 4. The LiDAR measurements at a light intensity of
15.5lux.
ObjectObject distance: 1 m Object distance: 2 m
Color Measured Error Measured Error
distance (m) (%) distance (m) (%)
Red 0.992 0 .8 2.099 0.49
Green 0.991 0.9 2.091 0.45
Blue 0.992 0 .8 2.069 0.34
Orange 0.980 2 2 .005 0.25
Purple 0 .991 0.9 2 .088 0.44
Black 1.003 0.3 2.022 l.l
Average 0.991 0.95 2.062 0.51
Table 5. The LiDAR measurements at a light intensity of0.04
lux
ObjectObject distance: 1 m Object distance: 2 m
Color Measured Error Measured Error
distance (m) (%) distance (m) (%)
Red 0.992 0.8 2 .013 0.65
Green 0.993 0.7 1.978 l.l
Blue 0.983 1.7 1.994 0.3
Orang e 0.991 0.9 1.982 0.9
Purple 0.995 0.5 1.990 0.5
Black 1.002 0.2 2 .018 0.9
Average 0.992 0.80 1.996 0.72
Table 6. The mobile robot avoidance tovanous obstacle
colors
Object distance: Object distance:
Object25cm 50cm
Color Collisions Collisions
Yes No Yes No
Red ./ ./
Green ./ ./
Blue ./ ./
Orange ./ ./
Purple ./ ./
Black ./ ./
Total (%) 0 100 0 100
Table 7.The mobile robot avoidance tovarious obstacle
widths.
Object distance: Object distance:
Object25cm 50cm
Width (cm) Collisions Collisions
Yes No Yes No
5 ./ ./
10 ./ ./
15 ./ ./
20 ./ ./
25 ./ ./
30 ./ ./
Total (%) 0 100 0 100
201B.Avoidance ofthe mobile robot to variou sobstacles
In the first experiment, avoidance ofthemobile robot is
carried out with various obstacle colors with the distances of
25 em and 50 ern, Table 6 shows that the mobile robot can
avoid various colors ofobject with a success rateof100%.In
the second experiment, avoidance ofthemobile robot is
carried out with various obstacle widths with the distances of
25 em and 50 em ,Table 7 shows that the mobile robot can
avoid various widths ofobject with a success rateof100%.
The third experiment iscarried out with various colors and
sizes ofobstacle with the distances of25cm and 50cm. Table
8 shows that themobile robot can avoid colored objects of
different sizes, namely glass bottle ,jerry can,andcardboard
box.However ,therobot cannot avoid transparent objects such
asacrylic andpolycarbonate bottle .This can be caused by the
laser pulses emitted byLiDAR do not reflected back by the
object. The o verall experiment includes anavigat ionofthe
autonomous mobile robot through anindoor room both with
andwithout obstacle. Theresults ofthe twoexperiments show
that the mobile robot manages tonavigate the room without
hitting orcrashing into the wall or obstacle. The movements
ofthemobile robot are shown in Figure 9 and Figure 10.
Table 8 The mobile robot avoidance tovarious obstacles
Object distance: Object distance:
25cm 50cm
Object Collisions Collisions
Yes No Yes No
Acrylic ./ ./
Polycarbonate Bottle ./ ./
Glass Bottle ./ ./
Jerry cans ./ ./
Cardboard ./ ./
Fig. 9. The mobile robot navigat ion in a room without obstacle.
Fig. 10. The mobile robot n avigation inaroom with obst acle.
V. CONCLUSION
In this study, a differential drive autonomous mobile robot
equipped with LiDAR has been developed to avoid obstacle.
The Braitenberg vehicle strategy is used to navigate the
movements ofthe robot. The distance measurements and the
control algorithm areimplemented on a single board computer
ofRaspberry Pi 3. The experimental results show that LiDAR
can measure distances between 0.12 to 10.5 m with an error
rate of0.9%. The color ofthe object and the intensity of
ambient light do not affect the measurements obtained from
LiDAR. The autonomous mobile robot can avoid colored
objects ofdifferent sizes. However, the robot cannot avoid
transparent objects. This autonomous mobile robot can
navigate indoor room both with and without obstacle with no
impact on wall or obstacle.
ACKNOWLEDGMENT
This research was carried out with financial aid support
from the Ministry ofResearch, Technology and Higher
Education ofthe Republic ofIndonesia (Kemenristekdikti RI).
REFERENCES
[1] M. Vuka, E. Schaffernicht, M. Schmuker, V .H.Bennetts, F. Amigoni,
A.J.Lilienthal, "Exploration and localization of a gas source with
MOX gas sensors on a mobile robot-A Gaussian regression bout
amplitude approach" ,ISOEN – ISOCS/IEEE Int. Symp .Olfaction
Electron. Nose ,2017 ,pp. 3 -5.
[2] H.Widyantara ,M.Rivai, D. Purwanto, "Wind direction sensor ba sed
on thermal anemometer forolfactory mobile robot ",Indonesian
Journal of El ectrical Engineering andComputer Science ,vo!. 13 ,no.
2,2019, pp. 475 -484.
[3] M .Rivai ,Rendyans yah, D.Purwanto , "Implementation of fu zzylogic
control in robot arm forsearching location of gas leak ",International
Seminar on Intelligent Technology and Its Application, 2015 ,pp. 69
74.
202[4] G. A.Rahardi ,M. R ivai, D.Purwanto, "Implementation o fhot-wire
anemometer on olfactory mobile robot to localize gas source" ,
International Conference on Information and Communications
Technology, 2018 ,pp. 412–417.
[5] Y .Peng, D .Qu, Y.Zhong, S .Xie, J .Luo,J.Gu, ''The obstacl e
detection and obstacle avoidance algorithm based on 2-D lidar", IEEE
Int. Conf. Inf. Autom .IClA – conjunction with IEEE Int. Conf. Autom .
Logist., 2015, pp. 1648-1653 .
[6] M. Bengel ,K.Pfeiffer ,B.Graf ,A.Bubeck, A.Veri, "Mobile robots
for offshore inspection and manipulation ",IEEE/RSJ Int. Conf. Intell.
Robot. Syst., 2009, pp .3317 -3322 .
[7] K.H.Hsia, H.Guo, K.L.Su,"Motion guidance of mobile robot using
laser range finder ",iFUZZY- Int. Conf. Fuzzy Theory Its App!. ,2013,
pp.293-298.
[8] N .A.Rahim, M .A.Markom, A.H.Adorn, S .A. A. Shukor, A.Y.M.
Shakaff ,E.S.M.M. Tan, "A mapping mobile robot using RP Lidar
scanner ",2016 ,pp.87-92.
[9] M. D .Adams ,"Lidar design ,use, and calibration concep tsfor correct
environmental detection" ,IEEE Trans .Robot. Autom .,vo!. 16 ,no. 6,
2000,pp .753-761 .
[10] M .M.Atia,S.Liu,H.Nematallah, T .B.Karamat, A.Noureldin,
"Integrated indoor na vigation system for ground vehicles with
automati c3-D alignment and position initialization" ,IEEE Trans .Veh.
Techno!. ,vo!. 64 ,no. 4 ,2015, pp. 1279-1292.
[11] J. Liu, Q. Sun, Z. Fan , "TOF Lidar Development inAutonomous
Vehicle" ,IEEE 3rd Optoelectron .Glob. Conf .,2018, pp. 185-190 .
[12] J.Zhang, S. Singh ,"Visual-li dar odometry and mapping: Low-drift,
robust, and fast" ,Proc .- IEEE Int. Conf. Robot. Autom. ,2015 ,pp.
2174-2181.
[13] "Raspberry Pi 3 Pinout, Features, Spec ifications & Dat asheet."
[Online] .Available :https: //componentsl0l .com/microcontrollers /
raspberry-pi- 3-pinout- features-datasheet. [Accessed: 03-Jul-20 19].
[14] "pi3-block-diagrarn-rev4.png (\259 x900). "[Online]. Available :
http://doc .xdevs. com/doc/RP i/pi3-block-d iagram-rev4 .png.
[Accessed :03-Jul-2019].
[15] M .S. Saidonr, H. Desa, M .N. Rudzuan, "A differential steering control
with proportional controller for an autonomous mobile robot", Proc .
IEEE 7th Int. Colloq .Signal Process .Its App!. CSPA, 2011 ,pp.90-
94.
[16] J .Singh, P .S.Chouhan, "A new approach for line following robot
using radius ofpath curvature and differential drive kinernatics", 6th
Int. Conf. Comput. App!. Electr. Eng. – Recent Adv. CERA ,2017,pp.
497-502 .
[17] R. Watiasih, M. Rivai ,R.A.Wibowo ,O. Penangsang, "Path planning
mobile robot using waypoint for gas level mapping ",International
Seminar on Intelligent Technology and Its Application ,2017, pp .244
249.
[18] R.Watiasih ,M.Rivai, O .Penangsang ,F.Budiman, Tukadi, Y.
Izza,"Online Gas Mapping in Outdoor Environment using Solar
Powered Mobile Robot", International Conference onComputer
Engineering ,Network and Intelligent Multimedia, 2018 ,pp.245-250 .
[19] X.Yang, R .V.Patel ,M.Moallem ,"Afuzzy-braitenberg navigation
strategy for differential drive mobile robots", J .Intel!. Robot .Syst.
Theory App!., vo!. 47 ,no. 2 ,2006, pp .101-124.
[20] E .Fabregas ,G.Farias, E .Peralta, H .Varg as,S.Dormido , "Teaching
control in mob ilerobotics w ithV-REP and a Khepera Nlibrary" ,
IEEE Conf. Control App!. CCA 2016 ,pp.821-826 .
[21] "YDLIDAR X4 DATASHEET ,"2015 .
View publication statsView publication stats
Copyright Notice
© Licențiada.org respectă drepturile de proprietate intelectuală și așteaptă ca toți utilizatorii să facă același lucru. Dacă consideri că un conținut de pe site încalcă drepturile tale de autor, te rugăm să trimiți o notificare DMCA.
Acest articol: See discussions, st ats, and author pr ofiles f or this public ation at : https:www .researchgate.ne tpublic ation336162116 [610310] (ID: 610310)
Dacă considerați că acest conținut vă încalcă drepturile de autor, vă rugăm să depuneți o cerere pe pagina noastră Copyright Takedown.
