3d Laser Scanner

  • May 2020
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View 3d Laser Scanner as PDF for free.

More details

  • Words: 2,153
  • Pages: 5
EO-36 Artículo Aceptado por Refereo

10º CONGRES O NACIONAL D E INGEN IERÍA ELECTROMECÁN ICA Y DE S IS TEMAS Noviembre 26-30, 2007; México, D.F.

Developing a dynamic optical 3D measurement system for measurement of respiratory patterns M. Bandala 1 , M. J. Joyce1 Department of Engineering, Lancaster University, Lancaster LA 1 4YR, UK Tel (44) 1524 593326 Fax (44) 1524 5381707 E-mail: [email protected] 1

Abstract –– In this paper we present a scanning system capable of acquiring the shape of 3D objects. A line of light is projected onto the physical objective. The image from a video camera is analysed to estimate the position the surface profile, so that the shape and volume of the object can be derived. The system scans objects at reasonable speeds and creates good-resolution images of the scanned object. The running technique of Laser Triangulation is implemented with a high-resolution camera, a laser diode, and electronics incorporated into a small sensor package that rotates from a fixed position. The resultant data are transmitted to a PC. This 3D system can be used as a auxiliary tool for radiotherapy gating methods that are typically done in one dimension. Keywords –– S urface scanning optical measurement, laser triangulation.

II. SCA NNING PRINCIPLE The laser beam is aimed to the object of interest, then a camera grabs an image to enable the software to analyse the image and find the laser line in order to es timate the 3D position of all the points illu minated by the laser. Some thresholding has to be done to eliminate any remaining noise on the picture. The enhanced image is used to triangulate the points or pixels in space. Finally a co mputer program meshes the points and creates a 3D view of the object of interest. Figure 1 resembles the basic configuration of the system.

RS232 Interface board

PC

I. INTRODUCTION

Line Laser

L1

In order to optimize external-beam conformal radiotherapy, patient movement during treat ment must be taken into account. For treat ment on the upper torso, the target organs are known to move substantially due to patient respiration [1]. When chest motion is present during a radiotherapy procedure, physicians usually require a method to monitor the breathing patterns in order to deliver radiat ion more accurately to the moving targets. Many of the techniques available use surrogate breathing signals taken fro m patients by systems that use different sensors such as thermocouples, thermistors or strain gauge [2]. Another common technique is the combination of in frared-sensitive cameras with reflective markers mounted on the abdomen of the patient, as well as audio or visual prompting methods that instruct the patients to breath in and breath out at periodic intervals in order to deduce patients’ own breathing patterns [3] – [5]. Some of these methods can be complex, time consuming, and very expensive. Most importantly some patients find them difficu lt to tolerate and uncomfortable. For this reason, we are proposing the development of a different system that could be used to track and predict organ motion using a noninvasive technique [6].

d

L2

IEEE1394 Camera User

c

User

Person to be scanned inetOrgPerson

Figure 1. Motion tracking system basic setup.

The main setup of this system is shown below in figure 2. As it can be seen, the classic triangulation method is deployed here. For simplicity, let us derive the equations for the tracking of a single point in space. Where C is a matrix that represents the camera pixel direction vectors , d is the laser position or distance with respect of the camera. L1 is the laser beam vector 1 and L2 is the laser beam vector 2. The magnitude of the vector C is in reality the position in 3D of our point of interest. So

C d L1 L2

(1)

EO-36 Artículo Aceptado por Refereo

10º CONGRES O NACIONAL D E INGEN IERÍA ELECTROMECÁN ICA Y DE S IS TEMAS Noviembre 26-30, 2007; México, D.F.

the angles formed by the two lines fro m the secondary principle point to the image sensor (figure 5), were obtained applying the formule g iven below.

D1 2f D2 2 x arctan 2f

2 x arctan

Figure 2. Motion tracking system basic setup.

Note that C is modelled in matrix form because it represents all the vectors associated to the camera p ixels that conform the camera image. This is better exp lained by figures 3 and 4. Of course the triangulation can be solved by simple algebraic equations, however all the components of the C vectors must be known. Such components depend on the position of its associated pixel.

(2)

(3)

Where D1 and D 2 are the actual horizontal and vertical sizes (in micro meters) of the image sensor. In some cases . This had to be done since the D1 D2 , therefore angles of interest and are not a standard camera attribute provided by manufacturers.

Figure 5. Principle to obtain

and

.

If (1) is expressed in vector form. It is still necessary to find the component angles of vector C, L1 and L2.

Figure 3. Camera pixel vectors representation.

C

x y z

d

1 0 0

L1

L2

(4)

Figure 6 is the representation of coordinate system associated to the laser. The laser vectors L1 and L2 can the be expressed in terms of the know angle θ. So

Figure 4. Position of the camera pixel vectors in relat ion with the X-Z plane.

L1

0 0 1

(5)

L2

cos sin

(6)

0

III. OPTICAL CONSIDERATIONS During this study, we discovered that camera angle of v iew is crucial to find the components of the C vectors. The horizontal and vertical angle of views and , which are

Figure 7 shows the relationship between vector component angles , , and and their projection angles 1 , 2 and 3.

EO-36 Artículo Aceptado por Refereo

10º CONGRES O NACIONAL D E INGEN IERÍA ELECTROMECÁN ICA Y DE S IS TEMAS Noviembre 26-30, 2007; México, D.F.

Tc 1 1 2 Tr 1 1 2

m 3

atan n

(9)

Where - Camera widest horizontal angle of view. m Tc n Tr Figure 6. Laser coordinate system.

Camera widest vertical angle of view. Colu mn associated to x-axis. Total number of pixel colu mns in screen. Row associated to z-axis. Total number of pixel ro ws in screen.

Once the projection angles are known, the component angles are found with (10), (11) and (12).

tan tan tan

csc csc csc

cot 3 cot 1 cot 3

(10) (11) (12)

1 1 2

Exp ression (1) can be represented in a from that can now be solved

C

cos cos cos

d d 0 0

0 L1 0 1

L2

cos sin 0

(13)

So

C

Figure 7. Coordinate system point P(x,y,z) and angles.

Finding the direction vectors from an image is done by scanning every horizontal pixel line in the image and searching for the pixel with the laser projection on it. So, for every image z-position, an associated x-position is sought. For a given vector Cn m the projection vectors can be found fro m (7), (8) and (9).

1

2

1 2

90

1 2

d sin cos cos cos sin

Note that θ here is not any of the projection angles θ 1 , θ2 or θ 3 but the angle of the laser beam (see figure 6). The practical implementation of a prototype is shown in figure 8 and the final implementation in figure 9.

m Tc

1

(7)

n Tr

1

(14)

(8)

Figure 8. System prototype.

EO-36 Artículo Aceptado por Refereo

10º CONGRES O NACIONAL D E INGEN IERÍA ELECTROMECÁN ICA Y DE S IS TEMAS Noviembre 26-30, 2007; México, D.F.

Figure 9. Lancaster University 3D Scanner.

IV. ALGORITHM Finding the direction vectors is done by scanning every horizontal line in the image and looking for the laser projection. That is, for every line m an associated illu minated position (or colu mn nu mber n) in required. The camera vector Cm,n is found by using (14). Since d, Φ, φ, Tc and Tr are know variables, for a position (m, n) the angles θ1 , θ2 and θ3 are needed to obtain α, β and γ. The algorith m to fulfil this task is illustrated in figure 10.

Figure 11. 3D scanning of a dummy face.

The calibrat ion of the system was performed with the method proposed by [7]. Figure 11 shows the scan of a box marked with squares which are at a known distance between each other. The accuracy of the system was calibrated by comparing the actual square distances and the ones scanned by the system.

Figure 10. Algorithm to obtain C.

V. RESULTS Figure 11 shows how the laser line is projected over a dummy face. The camera acquires single images and the software finds the vector positions for the laser-illu minated pixels. A program based on Lab VIEW ™ generates the x, y and z values and traces them in a 3D graph. It is possible to acquire and trace an entire 3D object by changing the position of the laser beam. This process has to be cycled to mesh many single scanned lines and draw a complete 3D shape.

Figure 11. Top – Calibration box. Bottom - Scan of the square-marked box during the calibration process.

EO-36 Artículo Aceptado por Refereo

Solid static bodies, such as a dummy head are easily scanned with good accuracy; the scanning speed to obtain the image in figure 8 was appro ximately 4.5 seconds. A method for dynamic analysis of the chest wall motion does not require the detail level achieved here, therefore the resolution and the θ angle stepping can be modified so that a section of interest of the human chest can be scanned much faster.

10º CONGRES O NACIONAL D E INGEN IERÍA ELECTROMECÁN ICA Y DE S IS TEMAS Noviembre 26-30, 2007; México, D.F.

VIII. [1] [2] [3] [4]

On this basis a statistical model can be constructed incorporating predictive variables and derived constants that can explain the volume changes when scanning a breathing chest. It is worth mention ing that the faster the respiratory cycle is, the more difficult is the scanning process and the accuracy of predictions based on previous data will be reduced.

[5] [6]

[7]

VI. CONCLUSION A 3D method to mon itor respiratory motion as an alternat ive to the current 1D methods used in gated-radiotherapy is presented. This optical system has considerable potential for rapid and accurate assessment of chest wall movements in the further assessment of movement during gatedradiotherapy. For examp le the position prediction of internal organs based of external measurements. It allo ws comp lex movements to be followed on a within breath basis, which could be related to muscle activity and respiratory pressures, and gives a more detailed view of events in the respiratory cycle. VII. ACKNOW LEDGM ENT We acknowledge the support by the Mexican National Council for Scientific and Technological Develop ment and the Faculty of Science and Technology at Lancaster University.

REFERENCES

Murphy, M.J., T racking moving organs in real time. Seminars in Radiation Oncology, 2004. 14(1): p. 91. Kubo, H.D. and B.C. Hill, Respiration gated radiotherapy treatment: a technical study. Physics in Medicine and Biology, 1996. 41: p. 83. Shimizu, S., et al., Detection of lung tumor movement in real-time tumor-tracking radiotherapy. International Journal of Radiation Oncology*Biology*Physics, 2001. 51(2): p. 304. Serago, C.F., et al., Initial experience with ultrasound localization for positioning prostate cancer patients for external beam radiotherapy. International Journal of Radiation Oncology*Biology*Physics, 2002. 53(5): p. 1130. Seiler, P.G., et al., A novel tracking technique for the continuous precise measurement of tumour positions in conformal radiotherapy. Physics in Medicine and Biology, 2000. 45: p. 103. Berson, A.M., et al., Clinical experience using respiratory gated radiation therapy: Comparison of free-breathing and breath-hold techniques. International Journal of Radiation Oncology*Biology*Physics, 2004. 60(2): p. 419. Gordon B Drummond1 and Neil D Duffy. A video-based optical system for rapid measurements of chest wall movement . Physiol. Meas. 22 (2001) 489–503.

BIOGRAPHIE S Manuel Bandala received his B.Eng (Hons) in Electronics Enginering from the Instituto Tecnológico de Puebla, in 2001. He is currently a PhD Candidate at The University of Lancaster supported by the M exican National Council for Scientific and Technological Development. His research interests include 3D laser scanning, body signal monitoring, wireless inertial navigation systems, and microelectronics design. Malcolm J. Joyce received both his B.Sc (Hons) in Physics and his PhD in Nuclear Physics from the University of Liverpool, UK in 1990 and 1993, respectively. He is currently Senior Lecturer in the Department of Engineering at Lancaster University UK. His research interests include medical radiotherapy, neutron and gamma-ray spectrometry and nuclear instrumentation. He is a Chartered M ember of the Institute of Physics and the Institution of Nuclear Engineers in the UK.

Related Documents

3d Laser Scanner
May 2020 4
Scanner
July 2020 24
Scanner
June 2020 16
Scanner
October 2019 38
Scanner
October 2019 63