• 검색 결과가 없습니다.

Automatic Hair Implant System Follicular Unit Classification Method Using Angle Variation of Boundary Vector for

N/A
N/A
Protected

Academic year: 2022

Share "Automatic Hair Implant System Follicular Unit Classification Method Using Angle Variation of Boundary Vector for"

Copied!
11
0
0

로드 중.... (전체 텍스트 보기)

전체 글

(1)

This paper presents a novel follicular unit (FU) classification method based on an angle variation of a boundary vector according to the number of hairs in several FU images. The recently developed robotic FU harvest system, ARTAS, classifies through digital imaging the FU type based on the number of hairs with defects in the contour and outline profile of the FU of interest.

However, this method has a drawback in that the FU classification is inaccurate because it causes unintended defects in the outline profile of the FU. To overcome this drawback, the proposed method classifies the FU’s type by the number of variation points that are calculated using an angle variation a boundary vector. The experimental results show that the proposed method is robust and accurate for various FU shapes, compared to the contour-outline profile FU classification method of the ARTAS system.

Keywords: Hair transplantation, follicular unit (FU), classification, FU harvest, angle variation.

Manuscript received Aug. 18, 2014; revised June 9, 2015; accepted July 22, 2015.

This work was supported by ETRI R&D Program (16ZC3110, Local-based medical device/robot development & medical IT convergence for small and medium enterprise revitalization project) funded by the Government of Korea.

Hwi Gang Kim (hwigangkim@etri.re.kr), Tae Wuk Bae (corresponding author, twbae@etri.re.kr), Kyu Hyung Kim (jaykim@etri.re.kr), and Hyung Soo Lee (hsulee@

etri.re.kr) are with the IT Convergence Research Laboratory, ETRI, Daegu, Rep. of Korea.

Soo In Lee (silee@etri.re.kr) is with the Broadcasting & Telecommunications Media Research Laboratory, ETRI, Daejeon, Rep. of Korea.

I. Introduction

Hair transplantation is a surgical technique that moves individual hair follicles from one part of the body, called the donor site, to a bald or balding part of the body, known as the recipient site [1], [2]. In this minimally invasive procedure, grafts containing hair follicles are transplanted to a recipient site.

Generally, grafts contain almost all of the epidermis and dermis surrounding the hair follicle, and rather than a single strip of skin, many tiny grafts are instead transplanted [3], [4]. Because hair grows naturally in groupings of one to five hairs (groupings of 4–5 hairs are much less common than groupings of 1–3 hairs), most advanced techniques harvest and transplant the more naturally occurring groupings of 1–3 hair follicular units (FUs) (see Fig. 1).

Generally, this type of hair transplant procedure is called follicular unit transplantation (FUT) [4]. The donor hair can be harvested in two different ways — strip harvesting and follicular unit extraction (FUE). Using strip harvesting, a

Fig. 1. FU types.

1 hair 2 hairs 3 hairs

Follicular Unit Classification Method Using Angle Variation of Boundary Vector for

Automatic Hair Implant System

Hwi Gang Kim, Tae Wuk Bae, Kyu Hyung Kim, Hyung Soo Lee, and Soo In Lee

(2)

surgeon harvests a strip of skin from a posterior scalp within an area of good hair growth [1], [2].

The excised strip is about 1 cm × 15 cm to 1.5 cm × 30 cm in size. While closing the resulting wound, assistants begin to dissect individual FU grafts, which are small, naturally formed groupings of hair follicles, from the strip. Working with binocular stereo-microscopes, they carefully remove excess fibrous and fatty tissue while trying to avoid damage to the follicular cells that will be used for grafting.

Through FUE harvesting, individual FUs containing one to three hairs are removed under local anesthesia; this micro removal typically uses tiny punches between 0.6 mm to 1.0 mm in diameter, as shown in Fig. 2(a) [5]–[8]. Punch size is selected based on FU density; the greater the density, the smaller the punch size required. If possible, the smallest punch is chosen to minimize trauma to the scalp [9]. The surgeon then uses very small micro blades or fine needles to puncture the sites for receiving the grafts, placing them in a predetermined density and pattern, and angling the wounds in a consistent fashion to promote a realistic hair pattern. Technicians generally conduct the final part of the procedure, inserting the individual grafts in place. Although FUE is considered more time consuming, depending on the operator’s skill, and because there are restrictions on patient candidacy, FUE has many advantages, including a small donor wound, less pain, and a slender graft without extra surrounding tissue. After the appropriate number of units has been removed, technicians separate the grafts into units of one to three hairs, and these grafts are implanted much the same way as in the “strip”

method [10]. An improved FUE procedure using a motor- driven handpiece (that is, the powered FUE (P-FUE) method) was introduced in [11].

The newest and most advanced method of hair transplantation is called NeoGraft, which is an automated FUE and implantation system [9], [12]. Through the NeoGraft hair transplant medical system, the difficulties associated with a manual FUE procedure have been eliminated. An FUE hair transplant using NeoGraft can be conducted within the same amount of time as a traditional strip hair transplant, reducing the cost of a long manual FUE transplant. In addition, to reduce the time required for the hair transplant, NeoGraft uses pneumatic pressure to extract the follicles and precisely implant them. This eliminates the need for touching the follicles with forceps during harvesting, leaving them in a more robust state.

The ARTAS robotic system has recently been developed to automatically harvest FUs using imaging technology and robotics [13]–[21]. The ARTAS system is an automated traditional FU harvest method using FUE, as shown in Fig. 2(b). Its features include high-resolution digital imaging, micron-level targeting accuracy using image-guided robotic

Fig. 2. Harvesting device: (a) FUE and (b) ARTAS.

(a) (b)

alignment, and precision beyond that of manual techniques; its minimally invasive dissection delivers healthy, intact grafts with nearly undetectable harvest sites [17]. Through high- resolution digital imaging, it visualizes the surface of the scalp in three dimensions with micron-level precision, and the stereo digital imaging maps the coordinates, angle orientation, and direction for each FU, and determines the density and distribution of the FUs [11]. Its algorithms set the alignment and depth for individual FU harvesting.

As previously mentioned, hair transplantation techniques have been researched and advanced through developments in the harvesting devices [11], [18]–[20]. In particular, the recently developed FU harvest system, the ARTAS robotic system, uses digital image processing techniques to obtain FU information (type, angle orientation, and direction) on the donor area and automatically selects and harvests the FU grafts without human intervention [14].

Among the related image processing techniques, the FU classification technique is very important for future automatic FU harvest and transplantation systems because different sizes of punch blades according to the FU type should be used for minimizing blooding during automatic FU harvesting [11], [19], [20]. For future automatic FU transplantation by using the FUE method, furthermore, the exact number of hairs and the hair density of the recipient area should be known for precise hair transplantation [21]–[23]. For these reasons, this paper presents a novel FU classification method using boundary vector variations for improving the contour-based FU classification of the ARTAS system.

II. Related Work

The ARTAS system is a method for classifying FUs using digital imaging and processing techniques for use in hair transplantation procedures [15], [16]. The method for classifying FUs based on the number of hairs in an FU of interest is as follows: (a) acquire an image of the body surface having the FU of interest; (b) process the image to calculate the contour of the FU and an outline profile that disregards the concavities in the contour; and (c) determine the number of

(3)

Fig. 3. Information on FU in ARTAS: (a) contour and (b) outline profile.

FU FU

Defect Outline profile

Contour

(a) (b)

defects in the outline profile to determine the number of hairs in the FU.

From a segmented image of an FU through binarization, a contour around the outer perimeter of the hairs of an FU may be calculated as shown in Fig. 3(a). For example, for an F1 (an FU with one hair), a contour will generally be a line or surface following the outer surface of a single hair. For relatively straight hair, the contour will look like a rectangle. For an F2 (an FU with two hairs), the hairs typically form a “V” shape such that the contour looks like a block lettered “V.” The segmented image also allows the calculation of an outline profile of the FU, as shown in Fig. 3(b). The outline profile disregards concavities in the contour of the image. For instance, for an F2, there is a concavity or “inwardly curved” portion in the contour formed by the descent in the contour from one side of the top of the “V” to the vertex of the “V” and back up to the other side of the top of the “V.” The calculated profile disregards this concavity such that the resulting outline profile looks like a triangle with one of the vertices of the triangle generally tracing the vertex of the “V” of the contour of the FU.

The outline profile is then compared to the contour to determine the number of “defects,” as shown in Fig. 3(b), in the outline profile. A defect in the outline profile may be defined, for example, as each of the concavities in the outline profile that divert from the contour. In the F2 example, there is one defect in the outline profile represented by the concavity formed by the “V” shape. In an F3 (an FU with three hairs), the contour will generally be shaped like two Vs sharing a common vertex, and with one line forming on one side of both Vs. The outline profile of an F3 will also have a generally triangular shape (although it may be a wider triangle than that found in an F2). Thus, an F3 will have two defects.

Therefore, it can be seen that the number of defects has a direct relationship with the type of FU (TOF). In this case, the number of hairs for the FU equals the number of defects plus one. In one embodiment of the method for classifying FUs, the outline profile may be determined by calculating a convex hull contour pursuant to well-known image processing techniques.

Fig. 4. Drawback of contour-outline profile method: (a) input FU image, (b) contour, (c) outline profile, and (d) unintended defects.

Defect

Unintended defects

(a) (b)

(c) (d)

The aforementioned contour-outline profile method of the ARTAS system for classifying FUs has a drawback in that it is sensitive to the shape of the FUs. Figure 4 shows that the contour-outline profile method may cause unintended defects in addition to the supposed defect in the outline profile for F2.

When the body of an FU is long and irregular, it may be more difficult to classify its type based on the number of defects owing to the existence of unintended defects. To solve this drawback, we present a novel FU classification method using the angle variation of the boundary vectors in a segmented image of an FU.

III. Proposed FU Classification Method

The proposed FU classification method using the angle variation of boundary vectors in a segmented image of an FU, is shown in Fig. 5. The method consists of six steps for classification based on the angle variation of the boundary vector of an FU: (a) image binarization and segmentation; (b) FU boundary tracing; (c) angle variation estimation of the boundary vector direction; (d) Gaussian smoothing for noise reduction; (e) difference in angle variation of the boundary vector; and (f) variation of the boundary vector angle. These steps can also be categorized into two major steps — FU boundary detection and difference extraction of the angle variation of the boundary vector.

1. FU Boundary Detection

A. Image Binarization and Segmentation

For an input FU image, as shown in Fig. 6(a), only the FU

(4)

Fig. 5. Proposed FU classification method.

Input FU image

Image binarization and segmentation

FU boundary trace FU boundary detection

Angle variation estimation of boundary vector Angle estimation of boundary vector direction

Gaussian smoothing

Difference of smoothed angle

Extraction of boundary vector variation

Kind of FU

Fig. 6. Segmentation of FU image: (a) input FU image and (b) segmented image of head and tail of FU.

Tail

Head

(a) (b)

region is segmented through binarization and complementation of the input image, as shown in Fig. 6(b). Generally, FU images consist of a follicle area at the hair root from the human scalp and a hair area leading off from the hair root. In addition, we can classify an FU according to the number of hairs, such as in a one-, two-, or three-hair FU (F1, F2, and F3, respectively). In this paper, the follicle and hair are called the

“head” and “tail,” respectively, as shown in Fig. 6(b).

B. FU Boundary Trace

For the segmented FU image, the boundary pixels of the FU region can be detected using a boundary trace filter [24]. The filter structure for detecting the boundary pixels of an FU region is shown in Fig. 7. Using the boundary trace filter, the boundary of an FU is detected, as shown in Fig. 8(a). In the detected FU boundary, a boundary pixel and its neighboring pixel can be regarded as a boundary vector of the FU boundary

Fig. 7. Boundary trace filter.

NW N NE

W E

SW S SE

Boundary vector

Neighborhood FU point

Center FU point –π 0

π

Fig. 8.Example of FU boundary trace: (a) FU boundary trace and (b) its starting point.

(a) (b) (c)

Start

with a given length and angle. Such a boundary vector is represented by a column vector, which identifies the row and column coordinates of the point on the FU boundary. The starting point for detecting the boundary vectors of an FU is the left- and upper-most object pixel point, with a trace being taken in a clockwise direction, as shown in Fig. 8(b).

2. Angle Variation Estimation of Boundary Vector A. Angle Estimation of Boundary Vector Direction

The proposed FU classification method is based on the angle variations of the boundary vectors for counting the number of hairs for an FU. For this procedure, the angle θ(n) of the boundary vector for a current boundary pixel (xn, yn) is examined using a previous boundary pixel (xn–1, yn–1) as follows:

1 1

1

( ) tan n n ,

n n

y y

n x x

  

    (1) where n denotes the number of boundary pixels (n ≥ 2). In (1), n represents the order of the boundary vectors according to the boundary pixels of an FU. In addition, the angles of the boundary vectors are in the range [0, π], as shown in Fig. 7.

As shown in (1), the angle (θ) of the boundary vector is defined as the degree from the slope of a boundary vector formed by two neighboring boundary pixels on an FU.

(5)

Fig. 9. Angles of boundary vector: (a) angle graph and (b) numbering significant variation points (head and tail) on FU image.

–200 –150 –100 –50 0 50 100 150 200

0 20 40 60 80 100 120

Boundary points

Angle (), θ(n)

(a)

(b)

Figure 9(a) shows the angles of the boundary vectors of the FU with the two hairs shown in Fig. 6. In addition, the numbers in red in Fig. 9(a) are rapidly changing sections and represent significant variation points, which are indicated in Fig. 9(b) with the same numbering. Furthermore, these variation points are related to the head (that is, no. 3 in Fig 9(b)) and tail areas (that is, no.s 2 and 4), as shown in Fig. 6(b). Focusing on these significant angle variations of the boundary vector of an FU, the method described in this paper determines the TOF based on the number of hairs by extracting the angle variation points (head and tail) of the FU from the boundary vectors.

B. Gaussian Smoothing

In Fig. 9(a), the obtained angles of the boundary vectors contain camera noise caused by dense pixel variations of the FU boundary points. In addition, the camera noise makes it difficult to extract the angle variation points exactly. Therefore, we generate a smooth signal through convolution using a Gaussian window to smooth the angle variations of the boundary vectors as follows:

sm( )n G x( ) * ( ),n

   (2)

2 2 2

( ) 1

2π ,

x

G x e

  (3)

where subscript “sm” indicates smoothness, and the Gaussian

window in (3) is normalized using a 20-point window size.

Gaussian smoothing is performed for a rough angle variation curve to estimate exactly the angle variation at the head or tail of an FU. Figure 10 shows the averaged angles for the original angles of the boundary vector in Fig. 9(a), and the graph looks smoother in shape.

C. Difference in Smoothed Angle

To extract significant variation points, we calculate the difference in the smoothed angles, ∆θdsm, as follows:

dsm( )n sm( )n sm(n 1),

  

    (4)

 

where n n|   2, ,n 1 . The difference in the smoothed angles is shown in Fig. 11. We can see that the significant variation points (head and tail) appear in the maximum (or minimum) sections by subtracting the neighboring smoothed angles. Based on the information regarding the important variation points, we can determine the TOF based on the number of hairs by counting the maximum (or minimum) sections.

D. Extraction of Boundary Vector Variation

This step takes the absolute value of the difference in the

Fig. 10. Gaussian smoothing, and numbering based on Fig. 9.

–150 –100 –50 0 50 100 150

0 20 40 60 80 100 120

Boundary points

Smoothed angle, θsm(n)

Fig. 11.Difference in smoothed angles, and numbering based on Fig. 9.

–20 –15 –10 –5 0 5 10 15 20 25 30

0 20 40 60 80 100 120

Boundary points

Difference of smoothed angles, Δθdsm

(6)

Fig. 12. Absolute values of angle differences.

0 5 10 15 20 25 30

0 20 40 60 80 100 120

Boundary points

Absolute of difference of smoothed angles, | Δθdsm |

smoothed angles, |∆θdsm|, to extract the angle variations in the head and tail of the FU easily. Using an absolute graph, it can extract and count the number of peaks over an arbitrary threshold value. The threshold vales are calculated and determined as half of the maximum values. Finally, our method can classify the TOF, or the number of hairs in an FU image with half of the number of peaks, Npeak, counted from the absolute graph, as one, two, or three hairs, as follows:

peak

TOF 2 .

N (5) For example, in the case of the FU with two hairs in Fig. 9, the method can count four peaks and classify it as a two-hair (Npeak/2 = 4/2 = 2) FU, as shown in Fig. 12.

IV. Experimental Results

Computer simulations were conducted using the proposed FU classification method to confirm its performance for an FU image set. The tested FU image set consists of six, eighteen, and four images for one, two, and three hairs, respectively.

Generally, a two-hair FU has a very high distribution unrelated to the type of patient. The FU image set was generated under various conditions; in other words, different camera resolutions, image sizes, and hair slopes were used.

First, the ARTAS FU classification algorithm (that is, the contour-outline profile method) was tested using the test image set. Table 1 shows the classification results of the ARTAS system based on the contour-outline profile. The FU images are used to determine the hair type based on the number of defects. As mentioned in Section II, the type of hairs for the FU image equals the number of defects plus one. In addition, the defects are defined by their triangular shape and area in the difference image of the contour image and outline profile image of the FU. The second column in Table 1 shows the

input FU images for each hair type. In addition, in the third column, the resulting images are listed along with their defects in the same order as in the second column. Detailed captions are attached to the resulting images, and their defects and unintended defects are marked on the resulting images by the red and yellow dotted lines, respectively. In the case of a one- hair type, there should be no defects; however, there are two unintended defects in the wide white area in the second image.

In addition, the one-hair type is classified as a three-hair type despite the image showing one hair. Moreover, the case of two hairs results in six false classification results for 17 images.

There are some unintended defects with a triangular shape, which are similar in size to an accurate defect. In particular, the third image of the three-hair type has one defect and is classified as a two-hair type. The false classification error is caused by the ARTAS system because the two defects are connected into one. We can also see the unintended defects in the resulting image of the three-FU hair type.

Second, the proposed classification method was tested using the same test image set shown in Table 2. Table 2 shows the classification results, the FU boundary images, and a graph of the peak detection result when using the proposed method. The FU images are classified by counting the number of peak points in the resulting graphs, and the hair type of each FU image determines one-half of the counted number of peaks as one to three hairs. In Table 2, the peak detection graphs show 2, 4, and 6 peaks when the FU images are of one, two, and three hairs, respectively. We can see that the proposed method classifies the FU images successfully according to the number of hairs, except in the case of the two-FU hair type where there is one failure.

As a result, we can see that the proposed method is superior in terms of its classification performance, compared to the contour-outline profile method. Furthermore, the new FU classification method for an FU harvest system solves the problem inherent to the ARTAS algorithm using the angle variation of the boundary vector. On the other hand, this classification method is only useful for “V” or “two-V” shapes and cannot be applied to other FU shapes, such as an “II” or

“X.”

V. Conclusion

This paper presented a novel FU classification method based on the angle variation of a boundary vector for classifying an FU based on the number of hairs. ARTAS, a recently developed FU harvest system, classifies the TOF based on the number of hairs, using the number of defects based on the difference in the contour and outline profile of the FU of interest. However, this method has a drawback in that the FU

(7)

Table 1. Experimental results of ARTAS (contour-outline profile method) classification system.

Type Input images Defect and unintended defects (hair type = the number of defects + 1) Classification result

1 hair

No defect (1 hair)

2 unintended defects (3 hairs)

No defect (1 hair)

No defect (1 hair)

No defect (1 hair)

No defect (1 hair)

5 succeeded, 1 failed

2 hairs 11 succeeded,

6 failed

3 hairs 2 succeeded,

2 failed 1 defect

(2 hairs)

1 defect (2 hairs)

1 defect, 1 unintended

(3 hairs)

1 defect (2 hairs)

1 defect (2 hairs)

1 defect, 2 unintended

(4 hairs)

1 defect, 2 unintended

(4 hairs)

1 defect (2 hairs)

1 defect (2 hairs)

1 defect, 1 unintended

(3 hairs)

1 defect, 2 unintended

(4 hairs)

1 defect (2 hairs)

1 defect (2 hairs)

1 defect (2 hairs)

1 defect (2 hairs)

1 defect, 1 unintended

(3 hairs)

1 defect (2 hairs)

2 defect (3 hairs)

2 defect, 1 unintended

(4 hairs)

1 defect (2 hairs)

2 defect (3 hairs)

(8)

Table 2. Experimental results of proposed FU classification method.

Type FU boundary images Peak detection (hair type = number of peaks/2) Classification

result

1 hair

2 peaks (=1 hair), 6 succeeded

2 hairs

4 peaks (=2 hairs), 16 succeeded,

1 failed

(9)

Table 2. Experimental results of proposed FU classification method (continued).

Type FU boundary images Peak detection (hair type = number of peaks/2) Classification

result

2 hairs

4 peaks (=2 hairs), 16 succeeded,

1 failed

3 hairs

6 peaks (=3 hairs), 4 succeeded

classification is inaccurate, because it causes unintended defects in the outline profile of the FU. To solve this drawback, the proposed method classifies the TOF image by extracting the number of angle variation peak points of the boundary vector. The experimental results show that the proposed method is robust and accurate for a limited number of FU

shapes compared to the contour-outline profile FU classification method of ARTAS. On the other hand, we are developing a newly modified algorithm for classifying shapes such as an “II” or “X” shape. Thus, in this paper, determining the number of peaks may not yet provide the most accurate classification result.

(10)

References

[1] S. Okuda, “The Study of Clinical Experiments of Hair Transplantation,” Japanese J. Dermatology, vol. 46, no. 135, 1939.

[2] W.P. Unger, “Delineating the Safe Donor Area for Hair Transplanting,” American J. Cosmetic Surgery, vol. 11, 1994, pp.

239–243.

[3] N. Orentreich, “Autografts in Alopecias and Other Selected Dermatological Conditions,” Annals New York Academy Sci., vol.

83, Nov. 1959, pp. 463–479.

[4] R.M. Bernstein and W.R. Rassman, “Follicular Unit Transplantation,” Issue Adv. Cosmetic Surgery, Dermatologic Clinics, vol. 23, no. 3, 2005, pp. 393–414.

[5] W.R. Rassman et al., “Follicular Unit Extraction: Minimally Invasive Surgery for Hair Transplantation,” Dermatologic Surgery, vol. 28, no. 8, Aug. 2002, pp. 720–728.

[6] FUE – A Few Words about Transplant Methods, The Blog, SURE HAIR INTERNATIONAL. Accessed Nov. 9, 2014.

http://surehairtransplants.com/fue-words-transplant-methods [7] Bauman Medical Group, P.A., Hair Restoration for Men and

Women, BAUMAN MEDICAL GROUP. Accessed Nov. 9, 2014.

http://baumanmedical.com

[8] Follicular Unit Extraction (FUE) Hair Transplant, Follicular Unit Extraction, Hair Transplant Network.com. Accessed Nov. 9, 2014. http://www.hairtransplantnetwork.com/Hair-Loss-Treatments/

follicular-unit-extraction.asp

[9] L.M. Bicknell et al., “Follicular Unit Extraction Hair Transplant Harvest: A Review of Current Recommendations and Future Considerations,” Dermatology Online J., vol. 20, no. 3, Mar.

2014.

[10] R.M. Rashid and B.L.T. Morgan, “Follicular Unit Extraction Hair Transplant Automation: Options in Overcoming Challenges of the Latest Technology in Hair Restoration with the Goal of Avoiding the Line Scar,” Dermatology Online J., vol. 18, no. 9, Sept. 2012.

[11] M. Onda et al., “Novel Technique of Follicular Unit Extraction Hair Transplantation with a Powered Punching Device,”

Dermatology Surgery, vol. 34, no. 12, Dec. 2008, pp. 1683–1688.

[12] NeoGraft® Automated FUE Hair Transplant System, Procedure Overview, NeoGraft. Accessed Nov. 9, 2014.

http://neograftdocs.com/procedure-overview/neograft-automated- fue-hair-transplant-System/

[13] Minimally Invasive Dissection, ARTAS Robotics System, RESTORATION ROBOTICS. Accessed Nov. 9, 2014.

http://restorationrobotics.com/minimally-invasive-dissection/

[14] R.M. Rashid et al., “Follicular Unit Extraction with Artas Robotic Hair Transplant System: An Evaluation of FUE Yield,”

Dermatology Online J., vol. 20, no. 4, Apr. 2014.

[15] S.A. Qureshi and M. Bodduluri, System and Method for

Classifying Follicular Units, US Patent 7,477,782 B2, filed Aug.

25, 2006, issued Jan. 13, 2009.

[16] S.A. Qureshi and M. Bodduluri, System and Method for Classifying Follicular Units, US Patent 7,627,157 B2, filed Aug.

24, 2007, issued Dec. 1, 2009.

[17] M. Bodduluri, P.L. Gildenberg, and D.E. Caddes, System and Methods for Aligning a Tool with a Desired Location or Object, US Patent 7,962,192 B2, filed Apr. 28, 2006, issued June 14, 2011.

[18] S.A. Qureshi and M.G. Canales, System and Method for Harvesting and Implanting Hair Using Image-Generated Topological Skin Models, US Patent 8,048,090 B2, filed Mar. 11, 2009, issued Nov. 1, 2011.

[19] S.A. Qureshi and M.G. Canales, System and Method for Harvesting and Implanting Hair Using Image-Generated Topological Skin Models, US Patent 8,361,085 B2, filed Sept. 23, 2011, issued Jan. 29, 2013.

[20] M. Bodduluri et al., System and Method for Harvesting and Implanting Hair Using Image-Generated Topological Skin Models, US Patent 8,454,627 B2, filed Jan. 6, 2012, issued June 4, 2013.

[21] S.A. Qureshi and M. Bodduluri, System and Method for Counting Follicular Units, US Patent 8,199,983 B2, filed Aug. 24, 2007, issued June 12, 2012.

[22] S.A. Qureshi and M. Bodduluri, System and Method for Counting Follicular Units, US Patent 8,290,229 B2, filed May 16, 2012, issued Oct. 16, 2012.

[23] J.W. Shin et al., “Characteristics of Robotically Harvested Hair Follicles in Koreans,” J. American Academy Dermatology, vol.

72, no. 1, Jan. 2015, pp. 146–150.

[24] Trace Object in Binary Image - MATLAB bwtraceboundary, Documentation, MathWorks. Accessed Nov. 9, 2014. http://kr.

mathworks.com/help/images/ref/bwtraceboundary.html

(11)

Hwi Gang Kim received his BS and MS degrees in electrical engineering from Kyungpook National University, Daegu, Rep.

of Korea, in 2008 and 2010, respectively. He is currently working at ETRI, Daegu, Rep. of Korea. His research interests include image processing, computer vision, and medical signal processing.

Tae Wuk Bae received his BS, MS, and PhD degrees in electrical engineering from Kyungpook National University, Daegu, Rep.

of Korea, in 2004, 2006, and 2010, respectively.

He is currently working at ETRI, Daegu, Rep.

of Korea. His research interests include medical devices and medical signal processing.

Kyu Hyung Kim received his BS and MS degrees in electrical engineering, and his PhD degree in computer science and engineering from Kyungpook National University, Daegu, Rep. of Korea, in 1998, 2000, and 2014, respectively. He is currently working at ETRI, Daegu, Rep. of Korea. His research interests include IoT technology and medical devices.

Hyung Soo Lee received his BS degree in electrical engineering from Kyungpook National University, Daegu, Rep. of Korea, in 1980 and his PhD degree in IT engineering from Sungkyunkwan University, Suwon, Rep.

of Korea, in 1996. Since 1983, he has been a principal researcher at ETRI, Daegu, Rep. of Korea. His research interests are spectrum engineering, WPAN system design, and biomedical IT convergence devices.

Soo In Lee received his MS and PhD degrees in electronics engineering from Kyungpook National University, Daegu, Rep. of Korea, in 1989 and 1996, respectively. Since 1990, he has been with ETRI, where he is currently a principal researcher with the Broadcasting System Research Department. His main research interests include digital broadcasting systems and IT convergence technology.

참조

관련 문서

• 대부분의 치료법은 환자의 이명 청력 및 소리의 편안함에 대한 보 고를 토대로

• 이명의 치료에 대한 매커니즘과 디지털 음향 기술에 대한 상업적으로의 급속한 발전으로 인해 치료 옵션은 증가했 지만, 선택 가이드 라인은 거의 없음.. •

12) Maestu I, Gómez-Aldaraví L, Torregrosa MD, Camps C, Llorca C, Bosch C, Gómez J, Giner V, Oltra A, Albert A. Gemcitabine and low dose carboplatin in the treatment of

Levi’s ® jeans were work pants.. Male workers wore them

The purpose of this paper is to introduce the design of the angle of attack controller using the pole placement approach for the high angle of attack

So this paper presents a force and motion model of the wafer floated inside the transfer unit and the control unit of the clean tube system and then, we examine its

Let f(x, y) be the density (=mass per unit area) of a distribution of mass in the xy -plane. 10 Vector Integral Calculus.. 10 Vector Integral Calculus.. 1) Change of Variables

Method: Several research conditions proposed in previous studies were applied as they were in this study such as research site, a unit plan, the primary energy demand for