ๆญฏ97419522.PDF



Similar documents
ๆญฏ PDF

DBPIA-NURIMEDIA

5. ํšŒ ์˜๋‚ด์šฉ < ์ œ 1ํ˜ธ ์•ˆ : 2011ํ•™๋…„๋„ ๋ฒ• ์•ˆ ํšŒ ์ œ ์ฒ  ์‚ฐ(์•ˆ )> ๋ฒ•์ธ ์‚ฌ๋ฌด๊ตญ์žฅ์˜ ์„ฑ์™ผ ๋ณด๊ณ ์— ์ด์ด ์˜์žฅ์ด ์ด์‚ฌํšŒ ๊ฐœํšŒ ์šฉ ์„ ์–ธํ•˜๊ณ  ํšŒ๊ณ„ํŒ๋ ค๋ถ€์žฅ์— ๊ฒŒ ์ œ l ํ˜ธ ์•ˆ์ธ ํ•™๋…„๋„ ์ž…์ธ ํšŒ๊ณ„ ๊ฒฐ์‚ฐ(์•ˆ)์— ๋Œ€ํ•œ ์„ฑ๋ช…์œต ์ง€์‹œํ•จ ํšŒ๊ณ„ํŒ๋ฆฌ๋ถ€์žฅ์ด 2011 ํ•™๋…„

untitled

cat_data3.PDF

PowerPoint Presentation

ๆญฏ522๋ฐ•๋ณ‘ํ˜ธ.PDF

<3130BAB9BDC428BCF6C1A4292E687770>

์ž๊ธฐ๊ตฌ์„ฑ์ง€๋„ ๊ธฐ๋ฐ˜ ๋ฐฉ๋ฒ•์„ ์ด์šฉํ•œ ์ด์ƒ ํƒ์ง€(Novelty Detection using SOM SOM-based Methods)

<3131BFF92D3828C6D0B3CEBFACB1B82DC0CCBBF3C8A D38302E687770>

10๊น€๋ฌ˜์„ 

ๆก t H I K ์žฌ์ ๊ฒฝ์ œ์ž‰์ž‘๊ณผ ๊ท€ํ•˜ ๋ณธ ๋ณด๊ณ ์„œ๋ฅผ r ๊ตญ์ œ๊ธˆ์œต๊ฑฐ๋ž˜๋ฅผ ํ†ตํ•œ ์ž๊ธˆ์„ธ๋‹ฅ ์œ  ํ˜• ๋ฐ ๋Œ€์ฒ˜๋ฐฉ์•ˆ ์—ฐ๊ตฌ ์— ๊ด€ํ•œ ์—ฐ๊ตฌ์šฉ์—ญ์˜ ์ตœ์ข…๋ณด๊ณ  ์„œ๋กœ ์ œ์ถœํ•œ๋‹ˆ๋‹ค ๋…„ 9 ์›” ํ™ˆํฉ ๋ฅผํˆด E์ž„ ํ›Œํ™ˆ

๋‚ด์ง€4์›”์ตœ์ข…

<B5B6BCADC7C1B7CEB1D7B7A52DC0DBBEF7C1DF E687770>

์ž…์žฅ

ไผ)์ด๋ผ๊ณ  ํ•˜์˜€๋Š”๋ฐ, ๋ผ์ž(็พ…ๅญ—)๋Š” ๋‚˜์ž(้‚ฃๅญ—)๋กœ ์“ฐ๊ธฐ๋„ ํ•˜๊ณ  ์•ผ์ž(่€ถๅญ—)๋กœ ์“ฐ๊ธฐ๋„ ํ•œ๋‹ค. ๋˜ ์„œ๋ฒŒ(ๅพไผ)์ด๋ผ๊ณ ๋„ ํ•œ๋‹ค. ์„ธ์†์—์„œ ๊ฒฝ์ž(ไบฌๅญ—)๋ฅผ ์ƒˆ๊ฒจ ์„œ๋ฒŒ(ๅพไผ)์ด๋ผ๊ณ  ํ•œ๋‹ค. ์ด ๋•Œ๋ฌธ์— ๋˜ ์‚ฌ๋ผ(ๆ–ฏ็พ…)๋ผ๊ณ  ํ•˜๊ธฐ๋„ ํ•˜๊ณ , ๋˜ ์‚ฌ๋กœ(ๆ–ฏ็›ง)๋ผ๊ณ  ํ•˜๊ธฐ๋„ ํ•œ๋‹ค. ์žฌ์œ„ ๊ธฐ๊ฐ„์€ 6

0429bodo.hwp

ๆ™‚ ็ฟ’ ่ชช ) 5), ์›ํ˜ธ์„ค( ๅ…ƒ ๆ˜Š ่ชช ) 6) ๋“ฑ์ด ์žˆ๋‹ค. 7) ์ด ๊ฐ€์šด๋ฐ ์ž„์ œ์„ค์— ๋™์˜ํ•˜๋Š”๋ฐ”, ์ƒ์„ธํ•œ ๋…ผ์˜๋Š” ํ™ฉํŒจ๊ฐ•์˜ ๋…ผ์˜๋กœ ๋ฏธ๋ฃจ๋‚˜ ๊ทธ์˜ ๋…ผ์˜์— ๋…ผ๊ฑฐ๋กœ์„œ ๋น ์ ธ ์žˆ๋Š” ๋ถ€๋ถ„์„ ๋ณด๊ฐ•ํ•˜์—ฌ ์ž„์ œ์„ค์— ๋Œ€ํ•œ ๋ณ€์ฆ( ่พจ ่ญ‰ )์„ ๋ง๋ถ™์ด๊ณ ์ž ํ•œ๋‹ค. ์šฐ์„ , ๋‹ค์Œ์˜ ์ธ์šฉ๋ฌธ์„ ๋ณด๋„๋ก

cls46-06(์‹ฌ์šฐ์˜).hwp

์ตœ์šฐ์„.hwp

๊ต์‚ฌ์šฉ์ง€๋„์„œ_์“ฐ๊ธฐ.hwp

< BDC3BAB8C1A4B1D4C6C75BC8A3BFDC D2E687770>

E1-์ •๋‹ต๋ฐํ’€์ด(1~24)ok

<C1B6BCB1B4EBBCBCBDC3B1E2342DC3D6C1BE2E687770>


<C0CEBCE2BABB2D33C2F7BCF6C1A420B1B9BFAAC3D1BCAD203130B1C72E687770>

untitled

๋ฏผ์ฃผ์žฅ์ •-๋…ธ๋™์šด๋™(๋ถ„๊ถŒ).indd

๊ณผ ์œ„ ๊ฐ€ ์˜ค๋Š” ๊ฒฝ์šฐ์—๋Š” ์•ž๋ง ๋ฐ›์นจ์„ ๋Œ€ํ‘œ์Œ์œผ๋กœ ๋ฐ”๊พผ [๋‹ค๊ฐ€ํŽ˜]์™€ [ํ๊ท€ ์—]๊ฐ€ ์˜ฌ๋ฐ”๋ฅธ ๋ฐœ์Œ์ด [์•ˆ์ž์„œ], [ํ• ํŠผ], [์—…์“ฐ๋ฏ€๋กœ], [์ ˆ๋ฏ] ํ’€์ด ์ž์Œ์œผ๋กœ ๋๋‚˜๋Š” ๋ง์ธ ์•‰- ๊ณผ ํ•ฅ-, ์—†-, ์ Š- ์— ๊ฐ๊ฐ ๋ชจ์Œ์œผ๋กœ ์‹œ์ž‘ํ•˜๋Š” ํ˜•์‹ํ˜•ํƒœ์†Œ์ธ -์•„์„œ, -์€, -์œผ๋ฏ€๋กœ, -์Œ

6ยฑร‡ยธรฑร‚รท

<C3D6C1BE5FBBF5B1B9BEEEBBFDC8B0B0DCBFEFC8A C3D6C1BEBABB292E687770>

์ดˆ๋“ฑ๊ตญ์–ด์—์„œ ๊ด€์šฉํ‘œํ˜„ ์ง€๋„ ๋ฐฉ์•ˆ ์—ฐ๊ตฌ

3 Gas Champion : MBB : IBM BCS PO : 2 BBc : : /45

177

์ œ์ฃผ์–ด ๊ต์œก์ž๋ฃŒ(์ค‘๋“ฑ)-์ž‘์—….hwp

ยธรฉยธรฑยผร’ยฝร„รรถ 63รˆยฃ_ยณยปรรถ รƒร–รยพ

01Report_210-4.hwp

<C3D1BCB15FC0CCC8C45FBFECB8AE5FB1B3C0B0C0C75FB9E6C7E D352D32315FC5E4292E687770>



๊ต์œก ๊ณผ ํ•™๊ธฐ ์ˆ ๋ถ€ ๊ณ  ์‹œ ์ œ ํ˜ธ ์ดˆ ์ค‘๋“ฑ๊ต์œก๋ฒ• ์ œ23์กฐ ์ œ2ํ•ญ์— ์˜๊ฑฐํ•˜์—ฌ ์ดˆ ์ค‘๋“ฑํ•™๊ต ๊ต์œก๊ณผ์ •์„ ๋‹ค์Œ๊ณผ ๊ฐ™์ด ๊ณ ์‹œํ•ฉ๋‹ˆ๋‹ค. 2011๋…„ 8์›” 9์ผ ๊ต์œก๊ณผํ•™๊ธฐ์ˆ ๋ถ€์žฅ๊ด€ 1. ์ดˆ ์ค‘๋“ฑํ•™๊ต ๊ต์œก๊ณผ์ • ์ด๋ก ์€ ๋ณ„์ฑ… 1 ๊ณผ ๊ฐ™์Šต๋‹ˆ๋‹ค. 2. ์ดˆ๋“ฑํ•™๊ต ๊ต์œก๊ณผ์ •์€ ๋ณ„์ฑ…

์‹œํ—˜์ง€ ์ถœ์ œ ์–‘์‹

์šฐ๋ฆฌ๋‚˜๋ผ์˜ ์ „ํ†ต๋ฌธํ™”์—๋Š” ๋ฌด์—‡์ด ์žˆ๋Š”์ง€ ์•Œ์•„๋ด…์‹œ๋‹ค. ์šฐ๋ฆฌ๋‚˜๋ผ์˜ ์ „ํ†ต๋ฌธํ™”๋ฅผ ์ฒดํ—˜ํ•ฉ์‹œ๋‹ค. ์šฐ๋ฆฌ๋‚˜๋ผ์˜ ์ „ํ†ต๋ฌธํ™”๋ฅผ ์†Œ์ค‘ํžˆ ์—ฌ๊ธฐ๋Š” ๋งˆ์Œ์„ ๊ฐ€์ง‘์‹œ๋‹ค. 5. ์šฐ๋ฆฌ ์˜ท ํ•œ๋ณต์˜ ํŠน์ง• ์ž๋ฃŒ 3 ์ฐธ๊ณ  ๋‚จ์ž์™€ ์—ฌ์ž๊ฐ€ ์ž…๋Š” ํ•œ๋ณต์˜ ์ข…๋ฅ˜ ๊ฐ€ ๋‹ฌ๋ž๋‹ค๋Š” ๊ฒƒ์„ ์•Œ๋ ค ์ค€๋‹ค. 85์ชฝ ๋ฌธ์ œ 8, 9 ์ž๋ฃŒ

์ƒํ’ˆ ์ „๋‹จ์ง€

::: ํ•ด๋‹น์‚ฌํ•ญ์ด ์—†์„ ๊ฒฝ์šฐ ๋ฌด ํ‘œ์‹œํ•˜์‹œ๊ธฐ ๋ฐ”๋ž๋‹ˆ๋‹ค. ๊ฒ€ํ† ํ•ญ๋ชฉ ๊ฒ€ ํ†  ์—ฌ ๋ถ€ ( ํ‘œ์‹œ) ์‹œ ๋ฏผ : ์œ  ( ) ๋ฌด ์‹œ ๋ฏผ ์ฐธ ์—ฌ ๊ณ  ๋ ค ์‚ฌ ํ•ญ ์ด ํ•ด ๋‹น ์‚ฌ ์ž : ์œ  ( ) ๋ฌด ์ „ ๋ฌธ ๊ฐ€ : ์œ  ( ) ๋ฌด ์˜ด ๋ธŒ ์ฆˆ ๋งŒ : ์œ  ( ) ๋ฌด ๋ฒ• ๋ น ๊ทœ ์ • : ๊ตํ†ต ํ™˜๊ฒฝ ์žฌ

2

DBPIA-NURIMEDIA

ํ™”์ด๋ จ(่ฏไปฅๆˆ€) hwp

ร†รฒรˆ๏ฟฝยดยฉยธยฎ 94รˆยฃ ยณยปรรถ_รƒร–รยพ

ๆญฏ1##01.PDF

<5BC1F8C7E0C1DF2D31B1C75D2DBCF6C1A4BABB2E687770>

120229(00)(1~3).indd

<C7D1B1B9B0E6C1A6BFACB1B8C7D0C8B828C0CCC1BEBFF85FC0CCBBF3B5B75FBDC5B1E2B9E9292E687770>

<32332D322D303120B9E6BFB5BCAE20C0CCB5BFC1D6312D32302E687770>

๋ณธ๋ฌธ01

untitled

์ง„๋‹จ, ํ‘œ์‹œใƒป๊ด‘๊ณ ๋ฒ• ์‹œํ–‰ 1๋…„

ยฐรญยผยฎรร– รƒรขยทร‚

๋ฏผ๋ณ€_๋ณด๋„์ž๋ฃŒ_ํŠน์กฐ์œ„_์˜ˆ์‚ฐ_๋ฏธํŽธ์„ฑ_ํ—Œ๋ฒ•์†Œ์›_๋ฐ_๊ณต.hwp

<C7A5C1F620BEE7BDC4>

ยฑรจยผยบรƒยถ รƒรขยทร‚-1

<C5F0B0E82D313132C8A328C0DBBEF7BFEB292E687770>

?์œ์ถŽํ‚ด์ž–?

Buy one get one with discount promotional strategy

CSVM ํ”„๋กœ์„ธ์„œ ์„ค๊ณ„ ๋ฐ ๊ตฌํ˜„

232 ๋„์‹œํ–‰์ •ํ•™๋ณด ์ œ25์ง‘ ์ œ4ํ˜ธ I. ์„œ ๋ก  1. ์—ฐ๊ตฌ์˜ ๋ฐฐ๊ฒฝ ๋ฐ ๋ชฉ์  ์‚ฌํšŒ๊ฐ€ ๋‹ค์›ํ™”๋ ์ˆ˜๋ก ๋‹ค์–‘์„ฑ๊ณผ ๋ณตํ•ฉ์„ฑ์˜ ์š”์†Œ๋Š” ์ฆ๊ฐ€ํ•˜๊ฒŒ ๋œ๋‹ค. ๋„์‹œ์˜ ๋ฐœ๋‹ฌ์€ ์‚ฌํšŒ์˜ ๋‹ค์› ํ™”์™€ ๋ฐ€์ ‘ํ•˜๊ฒŒ ๊ด€๋ จ๋˜์–ด ์žˆ๊ธฐ ๋•Œ๋ฌธ์— ํ˜„๋Œ€ํ™”๋œ ๋„์‹œ๋Š” ๊ฒฝ์ œ, ์‚ฌํšŒ, ์ •์น˜ ๋“ฑ์ด ๋ณตํ•ฉ์ ์œผ๋กœ ์—ฐ ๊ณ„๋˜์–ด ์žˆ์–ด ํŠน

< C7D0B3E2B5B520B9FDC7D0C0FBBCBABDC3C7E820C3DFB8AEB3EDC1F528C8A6BCF6C7FC292E687770>

<C1A634C2F720BAB8B0EDBCAD20C1BEC6ED20BDC3BBE720C5E4C5A920C7C1B7CEB1D7B7A5C0C720BEF0BEEE20BBE7BFEB20BDC7C5C220C1A1B0CB20C1A6C3E22E687770>

<B3EDB9AEC1FD5F3235C1FD2E687770>

โ… . ๊ธ€๋กœ๋ฒŒ ๊ฒฝ์ œํ™˜๊ฒฝ ๋ณ€ํ™” 29๋…„์€ ์„ธ๊ณ„ ๋ฐ ๊ตญ๋‚ด๊ฒฝ์ œ์˜ ์—ญ์‚ฌ ์†์—์„œ ์˜๋ฏธ ์žˆ๋Š” ํ•œ ํ•ด๋กœ ๊ธฐ๋ก๋  ๊ฒƒ์ด๋‹ค. ๋ฆฌ๋จผ ์‡ผํฌ ์ดํ›„์˜ ๊ธˆ์œต์‹œ์žฅ ํ˜ผ๋ž€๊ณผ ๊ฒฝ์ œ์ฃผ์ฒด๋“ค์˜ ์‹ฌ๋ฆฌ ์œ„์ถ•์€ ์ƒ๋‹นํ•œ ๊ธฐ๊ฐ„์˜ ๊ฒฝ๊ธฐ๋ถˆํ™ฉ์„ ์˜ˆ๊ณ ํ•˜๋Š” ๋“ฏ ํ–ˆ์ง€๋งŒ ๊ธ€๋กœ๋ฒŒ ๊ฒฝ์ œ์œ„๊ธฐ๋ฅผ ์ˆ˜์Šตํ•˜๊ธฐ ์œ„ํ•œ ๊ฐ๊ตญ์˜ ๊ธˆ์œต์•ˆ์ •ํ™” ๋Œ€์ฑ…๊ณผ ์žฌ์ •

ๆญฏ49์†์šฑ.PDF

๋ฌด์„ ๋ฐ์ดํ„ฐ_์š”๊ธˆ์ œ์˜_๊ฐ€๊ฒฉ์ฐจ๋ณ„ํ™”์—_๊ด€ํ•œ_์—ฐ๊ตฌv4.hwp

1112 ๋ฌผ๋ฆฌ ํ™”ํ•™ N ok.indd

์‚ฌ์ƒ์ฒด์งˆ์˜ํ•™ํšŒ์ง€

11ยนรšยดรถยฑร”

์ž๊ธฐ๊ณต๋ช… ์ž„ํ”ผ๋˜์Šค ๋‹จ์ธต์ดฌ์˜ ๊ธฐ์ˆ  ์—ฐ๊ตฌ์„ผํ„ฐ \(MREIT Research Center\)

์š”. ์šฐ๋ฆฌ๋Š” ์‚ด ์ˆ˜๊ฐ€ ์—†์œผ๋‹ˆ๊ฒŒ ์ด๊ฒŒ ํฐ ๋ฌด์Šจ ์ „์Ÿ์ด๋‹ค ๊ทธ๋ž˜๊ฐ€์ง€๊ณ ์„œ ๋ด‰ ๋™๋ฉด์ด๋ผ๊ณ  ๊ฑฐ๊ธฐ๊ฐ€ ๋งํ•˜์ž๋ฉด ํ•ญ๊ตฌ ์˜€๊ฑฐ๋“ ์š”. ๊ทธ๋•Œ ๊ตฐ์ธ๋“ค์ด ํ›„ํ‡ด๋ฅผ ํ•œ ๊ฑฐ์˜ˆ์š”. ๊ตฐ์ธ๋“ค์ด ํ›„ํ‡ดํ•˜๋ฉด์„œ ํ™œ๋™ ๋ชป ํ•  ์‚ฌ๋žŒ๋“ค์€ ๋‹ค ๊ทธ๋ƒฅ ์ฃฝ์–ด๋ฒ„๋ฆฌ๊ณ  ๊ทธ ๋‚˜๋จธ์ง€ ์–ด๋Š ์ •๋„ ๋ถ€์ƒ๋‹นํ–ˆ์–ด๋„ ํ™œ ๋™ํ•  ์ˆ˜ ์žˆ๋Š” ์‚ฌ๋žŒ๋“ค์€

< B5BFBEC6BDC3BEC6BBE E687770>

???? 1

- ็›ฎ ๆฌก I. ์„œ ๋ก  & R. ํˆฌ ๊ธฐ์  ๊ณต๊ฒฉ ์˜ ๋ฉ”์ปค๋‹ˆ์ฆ˜ 3 1. ํ˜„๋ฌผํ™˜์‹œ ์žฅ์˜ ์ด์šฉ 3 ๊ฐ€. ์•ฝ์„ธ ํ†ตํ™” ์ฐจ์—… 3 ๋‚˜. ์•ฝ์„ธํ†ตํ™”ํ‘œ์‹œ ์œ ๊ฐ€์ฆ๊ถŒ ๋งค๊ฐ 4 2. ํŒŒ์ƒ๊ธˆ์œต์ƒํ’ˆ์‹œ์žฅ์˜ ์ด์šฉ 5.. ๊ฐ€ ํ†ตํ™”์„ ๋ฌผํ™˜ ๋งค๊ฐ 5 ๋‚˜ ํ†ตํ™”์˜ต์…˜ ๊ฑฐ๋ž˜์ „๋žต : S t r

84 ํ•œ๊ตญ๊ฒฝ์ž˜์—ฐ๊ตฌ ์ € 19๊ถŒ ์ € ํ˜ธ 1. ์„œ ๋ก  ํ•œ๊ตญ์—์„œ์˜ ์‚ด์ธ์‚ฌ๊ฑด์€ 90๋…„๋„์— 606๊ฑด(์„œ์šธ์ฒญ. 2004: 12) ์ด ๋ฐœ์ƒํ•˜์˜€๊ณ  98๋…„์— 963๊ฑด(์„œ์šธ ์ฒญ. 2008: 12) ์ด ๋ฐœ์ƒํ•œ ์ดํ›„ 2001๋…„ 1051 ๊ฑด(๊ฒฝ์ฐฐ์ฒญ. 2002: 77). 2007๋…„์—๋Š” 1111

ร€รŒรร–รˆรฑ.hwp

11๋ฏผ๋ฝ์ดˆ์‹ ๋ฌธ4ํ˜ธ

<BBE7B8B3B4EBC7D0B0A8BBE7B9E9BCAD28C1F8C2A5C3D6C1BE E687770>


์ œ1์ ˆ ์กฐ์„ ์‹œ๋Œ€ ์ด์ „์˜ ๊ต์œก

์‚ฌ์ง„ 24 _ ์ข…๋ฃจ์ง€ ์ „๊ฒฝ(์„œ๋ถ์—์„œ) ์‚ฌ์ง„ 25 _ ์ข…๋ฃจ์ง€ ๋‚จ์ธก๊ธฐ๋‹จ(๋™์—์„œ) ์‚ฌ์ง„ 26 _ ์ข…๋ฃจ์ง€ ๋ถ์ธก๊ธฐ๋‹จ(์„œ์—์„œ) ์‚ฌ์ง„ 27 _ ์ข…๋ฃจ์ง€ 1์ฐจ ๊ฑด๋ฌผ์ง€ ์ดˆ์„ ์ ์‹ฌ์„ ์‚ฌ์ง„ 28 _ ์ข…๋ฃจ์ง€ ์ค‘์‹ฌ ๋ฐฉํ˜•์ ์‹ฌ ์œ  ์‚ฌ์ง„ 29 _ ์ข…๋ฃจ์ง€ ๋™์ธก ๊ณ„๋‹จ์„ <๊ฒฝ๋ฃจ์ง€> ์œ„ ์น˜ ํƒ‘์ง€์˜ ๋‚จ๋ถ์ค‘์‹ฌ

์ƒˆ๋งŒ๊ธˆ์„ธ๋ฏธ๋‚˜-1101-์ด์–‘์žฌ.hwp

??

652

ๆญฏ ์กฐ์„ ์ผ๋ณด.PDF

<C0CCBDB4C6E4C0CCC6DB34C8A35F28C3D6C1BE292E687770>

<33B1C7C3D6C1BEBABB28BCF6C1A42D E687770>

A sudy on realizaion of speech and speaker recogniion sysem based on feedback of recogniion value

<C1DFB1DE2842C7FC292E687770>

< C7CFB9DDB1E22028C6EDC1FD292E687770>

96๋ถ€์‚ฐ์—ฐ์ฃผ๋ฌธํ™”\(๊น€์ฐฝ์šฑ\)

๋ ˆ์ด์•„์›ƒ 1

001์ง€์‹๋ฐฑ์„œ_4๋„

???? 1

Transcription:

Natural Language Text Retreval Based on Neural Networks 1999 2

1 SVM(Support Vector Machne) SVM SV(Support Vector) SV SVM SRM(Structural Rsk Mnmzaton) SVM Convex Programmng Reuters-21578 5 SVM 97% (break-even pont) SVM Naïve Bayesan SV : Support Vector Machne Structural Rsk Mnmzaton Convex Programmng

1 1 1 11 1 12 2 13 3 2 4 21 4 22 7 3 8 31 8 311 8 312 10 313 RBF(Radal Bass Functon) 11 32 SUPPORT VECTOR MACHINES (SVMS)13 4 15 41 15 411 15 412 Structural Rsk Mnmzaton (SRM)17 42 SVM 19 421 (lnearly separable) 19 422 SVM22 423 24 424 25 425 SVM 27

1 5 31 51 31 52 33 53 35 531 SVM 36 532 Naïve Bayesan 39 533 39 6 42 44

1 4-1 29 5-1 32 5-2 Reuters-21578 33 5-3 33 5-4 10 35 5-5 36 5-6 SV 38 v

1 3-1 9 3-2 10 3-3 RBF 12 4-1 18 4-2 22 4-3 24 4-4 26 5-1 5 37 5-2 σ 1 RBF 37 5-3 β 0 =2 β 1 =1 2 38 5-4 Nave Bayesan 39 5-5 SV 40 5-6 RBF SV 41 v

1 1 11 1950 (WWW) [Yang 97] SVM(Support Vector Machne) SVM SV(Support Vector) 1

1 12 Naïve Bayesan Doc v NB Naïve Bayesan v NB = argmax P( v ) v j V j postons P( a v ) j v j a Doc [Mtchell 97] Qunlan C45 [Qunlan 93] k-nn(nearest Neghbor) k 2

1 13 1 2 SVM 3 4 SVM 5 6 3

2 2 21 doc 0 {doc category-value } (supervsed learnng) 4

2 (stemmng) (stop lst) (stem) engneerng engneered engneer engneer Porter Porter / the of and to 10 20~30% [Frakes et al 92] : TF(Term Frequency) doc w j TF 5

2 TF( w doc ) = count of w occurngn document doc TF IDF(Inverse Document Frequency) [Frakes et al 92] IDF n IDF( w ) = log DF( w ) n DF(Document Frequency) DF ( w ) = number of document wherew s occurrng tfdf TF IDF TF IDF DF IDF 0 tfdf 6

2 22 20000 20000 (sparse vector) 0 7

3 3 31 311 Rosenblatt(1958) (neuron) (synaptc weghts vector) (bas) 3-1 v m v = w x =1 + b w w m x x b 8

3 1 y = ฯ•( v) = 1 v 0 v < 0 3-1 m = 1 1 (hyperplane) w x + b = 0 w = w w ) ( 1 n w 1 42 9

3 312 3-2 (nput layer) (output layer) (hdden layer) (network) 3-2 1 (nonlnear actvaton functon) 10

3 Logstc functon: 1 ฯ•( v) = 1+ exp( av) Hyperbolc tangent functon: ฯ•( v ) = a tanh( bv) 2 (backpropagaton algorthm) 2 1 2 313 RBF(Radal Bass Functon) RBF RBF 3-3 11

3 3-3 RBF ฯ•(x) radal-bass functon ฯ•(x) ฯ•( x) = exp x 2σ 2 t 2 x t radal bass functon σ radal-bass functon Radal-bass (t ) 12

3 m1 F( x) = w ฯ• ( x) = 1 + b 32 Support Vector Machnes (SVMs) RBF (global mnmum) (local mnmum) 4 RBF radal-bass (tral-error) SVM 1 SV (support vector) [haykn 98] SVM SV SV 2 13

3 PCA (Prncpal Component Analyss) SOM (Self Organzed Map) VQ (Vector Quantzer) SVM SVM 3 SVM 4 (SRM:Structural Rsk Mnmzaton) SRM SVM SVM 14

4 SVM 4 SVM 41 411 ( 1 x1 d ) ( xl dl ) x R d { 1 1} N F( x w) { F( x w) : w W} F( x w) : R { 11} N 15

4 SVM F(xw * ) R( w ) = d F( x w ) dfx D ( x d) w (free parameter) F D ( x ) x d x d (jont probablty) R(w) (rsk functonal) (expected rsk) F D ( x ) x d R(w) l R(w) (emprcal rsk) R emp l 1 ( w) = d F( x w) l = 1 R emp (w) R emp (w) R(w) R emp (w) R emp (w) w R(w) w* consstent P ( sup R( w) ( w) > ε ) 0 R emp l Vapnk Chervonenks VC (Vapnk-Chervonenks dmenson) VC F(xw) F(xw) Vapnk Chervonenks 2l η h ln + 1 ln 4 (41) ( ) ( ) h R w R w + emp w W l 16

4 SVM h VC h (41) R(w) R emp (w) l VC R emp (w) VC VC VC R emp (w) 2l η h ln + 1 ln h 4 l VC 412 Structural Rsk Mnmzaton (SRM) VC Vapnk Structural Rsk Mnmzaton (SRM) (true error) (emprcal rsk) 4-1 VC VC VC 17

4 SVM 4-1 SRM Fk ( x w); w W k k = 12 n 18

4 SVM F F 1 2 F n VC h h 1 2 h n (41) SRM Fn VC VC VC SRM SVM VC VC 42 SVM 421 (lnearly separable) 19

4 SVM SVM {-1+1} S = { x : ( x d ) d = + 1} + S - = { x : ( x d ) d = 1} 2 (hyperplane) (42) w T x + b = 0 x w b SVM T w x + b 0 T w x + b 0 x x S S + w b w b (43) T w x + b 1 T w x + b 1 x x S + S r 2 23 1 n n H = { x R : a x = α} 20

4 SVM r = T w x + b 1 w w (44) ρ = 2 r = 2 w ρ (margn of separaton) 1 (support vector) r = w SVM ρ w w 4-2 ρ 21

4 SVM 4-2 422 SVM 41 SRM VC VC SRM SVM w w SRM 22

4 SVM Vapnk x 1 x 2 x l 3 (ball) R T 2 k A k F ( w x) = { w x + b : w } F k VC h k m 0 h k 2 2 k mn{ R A m0} + 1 VC h k w VC 0 SRM VC SVM ( 1 x1 d )( xl dl ) x R d { 1 N 1} (45) Mnmze w b subject to d ( w Φ( w) = T x + b) 1 = 12 l 1 T w w 2 3 n x r o B( x r) = { x R : x x n o < r} 23

4 SVM 423 0 4-3 4-3 24

4 SVM { } l ξ slack = 1 T (46) d ( w x + b) 1 ξ = 12 l ξ Φ( ) = ξ l = 1 424 (nonlnear surface) SVM (feature space) 4-4 25

4 SVM 4-4 m 0 m 1 m 1 ฯ• ฯ• ( x) = { ฯ•1( x) ฯ• m ( x)} 1 T w ฯ•( x) + b = 0 SVM 26

4 SVM (47) Mnmze w b subject to d ( w Φ( w T 1 ) = w 2 ξ 0 T x + b) 1 ξ w + C l = 1 ξ = 12 l = 12 l C C (47) w C 425 SVM (47) Convex Programmng [Peressn et al 88] f ( λx1 + [1 λ] x2) λf ( x1) + [1 λ] f ( x2) f(x) Convex Convex Programmng Convex (target functon) Convex (47) Lagrangan Dualty Dual Problem [Nash et al 97] (47) Lagrangan prmal functon l 1 T T (48) L w b ξ λ γ ) = w w λ{ d ( w ฯ•( x ) + b) 1+ ξ} ( γ ξ + C ξ 2 = 1 = 1 = 1 λ 0 γ 0 Lagrange Multpler l l 27

4 SVM 28 mn-max dualty 4 Dual Problem (49) ) ( mn maxmze 0 0 γ λ ξ ξ γ λ b L b w w (48) Convex Programmng ) ( mn γ λ ξ ξ b L b w w 0 ) ( cond3: 0 ) ( cond2 : 0 ) ( ) ( cond1: 1 1 = + = = = = = = = C b L d b b L d b L l l γ λ ξ γ λ ξ λ γ λ ξ ฯ• λ γ λ ξ w w x w w w cond1 (49) (410) C d d d L l l l j j T j j l = = = = = = λ λ ฯ• ฯ• λ λ λ λ λ 0 0 ) ( ) ( 2 1 ) ( maxmze 1 1 1 1 * x x (410) SVM ) ( ) ( j T x x ฯ• ฯ• K(x x j ) 4 ) ( ) ( ) ( * * * * y x F y x F y x F (x*y*) ) ( mnmax ) ( mn max y x F y x F Y y X x X x Y y =

4 SVM T T = = = T K ( x x j ) ฯ• ( x ) ฯ•( x j ) ฯ• ( x j ) ฯ•( x ) akฯ•k ( x ) ฯ•k ( x j ) k SVM (1 1 exp 2σ tanh p + x T y) P 2 x x 2 T ( β x x + β ) 0 1 σ 2 RBF 2 4-1 (410)λ w b l = 1 λ d K( x x ) + b = 0 Ns w = = 1 λ d ฯ•( x ) T Ns Ns = Number of Support Vectors b = 1 w λ d K( x x ) d =1 λ 0 0<λ 0 <C x 0 = 1 0 b Karush-Kuhn-Tucker KKT (saddle pont) 5 ( w * b* ξ* λ* γ * ) λ [ d ( w T ฯ•( x ) + b) 1+ ξ ] = 0 γ ξ = 0 = 12 l = 12 l λ < C cond3 ξ 0 0< λ < C x 5 L Lagrangan L( x* λ) L( x* λ*) L( x λ*) (x*λ*) 29

4 SVM T w ฯ•( x ) + b) 1 = 0 d ( b (410)Convex Programmmng 2 Quadratc Programmng [Nash et al 97] (410) Quadratc Programmng (411) 1 Mnmze F( Λ) = Λ1 + ΛΗΛ 2 subject to Λd = 0 Λ C1 Λ 0 Η j =d d j K(x x j ) Quadratc Programmng LOQO[Vanderbe 97] 30

5 5 51 Reuters-21578 Reuters-21578 Reuters newswre Reuter Carnege Reusters- 22173 Davd Lews 595 Reuters-21578 Reuters-21578 SGML Reuters-21578 5-1 31

5 EXCHANGES 39 ORGS 56 PEOPLE 267 PLACES 175 TOPICS 135 5-1 TOPICS Reuters- 21578 TOPICS TOPICS 2 Reuters-21578 3 ModLews (13625): LEWISSPLIT="TRAIN"; TOPICS="YES" or "NO" (61888): LEWISSPLIT="TEST"; TOPICS="YES" or "NO" (1765): LEWISSPLIT="NOT-USED" or TOPICS="BYPASS" ModApte (9603): LEWISSPLIT="TRAIN"; TOPICS="YES" (3299): LEWISSPLIT="TEST"; TOPICS="YES" 32

5 (8676): ModHayes : (20856): CGISPLIT="TRAINING-SET" (722): CGISPLIT="PUBLISHED-TESTSET" (0): 5-2 Reuters-21578 52 ModApte Reuters-21578 DF 5-3 DF (Document Frequency) <BODY> </BODY> Reuters newslne 100000 stemmng 33

5 IDF 4 8754 SVM 8754 (feature) 0: 0 n: n tfdf 1 ModApte 9603 3299 <BODY></BODY> 8762 3009 TOPICS 135 10 34

5 Earn 2839 1043 Acq 1611 672 Money-fx 513 143 Gran 415 125 Crude 370 156 Trade 351 102 Interest 323 97 Wheat 198 61 Shp 175 66 Corn 152 36 5-4 10 10 "corn" "crude" "earn" "gran" "nterest" 5 53 1 (Accuracy) = 35

5 2 (Precson/Recall break-even pont) recall precson recall precson +1-1 +1 a b -1 c d 5-5 recall = a/(a+c) precson = a/(a+b) recall precson (precson/recall break-even pont) precson recall precson/recallbreak even pont = 2 precson+ recall (Accuracy) (a+d)/(a+b+c+d) 531 SVM C=1000 SVM 5-1 5-2 5-3 36

5 5-1 5 5-2 s 1 RBF 37

5 5-3 b 0 =2 b 1 =1 2 corn crude earn gran nterest 1540 2248 3297 2566 1517 RBF 943 1491 2469 1749 1083 124 179 419 209 286 5-6 SV 5-1 5-2 5-3 SVM 97% earn 5-4 corn nterest crude gran earn SV 38

5 532 Naïve Bayesan Naïve Bayesan 5-6 5-4 Nave Bayesan SVM Naïve Bayesan RBF SVM Naïve Bayesan 533 39

5 5-6 RBF SV 8762 SV 2 5-5 5-6 5-5 SV 40

5 5-6 RBF SV 5-5 5-6 5-15-2 41

6 6 SVM SVM SVM Naïve Bayesan SVM SV SVM SV SV SVM Quadratc 42

6 Programmmng SVM Quadratc Optmzer 43

[Cherkassky et al 98] Vladmr Cherkassky and Flp Muler Learnng From Data Concepts Theory and Methods John Wley & Sons Inc 1997 [Frakes et al 92] Wllam B Frakes and Rchard Baeze-Yates Informaton Retreval Data Structures & Algorthms Prentce-Hall Inc 1997 [Haykn 98] Smon Haykn Neural Networks 2 nd edton Prentce-Hall Inc 1997 [Lews 91] Davd D Lews Evaluatng Text Categorzaton In Proceedngs of the Speech and Natural Language Workshop pp 312-317 1991 [Lere 97] Ray Lere and Prasad Tadepall Actve Learnng wth Commttees for Text Categorzaton In the Proceedngs of AAAI 97 pp591-596 1997 [Mtchell 97] Tom M Mtchell Machne Learnng McGraw-Hll Companes Inc 1997 [Nash et al 97] Stephen G Nash and Arela Sofer Lnear and Nonlnear Programmng McGraw-Hll Companes Inc 1997 [Osuna et al 97] Edgar E Osuna Robert Freund and Federco Gros Support Vector Machne: Tranng and Applcatons AI Memo MIT AI Lab 1997 [Peressn et al 88] A L Peressn F E Sullvan and J J Uhl Jr The Mathematcs of Nonlnear Programmng Sprnger Verlag New York Inc 1997 [Qunlan 93] J Ross Qunlan C45: Programs for Machne Learnng Morang Kaufmann Publshers Inc 1993 [Stston 96] M O Sttson J A E Weston A Gammerman V Vovk and V Vapnk Theory of Support Vector Machnes Techncal Report CSD-TR-96-17 Royal Holloway Unversty of London 1997 [Vanderbe 97] Robert J Vanderbe LOQO User s Manual Verson 310 Techncal Report SOR-97-08 Prnceton Unversty 1997 44

[Vapnk 95] V Vapnk The Nature of Statstcal Learnng Theory Sprnger Verlag New York Inc 1997 [Yang 97] Ymng Yang An Evaluaton of Statstcal Approaches to Text Categorzaton Techncal Report CMU-CS-97-127 Carnege Mellon Unversty 1997 45

ABSTRACT Now that the world s connected by onlne network t s an age of a flood of nformaton It s dffcult and tme-consumng to classfy accordng to user's nterests the enormous nformaton pourng n from onlne network Therefore f the classfcaton system can be automatcally bult usng machne learnng technques t wll be very effcent The problem of classfyng texts has a very hgher dmenson of nput space and the nformaton that the text tself contans s sparse In ths paper Support Vector Machne (SVM) an algorthm sutable for problems havng these characterstcs s mplemented In order to experment wth the effect of Support Vectors (SVs) whch SVM produces multlayer perceptron network s traned over the reduced data set usng only SVs SVM s a very strong algorthm based on Structural Rsk Mnmzaton (SRM) of the statstcal learnng theory In addton SVM's learnng process whch searches optmal solutons s a mathematcally well modeled process called Convex Programmng In the experment about 5 frequently-appeared topcs of Reuters-21578 document set t s remarkable that the resultng accuracy s hgher than 97% And SVM shows a better break-even pont than Nave bayesan classfer's In addton traned multlayer perceptron network usng only SVs not only shows a good performance but also reduces a tranng tme remarkably Keywords: Text Classfcaton Multlayer Perceptron Network SVM SRM Convex Programmng 46

2 730? 2 2 6 47