Online Monitoring Methods,, 1. DCS(Distributed Control System). (fault detection), (diagnosis). 1), 2) - 3) Shewart charts, CUSUM charts EWMA charts (Statistical Process Control; SPC).. ( ). -.
,. -.. SPC. Fig. 1. SPC. chemometrics (Principal Component Analysis(PCA) Partial Least Squares(PLS) ) Fevotte McKenna PCA polymerisation reactors parameter estimation on-line monitoring Nomikos Kosanovich Multiway PCA. Piovoso PLS, Miller photographic paper sensitization Skagerberg Multiblock PLS LDPE, Kresta fluidized bed reactor extractive distillation column. 96 European Symposium on Computer Aided Process Engineering chemometrics 10.
2. I. 2-1. Statistical Process Control(SPC) (Statistical Process Control, SPC) [4,5,6]. SPC 1930 Walter Shewhart[1] Shewhart SPC.. y t = μ + ε t (y t, μ ε t σ 2 random error) Fig. 1 99%, 95% (control lines; UCL, UWL, CL, LCL, LWL). Fig. 1 (in statistical control state) (out of control). Out of control UCL 2 3 UW CL LWL LCL Sample Number or Time Fig. 1 Shewhart Chart
. ( i ) Fig. 1 Shewhart. (Fault detection ) ( ii ). ( Diagnosis ) e.g. tank leaking, valve malfunction, controller malfunction... ( iii ). ( Correction ) SPC.[4] 2-2. Multivariate Statistical Process Control(MSPC)[6] ( SPC). SPC (multiivariate). Shewart SPC..... SPC (T) (P) Shewart. Fig. 2 (T) (P) Shewart.
. x. ( ). Shewhart x. P T Fig. 2 Bivariate Plot. Principal Component Analysis (PCA) Partial Least Squares (PLS). 2-3. Principal Component Analysis (PCA) Pearson[2] PCA Hotelling[3]
. Wold[9,14] review. PCA X (data matrix). r r r x11 x12......... x1 r r r x21 x22......... x2 : : : : X = : : : : : : : : r r r xn1 xn2......... x K K NK N (object),. K (variable),, ph. PCA. PCA PC (projection). PCA Fig. 4 system ( ). PCs score vector. Fig. 4 PC. PCA (PCs) (score vectors) system a linear combination system system.
4 1st PC 3 Variable 3 2 1 2nd PC 20 0-5 0 Variable 1 5 10 0 10 V a ria b le 2 Fig. 4 PCA a PC NxK Fig. 5. X = M 1 + M 2 + M 3 +... + M a m m m m X = M 1 + M 2 +... + M a n n n n m 1 m 1 m 1 m a m p T 1 p T 2 p T a p T a 1 1 1 X = t 1 + t 2 +... + t a = T a a n n n n n Fig. 5 PC score vector. Fig. 6 X PC. a PC PC X PC a PC a. score vectors(t) PC observation a
t a PC. a a score linear sum X Fig 5. 3 (system) PCA PC. PC PC. PC system PC, PC(a PC) linear sum system PCA. m m 1 t n n n m 1 p T Fig. 6 Score vectors loading vectors PCA. (X) 0 1 centering scaling.
Centering scaling (X) X T X covariance (S) covariance (column space). covariance (S=X T X) span (eigenvectors) span PC (PC) (eigenvalue) PC. (covariance ) (PC) (PC) PC, PC. a PC Fig. 5.. X = T a P a T + E (E : residual) E = X - T a P a T N K SSQ = E = E ( n, k) n =1 k =1 2 PCA system (a ) PCs. PC validation F-test residual(e) random error PCs. PCA Nonlinear iterative partial least squares(nipals). NIPALS a PC a PC X t 1 p T 1 residual E 1 X t 1 p T 1 E 1 t 2 p T 2 E random error a PCs. E 1 = X - t 1 p T 1, E 2 = E 1 - t 2 p T 2,..., E h = E h-1 - t h p T h NIPALS.
1. X column x j t h : t h = x j 2. p h T = t h T / t h T t h 3. normalize p h T to length 1 : p h T new = p h T / p h T 4. t T T h = X p hnew / p h new p hnew 5. compare the t h used in step 2 with that obtained in step 4. If they are the same, stop (the iteration has converged). If they still differ, go to step 2. 6. E = X - t h p T h ; X = E 7. go to step 1 PC SPC. Hotelling s T 2 statistic.. 2. Hotelling s T 2 statistic.[3] n covariance S. x i i x (mean vector). n T i i (3) i=1-1 S=(n-1) ( x -x)( x -x) x Hotelling s T 2 statistic Jackson[2].
T = ( x τ ) S ( x τ ) (4) 2 T -1 τ (target value). T 2 (UCL). T 2 UCL = ( n 1)( n + 1) a Fα ( a, n a) (5) n( n a ) Fα ( a, n a ) a n-a F upper 100α % critical point n, a PC α. PC p 2 sample n 15 1-α 95%(α=0.05) F 2,13,.05 =3.81 T 2 UCL 8.21.. Fig. 7 PCA. SIMPLIFICATION DATA REDUCTION MODELING OUTLIER DETECTION VARIABLE SELECTION CLASSIFICATION PREDICTION Fig. 7 PCA PCA a (PC) system ( ) Simplification Data reduction. input output (prediction). score plot(t 1 vs t 2 ) classification group outlier loading vector plot(p T 1 vs p T 2)
outlier. otulier detection variable selection.[8, 9] PCA Dunia[10] Tong[11], Heyen[23] measurement reconciliation Sensitivity calculations variance analysis, Fevotte McKenna[24] polymerisation reactors parameter estimation on-line monitoring. 2-4. Multiway Principal Component Analysis (MPCA) PCA data. PCA. MSPC PCA MPCA.[12, 15, 16],, image.. batct(i=1, 2,...,I) (j = 1, 2,...,J) k (k=1, 2,...,K). batch k Fig. 8 (IxKxJ). Batches I X Measurement s J Tim e R r=1 t r Ix1x1 1xJxK P r E IxJxK Fig. 8 MPCA Three-way array arrangement decomposition Fig. 8 batch run
. batch (i) (j) K x J time interval (k) batch N x K. MPCA PCA. MPCA PCA MPCA (three-way array) X (unfolding) PCA. X Fig. 9 X (N x K). (N x KJ). X batch variability. k 1 2 3 4 i X j t X p + E t p + E Fig 9. MPCA MPCA three-way array X Fig. 6 unfolding PCA
score vector (t r ) loading matrix(p r ) systematic part ( R r = 1 t r p r ) residual part (E). MPCA Nomikos MacGregor[15, 16] SPE score Fig. 10. MPCA Kosanovich(1996) Gallagher N.B. B.M. Wise[12]. Fig. 10 2-5. Partial Least Squares or Projection to Latent Structures (PLS) [14, 17, 18] PCA (X) PC system. product PLS. PCA (X) PLS (Y). (Y). (X) (Y),, PCA X Y
PLS. X Y mapping transfer matrix PLS Multiple Linear Regression(MLR) Principal Component Regression(PCR). MLR centering scaling collinearity singularity(wold, S. et al., 1984) PCR X score vectors(or PCs) y regression collinearity singularity X PC score vector Y PCR. Y PCR X ( PC) Y. X PC. Partial Least Squares(PLS). X Y PCA Fig. 11. outer relation. m a m m X T a P T E n n n p a p p Q T F* Y U a n n n Fig. 11 X Y Outer relations PLS PCR X score vector(t h ) Y score vector(u h ) u h =b h t h inner relation. inner relation X score vector Y X Y. X score vector weight(w h ) Y
. PLS score vector regression collinearity singularity X Y contribution weight PCR. PCA loading vector covariance (S=X T X) PLS loading vector (X T Y)(Y T X). PLS PCA NIPALS. outer relation PCA. X = T P T + E, Y = UC T + F, U = T B + G B=(T T T) -1 T T U. 1. Start: set u equal to a column of Y 2. w T = u T X / u T u (regress columns of X on u) 3. Normalize w to unit length 4. t = Xw / w T w (calculate the scores) 5. q T = t T Y / t T t (regress columns of Y on t) 6. Normalize q to unit length 7. u = Yq / q T q (calculate new u vector) 8. Check convergence: if YES to 9, if NO to 2 9. X loadings: p = X T t / t T t 10. Regression: b = u T t / t T t 11. Calculate residual matrices: E = X - tp T and F = Y - btq T 12. To calculate the next set of latent vectors replace X & Y by E and F and repeat. t 1-8 9. 1-8 t PCA NIPALS t 9 t. t PCR PCA.
(10 ) 10 PLS. PLS PCA a PLS cross validation PRESS. PLS [17, 19] Wise industrial ceramic melter, Piovoso, Slama fluid bed catalitic cracking fractionation section, Miller photographic paper sensitization, Dayal industrial pulp digester, Hodouin mineral crushing, grinding, and flotation circuit,,,. 2-6. Multiblock Partial Least Squares or Projection to Latent Structures (MPLS)[19] PLS PLS MPLS.. multiblock projection method multiway PCA multiblock PLS. MPCA PCA multiblock MPLS PLS multiblock (Herman Wold[4] Svante Wold[4]). MPCA unfolding MPLS
.... MPLS NIPALS PLS. iteration score vector score projection score vector(t c ) t c Y score vector PLS. MPLS off-line process yield( ). 2 X 1 Y. M PLS. 1) Y u. 2) (w1,t1), (w2,t2), X1,X2 PLS. 3) score vectors t1, t2 T matrix. 4) T matrix loading vector v, score vector tc, Y matrix loading vector q, new score vector u T matrix X PLS. 5) u. (2). 6) X1,X2 loading vector. (p1=x1 T t1/t1 T t1, p2=x2 T t2/t2 T t2) 7) Residual matrix. (E1=X1-t1p1 T, E2=X2-t2p2 T, F=Y-tcq T ) 8) X1,X2,Y E1,E2,F (1).
Fig 12. Multiblock PLS a PLS. A T X1= t1ap1a + E1 X2 = t2ap2 T a + E2 Y = tc a q a = 1 A a = 1 Macgregor[19] LDPE 2 contribution plot (1) batch polymerization batch data batch set-up data batch trajectory data 2 batch batch (2). Kresta[20] (1991) fluidized bed reactor extractive distillation column. ^ A a = 1 a T II. 3. PCA, PLS chemometrics. chemometrics (PCA, PLS)
compression complexity reduction [10, 25] deterministic. data reconciliation[23], gross error detection parameter estimation[24] MPCA MPLS. 90. PLS (1995,, ) 96 Chemometric Gross Error Detection(,, ). monitoring (Kresta[20], Jackson[8] ) Nomikos MacGregor[15, 16].. PLS MMAVA On-line Quality Monitoring PCA MPCA system. PCA classification, outlier detection, variable selection PLS prediction PLS cross validation PRESS. MacGregor review paper.
1. Shewhart, W. A., Economic Control of Quality of Manufactured Product, Van Nostrand, Princeton, NJ (1931) 2. Pearson, K., On lines and planes of closest fit to systems of points in space, Phil Mag, ser 6, 2, pp.559~572, (1901) 3. Hotelling, H., Analysis of a complex of statistical variables into principal components, J. Educat Psychol., 24, pp. 417~441, (1933) 4. MacGregor, J. F., Statistical Process Control for the Process Industries, the 4 th International Symp. on PSE, Montebello, Quebec, Canada August 5-9 (1991) 5. Montgomery D. C., Introduction to Statistical Quality Control, 2 nd Ed., John Wiley & Sons, INC., New York (1991). 6. Kourti T., J. Lee and J.F. MacGregor, Experiences with industrial applications of projection methods for multivariate statistical process control, Comp. Chem. Eng. Vol. 20, suppl., pp. S745~S750,1996 7. Neter, J., W. Wasserman and M.H. Kutner, Applied Linear Statistical Models, 3 rd ed., Richard D. IRWIN, INC., (1990) 8. Jackson J. E., A User s Guide To Principal Components, John Wiley & Sons, INC., (1991) 9. Wold, S. K. Esbensen and P. Geladi, Principal Component Analysis, Chemometrics and Intel. Lab. Sys. 2, 37~52 (1987) 10. Dunia R., S. J. Qin, T. F. Edgar and T. J. McAvoy, Use of principal component analysis for sensor fault identification, Comp. Chem. Eng. Vol. 20, suppl., pp. S713~S718,(1996) 11. Tong H. and C.M. Crowe, Detection persistent gross errors by sequential analysis of principal components, Comp. Chem. Eng. Vol. 20, suppl., pp. S733~S738,(1996) 12. Gallagher N.B. and B.M. Wise, Application of multi-way principal components analysis to nuclear waste storage tank monitoring, Comp. Chem. Eng. Vol. 20, suppl., pp. S739~S744,(1996) 13. Martin E.B., A.J. Morris, M.C. Papazoglou and C. Kiparissides, Batch process monitoring for consistent production, Comp. Chem. Eng. Vol. 20, suppl., pp. S599~S604,(1996) 14. Wold, S., P. Geladi, K. Esbensen and J. Ohman, Multi-way Principal Components and PLS-
Analysis, J. Chemometrics 1,41~56 (1987) 15. Nomikos, P. and J. F. MacGregor, Monitoring Batch Processes Using Multiway Principal Component Analysis, AIChE J. 40, 8, 1361~1375(1994) 16. Nomikos, P. and J.F. MacGregor, Multivariate SPC Charts for Monitoring Batch Processes, Technometrics 37,1, 41~59(1995) 17. Wold, S., A. Ruhe, H. Wold and W. J. Dunn III, The Collinearity Problem in Linear Regression. The Partial Least Squares(PLS) Approach to Generalized Inverses, SIAM J. Sci. Stat. Comput. 5, 3(1984) 18. Geladi, P. and B. R. Kowalski, Partial Least-Squares Regression: A Tutorial, Analytica Chimica Acta 185,1~17 (1986) 19. MacGregor J. F., C. Jaeckle, C. Kiparissides and M. Koutoudi, Process Monitoring and Diagnosis by Multi-block PLS Methods, AIChE J. 40, 5, 826~838 (1994) 20. Kresta J. V., J. F. MacGregor and T. E. Marlin, Multivariate Statistical Monitoring of Process Operating Performance, The Can. J. Chem. Eng. 69, 35~47 (1991) 21. Zullo L., Validation and verification of continuous plants operating modes using multivariate statistical methods, Comp. Chem. Eng. Vol. 20, suppl., pp. S683~S688,1996 22. Bandoni J.C.A. and J.A. Romagnoli, Robust statistical process monitoring, Comp. Chem. Eng. Vol. 20, suppl., pp. S497~S502,1996 23. Heyen G., E. Marechal and B. Kalitventzeff, Sensitivity calculations and variance analysis in plant measurement reconciliation, Comp. Chem. Eng. Vol. 20, suppl., pp. S539~S544,1996 24. Fevotte G., I. Varudio and T.F. McKenna, Computer-aided parameter estimation and on-line monitoring of emulsion and solution polymerisation reactors, Comp. Chem. Eng. Vol. 20, suppl., pp. S581~S586,1996 25. Raich, A. and A. Cinar, Statistical Process Monitoring and Disturbance Diagnosis in Multivariable Continuous Processes, AIChE J., 42, 4, pp.995~1009 (1996)