Weather and Climate Prediction with Icosahedral-Hexagonal Model at PKNU Jul. 23, 2012 Jai-Ho Oh Pukyong National University, Korea jhoh@pknu.ac.kr
Contents Desire for the next NWP model Common Computational Environment for sustainable Further Development Computational Performance of the NWP model Performance of GME for Medium Range Forecast Seasonal Prediction Time-slice Run with High Resolution Global Model based on IPCC AR4 Simulations
Desire for the next NWP model
Future Weather Information Detailed Globally mesoscale (1km) Locally microscale (1m) 4
More detailed. 10 km 1 km 5
Typhoon MAN-YI case (0704) 240km, 40km, 20km, 10km Total Precipitation (mm /3 hours) 0 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18 0.2 0.4 0.6 0.8 1.0 3.0 5.0 10.0 20.0 30.0
Typhoon MAN-YI case (0704) 240km, 40km, 20km, 10km Total Precipitation (mm /3 hours) 0 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18 0.2 0.4 0.6 0.8 1.0 3.0 5.0 10.0 20.0 30.0
Typhoon MAN-YI case (0704) 240km, 40km, 20km, 10km Total Precipitation (mm /3 hours) 0 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18 0.2 0.4 0.6 0.8 1.0 3.0 5.0 10.0 20.0 30.0
Future Weather Information Detailed Globally mesoscale (1km) Locally microscale (1m) Accurate 0~2 days : perfect 1 week : 85% forecast 9
Accuracy of long-term rainfall prediction (1) Linear Extrapolation Accuracy of Prediction (2) Volume Scanning Radar + Conceptual Rainfall Model Numerical Weather Forecast (3) Meso-scale Numerical Model (4) Meso-scale Numerical model assimilated with Radar Information Current Accuracy (5) GCM (6) Ensemble Forecast 20km 1hour 200km 6hours 2000km 2 weeks 6000km 1 month Spatial Scale Time Scale Meso β Scale Spatial & Time Scale 10
Future Weather Information Detailed Globally mesoscale (1km) Locally microscale (1m) Global Weather Services in 2025 - R. Anthes (UCAR) Accurate 0~2 days : perfect 1 week : 85% forecast Goal-Oriented time-weather travelling Promptly Instantly User-centered two-way service web based Observations 11
Observation networks in Korea KMA Integrated Observation System (IOS) COMS 2008 Geostationary Sat (MTSAT,FY) Cloud Imagery Obs Aviation Obs. AMDAR Radiosonde Weather RADAR Obs Oceanic Buoy Satellite Vertical Sounder KMA Headquarter Ship Obs Surface Obs Wind Profiler Data Collection,Analysis AWS or ASOS 12
Future Weather Information Detailed Globally mesoscale (1km) Locally microscale (1m) Global Weather Services in 2025 - R. Anthes (UCAR) Accurate 0~2 days : perfect 1 week : 85% forecast Goal-Oriented time-weather travelling Promptly Instantly User-centered two-way service web based Observations Data Assimilation 13
Data assimilation Data Assimilation Methods 14
Future Weather Information Detailed Globally mesoscale (1km) Locally microscale (1m) Global Weather Services in 2025 - R. Anthes (UCAR) Accurate 0~2 days : perfect 1 week : 85% forecast Goal-Oriented time-weather travelling Promptly Instantly User-centered two-way service web based Observations Data Assimilation Numerical Models 15
Dilemma of NWP Models Conservation Global NWP Model Accuracy Efficiency International Workshop on the Next Generation Numerical Weather Prediction Model
Global NWP model Considering candidate grids Lat/Lon Icosohedral/Hexagonal Cubed-sphere Yin-yang Develop prototypes, assess accuracy, efficiency 17
Numerical weather and climate prediction model Target phenomena: cm to 1000 km Governing Eqs. : - 10 D. E. Number of mesh : - 10 8 Time step : 10 sec 1 day prediction : 10 x 10 8 x 10 4 = 10 13 1 year prediction : 10 13 x 10 4 = 10 16 100 year prediction : 10 16 x 10 2 = 10 18 Number of calculations : - 10 21 18
Future available computer resources 2005-20 Teraflops 2009 YAP (Year after Petaflops) Heterogeneous Parallel Computing 2014 100 Petaflops 2018 YOE (Year of Exa-flops) 4 cores 240 cores Multi-Core Parallel-Core CPU GPU 19
Grid generation of global model GME n i min (km) av (km) max (km) n GP 8 881.7 950.4 1,052.2 642 12 587.8 636.2 693.7 1,442 16 440.9 477.6 526.4 2,562 24 293.9 319.0 347.2 5,762 32 220.4 239.3 263.4 10,242 48 147.0 159.7 173.6 23,042 64 110.2 119.8 131.7 40,962 96 73.5 79.9 86.8 92,162 128 55.1 59.9 65.9 163,842 192 36.7 40.0 43.4 368,642 256 27.6 30.0 32.9 655,362 384 18.4 20.0 21.7 1,474,562 Computing Requirement 2009 (with IBM p595 64 nodes) 10 day simulation -> 6 days 2018 (with Exa-flop computer) 10 day simulation -> (6d x 24h x 3600s)/10**4 = 55 sec 512 13.8 15.0 16.5 2,621,442 768 9.2 10.0 10.9 5,898,242
Future Weather Information Detailed Globally mesoscale (1km) Locally microscale (1m) Global Weather Services in 2025 - R. Anthes (UCAR) Accurate 0~2 days : perfect 1 week : 85% forecast Goal-Oriented time-weather travelling Promptly Instantly User-centered two-way service web based Observations Data Assimilation Numerical Models Information utilization Networks 21
Fujitsu Develop World Fastest Supercomputer The performance of the K Computer is the same as linking around 1 million desktops. The K Computer is made up of 672 cabinets and uses enough electricity to power around 10,000 homes at an estimated cost of $10 million annually.
"There was a time when weather forecasting... in terms of the use of world's top computers, was high up on that list. It's actually slid further and further down.
Moore s Law vs. Model Development
Common Computational Environment for sustainable Further Development
Partnership & Leadership for the nationwide Supercomputing Infrastructure 26 Partnership: between the nationwide Supercomputing Centers Leadership: for the national Cyberinfrastructure construction A partnership of PLSI A partnership with 12 regional supercomputing centers Application center Resource center : sharing their resources Tier1 & Application center : Evolution of a regional HPC ecosystem Extending the regional user groups of each application domain Advising to adopt a new HPC technology for regional manufacturing Resource center Tier0 center KISTI Project management Resource Integration Technology development Pusan TP Chonnam Univ. KMA Evolution of a regional HPC ecosystem mutual collaboration Tier1 center UNIST PNU PKNU GIST KOBIC Sharing a resource Evolution of a regional HPC ecosystem Tier2 center KIST UOS TU KU Sharing a resource
Computing Resources in PKNU
Inner (Blue) and Outer (Red) Users 기관명 시스템명 기관내부사용자 기관외부사용자 CPU 제공시간 (hr) 비율 (%) CPU 제공시간 (hr) 비율 (%) GIST Kigi 67,000 17.7% 311,621 82.3% KOBIC kobic 58,895 99.7% 156 0.3% 동명대 tusmp - - 14,906 100.0% 부경대 Hamel 261,538 61.4% 164,422 38.6% 부산대 pdaisy 38,495 72.8% 14,400 27.2% UNIST Unist_smp 16,232 34.8% 30,420 65.2% cheetah 78 2.7% 2,787 97.3%
Computational Performance of NWP model
Data Structure N Logical data layout of the icosahedral hexagonal grid of GME consisting of 10 rhombuses (diamonds), 5 containing the North Pole and 5 the South Pole
The grid is quasi-isotropic. Avoid pole problem Advantages of Icosahedral-Hexagonal Model CFL for advection is not an issue. All cells are nearly the same size (within about 5% in terms of area). Avoids the large amount of global communication required by spectral transform techniques. Data structure extremely well suited to high efficiency on distributed memory parallel computers. Suitable for the utilization of high performance computing.
Schematic representation of the processes included in GME Pressure Adiabatic Processes Ozone Winds Temperature Water vapour Cloud-water Cloud-ice Diffusion Radiation Cumulus convection Grid scale Precipitation Momentum Flux Sensible Heat flux Latent Heat flux Surface Roughness Surface Temperature Snow Temperature Interception Storage Snow Ground Humidiy Snow Melt (Source: Personal Communication with Dr. Majeswki)
Overview GME Operational NWP Model at DWD Icosahedral-hexagonal grid Operators of second order accuracy 40 km mesh size, 368,642 gridpoints per layer 20 km mesh size, 1,474,562 gridpoints per layer 40 layers (hybrid, sigma/pressure) Prognostic variables: p s, u, v, T, q v, q c, q i, o 3 Programming: Fortran90, MPI for message passing Intermittent data assimilation (OI) Digital filtering initialization
Computational Performance of GME on PKNU HAMEL Cluster Model computational performance on PKNU HAMEL cluster (32 bit REALs) Model performance test has been performed using 126-512 processors of 63-256 nodes PKNU HAMEL Cluster RPeak - 2.86 Tflops (Rmax : 1.762Tflops) CPU - Intel Pentium Xeon 2.8 Nodes - 256nodes, 512cpus Memory - 3GB / node Disk - 36GB / node, Shared 10 TB Inter Connection - Myrinet 2000
Performance of the GME PKNU HAMEL Cluster 1400 1200 1000 min time (without post-processing) Total time max time (without post-processing) Total time GME 40km/40L with 7-layer soil model x = 40 km, 368642 x 40 gridpoints t = 133 sec. Real Time (s) 800 600 400 200 24-h forecast using 360 procs. : about 10mins. Linear speedup : 126 to 208 procs. Best performance : 360 procs. 360-512 procs : Expected speedup can not achieved increase in number of procs. does not lead to efficient parallelization 0 128 152 176 200 224 248 272 296 320 344 368 392 416 440 464 488 512 Number of Processors Speedup of GME on PKNU HAMEL Cluster for 24h forecast The communication on system is too slow to support simultaneous processing of 360 (or more number of) processors.
Climate Big Data Management
2PB/2,000CPU 10PB/10,000CPU (Alice/CERN) (CDF/FNAL) (STAR /BNL) (Belle /KEK) (LIGO /LLO) Phase 2 Phase 1 (2009~2011) Provide Global Science Data Analysis Environment National Data Center Asian-Pacific Hub Center (2012~2014) Expand Supporting Fields: Earth Environment, Biometrics, Nano-tech, etc. Global Computing Resources Assigning and Information System Cyber Research and Training Environment 37
상세기후 / 기후예측의연구효율을향상시키기위한대용량데이터팜의구축이연구의필요필요성 기후변화 기후변화에대한능동적대응방안을마련하기위해상세기상 / 기후예측정보의필요성이높아짐수치모델은기상 / 기후예측을위한필수도구로활용 - 중단기예보, 장기계절예측, 미래기후전망등 수치모델의상세화연구의증대 초고성능컴퓨터발전과함께시공간적으로상세화전지규규모로상세격자화된기상 / 기후모의체계를구축하고분석하기위해서는대용량저장소필요
부경대학교대용량데이터연구 대용량데이터팜구축 GSDC 기반기후연구용대용량데이터팜구축 KISTI 환경에전지구기상모형구축및필수라이브러리관리 기후실험데이터저장및대용량자료공급체계구축 기상 / 기후자료체계구축 RCP 기후시나리오기반미래 100 년기후예측자료생산 고해상도전구기후모의결과분석 기후자료데이터팜전송및분석용데이터재생산 자료교환커뮤니티활성화 기후예측자료공유를통한국내외협업연구활성화 동남아시아지역및개도국중심이상기후분석및대용량데이터제공체계마련
GSDC 기반의기상 / 기후연구표준환경에적합한데이터팜구축 <KISTI GSDC> - 대용량전산자원 - - KISTI 전산자원환경에전지구기상예측모형및필수라이브러리설치 모형구축및실험준비 기후시나리오수집 모형초기자료수집 필수라이브러리설치 프로그래밍언어 기상분석그래픽툴
데이터팜을활용한고해상도전구규모기후모의및분석 모델 : Icosahedral-Hexagonal Gridpoint Model GME 해상도 : Icosahedral-hexagonal ni192 (40km), 40 Layers Regular 900x451 (40km) 초기자료 : ECMWF data (T512, 1024x512) 경계자료 : SST and Sea Ice Concentration data - 현재기후 (1979-2010,32 years) : AMIP Observations - 미래기후 (2011-2100,90 years) : RCP 8.5, 4.5 기반 CMIP5 참여모델결과 RCP(Representative Concentration Pathways) 기후시나리오 (IPCC AR5) RCP8.5 RCP8.5 RCP6.0 RCP4.5 RCP2.6 RCP6.0 RCP4.5 RCP2.6
데이터팜을활용한고해상도전구규모기후모의결과분석 - 전지구및동아시아상세미래기후변화분석및극한사상의분석계획 예 ) 열대야, 집중호우의발생빈도및강도변화, 태풍의발생변화등
연구추진성과 - GSDC 기반기후실험자료의대용량데이터팜전송및활용 - 기후실험데이터저장및대용량자료의공급체계확립 자료설명 모델수행기간 사용자그룹 데이터용량 Raw Data : (Icosahedral grid) 1979~09 (30 년 ) 계산완료 276 TB X 스토리지제한으로현재전송중단 PROCESS 1 : (Regular grid) 1979~93 (15 년 ) GSDC 전송완료 191 TB GSDC 전송중 PROCESS 2 : (Grib Binary format) 1979~86 (8 년 ) 추출완료 95 TB 데이터총용량 : 569.7 TB PROCESS 3 : (3hourly daily) 7.3 TB PROCESS 4 : (daily monthly) 384 GB
자료용량기후실험을위한대용량데이터용량요구량 TB - 기후실험을위한대용량데이터용량변화 데이터총용량 569.7 기후실험수행을위해서는추가 530TB 의대용량스토리지가요구
Turning Observations into Knowledge Products - 45 -
Data Management for Utilization of Climate Experiment GME Raw Data 1 st PROCESS 2 nd PROCESS 3 rd PROCESS 4 th PROCESS Icosahedral-hexagonal Grid Model output Interpolate to Regular grid and standard pressure levels Convert to binary format Calculate to daily data Calculate to daily Calculate to monthly Grid info.: Ni192 (40km)/40L 40km (900x451)/21L Intervals: 3 Hourly 3 Hourly 3 Hourly Daily Monthly Variables: 200 Variables 200 Variables Optional Optional Optional Format: GRIB format GRIB format Binary Binary Binary Data Size: 439 MB Surface : 87 MB 1.6 MB / 1VAR 1.6 MB / 1VAR 1.6 MB / 1VAR (1 time) Pressure : 224 MB Data Size: (1 Year) 1.3 TB 0.9 TB 4.7 GB / 1VAR 600 MB / 1VAR 20 MB / 1VAR Store GME Raw Data University University University Client Research Institute (climatologist, hydrologist) Research Institute (climatologist) Research Institute (climatologist) Extreme Events, Typhoon Analysis, etc. Monsoon, MJO, etc. Seasonal Analysis Analysis
PKNU s Future Research on the NWP Model 실험 1 : 차세대격자체계의 ICON 을활용한벤치마크테스트 - ICON 을활용한전지구 5 km/l90 실험 - 20km/L90, 동아시아 5km/L65 벤치마크실험 최대계산자원 (4000 cpus) 및자료저장용스토리지 (50TB) 필요 ICON : ICOsahedral Non-hydrostatic model 47
Conclusions Desired resolution for the next NWP model is at least 10 km or less globally and 1 km for regionally. The next NWP model must be on the model development ecosystem for sustainable further development. Further development on a NWP model should be faster than HW enhancement.