NNInput NNInputs_180.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree NNAna::CopyTree: entries= 15965 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 3134 nbkg = 12831 Bkg Entries: 12831 Sig Entries: 3134 Chosen entries: 3134 Warning: entries low (below 6000) Signal fraction: 1 Background fraction: 0.244252 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 12831 Actual Signal Entries: 3134 Entries to split: 3134 Test with : 1567 Train with : 1567 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 3134 for Signal Prepared event 0 for Signal with 3134 events ====Entry 0 Variable Ht : 136.562 Variable LepAPt : 27.4272 Variable LepBPt : 12.8801 Variable MetSigLeptonsJets : 8.38415 Variable MetSpec : 68.9431 Variable SumEtLeptonsJets : 67.6181 Variable VSumJetLeptonsPt : 63.1844 Variable addEt : 109.251 Variable dPhiLepSumMet : 2.88047 Variable dPhiLeptons : 0.840861 Variable dRLeptons : 0.843729 Variable lep1_E : 27.4326 Variable lep2_E : 12.8967 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2180 Ht = 136.562 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 27.4272 LepAPt = 27.4272 LepBEt = 12.8808 LepBPt = 12.8801 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 68.9431 MetDelPhi = 2.30006 MetSig = 7.08441 MetSigLeptonsJets = 8.38415 MetSpec = 68.9431 Mjj = 0 MostCentralJetEta = 1.96847 MtllMet = 109.272 Njets = 1 SB = 0 SumEt = 94.7052 SumEtJets = 0 SumEtLeptonsJets = 67.6181 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 63.1844 addEt = 109.251 dPhiLepSumMet = 2.88047 dPhiLeptons = 0.840861 dRLeptons = 0.843729 diltype = 54 dimass = 15.4003 event = 216 jet1_Et = 27.3108 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 27.4326 lep2_E = 12.8967 rand = 0.999742 run = 167299 weight = 2.58703e-06 ===Show End Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 12831 for Background Prepared event 0 for Background with 12831 events ====Entry 0 Variable Ht : 61.1273 Variable LepAPt : 22.7899 Variable LepBPt : 11.7315 Variable MetSigLeptonsJets : 4.52817 Variable MetSpec : 26.6052 Variable SumEtLeptonsJets : 34.5213 Variable VSumJetLeptonsPt : 31.9684 Variable addEt : 61.1273 Variable dPhiLepSumMet : 2.92616 Variable dPhiLeptons : 0.819534 Variable dRLeptons : 0.934366 Variable lep1_E : 23.1166 Variable lep2_E : 12.194 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 61.1273 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 22.7899 LepAPt = 22.7899 LepBEt = 11.7322 LepBPt = 11.7315 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 26.6052 MetDelPhi = 2.37814 MetSig = 2.46258 MetSigLeptonsJets = 4.52817 MetSpec = 26.6052 Mjj = 0 MostCentralJetEta = 0 MtllMet = 61.9133 Njets = 0 SB = 0 SumEt = 116.722 SumEtJets = 0 SumEtLeptonsJets = 34.5213 Target = 0 TrainWeight = 1.71617 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 31.9684 addEt = 61.1273 dPhiLepSumMet = 2.92616 dPhiLeptons = 0.819534 dRLeptons = 0.934366 diltype = 54 dimass = 14.9851 event = 1421455 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 23.1166 lep2_E = 12.194 rand = 0.999742 run = 271216 weight = 0.0400324 ===Show End Prepared event 10000 for Background with 12831 events Warning: found 414 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 15965 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 414 negative weights. Signal fraction: 74.1015167 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 55.8076553 62.7101555 64.7835541 68.4783173 69.5354919 70.9990082 72.8844452 74.8332291 76.979599 77.0398788 78.3596344 79.3425446 79.947937 81.4217148 83.3727646 84.6506577 86.0250473 86.5702972 88.5322952 90.0015259 91.4266663 92.6347504 93.3113937 94.0743408 95.4200592 96.2729797 96.8638153 97.8256683 98.9571533 99.4737549 99.7253876 100.760475 102.946716 103.934387 104.78302 105.624275 106.185074 107.125679 108.037254 108.468018 110.012314 111.029037 112.086945 113.551956 113.934219 114.924881 115.94931 117.348465 118.81411 119.740005 121.039238 122.68428 123.7379 125.341316 126.76503 127.930389 129.259277 130.887939 132.851776 133.371902 134.391052 135.905609 137.066681 138.384857 139.680115 140.869293 142.171082 143.128616 144.562775 145.573334 146.594452 148.424133 149.98317 151.860916 154.012436 155.61882 157.717102 159.031693 161.105316 163.410568 165.607162 168.193604 170.904724 173.352692 174.674744 177.031952 179.654205 182.068604 184.418671 188.835388 193.222839 197.911469 203.430969 208.592834 216.158844 224.993195 234.655136 250.024216 274.097534 287.18634 519.386597 ------------------------------ Transdef: Tab for variable 3 20.0008526 20.5603561 20.9274178 21.226017 21.5384102 21.9233742 22.2047577 22.3860455 22.5677319 22.789854 22.841156 23.1000118 23.4438057 23.7336082 24.0946178 24.3271942 24.6648979 24.8776569 25.0993233 25.2391853 25.3228703 25.5479393 25.6859722 25.9472828 26.2094154 26.33284 26.5616722 26.8182907 27.0770245 27.1987324 27.4056435 27.5871964 27.7944889 28.0449142 28.4208965 28.6632938 28.7643433 29.0510941 29.3194695 29.4646149 29.665081 29.7474804 29.8532295 30.0890541 30.2278557 30.4965782 30.6351776 30.7873993 31.1278725 31.3946629 31.6726475 31.984827 32.1812859 32.3360176 32.5086441 32.7162247 32.771492 33.0439682 33.1606293 33.5139465 33.8789635 34.1444702 34.3168068 34.6626205 34.8311844 35.1874771 35.4319305 35.597702 35.8171806 36.117775 36.3861885 36.6206741 36.7866631 37.2107086 37.5568924 37.9120102 38.3191528 38.705162 39.0662308 39.449501 39.9064026 40.3759766 40.8106804 41.4816513 42.3027382 42.8716469 43.3580551 44.2639847 44.8759918 45.6403809 46.2778816 47.0104294 47.8586655 49.0553284 50.1955032 51.347023 53.8757706 56.255497 60.0565796 66.8197784 109.76886 ------------------------------ Transdef: Tab for variable 4 10.0000381 10.1400528 10.3402405 10.4314346 10.4924774 10.502861 10.7481461 10.894021 10.9245605 11.1202364 11.245285 11.4339037 11.5584335 11.613308 11.7314577 11.9548378 12.108799 12.2339554 12.3063831 12.3603134 12.5146637 12.6800137 12.7972956 12.9468889 13.2114487 13.4686346 13.592845 13.6853905 13.7555513 13.8058643 13.8362007 14.1225719 14.3204575 14.5341043 14.8141232 15.0916815 15.3000231 15.6281033 15.6900921 16.04599 16.2425003 16.5359879 16.7620945 16.8846416 16.9959412 17.158287 17.3622837 17.7263527 18.0405827 18.3619633 18.6711636 19.0871067 19.3997746 19.587738 19.9103928 20.0652905 20.2676182 20.4218407 20.5870667 20.8371906 20.9957829 21.1862221 21.2958603 21.454361 21.6064072 21.7462864 21.916256 22.1064606 22.3049583 22.5164928 22.7080803 22.7735596 22.9612045 23.1401901 23.409441 23.6056709 23.7232208 23.9180069 24.0994377 24.3358593 24.5075626 24.6975784 24.9521084 25.2047882 25.3824806 25.5833263 25.917717 26.1490803 26.5156307 26.8181057 27.4299698 27.7898407 28.0947609 28.2852669 28.9017315 29.3358231 30.0654411 30.8256226 31.739933 33.4533882 38.2633057 ------------------------------ Transdef: Tab for variable 5 2.10590816 3.46516967 3.74484205 4.01928854 4.19624805 4.37200451 4.52769852 4.64662266 4.73682451 4.82207012 4.93864155 4.99881411 5.02468491 5.19426727 5.28199387 5.33019733 5.36683702 5.42969799 5.52389717 5.53959846 5.58315182 5.64479828 5.74667835 5.84385777 5.97395992 6.01634216 6.07542896 6.20207024 6.28262615 6.34458923 6.43435812 6.51676321 6.57035637 6.62749863 6.70484495 6.74353981 6.76540184 6.81719828 6.87796974 6.89349365 6.94331455 6.977314 7.00819159 7.05602407 7.07839012 7.11374903 7.16158867 7.18734407 7.2336607 7.27088928 7.33586407 7.37986851 7.40675068 7.44994926 7.50135279 7.5188818 7.54750538 7.58100033 7.62628269 7.65932655 7.70571232 7.75962973 7.82551527 7.88812065 7.92261934 7.95743752 8.03795052 8.06570816 8.10222721 8.14234352 8.17155647 8.23101807 8.25375175 8.30117702 8.34758472 8.41455078 8.51125813 8.57885551 8.64817429 8.73640633 8.76642036 8.83088684 8.89001083 8.94655991 8.98998451 9.08385944 9.17475986 9.26914024 9.38078499 9.44547272 9.57610226 9.73723221 9.95735645 10.1979179 10.386097 10.668869 10.9165497 11.3558369 12.2728004 12.6962681 17.5554657 ------------------------------ Transdef: Tab for variable 6 25.0010986 26.6051693 28.4936333 29.4356766 30.7085991 31.2495575 31.6497726 32.2874336 32.8033905 33.6018448 34.5043869 35.0621185 35.7226791 36.1718636 36.8845215 37.7195053 38.4536438 39.3083229 40.4136734 41.0422134 41.5236511 42.2369232 42.6858749 43.3044624 43.8853798 44.0274773 44.4518547 44.8965416 45.3660889 45.974617 46.6760178 47.0365791 47.3881683 47.9177132 48.3482437 48.9511337 49.4971695 50.064373 50.3284531 50.4983444 50.8898773 51.746273 52.0991669 53.026825 53.5608215 54.168663 54.508255 55.1080284 55.7932434 56.0985031 56.4232635 56.6968994 57.2828293 57.4937286 57.9089584 58.3899422 58.7571793 59.5971222 59.9839172 60.5906982 61.0607796 61.5324936 62.0579834 62.6770287 63.1883087 63.7805252 64.5145035 65.1523895 65.9503326 66.5864105 67.2691345 67.9805984 68.6868134 69.6192474 70.5295563 71.3491364 72.1144562 73.0048981 73.7190323 74.7186813 75.4099579 76.6155472 77.6406937 78.544075 79.2197418 80.5565796 82.1665115 83.0893707 84.6125641 86.223587 88.115654 90.1107712 92.5102844 94.4396057 97.004715 99.9676819 104.222244 113.778366 128.007324 144.524124 218.683594 ------------------------------ Transdef: Tab for variable 7 30.2366562 31.6788006 33.6402969 34.5213127 34.9346085 35.7740936 36.568573 37.2159729 37.7942963 38.6215668 39.0530167 39.3913345 39.8719101 40.0770187 40.3926239 40.9738464 41.3810043 41.6970215 42.354393 42.8888245 43.7648163 44.3339577 44.8805923 45.3286667 46.1746979 46.524971 46.9176483 47.3909683 48.1137924 48.3388672 48.6984596 49.3930473 49.7375488 50.3825951 51.1831093 51.6467285 52.2662239 53.3872223 54.0212021 54.6857147 55.7291489 56.5997353 57.3095055 57.9862671 58.8076019 59.2131882 59.6867447 60.3869629 61.2273788 62.3113174 62.7057114 63.091114 63.868103 64.8756485 65.5426178 66.5927582 67.3784027 68.0378876 68.6608887 69.674942 70.8295822 71.8594666 72.6478119 73.1041946 74.047493 75.0097809 76.5198135 76.77108 77.675415 78.6826477 79.8171158 80.7405853 82.0435791 83.6882782 84.6557007 85.8813324 87.3775024 88.7637482 90.0461273 91.5321579 92.9947739 94.5535126 96.2555161 98.0163269 100.205162 102.225899 104.153671 106.64447 108.938911 111.683105 115.824013 119.012177 122.319336 125.765244 128.935699 134.371765 137.841339 145.090561 155.40799 172.937897 322.997467 ------------------------------ Transdef: Tab for variable 8 6.35964584 28.511673 30.7400379 31.5159225 31.9667816 33.4348679 33.731266 34.1262436 34.828228 35.407814 35.5526733 36.1046867 36.8379707 37.441864 37.8764915 38.5407944 38.8432236 38.9037628 39.4593544 39.5907745 39.7720108 39.8293304 40.1907005 41.0560303 41.3806305 41.8583145 42.5370827 42.8017731 43.314827 43.7618904 44.3466339 44.5794678 44.9016647 45.3519592 45.8892708 46.2355614 46.3988571 46.7991829 47.3759079 47.8930664 48.2234192 48.6942444 48.7935944 48.9327316 49.3695831 49.7132874 50.1091843 50.4526634 50.9829826 51.3113976 51.9956932 52.8147087 53.2538414 53.6465912 54.1503754 54.9676628 55.6991501 56.2756348 56.7958679 57.4495277 57.8746262 58.2679138 58.8352585 59.2989426 59.9225807 60.3932343 61.2085037 61.759079 62.3382416 62.901825 63.7375031 64.3255997 65.1773682 65.9924469 66.5825958 67.4385986 68.0531845 68.788475 69.4163971 70.5142212 71.4655151 72.1239166 73.3176346 74.2106934 75.1552582 76.2412262 77.5427017 79.2710419 81.1970215 83.3265533 84.8002014 86.9016876 88.7155457 91.2640381 94.7366791 98.0402222 101.612236 108.28598 120.200554 137.472794 204.13501 ------------------------------ Transdef: Tab for variable 9 55.8076553 62.7101555 64.3673553 67.048584 69.5354919 70.2676239 71.666687 73.03582 75.3129425 76.4216385 77.0398788 77.5463715 78.8097382 79.7652817 79.8385315 80.982193 81.9037018 83.3727646 84.2138214 85.312149 86.2577133 86.8775635 88.0978165 88.9259262 90.0930786 90.8349304 91.9452591 92.9242554 93.3113937 94.0053558 94.9749527 95.5104141 96.3213501 96.8638153 97.5434418 98.3635788 99.0011597 99.5592499 100.204758 101.016159 102.527992 103.541542 103.934387 104.782387 105.109604 105.64769 106.77916 107.672043 108.31768 109.382553 110.447861 111.019135 111.796753 112.599197 113.391357 113.935944 114.709457 115.235641 115.865242 116.715401 117.303757 118.063324 118.852478 119.606644 120.431496 121.434975 122.520981 123.325851 124.277603 125.391296 126.092285 127.221954 128.194794 129.145599 130.512756 131.42099 132.947723 133.943283 135.570831 136.931503 137.630051 139.004547 140.380463 141.965393 143.110901 144.613098 146.015442 146.976959 148.136353 149.889862 151.442383 152.898651 154.745392 156.930389 160.313324 164.097473 169.650726 175.761536 183.303619 204.740128 282.530121 ------------------------------ Transdef: Tab for variable 10 0.498700708 1.16598082 1.49228191 1.70101738 1.86172748 1.93985415 1.99498224 2.05274534 2.11680937 2.17456937 2.2462039 2.3010726 2.35077143 2.40561318 2.42389107 2.4603672 2.48920631 2.51954269 2.55984974 2.58827209 2.60617948 2.633039 2.64662886 2.67431927 2.70115614 2.71497965 2.72831845 2.74727082 2.76828337 2.77586126 2.79832387 2.80275297 2.81549549 2.82901263 2.84304476 2.84978724 2.85716414 2.86366129 2.87437725 2.88052535 2.89365673 2.89795756 2.91026115 2.92114067 2.92616296 2.92989135 2.94115877 2.94956255 2.95614433 2.95865107 2.96325493 2.96900749 2.97697616 2.98018599 2.98404551 2.98819304 2.99126482 2.99429631 3.00271273 3.00466204 3.00816154 3.01365662 3.01604509 3.02115965 3.02621222 3.03226733 3.03671122 3.04091001 3.04678965 3.05133295 3.05512381 3.05840087 3.06262183 3.06593418 3.06753922 3.07116747 3.07306862 3.07561874 3.07752085 3.08097744 3.0834794 3.08763647 3.08980298 3.0928359 3.09863424 3.10205126 3.10495806 3.10881424 3.11266994 3.11320925 3.11529112 3.11855888 3.121984 3.12412858 3.12727976 3.1288991 3.13073969 3.13251019 3.13530827 3.13783836 3.1415844 ------------------------------ Transdef: Tab for variable 11 0.00017022471 0.00723695755 0.0112676546 0.0235606413 0.0273525603 0.0399125516 0.0476160049 0.0593596548 0.0634592697 0.0708264038 0.0770739615 0.0878789425 0.0982999802 0.111269489 0.124926448 0.134030044 0.1450212 0.153678536 0.157978207 0.167318821 0.175091743 0.182775617 0.193042159 0.201967686 0.210310698 0.220725477 0.230319723 0.234402716 0.239570737 0.249040902 0.25726679 0.261038423 0.26586175 0.270047665 0.278426588 0.287550092 0.293155164 0.300233603 0.309210777 0.316371441 0.321610689 0.327846885 0.335075617 0.33987093 0.345934212 0.352173924 0.358271599 0.362557888 0.367641687 0.37217468 0.373906851 0.377302527 0.380828857 0.386113524 0.390124559 0.39136076 0.39606601 0.399818063 0.403323293 0.408040851 0.410127401 0.413483143 0.417114317 0.421759546 0.425439477 0.42900461 0.433722496 0.437964082 0.443928719 0.44675529 0.450620055 0.456074685 0.459079742 0.464891791 0.467939317 0.474292457 0.478264809 0.489656448 0.49883163 0.509246588 0.517154217 0.525939107 0.534811735 0.542201817 0.548867702 0.554581881 0.564766884 0.580120087 0.587723374 0.597531438 0.608550549 0.631877184 0.639819145 0.650555253 0.662105441 0.687364817 0.715029478 0.760467529 0.818600893 0.86060977 1.06340075 ------------------------------ Transdef: Tab for variable 12 0.2193266 0.374239773 0.394458413 0.403738797 0.406588793 0.409181833 0.411967516 0.412947476 0.415074408 0.417250752 0.419243395 0.421536088 0.424450517 0.426856071 0.428281009 0.431318969 0.433391631 0.436837435 0.438745618 0.441466212 0.4439421 0.445423424 0.447143108 0.450419068 0.452995777 0.455082238 0.457731217 0.459948719 0.462466061 0.465503991 0.468149513 0.470865041 0.472160399 0.475207061 0.477399349 0.479552805 0.481846362 0.484809041 0.487842023 0.490144998 0.490797937 0.493476957 0.497123867 0.499558955 0.502154946 0.504929662 0.508236885 0.511789382 0.514075816 0.516755998 0.521203399 0.523057699 0.524515748 0.528603792 0.531398892 0.535214901 0.537661791 0.53988409 0.544006348 0.547478974 0.551613688 0.553702712 0.556952715 0.561010838 0.566948414 0.570469797 0.576014876 0.579078794 0.58421284 0.593460798 0.599589527 0.603428185 0.609445214 0.612646401 0.621388435 0.626247406 0.632318497 0.63692677 0.640238941 0.646140814 0.650766969 0.653800428 0.663305521 0.667284966 0.674305975 0.678620815 0.688013434 0.695569277 0.703060389 0.713153362 0.727047086 0.73268503 0.742380857 0.763370872 0.780170023 0.796287894 0.820766926 0.855823636 0.90160042 0.934366345 1.12655604 ------------------------------ Transdef: Tab for variable 13 20.1081009 22.1069183 23.1052208 23.6878452 24.2840958 25.0034103 25.3407879 25.7376442 26.1825447 26.8040771 27.0507278 27.1392021 27.7317963 28.1280117 28.8530464 29.0647049 29.4970398 29.580265 29.666563 29.977459 30.3743248 30.6449413 30.983242 31.0473862 31.391243 31.7550659 32.2280121 32.7377968 33.1061935 33.4070358 33.8142776 34.4171715 34.5903778 35.0722885 35.3335419 35.4208679 35.5952644 35.910881 36.3005905 36.5976486 36.9353409 37.2127342 37.588623 37.9713058 38.4030724 38.6852341 39.1520882 39.4775658 39.7440414 40.2251472 40.6447716 41.1796799 41.7341652 42.3800583 42.9011765 43.5014763 43.8199348 44.3562279 44.6691284 45.22892 45.7440071 46.4079895 46.9142532 47.4082718 48.0119438 48.7475357 49.3192978 49.9419937 50.399025 50.8012695 51.1317825 51.7248459 52.1803665 53.1092949 53.8443184 54.7502899 55.4510117 56.1376038 56.752346 57.8223076 58.6557617 59.1948547 60.4698334 62.0245323 63.5581818 64.3083344 65.1716309 66.4578476 67.6326599 69.2451553 71.4673157 73.9973068 77.6642456 79.5552673 82.9823685 85.3608856 88.2280273 89.9060516 96.5829391 116.427765 192.369995 ------------------------------ Transdef: Tab for variable 14 10.0093269 10.4510403 10.9167776 11.4974775 11.7327147 11.9825802 12.1933384 12.3175669 12.3507595 12.476078 12.8136339 13.3683987 13.8091602 13.8525257 13.9105339 14.4647388 15.085062 15.5734921 16.2217484 16.4593353 16.7433529 16.9072552 17.3262711 17.7291374 17.9586754 18.1141357 18.183197 18.3860397 18.6072063 18.7512474 19.210022 19.4558468 19.7151661 20.012188 20.1359005 20.4706917 20.6365833 20.8941479 21.114109 21.411087 21.6186066 21.8042488 22.1298065 22.3520241 22.4772625 22.6736794 22.9539356 23.1436443 23.3465538 23.5843277 23.8393898 24.0979366 24.4158535 24.5516281 24.9151573 25.0706253 25.3301849 25.5151291 25.7336349 26.052824 26.3135433 26.4726219 26.6490135 26.8707695 27.248497 27.3761215 27.6599274 27.952919 28.3206024 28.7221222 29.1011486 29.5275459 29.7680435 30.0329742 30.3056145 30.5937271 31.0292187 31.4613609 31.9147606 32.3148346 32.8356018 33.3223495 33.9031982 34.4886475 35.1834717 35.9088402 36.5803528 37.4921608 38.3499069 39.149498 40.1723862 41.105957 42.5129166 44.6526337 46.1491661 48.2720947 50.3930283 52.7572365 56.6471901 62.2474289 89.625824 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 0.9 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 51.4 25.6 43.1 16.1 37.4 50.7 43.3 46.4 -23.1 -9.4 -18.0 11.9 35.4 2 51.4 100.0 51.2 48.6 43.1 79.7 93.7 79.9 89.8 -37.8 -20.4 -36.5 28.1 37.1 3 25.6 51.2 100.0 26.0 14.9 36.1 52.7 48.3 63.5 3.5 -26.2 -47.9 64.9 19.8 4 43.1 48.6 26.0 100.0 19.6 39.3 48.1 50.0 55.9 -0.7 -25.0 -51.6 14.5 78.8 5 16.1 43.1 14.9 19.6 100.0 84.4 11.4 53.4 68.0 25.1 -8.5 -12.6 3.1 14.9 6 37.4 79.7 36.1 39.3 84.4 100.0 57.1 80.9 90.1 -4.5 -16.1 -28.1 17.0 30.3 7 50.7 93.7 52.7 48.1 11.4 57.1 100.0 72.0 74.8 -48.9 -20.5 -36.6 31.5 36.8 8 43.3 79.9 48.3 50.0 53.4 80.9 72.0 100.0 84.8 -6.9 -22.6 -38.4 30.3 40.7 9 46.4 89.8 63.5 55.9 68.0 90.1 74.8 84.8 100.0 -7.9 -24.8 -45.8 36.2 43.8 10 -23.1 -37.8 3.5 -0.7 25.1 -4.5 -48.9 -6.9 -7.9 100.0 -5.2 -4.2 4.6 -1.2 11 -9.4 -20.4 -26.2 -25.0 -8.5 -16.1 -20.5 -22.6 -24.8 -5.2 100.0 48.9 -26.6 -28.5 12 -18.0 -36.5 -47.9 -51.6 -12.6 -28.1 -36.6 -38.4 -45.8 -4.2 48.9 100.0 -32.0 -44.2 13 11.9 28.1 64.9 14.5 3.1 17.0 31.5 30.3 36.2 4.6 -26.6 -32.0 100.0 40.0 14 35.4 37.1 19.8 78.8 14.9 30.3 36.8 40.7 43.8 -1.2 -28.5 -44.2 40.0 100.0 TOTAL CORRELATION TO TARGET (diagonal) 125.498748 TOTAL CORRELATION OF ALL VARIABLES 57.8917692 ROUND 1: MAX CORR ( 57.8916488) AFTER KILLING INPUT VARIABLE 5 CONTR 0.118083278 ROUND 2: MAX CORR ( 57.8909769) AFTER KILLING INPUT VARIABLE 11 CONTR 0.278916863 ROUND 3: MAX CORR ( 57.8573776) AFTER KILLING INPUT VARIABLE 9 CONTR 1.97207024 ROUND 4: MAX CORR ( 57.8083718) AFTER KILLING INPUT VARIABLE 7 CONTR 2.38081876 ROUND 5: MAX CORR ( 57.6513718) AFTER KILLING INPUT VARIABLE 6 CONTR 4.25760196 ROUND 6: MAX CORR ( 57.5729753) AFTER KILLING INPUT VARIABLE 8 CONTR 3.00552671 ROUND 7: MAX CORR ( 57.2380506) AFTER KILLING INPUT VARIABLE 13 CONTR 6.20105211 ROUND 8: MAX CORR ( 57.1087052) AFTER KILLING INPUT VARIABLE 14 CONTR 3.84580598 ROUND 9: MAX CORR ( 56.8594998) AFTER KILLING INPUT VARIABLE 3 CONTR 5.32930519 ROUND 10: MAX CORR ( 56.3861122) AFTER KILLING INPUT VARIABLE 10 CONTR 7.3218215 ROUND 11: MAX CORR ( 55.4162893) AFTER KILLING INPUT VARIABLE 12 CONTR 10.4129019 ROUND 12: MAX CORR ( 51.407646) AFTER KILLING INPUT VARIABLE 4 CONTR 20.6934545 LAST REMAINING VARIABLE: 2 total correlation to target: 57.8917692 % total significance: 26.5282185 sigma correlations of single variables to target: variable 2: 51.407646 % , in sigma: 23.5569457 variable 3: 25.5916145 % , in sigma: 11.7270547 variable 4: 43.0687249 % , in sigma: 19.7357338 variable 5: 16.0964098 % , in sigma: 7.3759894 variable 6: 37.3633748 % , in sigma: 17.1213245 variable 7: 50.7474731 % , in sigma: 23.2544293 variable 8: 43.2754368 % , in sigma: 19.830457 variable 9: 46.433728 % , in sigma: 21.2777066 variable 10: -23.0501219 % , in sigma: 10.5624457 variable 11: -9.35017266 % , in sigma: 4.28460602 variable 12: -17.9618156 % , in sigma: 8.23078952 variable 13: 11.8836 % , in sigma: 5.44551916 variable 14: 35.4328469 % , in sigma: 16.236683 variables sorted by significance: 1 most relevant variable 2 corr 51.4076462 , in sigma: 23.5569458 2 most relevant variable 4 corr 20.6934547 , in sigma: 9.48253087 3 most relevant variable 12 corr 10.4129019 , in sigma: 4.77158912 4 most relevant variable 10 corr 7.32182169 , in sigma: 3.35513818 5 most relevant variable 3 corr 5.32930517 , in sigma: 2.44209106 6 most relevant variable 14 corr 3.84580588 , in sigma: 1.76229506 7 most relevant variable 13 corr 6.20105219 , in sigma: 2.84155882 8 most relevant variable 8 corr 3.00552678 , in sigma: 1.3772471 9 most relevant variable 6 corr 4.25760174 , in sigma: 1.95099564 10 most relevant variable 7 corr 2.38081884 , in sigma: 1.09098207 11 most relevant variable 9 corr 1.97207022 , in sigma: 0.903677851 12 most relevant variable 11 corr 0.278916866 , in sigma: 0.127810355 13 most relevant variable 5 corr 0.118083276 , in sigma: 0.0541102644 global correlations between input variables: variable 2: 99.5511716 % variable 3: 94.5815238 % variable 4: 91.5952179 % variable 5: 98.3263031 % variable 6: 97.7129728 % variable 7: 99.3656172 % variable 8: 91.6059107 % variable 9: 99.4590556 % variable 10: 72.8102821 % variable 11: 51.7629918 % variable 12: 70.4443728 % variable 13: 80.9068046 % variable 14: 87.9568462 % significance loss when removing single variables: variable 2: corr = 6.30608717 % , sigma = 2.88968985 variable 3: corr = 5.60422753 % , sigma = 2.56807098 variable 4: corr = 8.77856857 % , sigma = 4.02267521 variable 5: corr = 0.118083278 % , sigma = 0.0541102656 variable 6: corr = 2.89803633 % , sigma = 1.32799087 variable 7: corr = 2.66231549 % , sigma = 1.21997458 variable 8: corr = 5.3729278 % , sigma = 2.46208061 variable 9: corr = 1.44403723 % , sigma = 0.66171298 variable 10: corr = 6.32157125 % , sigma = 2.89678525 variable 11: corr = 0.26723276 % , sigma = 0.122456252 variable 12: corr = 10.2104683 % , sigma = 4.67882631 variable 13: corr = 6.49167513 % , sigma = 2.97473334 variable 14: corr = 6.8327217 % , sigma = 3.1310139 Keep only 3 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 4 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 3 --> 10.1232128 sigma out 15 active outputs RANK 2 NODE 2 --> 7.63069248 sigma out 15 active outputs RANK 3 NODE 4 --> 7.56813002 sigma out 15 active outputs RANK 4 NODE 1 --> 6.56818581 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 7 --> 9.59298801 sigma in 4act. ( 12.9748354 sig out 1act.) RANK 2 NODE 1 --> 7.59722662 sigma in 4act. ( 6.76687288 sig out 1act.) RANK 3 NODE 13 --> 5.83936214 sigma in 4act. ( 8.40149784 sig out 1act.) RANK 4 NODE 8 --> 4.99483013 sigma in 4act. ( 6.38386536 sig out 1act.) RANK 5 NODE 4 --> 3.65159273 sigma in 4act. ( 4.72576475 sig out 1act.) RANK 6 NODE 2 --> 3.53826523 sigma in 4act. ( 3.62825847 sig out 1act.) RANK 7 NODE 5 --> 2.83153915 sigma in 4act. ( 4.05196333 sig out 1act.) RANK 8 NODE 3 --> 2.82124925 sigma in 4act. ( 2.808285 sig out 1act.) RANK 9 NODE 14 --> 1.83445048 sigma in 4act. ( 2.38192415 sig out 1act.) RANK 10 NODE 11 --> 1.78944647 sigma in 4act. ( 1.81246638 sig out 1act.) RANK 11 NODE 6 --> 1.21511149 sigma in 4act. ( 1.13442993 sig out 1act.) RANK 12 NODE 10 --> 1.09450531 sigma in 4act. ( 1.18157578 sig out 1act.) RANK 13 NODE 9 --> 0.849133432 sigma in 4act. ( 0.989138782 sig out 1act.) RANK 14 NODE 15 --> 0.683720231 sigma in 4act. ( 0.380891383 sig out 1act.) RANK 15 NODE 12 --> 0.27210331 sigma in 4act. ( 0.158087865 sig out 1act.) sorted by output significance RANK 1 NODE 7 --> 12.9748354 sigma out 1act.( 9.59298801 sig in 4act.) RANK 2 NODE 13 --> 8.40149784 sigma out 1act.( 5.83936214 sig in 4act.) RANK 3 NODE 1 --> 6.76687288 sigma out 1act.( 7.59722662 sig in 4act.) RANK 4 NODE 8 --> 6.38386536 sigma out 1act.( 4.99483013 sig in 4act.) RANK 5 NODE 4 --> 4.72576475 sigma out 1act.( 3.65159273 sig in 4act.) RANK 6 NODE 5 --> 4.05196333 sigma out 1act.( 2.83153915 sig in 4act.) RANK 7 NODE 2 --> 3.62825847 sigma out 1act.( 3.53826523 sig in 4act.) RANK 8 NODE 3 --> 2.808285 sigma out 1act.( 2.82124925 sig in 4act.) RANK 9 NODE 14 --> 2.38192415 sigma out 1act.( 1.83445048 sig in 4act.) RANK 10 NODE 11 --> 1.81246638 sigma out 1act.( 1.78944647 sig in 4act.) RANK 11 NODE 10 --> 1.18157578 sigma out 1act.( 1.09450531 sig in 4act.) RANK 12 NODE 6 --> 1.13442993 sigma out 1act.( 1.21511149 sig in 4act.) RANK 13 NODE 9 --> 0.989138782 sigma out 1act.( 0.849133432 sig in 4act.) RANK 14 NODE 15 --> 0.380891383 sigma out 1act.( 0.683720231 sig in 4act.) RANK 15 NODE 12 --> 0.158087865 sigma out 1act.( 0.27210331 sig in 4act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 19.9516392 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 3 --> 16.2052059 sigma out 15 active outputs RANK 2 NODE 1 --> 11.9818878 sigma out 15 active outputs RANK 3 NODE 2 --> 8.57025146 sigma out 15 active outputs RANK 4 NODE 4 --> 7.52437353 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 1 --> 8.75432014 sigma in 4act. ( 7.3570447 sig out 1act.) RANK 2 NODE 12 --> 8.38808441 sigma in 4act. ( 5.25524998 sig out 1act.) RANK 3 NODE 10 --> 7.16192102 sigma in 4act. ( 3.6654253 sig out 1act.) RANK 4 NODE 7 --> 6.98311663 sigma in 4act. ( 8.24646568 sig out 1act.) RANK 5 NODE 9 --> 6.63298273 sigma in 4act. ( 3.3637135 sig out 1act.) RANK 6 NODE 6 --> 6.49134111 sigma in 4act. ( 3.34380174 sig out 1act.) RANK 7 NODE 3 --> 6.2369628 sigma in 4act. ( 5.20330191 sig out 1act.) RANK 8 NODE 4 --> 5.53182411 sigma in 4act. ( 4.57972288 sig out 1act.) RANK 9 NODE 11 --> 5.38734198 sigma in 4act. ( 4.16741562 sig out 1act.) RANK 10 NODE 13 --> 5.27441883 sigma in 4act. ( 5.27577353 sig out 1act.) RANK 11 NODE 8 --> 5.14010429 sigma in 4act. ( 4.69139576 sig out 1act.) RANK 12 NODE 2 --> 4.80492401 sigma in 4act. ( 3.7751689 sig out 1act.) RANK 13 NODE 14 --> 4.32528162 sigma in 4act. ( 2.53532457 sig out 1act.) RANK 14 NODE 5 --> 2.63697553 sigma in 4act. ( 2.27177548 sig out 1act.) RANK 15 NODE 15 --> 1.23985195 sigma in 4act. ( 0.214659631 sig out 1act.) sorted by output significance RANK 1 NODE 7 --> 8.24646568 sigma out 1act.( 6.98311663 sig in 4act.) RANK 2 NODE 1 --> 7.3570447 sigma out 1act.( 8.75432014 sig in 4act.) RANK 3 NODE 13 --> 5.27577353 sigma out 1act.( 5.27441883 sig in 4act.) RANK 4 NODE 12 --> 5.25524998 sigma out 1act.( 8.38808441 sig in 4act.) RANK 5 NODE 3 --> 5.20330191 sigma out 1act.( 6.2369628 sig in 4act.) RANK 6 NODE 8 --> 4.69139576 sigma out 1act.( 5.14010429 sig in 4act.) RANK 7 NODE 4 --> 4.57972288 sigma out 1act.( 5.53182411 sig in 4act.) RANK 8 NODE 11 --> 4.16741562 sigma out 1act.( 5.38734198 sig in 4act.) RANK 9 NODE 2 --> 3.7751689 sigma out 1act.( 4.80492401 sig in 4act.) RANK 10 NODE 10 --> 3.6654253 sigma out 1act.( 7.16192102 sig in 4act.) RANK 11 NODE 9 --> 3.3637135 sigma out 1act.( 6.63298273 sig in 4act.) RANK 12 NODE 6 --> 3.34380174 sigma out 1act.( 6.49134111 sig in 4act.) RANK 13 NODE 14 --> 2.53532457 sigma out 1act.( 4.32528162 sig in 4act.) RANK 14 NODE 5 --> 2.27177548 sigma out 1act.( 2.63697553 sig in 4act.) RANK 15 NODE 15 --> 0.214659631 sigma out 1act.( 1.23985195 sig in 4act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 18.0782795 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.48087433 *** contribution from regularisation: 0.0250236113 *** contribution from error: -0.505897939 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.569096625 *** contribution from regularisation: 0.00837655831 *** contribution from error: -0.577473164 *********************************************** -----------------> Test sample ENTER BFGS code START -4543.32759 0.415076464 -0.369631827 EXIT FROM BFGS code FG_START 0. 0.415076464 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.57883811 *** contribution from regularisation: 0.00615235558 *** contribution from error: -0.584990442 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -4620.28597 0.415076464 3.93614268 EXIT FROM BFGS code FG_LNSRCH 0. 0.461012155 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0.577664196 *** contribution from regularisation: 0.00893981475 *** contribution from error: -0.586603999 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4610.91547 0.461012155 0.792099416 EXIT FROM BFGS code FG_LNSRCH 0. 0.434646189 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.582383394 *** contribution from regularisation: 0.00815686304 *** contribution from error: -0.59054023 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4648.58406 0.434646189 1.89310658 EXIT FROM BFGS code NEW_X -4648.58406 0.434646189 1.89310658 ENTER BFGS code NEW_X -4648.58406 0.434646189 1.89310658 EXIT FROM BFGS code FG_LNSRCH 0. 0.444390029 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.583223581 *** contribution from regularisation: 0.00815983862 *** contribution from error: -0.591383398 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4655.29046 0.444390029 0.614864111 EXIT FROM BFGS code NEW_X -4655.29046 0.444390029 0.614864111 ENTER BFGS code NEW_X -4655.29046 0.444390029 0.614864111 EXIT FROM BFGS code FG_LNSRCH 0. 0.459411591 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.583400726 *** contribution from regularisation: 0.00817645621 *** contribution from error: -0.591577172 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4656.70447 0.459411591 -1.71871459 EXIT FROM BFGS code NEW_X -4656.70447 0.459411591 -1.71871459 ENTER BFGS code NEW_X -4656.70447 0.459411591 -1.71871459 EXIT FROM BFGS code FG_LNSRCH 0. 0.461663574 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.583387136 *** contribution from regularisation: 0.00788226817 *** contribution from error: -0.591269433 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4656.59603 0.461663574 -1.95401144 EXIT FROM BFGS code FG_LNSRCH 0. 0.459925801 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.583074629 *** contribution from regularisation: 0.00844318792 *** contribution from error: -0.591517806 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4654.10172 0.459925801 -1.97907591 EXIT FROM BFGS code FG_LNSRCH 0. 0.459442854 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 3 --> 8.75219345 sigma out 15 active outputs RANK 2 NODE 1 --> 7.65566587 sigma out 15 active outputs RANK 3 NODE 2 --> 5.48072958 sigma out 15 active outputs RANK 4 NODE 4 --> 3.75043917 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 12 --> 4.98256588 sigma in 4act. ( 2.99335623 sig out 1act.) RANK 2 NODE 1 --> 4.71292782 sigma in 4act. ( 3.59815454 sig out 1act.) RANK 3 NODE 10 --> 4.66150093 sigma in 4act. ( 2.93459582 sig out 1act.) RANK 4 NODE 6 --> 4.55646753 sigma in 4act. ( 3.0625689 sig out 1act.) RANK 5 NODE 4 --> 4.12251759 sigma in 4act. ( 3.89507008 sig out 1act.) RANK 6 NODE 9 --> 4.07698536 sigma in 4act. ( 2.3592124 sig out 1act.) RANK 7 NODE 3 --> 3.84178257 sigma in 4act. ( 2.78920078 sig out 1act.) RANK 8 NODE 11 --> 3.10527492 sigma in 4act. ( 1.84985948 sig out 1act.) RANK 9 NODE 8 --> 3.00427437 sigma in 4act. ( 2.72121263 sig out 1act.) RANK 10 NODE 2 --> 2.48807907 sigma in 4act. ( 1.3752557 sig out 1act.) RANK 11 NODE 14 --> 2.38107944 sigma in 4act. ( 1.47970819 sig out 1act.) RANK 12 NODE 13 --> 2.36027908 sigma in 4act. ( 1.99410772 sig out 1act.) RANK 13 NODE 7 --> 1.59558654 sigma in 4act. ( 1.4483788 sig out 1act.) RANK 14 NODE 5 --> 1.49460876 sigma in 4act. ( 1.06488979 sig out 1act.) RANK 15 NODE 15 --> 0.728187144 sigma in 4act. ( 0.236501262 sig out 1act.) sorted by output significance RANK 1 NODE 4 --> 3.89507008 sigma out 1act.( 4.12251759 sig in 4act.) RANK 2 NODE 1 --> 3.59815454 sigma out 1act.( 4.71292782 sig in 4act.) RANK 3 NODE 6 --> 3.0625689 sigma out 1act.( 4.55646753 sig in 4act.) RANK 4 NODE 12 --> 2.99335623 sigma out 1act.( 4.98256588 sig in 4act.) RANK 5 NODE 10 --> 2.93459582 sigma out 1act.( 4.66150093 sig in 4act.) RANK 6 NODE 3 --> 2.78920078 sigma out 1act.( 3.84178257 sig in 4act.) RANK 7 NODE 8 --> 2.72121263 sigma out 1act.( 3.00427437 sig in 4act.) RANK 8 NODE 9 --> 2.3592124 sigma out 1act.( 4.07698536 sig in 4act.) RANK 9 NODE 13 --> 1.99410772 sigma out 1act.( 2.36027908 sig in 4act.) RANK 10 NODE 11 --> 1.84985948 sigma out 1act.( 3.10527492 sig in 4act.) RANK 11 NODE 14 --> 1.47970819 sigma out 1act.( 2.38107944 sig in 4act.) RANK 12 NODE 7 --> 1.4483788 sigma out 1act.( 1.59558654 sig in 4act.) RANK 13 NODE 2 --> 1.3752557 sigma out 1act.( 2.48807907 sig in 4act.) RANK 14 NODE 5 --> 1.06488979 sigma out 1act.( 1.49460876 sig in 4act.) RANK 15 NODE 15 --> 0.236501262 sigma out 1act.( 0.728187144 sig in 4act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 9.51773071 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.582596123 *** contribution from regularisation: 0.00897842273 *** contribution from error: -0.59157455 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -4650.28221 0.459442854 -1.56890106 EXIT FROM BFGS code FG_LNSRCH 0. 0.459411651 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.58429563 *** contribution from regularisation: 0.00728151808 *** contribution from error: -0.591577172 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4663.8478 0.459411651 -1.61137915 EXIT FROM BFGS code NEW_X -4663.8478 0.459411651 -1.61137915 ENTER BFGS code NEW_X -4663.8478 0.459411651 -1.61137915 EXIT FROM BFGS code FG_LNSRCH 0. 0.455330342 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.583851457 *** contribution from regularisation: 0.00786169432 *** contribution from error: -0.59171313 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4660.30231 0.455330342 -1.07612789 EXIT FROM BFGS code FG_LNSRCH 0. 0.459281772 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.583684742 *** contribution from regularisation: 0.00789814256 *** contribution from error: -0.591582894 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4658.97173 0.459281772 -1.68185687 EXIT FROM BFGS code FG_LNSRCH 0. 0.459411532 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.583370209 *** contribution from regularisation: 0.00820693932 *** contribution from error: -0.591577172 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4656.46113 0.459411532 -1.62268317 EXIT FROM BFGS code FG_LNSRCH 0. 0.459411651 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.583802581 *** contribution from regularisation: 0.00777458306 *** contribution from error: -0.591577172 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4659.91216 0.459411651 -1.56130135 EXIT FROM BFGS code FG_LNSRCH 0. 0.459411651 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.583524227 *** contribution from regularisation: 0.00805294886 *** contribution from error: -0.591577172 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4657.69024 0.459411651 -1.74684381 EXIT FROM BFGS code FG_LNSRCH 0. 0.459411651 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.583505988 *** contribution from regularisation: 0.00807115249 *** contribution from error: -0.591577113 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -4657.54494 0.459411651 -1.58450305 EXIT FROM BFGS code NEW_X -4657.54494 0.459411651 -1.58450305 ENTER BFGS code NEW_X -4657.54494 0.459411651 -1.58450305 EXIT FROM BFGS code CONVERGENC -4657.54494 0.459411651 -1.58450305 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 3 --> 12.4515753 sigma out 15 active outputs RANK 2 NODE 1 --> 12.3074894 sigma out 15 active outputs RANK 3 NODE 2 --> 7.69888067 sigma out 15 active outputs RANK 4 NODE 4 --> 5.54043865 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 1 --> 8.26541996 sigma in 4act. ( 5.75400686 sig out 1act.) RANK 2 NODE 6 --> 7.37759304 sigma in 4act. ( 4.94587231 sig out 1act.) RANK 3 NODE 10 --> 6.90290451 sigma in 4act. ( 4.9894557 sig out 1act.) RANK 4 NODE 12 --> 6.7687149 sigma in 4act. ( 5.21870327 sig out 1act.) RANK 5 NODE 3 --> 6.42776394 sigma in 4act. ( 4.12526608 sig out 1act.) RANK 6 NODE 4 --> 6.28694868 sigma in 4act. ( 6.57353735 sig out 1act.) RANK 7 NODE 9 --> 5.49105167 sigma in 4act. ( 4.03912258 sig out 1act.) RANK 8 NODE 11 --> 4.01385689 sigma in 4act. ( 3.06272244 sig out 1act.) RANK 9 NODE 8 --> 3.80168176 sigma in 4act. ( 4.37500572 sig out 1act.) RANK 10 NODE 14 --> 3.23987818 sigma in 4act. ( 2.1194365 sig out 1act.) RANK 11 NODE 2 --> 3.16049004 sigma in 4act. ( 2.06451035 sig out 1act.) RANK 12 NODE 13 --> 3.12666464 sigma in 4act. ( 3.54754066 sig out 1act.) RANK 13 NODE 7 --> 2.0510745 sigma in 4act. ( 2.26008606 sig out 1act.) RANK 14 NODE 5 --> 1.93174243 sigma in 4act. ( 1.43775451 sig out 1act.) RANK 15 NODE 15 --> 0.718442917 sigma in 4act. ( 0.284886986 sig out 1act.) sorted by output significance RANK 1 NODE 4 --> 6.57353735 sigma out 1act.( 6.28694868 sig in 4act.) RANK 2 NODE 1 --> 5.75400686 sigma out 1act.( 8.26541996 sig in 4act.) RANK 3 NODE 12 --> 5.21870327 sigma out 1act.( 6.7687149 sig in 4act.) RANK 4 NODE 10 --> 4.9894557 sigma out 1act.( 6.90290451 sig in 4act.) RANK 5 NODE 6 --> 4.94587231 sigma out 1act.( 7.37759304 sig in 4act.) RANK 6 NODE 8 --> 4.37500572 sigma out 1act.( 3.80168176 sig in 4act.) RANK 7 NODE 3 --> 4.12526608 sigma out 1act.( 6.42776394 sig in 4act.) RANK 8 NODE 9 --> 4.03912258 sigma out 1act.( 5.49105167 sig in 4act.) RANK 9 NODE 13 --> 3.54754066 sigma out 1act.( 3.12666464 sig in 4act.) RANK 10 NODE 11 --> 3.06272244 sigma out 1act.( 4.01385689 sig in 4act.) RANK 11 NODE 7 --> 2.26008606 sigma out 1act.( 2.0510745 sig in 4act.) RANK 12 NODE 14 --> 2.1194365 sigma out 1act.( 3.23987818 sig in 4act.) RANK 13 NODE 2 --> 2.06451035 sigma out 1act.( 3.16049004 sig in 4act.) RANK 14 NODE 5 --> 1.43775451 sigma out 1act.( 1.93174243 sig in 4act.) RANK 15 NODE 15 --> 0.284886986 sigma out 1act.( 0.718442917 sig in 4act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 15.5999994 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.583209038 *** contribution from regularisation: 0.00836809631 *** contribution from error: -0.591577113 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 22610 Closing output file done