NNInput NNInputs_135.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree NNAna::CopyTree: entries= 182617 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 46281 nbkg = 136336 Bkg Entries: 136336 Sig Entries: 46281 Chosen entries: 46281 Signal fraction: 1 Background fraction: 0.339463 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 136336 Actual Signal Entries: 46281 Entries to split: 46281 Test with : 23140 Train with : 23140 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 46281 for Signal Prepared event 0 for Signal with 46281 events ====Entry 0 Variable Ht : 232.766 Variable LepAPt : 74.3556 Variable LepBPt : 30.6959 Variable MetSigLeptonsJets : 3.09436 Variable MetSpec : 41.4986 Variable SumEtLeptonsJets : 190.102 Variable VSumJetLeptonsPt : 35.9078 Variable addEt : 147.716 Variable dPhiLepSumMet : 2.12661 Variable dPhiLeptons : 0.248817 Variable dRLeptons : 0.333981 Variable lep1_E : 86.3275 Variable lep2_E : 32.4592 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2135 Ht = 232.766 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 74.3557 LepAPt = 74.3556 LepBEt = 30.6962 LepBPt = 30.6959 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 42.6642 MetDelPhi = 1.33651 MetSig = 2.91556 MetSigLeptonsJets = 3.09436 MetSpec = 41.4986 Mjj = 0 MostCentralJetEta = -1.56143 MtllMet = 152.006 Njets = 1 SB = 0 SumEt = 214.133 SumEtJets = 0 SumEtLeptonsJets = 190.102 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 35.9078 addEt = 147.716 dPhiLepSumMet = 2.12661 dPhiLeptons = 0.248817 dRLeptons = 0.333981 diltype = 17 dimass = 15.9508 event = 435 jet1_Et = 85.0503 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 86.3275 lep2_E = 32.4592 rand = 0.999742 run = 230778 weight = 2.42631e-06 ===Show End Prepared event 10000 for Signal with 46281 events Prepared event 20000 for Signal with 46281 events Prepared event 30000 for Signal with 46281 events Prepared event 40000 for Signal with 46281 events Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 136336 for Background Prepared event 0 for Background with 136336 events ====Entry 0 Variable Ht : 85.3408 Variable LepAPt : 31.1294 Variable LepBPt : 10.0664 Variable MetSigLeptonsJets : 6.87768 Variable MetSpec : 44.1437 Variable SumEtLeptonsJets : 41.1959 Variable VSumJetLeptonsPt : 41.0132 Variable addEt : 85.3408 Variable dPhiLepSumMet : 3.10536 Variable dPhiLeptons : 0.219342 Variable dRLeptons : 0.424454 Variable lep1_E : 32.3548 Variable lep2_E : 10.1027 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 85.3408 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 31.1297 LepAPt = 31.1294 LepBEt = 10.0674 LepBPt = 10.0664 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 44.1437 MetDelPhi = 3.01191 MetSig = 3.74558 MetSigLeptonsJets = 6.87768 MetSpec = 44.1437 Mjj = 0 MostCentralJetEta = 0 MtllMet = 86.2332 Njets = 0 SB = 0 SumEt = 138.899 SumEtJets = 0 SumEtLeptonsJets = 41.1959 Target = 0 TrainWeight = 0.340615 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 41.0132 addEt = 85.3408 dPhiLepSumMet = 3.10536 dPhiLeptons = 0.219342 dRLeptons = 0.424454 diltype = 17 dimass = 7.54723 event = 6717520 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 32.3548 lep2_E = 10.1027 rand = 0.999742 run = 271566 weight = 0.00296168 ===Show End Prepared event 10000 for Background with 136336 events Prepared event 20000 for Background with 136336 events Prepared event 30000 for Background with 136336 events Prepared event 40000 for Background with 136336 events Prepared event 50000 for Background with 136336 events Prepared event 60000 for Background with 136336 events Prepared event 70000 for Background with 136336 events Prepared event 80000 for Background with 136336 events Prepared event 90000 for Background with 136336 events Prepared event 100000 for Background with 136336 events Prepared event 110000 for Background with 136336 events Prepared event 120000 for Background with 136336 events Prepared event 130000 for Background with 136336 events Warning: found 4421 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 182617 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 4421 negative weights. Signal fraction: 62.583374 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 60.0046387 63.1605568 65.489769 67.0583725 68.3219299 69.4478149 70.4465408 71.3224335 72.1702499 73.0569 73.8070831 74.432785 75.144104 75.805954 76.4837036 77.1691742 77.8834686 78.518158 79.1094513 79.7322845 80.411972 81.0086212 81.7208862 82.5701752 83.3773041 84.1730347 85.1305008 86.0232697 86.9060669 87.8904343 88.8898773 90.0353088 91.1164856 92.1782913 93.1035614 94.0769501 95.2103424 96.3232269 97.4534607 98.6275558 99.6593933 100.750702 101.864105 103.057625 104.11908 105.026382 106.150841 107.01609 107.998566 109.144409 110.20546 111.323082 112.388733 113.419769 114.468079 115.507751 116.540367 117.650116 118.779205 119.968819 121.11837 122.261566 123.328156 124.306885 125.407913 126.535431 127.681976 128.720184 129.811584 131.112106 132.402695 133.669861 135.10173 136.711594 138.480774 140.432953 142.300018 144.379547 146.521393 148.777039 150.986511 153.610016 155.942841 158.288971 160.787369 163.657715 167.13736 170.298706 173.796555 177.620758 181.979431 186.792236 192.863892 199.220261 206.618744 216.022491 228.665466 244.12204 269.492615 318.281647 782.079102 ------------------------------ Transdef: Tab for variable 3 20.0008526 20.2994957 20.6222458 20.9314308 21.1509476 21.4078712 21.6363716 21.8704643 22.0723038 22.2611465 22.4851093 22.6812401 22.8724041 23.0949783 23.2759628 23.4327431 23.6257191 23.8154449 24.018671 24.2134361 24.4254417 24.5955086 24.7845497 24.9951744 25.1958237 25.3960838 25.5822258 25.7497253 25.9355965 26.1476135 26.3367577 26.5295868 26.7020035 26.9018059 27.1068268 27.2886982 27.508461 27.6935425 27.8847942 28.0885353 28.2697868 28.5067348 28.7097244 28.9066925 29.1315956 29.350914 29.5779495 29.7544022 29.9749184 30.1700058 30.4379082 30.6465759 30.8908272 31.1215553 31.3859138 31.6292553 31.853344 32.0918579 32.325798 32.5892258 32.7961388 33.0869064 33.3238068 33.6349678 33.934288 34.2032928 34.5106659 34.8284607 35.1277313 35.4321671 35.7498398 36.093998 36.4518967 36.7838516 37.1238174 37.5026245 37.892807 38.3122406 38.7620201 39.2266083 39.662529 40.1396141 40.6462631 40.9990768 41.5371628 42.0569115 42.7379723 43.4006081 44.1364288 44.9819374 45.9588737 46.9662781 47.9906769 49.2978554 50.684494 52.3898048 54.5028267 57.4267197 62.3227081 70.6404037 228.385986 ------------------------------ Transdef: Tab for variable 4 10.0005026 10.1132498 10.2454109 10.3727074 10.4546909 10.5653381 10.6818848 10.8119526 10.9255657 11.0440178 11.1783218 11.3037605 11.4513216 11.5686131 11.688427 11.8091984 11.9432688 12.0761538 12.1926432 12.316081 12.447629 12.5909319 12.7216988 12.8680725 13.0264463 13.193882 13.3370781 13.4957876 13.6273479 13.797411 13.9502125 14.0818367 14.2230625 14.3844948 14.5480471 14.7159271 14.8966885 15.0797052 15.238677 15.3946514 15.5605783 15.728754 15.9328003 16.1242199 16.3239861 16.5037117 16.688736 16.8386383 17.039402 17.2282753 17.4336739 17.6343727 17.8176079 17.9862099 18.1569481 18.3710136 18.5745811 18.8048096 19.0298195 19.2487526 19.4806175 19.7384987 19.9647179 20.1825714 20.2983112 20.4789925 20.6675987 20.8937263 21.1215401 21.350975 21.5970993 21.8183174 22.0315952 22.2975998 22.5219612 22.7712116 23.0337315 23.311409 23.5927238 23.8887596 24.1781082 24.4668007 24.7446651 25.1070671 25.5081978 25.8719273 26.2489967 26.6507969 27.093956 27.5553646 28.0134392 28.4544601 29.0113792 29.6485023 30.405674 31.2269306 32.301918 33.6273193 35.9641724 39.9884949 71.4863358 ------------------------------ Transdef: Tab for variable 5 1.07960415 2.5016191 2.91821098 3.22046161 3.4528532 3.67850971 3.85254812 4.02116394 4.13311291 4.23899364 4.34421682 4.44350958 4.53953648 4.62910366 4.70110607 4.78535843 4.86011648 4.9369812 5.00733423 5.0677042 5.12691021 5.18826389 5.25102901 5.30525208 5.36469078 5.41789913 5.46900558 5.51719236 5.54541683 5.58734035 5.63629532 5.69217396 5.73882008 5.78298807 5.82764673 5.87380075 5.92230654 5.96235514 6.00555897 6.05106354 6.0945797 6.13824272 6.18328047 6.22007847 6.25822878 6.29009151 6.33151245 6.3758235 6.41860485 6.46174335 6.50634909 6.55051327 6.59488106 6.63665915 6.67402124 6.72036219 6.75820017 6.79457664 6.83507824 6.87683487 6.91282797 6.96296406 6.9967823 7.04142046 7.08165026 7.13059998 7.17220974 7.22218513 7.26986456 7.31232357 7.36099625 7.40485144 7.44978428 7.50139761 7.54606533 7.59528255 7.64169121 7.69541836 7.75513744 7.81214857 7.86277485 7.92387867 7.98532486 8.0485878 8.11431599 8.18091774 8.24525356 8.31570339 8.40111923 8.49152374 8.58493996 8.69274616 8.81053448 8.93764782 9.08921051 9.26966476 9.48702145 9.78413582 10.2494373 11.2697334 19.048111 ------------------------------ Transdef: Tab for variable 6 15.000617 19.8895378 24.7883167 25.9472198 27.0155735 27.7233467 28.6694183 29.3895798 30.0848942 30.7176876 31.3012943 31.9037685 32.3396912 32.8657913 33.3645439 33.7839737 34.2309875 34.6655884 34.9652977 35.3898849 35.7721443 36.1323242 36.5197983 36.8840332 37.2254066 37.562603 37.9484482 38.2994843 38.6672592 39.0060997 39.3600922 39.7111816 40.0616226 40.3591995 40.7802811 41.1408005 41.5368118 41.9290314 42.3429451 42.7228928 43.1153374 43.3710327 43.8134155 44.1215439 44.5804291 45.0778351 45.5381355 46.0209351 46.5216599 47.0025177 47.4697113 47.921524 48.3451614 48.8589363 49.3240051 49.8618584 50.3173523 50.7089386 51.1698227 51.7421227 52.2870674 52.7988434 53.3064499 53.8968811 54.4042053 55.0087357 55.607872 56.1544113 56.7454605 57.2630692 57.7543373 58.2720108 58.8445892 59.4704742 59.9952774 60.5960312 61.2361832 61.8441734 62.5261841 63.1857758 63.8825455 64.5944214 65.3005066 66.045929 66.8737183 67.6958466 68.6527863 69.5844421 70.6511383 71.9358063 73.3592529 74.9476624 76.659729 77.7560806 80.0563889 82.5276184 85.8595963 90.1207123 96.3152008 108.230904 236.562576 ------------------------------ Transdef: Tab for variable 7 30.1476021 32.8939667 33.7959061 34.5315094 35.0740738 35.5087662 35.9888916 36.3966942 36.8417206 37.204937 37.5733948 37.9117737 38.2146072 38.5708504 38.871666 39.2003326 39.5523834 39.8115273 40.1064911 40.3878326 40.7003708 40.9990692 41.3245049 41.6866455 42.0805931 42.5374985 42.9392319 43.3831177 44.0053482 44.5438652 44.9398804 45.4612122 45.9914246 46.5751266 47.3156815 47.9881363 48.6183167 49.1932144 49.7415237 50.4861679 51.1773376 51.7891922 52.5021667 53.0965729 53.7068481 54.3907776 55.1110001 55.8122139 56.4450302 57.0621948 57.7424469 58.388916 59.0092201 59.7031479 60.3182983 60.9636993 61.436554 62.1115608 62.794487 63.3654785 64.1554642 64.9968338 65.8425598 66.7680206 67.6701965 68.6650772 69.7498322 70.8358307 72.0703125 73.3245544 74.470047 75.8127899 77.0974579 78.5232849 79.741272 81.3005371 82.8778687 84.5427399 86.2344894 88.0121613 89.8820267 91.7558441 93.8924561 96.0735626 98.4428101 101.094421 103.919029 106.826172 109.684715 112.9814 116.910431 121.10627 126.000168 131.962082 138.492645 146.423172 155.43277 168.34024 187.696228 225.027252 477.637085 ------------------------------ Transdef: Tab for variable 8 0.802010298 22.7607288 27.6515121 30.1236305 31.3686562 32.2432938 32.8545227 33.3739243 33.8914413 34.2860336 34.6451721 35.0047913 35.3801956 35.7092247 36.0461884 36.3631058 36.6592789 36.8999786 37.1969986 37.5171394 37.7852173 38.067421 38.2900925 38.5792274 38.8232651 39.0749054 39.3057556 39.5581131 39.7678833 40.0253181 40.3271027 40.6185379 40.941124 41.2941208 41.6144867 41.9466591 42.294075 42.6590576 43.0328217 43.4174194 43.8124046 44.2088547 44.5743332 44.9327354 45.3982315 45.8889885 46.3066635 46.7745667 47.2280197 47.6649055 48.0360031 48.4496918 48.8302765 49.2991791 49.6926956 50.1715775 50.6679192 51.1166687 51.632061 52.1736069 52.6690598 53.076622 53.5193253 54.0318375 54.557991 55.1412506 55.65065 56.1458054 56.6236343 57.1274109 57.6534615 58.1918068 58.766777 59.2747231 59.8085098 60.3748169 60.8965797 61.5064621 62.0706177 62.6945343 63.3044662 63.9790344 64.7478333 65.5211258 66.3627014 67.2440948 68.249321 69.3401489 70.505722 71.8819122 73.3351288 74.9829712 76.8619995 78.6356049 81.1493683 84.3984833 88.5686188 94.3462601 103.441803 122.669159 326.868286 ------------------------------ Transdef: Tab for variable 9 49.2546883 62.6598473 64.6544266 66.3546753 67.5787811 68.5529633 69.6457214 70.4529572 71.2506485 71.9537048 72.7442169 73.4704819 74.0413818 74.6421967 75.2404785 75.746666 76.3247375 77.0055847 77.5250092 78.1009064 78.6179504 79.1395111 79.6850586 80.1954803 80.7790375 81.3068314 81.94104 82.5896759 83.2040405 83.8414536 84.5800934 85.2564621 85.9627533 86.6010056 87.1852417 87.9637909 88.6635437 89.5243683 90.3264618 91.1164856 91.9809875 92.8846436 93.5640717 94.3884125 95.2222443 96.0624847 96.8665695 97.8569641 98.7539825 99.6275635 100.489883 101.337997 102.161118 103.059738 103.93074 104.799332 105.624275 106.349747 107.188202 107.95182 108.785599 109.532303 110.375412 111.266632 112.121254 112.939339 113.792099 114.682709 115.444977 116.244858 117.039337 117.947464 118.79702 119.666763 120.531242 121.386345 122.265488 123.100403 123.888382 124.782242 125.713257 126.535431 127.509766 128.497437 129.47345 130.454361 131.543259 132.703369 133.887268 135.152344 136.61969 138.281723 140.207275 142.57782 145.344635 148.496475 152.891159 159.446106 171.471558 203.601807 424.390259 ------------------------------ Transdef: Tab for variable 10 0.0050833188 0.805752635 1.19014502 1.40003705 1.57496929 1.70115638 1.82131851 1.91491127 2.00710797 2.07088995 2.13029909 2.18831825 2.24365592 2.29355812 2.34151196 2.3860302 2.42370415 2.45872021 2.49071026 2.52142906 2.54991102 2.57829404 2.60562563 2.63253307 2.65566683 2.67752624 2.69930458 2.71932006 2.73793554 2.7559166 2.76921892 2.78438187 2.80046988 2.8136611 2.82767725 2.83864069 2.84962177 2.85872841 2.86956573 2.87926435 2.88857508 2.89794636 2.90654445 2.91494489 2.92204857 2.93146992 2.93911839 2.94661045 2.95287943 2.95865107 2.96530914 2.9714241 2.97811127 2.98317432 2.98820615 2.99352932 2.9975071 3.00283051 3.00763035 3.0124321 3.01688576 3.02117658 3.02553773 3.02996731 3.03389764 3.03789949 3.04150319 3.04509544 3.04914498 3.05293751 3.05689001 3.06093073 3.06427693 3.06783962 3.07116556 3.07432914 3.07786274 3.0809803 3.08394003 3.08719206 3.09002113 3.09310389 3.09567499 3.09858179 3.10132217 3.10416889 3.10686636 3.10989237 3.11238194 3.11434698 3.11694145 3.11965537 3.1222415 3.1248219 3.12730861 3.12926555 3.13198996 3.13386011 3.13618946 3.13877058 3.14159226 ------------------------------ Transdef: Tab for variable 11 1.19226625E-05 0.00630240748 0.0105965156 0.0186235309 0.0249165297 0.0319757462 0.03976053 0.0481117889 0.0569097996 0.0654656887 0.0729372501 0.0808531642 0.0886897445 0.0973302349 0.105083048 0.112657428 0.120618246 0.128480434 0.136720479 0.144973606 0.153583542 0.161200017 0.168416649 0.175638065 0.18294096 0.189626515 0.196652293 0.20328176 0.208381176 0.214579031 0.220736623 0.226904869 0.233005181 0.23896122 0.24527663 0.2509377 0.257148385 0.262125254 0.267488599 0.273562074 0.279040337 0.285103261 0.291464984 0.296717912 0.30122292 0.307044625 0.312719584 0.319120169 0.325021923 0.330625147 0.336485744 0.342564821 0.347563684 0.353212833 0.358320475 0.363227606 0.368529052 0.3749156 0.380608916 0.386174977 0.391076207 0.397207797 0.402451813 0.40853551 0.413241863 0.418721944 0.425143778 0.431429386 0.437250137 0.44415313 0.449441344 0.455138505 0.460459828 0.46720016 0.473818541 0.480154872 0.487521708 0.494900912 0.502875805 0.510262966 0.517783165 0.524337769 0.53339529 0.539795578 0.547008038 0.556613207 0.566766381 0.577104688 0.587728024 0.598435283 0.611946702 0.626989603 0.639928579 0.652807236 0.670769691 0.692760468 0.713360667 0.744863629 0.78319788 0.840545774 1.1246134 ------------------------------ Transdef: Tab for variable 12 0.20003137 0.220507503 0.237397686 0.249793947 0.260485679 0.272301018 0.281308442 0.290687233 0.299029768 0.307319164 0.31459564 0.322418123 0.329396307 0.336342871 0.343713403 0.349655956 0.355770648 0.360957265 0.367488295 0.374084115 0.379885912 0.386487365 0.392344534 0.397761554 0.40176183 0.4063164 0.41056785 0.4144243 0.41855967 0.422072351 0.426626593 0.430821955 0.434990942 0.43875742 0.442759693 0.446754217 0.45077461 0.454856336 0.458063513 0.461986899 0.465917856 0.470044196 0.473483503 0.476857424 0.480395406 0.483769298 0.48752296 0.491858065 0.49574697 0.499410212 0.503214836 0.507341385 0.511344016 0.515204191 0.519486666 0.523668528 0.527931094 0.532208562 0.536985993 0.541069746 0.545783401 0.548749626 0.553838074 0.558312774 0.56249392 0.567077398 0.571491063 0.576860189 0.58194989 0.587538242 0.592473865 0.598201275 0.603425384 0.608363628 0.614321351 0.620009899 0.62621659 0.633014917 0.63871485 0.646136165 0.652125597 0.659069121 0.667044699 0.673792362 0.679592133 0.687273145 0.695759654 0.703020811 0.71243906 0.723445475 0.733443499 0.744187772 0.754551351 0.76800549 0.784019947 0.799485445 0.819433153 0.840740383 0.867098331 0.912776649 1.13469779 ------------------------------ Transdef: Tab for variable 13 20.0082169 21.4542618 22.2193413 22.8510513 23.3923321 23.8927898 24.3984528 24.7758789 25.1604786 25.5667496 25.9434586 26.348753 26.7709694 27.0997467 27.4504509 27.8026199 28.1621838 28.4803734 28.8228455 29.1563301 29.4696999 29.7076569 30.0391541 30.3213711 30.590744 30.8803062 31.1072979 31.3982601 31.6989822 31.983654 32.241291 32.4821968 32.7730026 33.0778961 33.3677139 33.6746445 34.0074539 34.34478 34.5295258 34.8400955 35.1091347 35.4224701 35.733902 36.0461617 36.3265533 36.67416 37.0185699 37.3353348 37.6871948 38.0306015 38.3755455 38.6938934 39.0258255 39.4199905 39.8165588 40.2466736 40.6811714 41.118618 41.5495758 41.9850197 42.4499588 42.8846359 43.2651749 43.7135048 44.2080727 44.7236252 45.2435913 45.8182793 46.3991127 46.9392929 47.4441452 47.9580002 48.6035385 49.2779999 49.9582748 50.6650009 51.3655396 52.2056084 52.9954071 53.8064499 54.6714783 55.5263596 56.52314 57.4783096 58.5133057 59.4790764 60.7757645 62.1498909 63.5980301 65.1862488 67.119812 68.3752441 70.3995819 72.8720245 75.6225891 78.8709412 82.9932098 87.5256805 93.6559601 104.64151 232.717926 ------------------------------ Transdef: Tab for variable 14 10.0116768 10.6114464 11.0188112 11.3428936 11.652092 11.9910412 12.273675 12.4767857 12.7433147 13.0266008 13.3107252 13.5702734 13.821455 14.054533 14.3142767 14.5397816 14.7947807 15.0481844 15.2934399 15.5406685 15.7748146 16.0380402 16.3095779 16.5291214 16.7653427 16.9656029 17.1911545 17.4473915 17.6948586 17.9249363 18.1612415 18.3909035 18.635519 18.8277473 19.0595188 19.2930565 19.5419884 19.7971573 20.0152588 20.2488022 20.4982719 20.6870689 20.9367638 21.1937103 21.4250183 21.6928024 21.9340172 22.1470146 22.3938618 22.6381397 22.9000587 23.165062 23.4054852 23.6742783 23.9223251 24.1867943 24.4662857 24.714695 24.9785271 25.272583 25.5340157 25.8104095 26.0909462 26.3801975 26.6559372 26.9543114 27.2485962 27.5443268 27.8329906 28.0346222 28.3348045 28.6798096 29.0146217 29.3488693 29.7144699 30.1012497 30.4935875 30.8943272 31.3279915 31.7589264 32.2540817 32.7496948 33.2745132 33.8194656 34.3912048 34.9747162 35.6020355 36.3962059 37.200943 38.1308899 39.1280136 39.9527016 41.0626144 42.5572586 43.9628677 45.5744629 47.7579193 50.57481 54.7422791 62.0308228 122.221924 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 54.8 36.6 40.4 13.7 35.9 52.7 40.2 51.0 -26.1 -12.2 -25.9 3.9 11.1 2 54.8 100.0 62.0 51.7 25.0 62.3 94.2 68.2 90.4 -50.5 -20.8 -42.6 28.7 24.3 3 36.6 62.0 100.0 30.3 -0.3 26.4 65.0 47.0 68.0 -17.3 -22.5 -47.6 63.0 10.9 4 40.4 51.7 30.3 100.0 4.6 27.0 52.8 41.2 57.2 -13.2 -26.9 -55.2 10.2 69.0 5 13.7 25.0 -0.3 4.6 100.0 81.8 -3.1 50.6 50.6 24.3 -2.8 -1.8 -8.5 -3.1 6 35.9 62.3 26.4 27.0 81.8 100.0 40.0 74.3 76.9 -5.7 -11.9 -20.8 5.8 9.3 7 52.7 94.2 65.0 52.8 -3.1 40.0 100.0 60.2 77.9 -57.7 -19.8 -43.1 33.0 26.9 8 40.2 68.2 47.0 41.2 50.6 74.3 60.2 100.0 79.3 -11.7 -19.3 -35.2 22.1 21.1 9 51.0 90.4 68.0 57.2 50.6 76.9 77.9 79.3 100.0 -25.0 -24.1 -48.8 33.5 28.9 10 -26.1 -50.5 -17.3 -13.2 24.3 -5.7 -57.7 -11.7 -25.0 100.0 2.6 9.2 -6.4 -4.5 11 -12.2 -20.8 -22.5 -26.9 -2.8 -11.9 -19.8 -19.3 -24.1 2.6 100.0 53.6 -9.3 -17.7 12 -25.9 -42.6 -47.6 -55.2 -1.8 -20.8 -43.1 -35.2 -48.8 9.2 53.6 100.0 -21.9 -30.2 13 3.9 28.7 63.0 10.2 -8.5 5.8 33.0 22.1 33.5 -6.4 -9.3 -21.9 100.0 39.8 14 11.1 24.3 10.9 69.0 -3.1 9.3 26.9 21.1 28.9 -4.5 -17.7 -30.2 39.8 100.0 TOTAL CORRELATION TO TARGET (diagonal) 126.850027 TOTAL CORRELATION OF ALL VARIABLES 60.873657 ROUND 1: MAX CORR ( 60.8736493) AFTER KILLING INPUT VARIABLE 11 CONTR 0.030694073 ROUND 2: MAX CORR ( 60.8704605) AFTER KILLING INPUT VARIABLE 7 CONTR 0.623073956 ROUND 3: MAX CORR ( 60.8145645) AFTER KILLING INPUT VARIABLE 10 CONTR 2.60800779 ROUND 4: MAX CORR ( 60.7489717) AFTER KILLING INPUT VARIABLE 8 CONTR 2.82377299 ROUND 5: MAX CORR ( 60.6355015) AFTER KILLING INPUT VARIABLE 5 CONTR 3.71126951 ROUND 6: MAX CORR ( 60.3907048) AFTER KILLING INPUT VARIABLE 14 CONTR 5.44305121 ROUND 7: MAX CORR ( 60.0863639) AFTER KILLING INPUT VARIABLE 12 CONTR 6.05525414 ROUND 8: MAX CORR ( 59.3968866) AFTER KILLING INPUT VARIABLE 9 CONTR 9.07639764 ROUND 9: MAX CORR ( 59.3113819) AFTER KILLING INPUT VARIABLE 6 CONTR 3.18592468 ROUND 10: MAX CORR ( 57.8032925) AFTER KILLING INPUT VARIABLE 4 CONTR 13.2898228 ROUND 11: MAX CORR ( 56.1788983) AFTER KILLING INPUT VARIABLE 3 CONTR 13.6070574 ROUND 12: MAX CORR ( 54.8036127) AFTER KILLING INPUT VARIABLE 13 CONTR 12.3544587 LAST REMAINING VARIABLE: 2 total correlation to target: 60.873657 % total significance: 117.3609 sigma correlations of single variables to target: variable 2: 54.8036127 % , in sigma: 105.658205 variable 3: 36.6476719 % , in sigma: 70.654598 variable 4: 40.3999396 % , in sigma: 77.8887537 variable 5: 13.6671265 % , in sigma: 26.3494317 variable 6: 35.8812481 % , in sigma: 69.1769772 variable 7: 52.6670469 % , in sigma: 101.539029 variable 8: 40.2422686 % , in sigma: 77.5847732 variable 9: 51.0447747 % , in sigma: 98.4113821 variable 10: -26.0531629 % , in sigma: 50.2289956 variable 11: -12.2350104 % , in sigma: 23.5883946 variable 12: -25.9122508 % , in sigma: 49.9573252 variable 13: 3.88616828 % , in sigma: 7.49230835 variable 14: 11.1442887 % , in sigma: 21.4855459 variables sorted by significance: 1 most relevant variable 2 corr 54.8036118 , in sigma: 105.658203 2 most relevant variable 13 corr 12.3544588 , in sigma: 23.8186841 3 most relevant variable 3 corr 13.6070576 , in sigma: 26.2336223 4 most relevant variable 4 corr 13.2898226 , in sigma: 25.6220115 5 most relevant variable 6 corr 3.18592477 , in sigma: 6.14227923 6 most relevant variable 9 corr 9.0763979 , in sigma: 17.4987717 7 most relevant variable 12 corr 6.05525398 , in sigma: 11.6741805 8 most relevant variable 14 corr 5.44305134 , in sigma: 10.4938891 9 most relevant variable 5 corr 3.71126962 , in sigma: 7.1551138 10 most relevant variable 8 corr 2.82377291 , in sigma: 5.44407133 11 most relevant variable 10 corr 2.60800767 , in sigma: 5.02808839 12 most relevant variable 7 corr 0.623073936 , in sigma: 1.20125062 13 most relevant variable 11 corr 0.0306940731 , in sigma: 0.0591764028 global correlations between input variables: variable 2: 99.1386436 % variable 3: 93.2430803 % variable 4: 91.2007558 % variable 5: 95.2468675 % variable 6: 93.6160525 % variable 7: 98.6766239 % variable 8: 87.2486785 % variable 9: 98.904169 % variable 10: 72.1714369 % variable 11: 54.7935728 % variable 12: 73.9258507 % variable 13: 84.3824424 % variable 14: 86.6876443 % significance loss when removing single variables: variable 2: corr = 7.27771905 % , sigma = 14.0310226 variable 3: corr = 13.2085628 % , sigma = 25.4653473 variable 4: corr = 16.198068 % , sigma = 31.2289409 variable 5: corr = 4.50101764 % , sigma = 8.67770244 variable 6: corr = 4.60903962 % , sigma = 8.8859626 variable 7: corr = 0.623698596 % , sigma = 1.20245493 variable 8: corr = 2.33929005 % , sigma = 4.51001631 variable 9: corr = 8.83448942 % , sigma = 17.0323861 variable 10: corr = 2.52560387 % , sigma = 4.86921861 variable 11: corr = 0.030694073 % , sigma = 0.0591764027 variable 12: corr = 5.27887471 % , sigma = 10.177366 variable 13: corr = 8.11244 % , sigma = 15.6403165 variable 14: corr = 5.43368971 % , sigma = 10.4758404 Keep only 11 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 12 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 4 --> 12.0367928 sigma out 15 active outputs RANK 2 NODE 11 --> 11.174283 sigma out 15 active outputs RANK 3 NODE 8 --> 10.3726263 sigma out 15 active outputs RANK 4 NODE 7 --> 9.71386528 sigma out 15 active outputs RANK 5 NODE 9 --> 9.31745148 sigma out 15 active outputs RANK 6 NODE 10 --> 9.07838726 sigma out 15 active outputs RANK 7 NODE 2 --> 8.73185349 sigma out 15 active outputs RANK 8 NODE 6 --> 7.97973251 sigma out 15 active outputs RANK 9 NODE 1 --> 7.65449858 sigma out 15 active outputs RANK 10 NODE 3 --> 6.07497311 sigma out 15 active outputs RANK 11 NODE 5 --> 5.74853706 sigma out 15 active outputs RANK 12 NODE 12 --> 5.43053865 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 10 --> 15.6231422 sigma in 12act. ( 16.3237953 sig out 1act.) RANK 2 NODE 9 --> 13.6851482 sigma in 12act. ( 11.2885466 sig out 1act.) RANK 3 NODE 1 --> 10.5305281 sigma in 12act. ( 10.5284538 sig out 1act.) RANK 4 NODE 6 --> 9.08061028 sigma in 12act. ( 6.83749723 sig out 1act.) RANK 5 NODE 5 --> 8.82573891 sigma in 12act. ( 7.09617376 sig out 1act.) RANK 6 NODE 13 --> 8.3557148 sigma in 12act. ( 6.91404104 sig out 1act.) RANK 7 NODE 15 --> 8.05378437 sigma in 12act. ( 4.79847288 sig out 1act.) RANK 8 NODE 3 --> 6.96076536 sigma in 12act. ( 8.57647133 sig out 1act.) RANK 9 NODE 8 --> 3.60385442 sigma in 12act. ( 2.57916594 sig out 1act.) RANK 10 NODE 2 --> 3.45513296 sigma in 12act. ( 2.39985871 sig out 1act.) RANK 11 NODE 4 --> 2.78081226 sigma in 12act. ( 4.26337862 sig out 1act.) RANK 12 NODE 7 --> 2.7367487 sigma in 12act. ( 2.83320689 sig out 1act.) RANK 13 NODE 12 --> 2.31068182 sigma in 12act. ( 2.54064083 sig out 1act.) RANK 14 NODE 11 --> 2.18854952 sigma in 12act. ( 2.64332199 sig out 1act.) RANK 15 NODE 14 --> 1.69288683 sigma in 12act. ( 2.4701066 sig out 1act.) sorted by output significance RANK 1 NODE 10 --> 16.3237953 sigma out 1act.( 15.6231422 sig in 12act.) RANK 2 NODE 9 --> 11.2885466 sigma out 1act.( 13.6851482 sig in 12act.) RANK 3 NODE 1 --> 10.5284538 sigma out 1act.( 10.5305281 sig in 12act.) RANK 4 NODE 3 --> 8.57647133 sigma out 1act.( 6.96076536 sig in 12act.) RANK 5 NODE 5 --> 7.09617376 sigma out 1act.( 8.82573891 sig in 12act.) RANK 6 NODE 13 --> 6.91404104 sigma out 1act.( 8.3557148 sig in 12act.) RANK 7 NODE 6 --> 6.83749723 sigma out 1act.( 9.08061028 sig in 12act.) RANK 8 NODE 15 --> 4.79847288 sigma out 1act.( 8.05378437 sig in 12act.) RANK 9 NODE 4 --> 4.26337862 sigma out 1act.( 2.78081226 sig in 12act.) RANK 10 NODE 7 --> 2.83320689 sigma out 1act.( 2.7367487 sig in 12act.) RANK 11 NODE 11 --> 2.64332199 sigma out 1act.( 2.18854952 sig in 12act.) RANK 12 NODE 8 --> 2.57916594 sigma out 1act.( 3.60385442 sig in 12act.) RANK 13 NODE 12 --> 2.54064083 sigma out 1act.( 2.31068182 sig in 12act.) RANK 14 NODE 14 --> 2.4701066 sigma out 1act.( 1.69288683 sig in 12act.) RANK 15 NODE 2 --> 2.39985871 sigma out 1act.( 3.45513296 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 28.3618813 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 17.5195465 sigma out 15 active outputs RANK 2 NODE 7 --> 13.2342243 sigma out 15 active outputs RANK 3 NODE 4 --> 12.9457865 sigma out 15 active outputs RANK 4 NODE 11 --> 11.478776 sigma out 15 active outputs RANK 5 NODE 10 --> 11.4617901 sigma out 15 active outputs RANK 6 NODE 8 --> 11.1504965 sigma out 15 active outputs RANK 7 NODE 9 --> 10.9709558 sigma out 15 active outputs RANK 8 NODE 6 --> 9.94712162 sigma out 15 active outputs RANK 9 NODE 1 --> 9.44495869 sigma out 15 active outputs RANK 10 NODE 12 --> 9.11039734 sigma out 15 active outputs RANK 11 NODE 3 --> 7.64950609 sigma out 15 active outputs RANK 12 NODE 5 --> 6.23927021 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 10 --> 16.3638992 sigma in 12act. ( 16.2820282 sig out 1act.) RANK 2 NODE 9 --> 12.3213339 sigma in 12act. ( 10.0613527 sig out 1act.) RANK 3 NODE 14 --> 11.5189352 sigma in 12act. ( 4.36009026 sig out 1act.) RANK 4 NODE 1 --> 11.0552578 sigma in 12act. ( 10.5864763 sig out 1act.) RANK 5 NODE 11 --> 10.5177917 sigma in 12act. ( 4.2022233 sig out 1act.) RANK 6 NODE 12 --> 10.3969517 sigma in 12act. ( 4.31550503 sig out 1act.) RANK 7 NODE 3 --> 10.0151281 sigma in 12act. ( 8.98133087 sig out 1act.) RANK 8 NODE 5 --> 9.88375282 sigma in 12act. ( 6.62099743 sig out 1act.) RANK 9 NODE 4 --> 9.04511452 sigma in 12act. ( 6.13503408 sig out 1act.) RANK 10 NODE 15 --> 9.01848507 sigma in 12act. ( 4.20581007 sig out 1act.) RANK 11 NODE 13 --> 8.82342148 sigma in 12act. ( 6.29586697 sig out 1act.) RANK 12 NODE 6 --> 8.62157726 sigma in 12act. ( 6.08374071 sig out 1act.) RANK 13 NODE 8 --> 7.41978312 sigma in 12act. ( 2.45417166 sig out 1act.) RANK 14 NODE 7 --> 6.65748167 sigma in 12act. ( 3.78475976 sig out 1act.) RANK 15 NODE 2 --> 4.49249458 sigma in 12act. ( 2.16543293 sig out 1act.) sorted by output significance RANK 1 NODE 10 --> 16.2820282 sigma out 1act.( 16.3638992 sig in 12act.) RANK 2 NODE 1 --> 10.5864763 sigma out 1act.( 11.0552578 sig in 12act.) RANK 3 NODE 9 --> 10.0613527 sigma out 1act.( 12.3213339 sig in 12act.) RANK 4 NODE 3 --> 8.98133087 sigma out 1act.( 10.0151281 sig in 12act.) RANK 5 NODE 5 --> 6.62099743 sigma out 1act.( 9.88375282 sig in 12act.) RANK 6 NODE 13 --> 6.29586697 sigma out 1act.( 8.82342148 sig in 12act.) RANK 7 NODE 4 --> 6.13503408 sigma out 1act.( 9.04511452 sig in 12act.) RANK 8 NODE 6 --> 6.08374071 sigma out 1act.( 8.62157726 sig in 12act.) RANK 9 NODE 14 --> 4.36009026 sigma out 1act.( 11.5189352 sig in 12act.) RANK 10 NODE 12 --> 4.31550503 sigma out 1act.( 10.3969517 sig in 12act.) RANK 11 NODE 15 --> 4.20581007 sigma out 1act.( 9.01848507 sig in 12act.) RANK 12 NODE 11 --> 4.2022233 sigma out 1act.( 10.5177917 sig in 12act.) RANK 13 NODE 7 --> 3.78475976 sigma out 1act.( 6.65748167 sig in 12act.) RANK 14 NODE 8 --> 2.45417166 sigma out 1act.( 7.41978312 sig in 12act.) RANK 15 NODE 2 --> 2.16543293 sigma out 1act.( 4.49249458 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 28.5520077 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.406308323 *** contribution from regularisation: 0.00478268228 *** contribution from error: -0.411091 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.448987484 *** contribution from regularisation: 0.0019035507 *** contribution from error: -0.450891048 *********************************************** -----------------> Test sample ENTER BFGS code START -41005.279 0.222001418 -0.0126494523 EXIT FROM BFGS code FG_START 0. 0.222001418 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.475741178 *** contribution from regularisation: 0.00217164238 *** contribution from error: -0.477912813 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -43438.9752 0.222001418 71.2891312 EXIT FROM BFGS code FG_LNSRCH 0. 0.233075902 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0.507166743 *** contribution from regularisation: 0.00383110647 *** contribution from error: -0.510997832 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46308.3817 0.233075902 50.2583275 EXIT FROM BFGS code NEW_X -46308.3817 0.233075902 50.2583275 ENTER BFGS code NEW_X -46308.3817 0.233075902 50.2583275 EXIT FROM BFGS code FG_LNSRCH 0. 0.238783419 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.509918749 *** contribution from regularisation: 0.00349428598 *** contribution from error: -0.513413012 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46559.6637 0.238783419 30.3997803 EXIT FROM BFGS code NEW_X -46559.6637 0.238783419 30.3997803 ENTER BFGS code NEW_X -46559.6637 0.238783419 30.3997803 EXIT FROM BFGS code FG_LNSRCH 0. 0.246842369 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.511608481 *** contribution from regularisation: 0.00319139776 *** contribution from error: -0.514799893 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46713.9468 0.246842369 14.0682516 EXIT FROM BFGS code NEW_X -46713.9468 0.246842369 14.0682516 ENTER BFGS code NEW_X -46713.9468 0.246842369 14.0682516 EXIT FROM BFGS code FG_LNSRCH 0. 0.25966078 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.51275748 *** contribution from regularisation: 0.00309806271 *** contribution from error: -0.515855551 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46818.862 0.25966078 -10.080081 EXIT FROM BFGS code NEW_X -46818.862 0.25966078 -10.080081 ENTER BFGS code NEW_X -46818.862 0.25966078 -10.080081 EXIT FROM BFGS code FG_LNSRCH 0. 0.260756433 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.513557434 *** contribution from regularisation: 0.00293893926 *** contribution from error: -0.51649636 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46891.9022 0.260756433 -8.57259941 EXIT FROM BFGS code NEW_X -46891.9022 0.260756433 -8.57259941 ENTER BFGS code NEW_X -46891.9022 0.260756433 -8.57259941 EXIT FROM BFGS code FG_LNSRCH 0. 0.262271583 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.514675498 *** contribution from regularisation: 0.00297662592 *** contribution from error: -0.517652094 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46993.9898 0.262271583 -4.9049964 EXIT FROM BFGS code NEW_X -46993.9898 0.262271583 -4.9049964 ENTER BFGS code NEW_X -46993.9898 0.262271583 -4.9049964 EXIT FROM BFGS code FG_LNSRCH 0. 0.261285216 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 38.6067238 sigma out 15 active outputs RANK 2 NODE 1 --> 23.1344261 sigma out 15 active outputs RANK 3 NODE 12 --> 18.0077591 sigma out 15 active outputs RANK 4 NODE 10 --> 15.0437355 sigma out 15 active outputs RANK 5 NODE 6 --> 12.1194401 sigma out 15 active outputs RANK 6 NODE 7 --> 11.9416838 sigma out 15 active outputs RANK 7 NODE 9 --> 11.8988314 sigma out 15 active outputs RANK 8 NODE 8 --> 11.6443081 sigma out 15 active outputs RANK 9 NODE 4 --> 9.21136665 sigma out 15 active outputs RANK 10 NODE 11 --> 8.46520615 sigma out 15 active outputs RANK 11 NODE 3 --> 7.97056293 sigma out 15 active outputs RANK 12 NODE 5 --> 4.4651804 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 12 --> 34.0287933 sigma in 12act. ( 28.3807755 sig out 1act.) RANK 2 NODE 4 --> 27.3948975 sigma in 12act. ( 26.0095348 sig out 1act.) RANK 3 NODE 7 --> 18.6627045 sigma in 12act. ( 16.386549 sig out 1act.) RANK 4 NODE 11 --> 17.7051201 sigma in 12act. ( 18.1018543 sig out 1act.) RANK 5 NODE 14 --> 16.7686501 sigma in 12act. ( 15.1839237 sig out 1act.) RANK 6 NODE 1 --> 9.45489597 sigma in 12act. ( 6.88200808 sig out 1act.) RANK 7 NODE 9 --> 9.0628624 sigma in 12act. ( 9.48392487 sig out 1act.) RANK 8 NODE 5 --> 8.66062737 sigma in 12act. ( 7.56951427 sig out 1act.) RANK 9 NODE 15 --> 8.20324612 sigma in 12act. ( 7.13051081 sig out 1act.) RANK 10 NODE 13 --> 8.18791008 sigma in 12act. ( 7.30488777 sig out 1act.) RANK 11 NODE 10 --> 6.692276 sigma in 12act. ( 4.85872793 sig out 1act.) RANK 12 NODE 8 --> 6.34800053 sigma in 12act. ( 5.31408978 sig out 1act.) RANK 13 NODE 3 --> 6.2195611 sigma in 12act. ( 3.90946865 sig out 1act.) RANK 14 NODE 6 --> 4.29950285 sigma in 12act. ( 1.56450009 sig out 1act.) RANK 15 NODE 2 --> 2.89631224 sigma in 12act. ( 0.902129233 sig out 1act.) sorted by output significance RANK 1 NODE 12 --> 28.3807755 sigma out 1act.( 34.0287933 sig in 12act.) RANK 2 NODE 4 --> 26.0095348 sigma out 1act.( 27.3948975 sig in 12act.) RANK 3 NODE 11 --> 18.1018543 sigma out 1act.( 17.7051201 sig in 12act.) RANK 4 NODE 7 --> 16.386549 sigma out 1act.( 18.6627045 sig in 12act.) RANK 5 NODE 14 --> 15.1839237 sigma out 1act.( 16.7686501 sig in 12act.) RANK 6 NODE 9 --> 9.48392487 sigma out 1act.( 9.0628624 sig in 12act.) RANK 7 NODE 5 --> 7.56951427 sigma out 1act.( 8.66062737 sig in 12act.) RANK 8 NODE 13 --> 7.30488777 sigma out 1act.( 8.18791008 sig in 12act.) RANK 9 NODE 15 --> 7.13051081 sigma out 1act.( 8.20324612 sig in 12act.) RANK 10 NODE 1 --> 6.88200808 sigma out 1act.( 9.45489597 sig in 12act.) RANK 11 NODE 8 --> 5.31408978 sigma out 1act.( 6.34800053 sig in 12act.) RANK 12 NODE 10 --> 4.85872793 sigma out 1act.( 6.692276 sig in 12act.) RANK 13 NODE 3 --> 3.90946865 sigma out 1act.( 6.2195611 sig in 12act.) RANK 14 NODE 6 --> 1.56450009 sigma out 1act.( 4.29950285 sig in 12act.) RANK 15 NODE 2 --> 0.902129233 sigma out 1act.( 2.89631224 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 51.7485657 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.515595198 *** contribution from regularisation: 0.00295569981 *** contribution from error: -0.518550873 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -47077.9682 0.261285216 -5.38039637 EXIT FROM BFGS code NEW_X -47077.9682 0.261285216 -5.38039637 ENTER BFGS code NEW_X -47077.9682 0.261285216 -5.38039637 EXIT FROM BFGS code FG_LNSRCH 0. 0.246848121 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.51677072 *** contribution from regularisation: 0.00331465458 *** contribution from error: -0.520085394 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47185.3031 0.246848121 11.5128727 EXIT FROM BFGS code NEW_X -47185.3031 0.246848121 11.5128727 ENTER BFGS code NEW_X -47185.3031 0.246848121 11.5128727 EXIT FROM BFGS code FG_LNSRCH 0. 0.246464789 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.518521607 *** contribution from regularisation: 0.0029844495 *** contribution from error: -0.521506071 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47345.1711 0.246464789 -18.0576115 EXIT FROM BFGS code NEW_X -47345.1711 0.246464789 -18.0576115 ENTER BFGS code NEW_X -47345.1711 0.246464789 -18.0576115 EXIT FROM BFGS code FG_LNSRCH 0. 0.243542686 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.518745661 *** contribution from regularisation: 0.00297028339 *** contribution from error: -0.521715939 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47365.6272 0.243542686 -1.21812558 EXIT FROM BFGS code NEW_X -47365.6272 0.243542686 -1.21812558 ENTER BFGS code NEW_X -47365.6272 0.243542686 -1.21812558 EXIT FROM BFGS code FG_LNSRCH 0. 0.239813685 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.519093215 *** contribution from regularisation: 0.00302240346 *** contribution from error: -0.522115648 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47397.3659 0.239813685 -0.233315334 EXIT FROM BFGS code NEW_X -47397.3659 0.239813685 -0.233315334 ENTER BFGS code NEW_X -47397.3659 0.239813685 -0.233315334 EXIT FROM BFGS code FG_LNSRCH 0. 0.210483521 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.520590425 *** contribution from regularisation: 0.00352389994 *** contribution from error: -0.524114311 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47534.0718 0.210483521 1.42394078 EXIT FROM BFGS code NEW_X -47534.0718 0.210483521 1.42394078 ENTER BFGS code NEW_X -47534.0718 0.210483521 1.42394078 EXIT FROM BFGS code FG_LNSRCH 0. 0.203444466 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.521636844 *** contribution from regularisation: 0.00356567302 *** contribution from error: -0.525202513 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47629.6171 0.203444466 -8.70059204 EXIT FROM BFGS code NEW_X -47629.6171 0.203444466 -8.70059204 ENTER BFGS code NEW_X -47629.6171 0.203444466 -8.70059204 EXIT FROM BFGS code FG_LNSRCH 0. 0.19686015 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.522706628 *** contribution from regularisation: 0.0035638914 *** contribution from error: -0.526270509 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47727.2993 0.19686015 -9.96531963 EXIT FROM BFGS code NEW_X -47727.2993 0.19686015 -9.96531963 ENTER BFGS code NEW_X -47727.2993 0.19686015 -9.96531963 EXIT FROM BFGS code FG_LNSRCH 0. 0.189159706 0. --------------------------------------------------- Iteration : 18 *********************************************** *** Learn Path 18 *** loss function: -0.523151994 *** contribution from regularisation: 0.00361932092 *** contribution from error: -0.526771307 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47767.9606 0.189159706 -5.268466 EXIT FROM BFGS code NEW_X -47767.9606 0.189159706 -5.268466 ENTER BFGS code NEW_X -47767.9606 0.189159706 -5.268466 EXIT FROM BFGS code FG_LNSRCH 0. 0.183562711 0. --------------------------------------------------- Iteration : 19 *********************************************** *** Learn Path 19 *** loss function: -0.523509741 *** contribution from regularisation: 0.00359659619 *** contribution from error: -0.527106345 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47800.6262 0.183562711 -5.1353178 EXIT FROM BFGS code NEW_X -47800.6262 0.183562711 -5.1353178 ENTER BFGS code NEW_X -47800.6262 0.183562711 -5.1353178 EXIT FROM BFGS code FG_LNSRCH 0. 0.156938061 0. --------------------------------------------------- Iteration : 20 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 64.1976166 sigma out 15 active outputs RANK 2 NODE 1 --> 38.1521034 sigma out 15 active outputs RANK 3 NODE 12 --> 36.2005119 sigma out 15 active outputs RANK 4 NODE 10 --> 21.2186451 sigma out 15 active outputs RANK 5 NODE 9 --> 20.3449116 sigma out 15 active outputs RANK 6 NODE 6 --> 19.3749409 sigma out 15 active outputs RANK 7 NODE 3 --> 18.7560043 sigma out 15 active outputs RANK 8 NODE 4 --> 18.0045872 sigma out 15 active outputs RANK 9 NODE 7 --> 15.7450399 sigma out 15 active outputs RANK 10 NODE 11 --> 14.1165905 sigma out 15 active outputs RANK 11 NODE 8 --> 13.2156744 sigma out 15 active outputs RANK 12 NODE 5 --> 13.1678295 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 7 --> 43.6135025 sigma in 12act. ( 38.6618462 sig out 1act.) RANK 2 NODE 12 --> 39.4189873 sigma in 12act. ( 43.1618729 sig out 1act.) RANK 3 NODE 11 --> 38.2672005 sigma in 12act. ( 54.2857628 sig out 1act.) RANK 4 NODE 1 --> 35.4333344 sigma in 12act. ( 38.4504166 sig out 1act.) RANK 5 NODE 15 --> 30.2675781 sigma in 12act. ( 31.0404377 sig out 1act.) RANK 6 NODE 14 --> 23.5304737 sigma in 12act. ( 24.7876453 sig out 1act.) RANK 7 NODE 8 --> 21.7599392 sigma in 12act. ( 25.3642273 sig out 1act.) RANK 8 NODE 4 --> 21.45858 sigma in 12act. ( 19.5304089 sig out 1act.) RANK 9 NODE 9 --> 20.2408791 sigma in 12act. ( 23.9637394 sig out 1act.) RANK 10 NODE 10 --> 13.2983313 sigma in 12act. ( 11.4211493 sig out 1act.) RANK 11 NODE 5 --> 12.5313311 sigma in 12act. ( 13.3574076 sig out 1act.) RANK 12 NODE 13 --> 12.0659533 sigma in 12act. ( 10.7809286 sig out 1act.) RANK 13 NODE 6 --> 10.2133579 sigma in 12act. ( 9.84576988 sig out 1act.) RANK 14 NODE 3 --> 4.20040512 sigma in 12act. ( 2.02114725 sig out 1act.) RANK 15 NODE 2 --> 2.70682073 sigma in 12act. ( 1.60622108 sig out 1act.) sorted by output significance RANK 1 NODE 11 --> 54.2857628 sigma out 1act.( 38.2672005 sig in 12act.) RANK 2 NODE 12 --> 43.1618729 sigma out 1act.( 39.4189873 sig in 12act.) RANK 3 NODE 7 --> 38.6618462 sigma out 1act.( 43.6135025 sig in 12act.) RANK 4 NODE 1 --> 38.4504166 sigma out 1act.( 35.4333344 sig in 12act.) RANK 5 NODE 15 --> 31.0404377 sigma out 1act.( 30.2675781 sig in 12act.) RANK 6 NODE 8 --> 25.3642273 sigma out 1act.( 21.7599392 sig in 12act.) RANK 7 NODE 14 --> 24.7876453 sigma out 1act.( 23.5304737 sig in 12act.) RANK 8 NODE 9 --> 23.9637394 sigma out 1act.( 20.2408791 sig in 12act.) RANK 9 NODE 4 --> 19.5304089 sigma out 1act.( 21.45858 sig in 12act.) RANK 10 NODE 5 --> 13.3574076 sigma out 1act.( 12.5313311 sig in 12act.) RANK 11 NODE 10 --> 11.4211493 sigma out 1act.( 13.2983313 sig in 12act.) RANK 12 NODE 13 --> 10.7809286 sigma out 1act.( 12.0659533 sig in 12act.) RANK 13 NODE 6 --> 9.84576988 sigma out 1act.( 10.2133579 sig in 12act.) RANK 14 NODE 3 --> 2.02114725 sigma out 1act.( 4.20040512 sig in 12act.) RANK 15 NODE 2 --> 1.60622108 sigma out 1act.( 2.70682073 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 107.185524 sigma in 15 active inputs *********************************************** *** Learn Path 20 *** loss function: -0.524356127 *** contribution from regularisation: 0.00374804903 *** contribution from error: -0.528104186 *********************************************** -----------------> Test sample Iteration No: 20 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -47877.9108 0.156938061 4.51011038 EXIT FROM BFGS code NEW_X -47877.9108 0.156938061 4.51011038 ENTER BFGS code NEW_X -47877.9108 0.156938061 4.51011038 EXIT FROM BFGS code FG_LNSRCH 0. 0.142838985 0. --------------------------------------------------- Iteration : 21 *********************************************** *** Learn Path 21 *** loss function: -0.524408758 *** contribution from regularisation: 0.00374286179 *** contribution from error: -0.528151631 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47882.7159 0.142838985 -26.0913181 EXIT FROM BFGS code FG_LNSRCH 0. 0.150353804 0. --------------------------------------------------- Iteration : 22 *********************************************** *** Learn Path 22 *** loss function: -0.524778128 *** contribution from regularisation: 0.00369334361 *** contribution from error: -0.52847147 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47916.4389 0.150353804 -9.02674866 EXIT FROM BFGS code NEW_X -47916.4389 0.150353804 -9.02674866 ENTER BFGS code NEW_X -47916.4389 0.150353804 -9.02674866 EXIT FROM BFGS code FG_LNSRCH 0. 0.147105485 0. --------------------------------------------------- Iteration : 23 *********************************************** *** Learn Path 23 *** loss function: -0.525198936 *** contribution from regularisation: 0.00365630351 *** contribution from error: -0.528855264 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47954.8658 0.147105485 -10.6516171 EXIT FROM BFGS code NEW_X -47954.8658 0.147105485 -10.6516171 ENTER BFGS code NEW_X -47954.8658 0.147105485 -10.6516171 EXIT FROM BFGS code FG_LNSRCH 0. 0.129796013 0. --------------------------------------------------- Iteration : 24 *********************************************** *** Learn Path 24 *** loss function: -0.525881827 *** contribution from regularisation: 0.00365482364 *** contribution from error: -0.529536664 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48017.2168 0.129796013 -5.97090387 EXIT FROM BFGS code NEW_X -48017.2168 0.129796013 -5.97090387 ENTER BFGS code NEW_X -48017.2168 0.129796013 -5.97090387 EXIT FROM BFGS code FG_LNSRCH 0. 0.118788205 0. --------------------------------------------------- Iteration : 25 *********************************************** *** Learn Path 25 *** loss function: -0.52585113 *** contribution from regularisation: 0.00379050337 *** contribution from error: -0.529641628 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48014.4169 0.118788205 0.437792778 EXIT FROM BFGS code FG_LNSRCH 0. 0.125085488 0. --------------------------------------------------- Iteration : 26 *********************************************** *** Learn Path 26 *** loss function: -0.525885403 *** contribution from regularisation: 0.00389087829 *** contribution from error: -0.529776275 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48017.5422 0.125085488 -3.63082933 EXIT FROM BFGS code NEW_X -48017.5422 0.125085488 -3.63082933 ENTER BFGS code NEW_X -48017.5422 0.125085488 -3.63082933 EXIT FROM BFGS code FG_LNSRCH 0. 0.119414531 0. --------------------------------------------------- Iteration : 27 *********************************************** *** Learn Path 27 *** loss function: -0.526212335 *** contribution from regularisation: 0.00380564341 *** contribution from error: -0.530017972 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48047.3951 0.119414531 -1.33976471 EXIT FROM BFGS code NEW_X -48047.3951 0.119414531 -1.33976471 ENTER BFGS code NEW_X -48047.3951 0.119414531 -1.33976471 EXIT FROM BFGS code FG_LNSRCH 0. 0.112060368 0. --------------------------------------------------- Iteration : 28 *********************************************** *** Learn Path 28 *** loss function: -0.526364684 *** contribution from regularisation: 0.00384925632 *** contribution from error: -0.530213952 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48061.3061 0.112060368 2.70020747 EXIT FROM BFGS code NEW_X -48061.3061 0.112060368 2.70020747 ENTER BFGS code NEW_X -48061.3061 0.112060368 2.70020747 EXIT FROM BFGS code FG_LNSRCH 0. 0.10446503 0. --------------------------------------------------- Iteration : 29 *********************************************** *** Learn Path 29 *** loss function: -0.526541591 *** contribution from regularisation: 0.00390242646 *** contribution from error: -0.530444026 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48077.4599 0.10446503 3.6504972 EXIT FROM BFGS code NEW_X -48077.4599 0.10446503 3.6504972 ENTER BFGS code NEW_X -48077.4599 0.10446503 3.6504972 EXIT FROM BFGS code FG_LNSRCH 0. 0.0965975523 0. --------------------------------------------------- Iteration : 30 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 72.0739365 sigma out 15 active outputs RANK 2 NODE 1 --> 44.7180634 sigma out 15 active outputs RANK 3 NODE 10 --> 28.7759438 sigma out 15 active outputs RANK 4 NODE 12 --> 27.7992306 sigma out 15 active outputs RANK 5 NODE 3 --> 27.3674698 sigma out 15 active outputs RANK 6 NODE 9 --> 25.755167 sigma out 15 active outputs RANK 7 NODE 4 --> 25.5785084 sigma out 15 active outputs RANK 8 NODE 8 --> 24.4763527 sigma out 15 active outputs RANK 9 NODE 6 --> 22.930542 sigma out 15 active outputs RANK 10 NODE 7 --> 21.1329575 sigma out 15 active outputs RANK 11 NODE 11 --> 18.4534187 sigma out 15 active outputs RANK 12 NODE 5 --> 9.72365475 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 7 --> 55.7654152 sigma in 12act. ( 50.9891777 sig out 1act.) RANK 2 NODE 12 --> 41.2543907 sigma in 12act. ( 46.1111069 sig out 1act.) RANK 3 NODE 15 --> 40.1781158 sigma in 12act. ( 37.4612885 sig out 1act.) RANK 4 NODE 1 --> 38.8806992 sigma in 12act. ( 43.2677536 sig out 1act.) RANK 5 NODE 11 --> 37.2817612 sigma in 12act. ( 65.1831741 sig out 1act.) RANK 6 NODE 4 --> 27.3859749 sigma in 12act. ( 27.357336 sig out 1act.) RANK 7 NODE 14 --> 26.7714367 sigma in 12act. ( 27.2345085 sig out 1act.) RANK 8 NODE 8 --> 26.0241661 sigma in 12act. ( 29.3791981 sig out 1act.) RANK 9 NODE 9 --> 23.0173759 sigma in 12act. ( 25.8003407 sig out 1act.) RANK 10 NODE 5 --> 15.9093418 sigma in 12act. ( 18.0674477 sig out 1act.) RANK 11 NODE 6 --> 14.9471645 sigma in 12act. ( 16.4058685 sig out 1act.) RANK 12 NODE 10 --> 13.6975698 sigma in 12act. ( 12.7841778 sig out 1act.) RANK 13 NODE 13 --> 12.5078602 sigma in 12act. ( 11.3385563 sig out 1act.) RANK 14 NODE 3 --> 4.83452606 sigma in 12act. ( 3.92503595 sig out 1act.) RANK 15 NODE 2 --> 4.79656506 sigma in 12act. ( 4.7701087 sig out 1act.) sorted by output significance RANK 1 NODE 11 --> 65.1831741 sigma out 1act.( 37.2817612 sig in 12act.) RANK 2 NODE 7 --> 50.9891777 sigma out 1act.( 55.7654152 sig in 12act.) RANK 3 NODE 12 --> 46.1111069 sigma out 1act.( 41.2543907 sig in 12act.) RANK 4 NODE 1 --> 43.2677536 sigma out 1act.( 38.8806992 sig in 12act.) RANK 5 NODE 15 --> 37.4612885 sigma out 1act.( 40.1781158 sig in 12act.) RANK 6 NODE 8 --> 29.3791981 sigma out 1act.( 26.0241661 sig in 12act.) RANK 7 NODE 4 --> 27.357336 sigma out 1act.( 27.3859749 sig in 12act.) RANK 8 NODE 14 --> 27.2345085 sigma out 1act.( 26.7714367 sig in 12act.) RANK 9 NODE 9 --> 25.8003407 sigma out 1act.( 23.0173759 sig in 12act.) RANK 10 NODE 5 --> 18.0674477 sigma out 1act.( 15.9093418 sig in 12act.) RANK 11 NODE 6 --> 16.4058685 sigma out 1act.( 14.9471645 sig in 12act.) RANK 12 NODE 10 --> 12.7841778 sigma out 1act.( 13.6975698 sig in 12act.) RANK 13 NODE 13 --> 11.3385563 sigma out 1act.( 12.5078602 sig in 12act.) RANK 14 NODE 2 --> 4.7701087 sigma out 1act.( 4.79656506 sig in 12act.) RANK 15 NODE 3 --> 3.92503595 sigma out 1act.( 4.83452606 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 127.259987 sigma in 15 active inputs *********************************************** *** Learn Path 30 *** loss function: -0.526838243 *** contribution from regularisation: 0.00394698279 *** contribution from error: -0.530785203 *********************************************** -----------------> Test sample Iteration No: 30 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -48104.5463 0.0965975523 5.35046053 EXIT FROM BFGS code NEW_X -48104.5463 0.0965975523 5.35046053 ENTER BFGS code NEW_X -48104.5463 0.0965975523 5.35046053 EXIT FROM BFGS code FG_LNSRCH 0. 0.0814619884 0. --------------------------------------------------- Iteration : 31 *********************************************** *** Learn Path 31 *** loss function: -0.526100039 *** contribution from regularisation: 0.00405197032 *** contribution from error: -0.530152023 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48037.1398 0.0814619884 -104.716255 EXIT FROM BFGS code FG_LNSRCH 0. 0.0923872292 0. --------------------------------------------------- Iteration : 32 *********************************************** *** Learn Path 32 *** loss function: -0.52694416 *** contribution from regularisation: 0.00405810215 *** contribution from error: -0.531002283 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48114.2174 0.0923872292 -14.8965607 EXIT FROM BFGS code NEW_X -48114.2174 0.0923872292 -14.8965607 ENTER BFGS code NEW_X -48114.2174 0.0923872292 -14.8965607 EXIT FROM BFGS code FG_LNSRCH 0. 0.0837501884 0. --------------------------------------------------- Iteration : 33 *********************************************** *** Learn Path 33 *** loss function: -0.527264059 *** contribution from regularisation: 0.00402883859 *** contribution from error: -0.531292915 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48143.4261 0.0837501884 -6.59327316 EXIT FROM BFGS code NEW_X -48143.4261 0.0837501884 -6.59327316 ENTER BFGS code NEW_X -48143.4261 0.0837501884 -6.59327316 EXIT FROM BFGS code FG_LNSRCH 0. 0.0815069675 0. --------------------------------------------------- Iteration : 34 *********************************************** *** Learn Path 34 *** loss function: -0.527407229 *** contribution from regularisation: 0.0040446315 *** contribution from error: -0.531451881 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48156.5002 0.0815069675 -12.8510723 EXIT FROM BFGS code NEW_X -48156.5002 0.0815069675 -12.8510723 ENTER BFGS code NEW_X -48156.5002 0.0815069675 -12.8510723 EXIT FROM BFGS code FG_LNSRCH 0. 0.0713825673 0. --------------------------------------------------- Iteration : 35 *********************************************** *** Learn Path 35 *** loss function: -0.527834177 *** contribution from regularisation: 0.00400244445 *** contribution from error: -0.531836629 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48195.4805 0.0713825673 6.98768139 EXIT FROM BFGS code NEW_X -48195.4805 0.0713825673 6.98768139 ENTER BFGS code NEW_X -48195.4805 0.0713825673 6.98768139 EXIT FROM BFGS code FG_LNSRCH 0. 0.072362788 0. --------------------------------------------------- Iteration : 36 *********************************************** *** Learn Path 36 *** loss function: -0.527601719 *** contribution from regularisation: 0.00399295287 *** contribution from error: -0.531594694 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48174.2571 0.072362788 -118.344696 EXIT FROM BFGS code FG_LNSRCH 0. 0.0716568008 0. --------------------------------------------------- Iteration : 37 *********************************************** *** Learn Path 37 *** loss function: -0.527744353 *** contribution from regularisation: 0.0041634962 *** contribution from error: -0.531907856 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48187.2841 0.0716568008 -25.8298569 EXIT FROM BFGS code FG_LNSRCH 0. 0.0714176297 0. --------------------------------------------------- Iteration : 38 *********************************************** *** Learn Path 38 *** loss function: -0.527748346 *** contribution from regularisation: 0.0041038841 *** contribution from error: -0.531852245 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48187.6472 0.0714176297 2.85409403 EXIT FROM BFGS code FG_LNSRCH 0. 0.0713835359 0. --------------------------------------------------- Iteration : 39 *********************************************** *** Learn Path 39 *** loss function: -0.527750909 *** contribution from regularisation: 0.00408618012 *** contribution from error: -0.531837106 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48187.8807 0.0713835359 6.86046028 EXIT FROM BFGS code FG_LNSRCH 0. 0.0713825673 0. --------------------------------------------------- Iteration : 40 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 81.2271957 sigma out 15 active outputs RANK 2 NODE 1 --> 60.3924828 sigma out 15 active outputs RANK 3 NODE 10 --> 35.8054047 sigma out 15 active outputs RANK 4 NODE 3 --> 33.4508553 sigma out 15 active outputs RANK 5 NODE 4 --> 31.8036976 sigma out 15 active outputs RANK 6 NODE 12 --> 29.7405891 sigma out 15 active outputs RANK 7 NODE 9 --> 29.5023632 sigma out 15 active outputs RANK 8 NODE 6 --> 26.2490883 sigma out 15 active outputs RANK 9 NODE 7 --> 25.8312988 sigma out 15 active outputs RANK 10 NODE 11 --> 24.6055202 sigma out 15 active outputs RANK 11 NODE 8 --> 24.5711613 sigma out 15 active outputs RANK 12 NODE 5 --> 12.7583799 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 7 --> 65.556839 sigma in 12act. ( 58.9684105 sig out 1act.) RANK 2 NODE 12 --> 50.8662758 sigma in 12act. ( 52.5745316 sig out 1act.) RANK 3 NODE 15 --> 49.3712463 sigma in 12act. ( 45.6847649 sig out 1act.) RANK 4 NODE 1 --> 47.0153618 sigma in 12act. ( 49.5857735 sig out 1act.) RANK 5 NODE 11 --> 40.6667976 sigma in 12act. ( 66.5876236 sig out 1act.) RANK 6 NODE 4 --> 35.9660568 sigma in 12act. ( 34.7485771 sig out 1act.) RANK 7 NODE 14 --> 31.3762226 sigma in 12act. ( 28.5643539 sig out 1act.) RANK 8 NODE 9 --> 26.1990204 sigma in 12act. ( 27.525116 sig out 1act.) RANK 9 NODE 8 --> 25.8047085 sigma in 12act. ( 30.3424587 sig out 1act.) RANK 10 NODE 5 --> 21.7998581 sigma in 12act. ( 24.0440063 sig out 1act.) RANK 11 NODE 6 --> 19.5250187 sigma in 12act. ( 23.0389519 sig out 1act.) RANK 12 NODE 10 --> 14.8062201 sigma in 12act. ( 15.7499399 sig out 1act.) RANK 13 NODE 13 --> 13.5968132 sigma in 12act. ( 12.7058439 sig out 1act.) RANK 14 NODE 2 --> 6.49681902 sigma in 12act. ( 6.60395718 sig out 1act.) RANK 15 NODE 3 --> 4.55976772 sigma in 12act. ( 3.81070423 sig out 1act.) sorted by output significance RANK 1 NODE 11 --> 66.5876236 sigma out 1act.( 40.6667976 sig in 12act.) RANK 2 NODE 7 --> 58.9684105 sigma out 1act.( 65.556839 sig in 12act.) RANK 3 NODE 12 --> 52.5745316 sigma out 1act.( 50.8662758 sig in 12act.) RANK 4 NODE 1 --> 49.5857735 sigma out 1act.( 47.0153618 sig in 12act.) RANK 5 NODE 15 --> 45.6847649 sigma out 1act.( 49.3712463 sig in 12act.) RANK 6 NODE 4 --> 34.7485771 sigma out 1act.( 35.9660568 sig in 12act.) RANK 7 NODE 8 --> 30.3424587 sigma out 1act.( 25.8047085 sig in 12act.) RANK 8 NODE 14 --> 28.5643539 sigma out 1act.( 31.3762226 sig in 12act.) RANK 9 NODE 9 --> 27.525116 sigma out 1act.( 26.1990204 sig in 12act.) RANK 10 NODE 5 --> 24.0440063 sigma out 1act.( 21.7998581 sig in 12act.) RANK 11 NODE 6 --> 23.0389519 sigma out 1act.( 19.5250187 sig in 12act.) RANK 12 NODE 10 --> 15.7499399 sigma out 1act.( 14.8062201 sig in 12act.) RANK 13 NODE 13 --> 12.7058439 sigma out 1act.( 13.5968132 sig in 12act.) RANK 14 NODE 2 --> 6.60395718 sigma out 1act.( 6.49681902 sig in 12act.) RANK 15 NODE 3 --> 3.81070423 sigma out 1act.( 4.55976772 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 143.175888 sigma in 15 active inputs *********************************************** *** Learn Path 40 *** loss function: -0.527752042 *** contribution from regularisation: 0.00408455497 *** contribution from error: -0.531836569 *********************************************** -----------------> Test sample Iteration No: 40 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -48187.9837 0.0713825673 6.97400665 EXIT FROM BFGS code FG_LNSRCH 0. 0.0713825673 0. --------------------------------------------------- Iteration : 41 *********************************************** *** Learn Path 41 *** loss function: -0.52775526 *** contribution from regularisation: 0.00408134237 *** contribution from error: -0.531836629 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48188.2765 0.0713825673 6.97155809 EXIT FROM BFGS code FG_LNSRCH 0. 0.0713825673 0. --------------------------------------------------- Iteration : 42 *********************************************** *** Learn Path 42 *** loss function: -0.527754247 *** contribution from regularisation: 0.00408236077 *** contribution from error: -0.531836629 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48188.1835 0.0713825673 6.96835423 EXIT FROM BFGS code FG_LNSRCH 0. 0.0713825673 0. --------------------------------------------------- Iteration : 43 *********************************************** *** Learn Path 43 *** loss function: -0.527751207 *** contribution from regularisation: 0.00408538384 *** contribution from error: -0.531836569 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -48187.9075 0.0713825673 6.96731138 EXIT FROM BFGS code NEW_X -48187.9075 0.0713825673 6.96731138 ENTER BFGS code NEW_X -48187.9075 0.0713825673 6.96731138 EXIT FROM BFGS code CONVERGENC -48187.9075 0.0713825673 6.96731138 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 128.048508 sigma out 15 active outputs RANK 2 NODE 1 --> 95.1224823 sigma out 15 active outputs RANK 3 NODE 10 --> 57.3258896 sigma out 15 active outputs RANK 4 NODE 3 --> 51.7748337 sigma out 15 active outputs RANK 5 NODE 4 --> 50.1026917 sigma out 15 active outputs RANK 6 NODE 12 --> 47.1836472 sigma out 15 active outputs RANK 7 NODE 9 --> 45.6462631 sigma out 15 active outputs RANK 8 NODE 7 --> 39.9115829 sigma out 15 active outputs RANK 9 NODE 6 --> 39.6127014 sigma out 15 active outputs RANK 10 NODE 8 --> 38.2971916 sigma out 15 active outputs RANK 11 NODE 11 --> 38.2919197 sigma out 15 active outputs RANK 12 NODE 5 --> 20.192667 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 7 --> 104.544449 sigma in 12act. ( 91.6763077 sig out 1act.) RANK 2 NODE 12 --> 79.7768173 sigma in 12act. ( 82.8741837 sig out 1act.) RANK 3 NODE 15 --> 77.1765366 sigma in 12act. ( 72.1389542 sig out 1act.) RANK 4 NODE 1 --> 74.0385284 sigma in 12act. ( 78.4822845 sig out 1act.) RANK 5 NODE 11 --> 62.8053284 sigma in 12act. ( 105.402313 sig out 1act.) RANK 6 NODE 4 --> 55.6545486 sigma in 12act. ( 54.8760452 sig out 1act.) RANK 7 NODE 14 --> 48.6977921 sigma in 12act. ( 45.0014915 sig out 1act.) RANK 8 NODE 9 --> 41.4072304 sigma in 12act. ( 42.5830231 sig out 1act.) RANK 9 NODE 8 --> 40.4169388 sigma in 12act. ( 47.2874451 sig out 1act.) RANK 10 NODE 5 --> 34.1675415 sigma in 12act. ( 37.6030807 sig out 1act.) RANK 11 NODE 6 --> 30.7213364 sigma in 12act. ( 35.7433205 sig out 1act.) RANK 12 NODE 10 --> 23.2636967 sigma in 12act. ( 24.4146767 sig out 1act.) RANK 13 NODE 13 --> 21.0602322 sigma in 12act. ( 19.7942066 sig out 1act.) RANK 14 NODE 2 --> 9.91431618 sigma in 12act. ( 10.3245583 sig out 1act.) RANK 15 NODE 3 --> 6.85509109 sigma in 12act. ( 5.84330988 sig out 1act.) sorted by output significance RANK 1 NODE 11 --> 105.402313 sigma out 1act.( 62.8053284 sig in 12act.) RANK 2 NODE 7 --> 91.6763077 sigma out 1act.( 104.544449 sig in 12act.) RANK 3 NODE 12 --> 82.8741837 sigma out 1act.( 79.7768173 sig in 12act.) RANK 4 NODE 1 --> 78.4822845 sigma out 1act.( 74.0385284 sig in 12act.) RANK 5 NODE 15 --> 72.1389542 sigma out 1act.( 77.1765366 sig in 12act.) RANK 6 NODE 4 --> 54.8760452 sigma out 1act.( 55.6545486 sig in 12act.) RANK 7 NODE 8 --> 47.2874451 sigma out 1act.( 40.4169388 sig in 12act.) RANK 8 NODE 14 --> 45.0014915 sigma out 1act.( 48.6977921 sig in 12act.) RANK 9 NODE 9 --> 42.5830231 sigma out 1act.( 41.4072304 sig in 12act.) RANK 10 NODE 5 --> 37.6030807 sigma out 1act.( 34.1675415 sig in 12act.) RANK 11 NODE 6 --> 35.7433205 sigma out 1act.( 30.7213364 sig in 12act.) RANK 12 NODE 10 --> 24.4146767 sigma out 1act.( 23.2636967 sig in 12act.) RANK 13 NODE 13 --> 19.7942066 sigma out 1act.( 21.0602322 sig in 12act.) RANK 14 NODE 2 --> 10.3245583 sigma out 1act.( 9.91431618 sig in 12act.) RANK 15 NODE 3 --> 5.84330988 sigma out 1act.( 6.85509109 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 225.059891 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.527754724 *** contribution from regularisation: 0.00408187928 *** contribution from error: -0.531836629 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 31601 Closing output file done