NNInput NNInputs_135.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree Error in : error reading all requested bytes from file TMVAAna_out.root, got 228 of 300 Warning in : file TMVAAna_out.root has no keys Error in : Cannot find tree with name NNInput in file TMVAAna_out.root NNAna::CopyTree: entries= 188782 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 47077 nbkg = 141705 Bkg Entries: 141705 Sig Entries: 47077 Chosen entries: 47077 Signal fraction: 1 Background fraction: 0.332218 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 141705 Actual Signal Entries: 47077 Entries to split: 47077 Test with : 23538 Train with : 23538 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 47077 for Signal Prepared event 0 for Signal with 47077 events ====Entry 0 Variable Ht : 232.766 Variable LepAPt : 74.3556 Variable LepBPt : 30.6959 Variable MetSigLeptonsJets : 3.09436 Variable MetSpec : 41.4986 Variable SumEtLeptonsJets : 190.102 Variable VSumJetLeptonsPt : 35.9078 Variable addEt : 147.716 Variable dPhiLepSumMet : 2.12661 Variable dPhiLeptons : 0.248817 Variable dRLeptons : 0.333981 Variable lep1_E : 86.3275 Variable lep2_E : 32.4592 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2135 Ht = 232.766 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 74.3557 LepAPt = 74.3556 LepBEt = 30.6962 LepBPt = 30.6959 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 42.6642 MetDelPhi = 1.33651 MetSig = 2.91556 MetSigLeptonsJets = 3.09436 MetSpec = 41.4986 Mjj = 0 MostCentralJetEta = -1.56143 MtllMet = 152.006 Njets = 1 SB = 0 SumEt = 214.133 SumEtJets = 0 SumEtLeptonsJets = 190.102 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 35.9078 addEt = 147.716 dPhiLepSumMet = 2.12661 dPhiLeptons = 0.248817 dRLeptons = 0.333981 diltype = 17 dimass = 15.9508 event = 435 jet1_Et = 85.0503 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 86.3275 lep2_E = 32.4592 rand = 0.999742 run = 230778 weight = 2.42631e-06 ===Show End Prepared event 10000 for Signal with 47077 events Prepared event 20000 for Signal with 47077 events Prepared event 30000 for Signal with 47077 events Prepared event 40000 for Signal with 47077 events Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 141705 for Background Prepared event 0 for Background with 141705 events ====Entry 0 Variable Ht : 85.3408 Variable LepAPt : 31.1294 Variable LepBPt : 10.0664 Variable MetSigLeptonsJets : 6.87768 Variable MetSpec : 44.1437 Variable SumEtLeptonsJets : 41.1959 Variable VSumJetLeptonsPt : 41.0132 Variable addEt : 85.3408 Variable dPhiLepSumMet : 3.10536 Variable dPhiLeptons : 0.219342 Variable dRLeptons : 0.424454 Variable lep1_E : 32.3548 Variable lep2_E : 10.1027 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 85.3408 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 31.1297 LepAPt = 31.1294 LepBEt = 10.0674 LepBPt = 10.0664 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 44.1437 MetDelPhi = 3.01191 MetSig = 3.74558 MetSigLeptonsJets = 6.87768 MetSpec = 44.1437 Mjj = 0 MostCentralJetEta = 0 MtllMet = 86.2332 Njets = 0 SB = 0 SumEt = 138.899 SumEtJets = 0 SumEtLeptonsJets = 41.1959 Target = 0 TrainWeight = 0.271108 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 41.0132 addEt = 85.3408 dPhiLepSumMet = 3.10536 dPhiLeptons = 0.219342 dRLeptons = 0.424454 diltype = 17 dimass = 7.54723 event = 6717520 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 32.3548 lep2_E = 10.1027 rand = 0.999742 run = 271566 weight = 0.00296168 ===Show End Prepared event 10000 for Background with 141705 events Prepared event 20000 for Background with 141705 events Prepared event 30000 for Background with 141705 events Prepared event 40000 for Background with 141705 events Prepared event 50000 for Background with 141705 events Prepared event 60000 for Background with 141705 events Prepared event 70000 for Background with 141705 events Prepared event 80000 for Background with 141705 events Prepared event 90000 for Background with 141705 events Prepared event 100000 for Background with 141705 events Prepared event 110000 for Background with 141705 events Prepared event 120000 for Background with 141705 events Prepared event 130000 for Background with 141705 events Prepared event 140000 for Background with 141705 events Warning: found 4392 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 188782 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 4392 negative weights. Signal fraction: 62.0343552 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 47.1748543 61.7397003 64.2002258 66.1391602 67.4029694 68.5073242 69.6145554 70.4647064 71.3087158 72.0952148 73.0454865 73.755127 74.3350449 75.0042419 75.6182709 76.2914581 76.9723816 77.4581146 78.0231934 78.6397858 79.2813644 79.8660507 80.5231705 81.0945663 81.8197021 82.4884644 83.1710663 84.0450287 85.0336456 85.9190826 86.8053589 87.7672729 88.716774 89.670578 90.8839417 92.0369263 93.0809326 94.3285522 95.4459 96.5662613 97.7723083 98.9803772 100.193939 101.211975 102.341637 103.582466 104.585663 105.642677 106.667847 107.730118 108.807411 109.948433 111.028671 112.064034 113.08432 114.243317 115.325134 116.369453 117.500549 118.599838 119.717636 120.657806 121.844177 122.964279 124.0793 125.217247 126.329697 127.605713 128.81456 130.132736 131.504181 132.864059 134.296906 135.785065 137.483154 139.374115 141.367157 143.232315 145.334335 147.574829 149.87088 152.186768 154.608917 156.833359 159.486694 162.375854 165.320557 168.896027 172.163818 176.308136 180.599945 185.875092 191.664932 198.465851 205.641922 214.328674 227.094009 241.630936 265.631378 310.817261 782.079102 ------------------------------ Transdef: Tab for variable 3 20.0002613 20.2939873 20.5743523 20.8458748 21.1195679 21.4095726 21.5862255 21.8053646 21.9774437 22.2024231 22.3837738 22.5716591 22.7964172 23.0214901 23.1793671 23.3230286 23.5389042 23.7401047 23.932333 24.1377907 24.3719673 24.5334625 24.7067261 24.9169426 25.1099949 25.2825012 25.4542618 25.6232948 25.8015594 25.9859581 26.1911278 26.3489418 26.5387344 26.7364693 26.9276257 27.1262169 27.32061 27.5366821 27.7211838 27.9225731 28.1044159 28.2934914 28.5180416 28.6994953 28.9025116 29.105751 29.2925701 29.4892025 29.7217827 29.9482498 30.1640968 30.4274883 30.6780663 30.9194107 31.1477356 31.3810635 31.6191082 31.8297615 32.0534821 32.280014 32.5320702 32.7859039 33.0844536 33.3744965 33.6644249 33.9519958 34.2378006 34.5577927 34.8861351 35.2109222 35.5185471 35.8873215 36.2570114 36.6528549 37.004631 37.4121246 37.8046875 38.2023239 38.6303978 39.0873871 39.4110031 39.8673935 40.4266891 40.8138657 41.3488159 41.9284515 42.6147461 43.2663803 43.9671555 44.8080292 45.6666641 46.6549225 47.7915421 49.0851898 50.6177673 52.395565 54.5392303 57.4451904 62.3504028 70.5046692 228.385986 ------------------------------ Transdef: Tab for variable 4 10.0001202 10.1117859 10.217309 10.3027697 10.405261 10.5042801 10.6396427 10.7567806 10.8558617 10.977005 11.0720682 11.2113743 11.3302193 11.4413013 11.5605364 11.698103 11.8291931 11.9640722 12.0747595 12.197794 12.3160763 12.4496555 12.588809 12.7176027 12.8443737 12.9759693 13.1316137 13.2907009 13.4455547 13.5928459 13.734086 13.8680077 14.0349846 14.1797752 14.3301201 14.4981527 14.6516161 14.8237801 15.013854 15.1938944 15.3561401 15.4954414 15.6779509 15.8566437 16.011219 16.2101231 16.4174728 16.612339 16.7929306 16.96208 17.171669 17.3964424 17.5997353 17.7794037 17.9862099 18.1364288 18.3598709 18.5322666 18.7728291 18.9820175 19.1998291 19.4108124 19.6792297 19.9323483 20.157402 20.3208199 20.5290031 20.7378445 20.9757328 21.1937904 21.4655457 21.7033958 21.9337769 22.098814 22.3554134 22.601326 22.8652344 23.1576385 23.4373817 23.7276077 24.0522594 24.3808517 24.6992493 25.0816002 25.4570408 25.8274899 26.1958389 26.6032391 27.004158 27.4357948 27.8819695 28.3499413 28.9445572 29.5737114 30.3229427 31.1401711 32.2985229 33.8244247 35.9687576 39.8987579 71.9776382 ------------------------------ Transdef: Tab for variable 5 1.17167664 2.43591213 2.83182478 3.08376646 3.27856445 3.46319628 3.63480091 3.78777337 3.94079733 4.06834412 4.17258453 4.27979565 4.37447691 4.47832966 4.56199074 4.63041973 4.70115185 4.78341246 4.8598671 4.94215679 5.0114069 5.07124805 5.12965202 5.18932581 5.25952435 5.31247807 5.36859131 5.41401958 5.46058464 5.51598263 5.54345942 5.58951855 5.63364124 5.67517948 5.72340775 5.76862144 5.81221581 5.85691547 5.90952921 5.95187902 5.99678516 6.03602219 6.07626772 6.11624861 6.15842772 6.20625973 6.24953651 6.28082323 6.31974316 6.35623407 6.39678812 6.43444252 6.47706795 6.51776695 6.56274605 6.60608625 6.64894199 6.69665051 6.73929119 6.78389549 6.82898712 6.86711645 6.89670944 6.94470119 6.99110794 7.04269981 7.08029604 7.12763929 7.17551231 7.22080898 7.26937675 7.31150675 7.36200619 7.41100693 7.46672106 7.50680494 7.55751896 7.61160231 7.66433239 7.71786499 7.78026962 7.83545113 7.89269495 7.94797993 8.01481915 8.09540653 8.16316795 8.23126221 8.30404663 8.39330292 8.48456383 8.59179688 8.71124935 8.84185696 8.98955631 9.16707611 9.37489319 9.66391945 10.1340504 11.0821247 19.048111 ------------------------------ Transdef: Tab for variable 6 15.000617 19.2638226 23.1626854 25.4351425 26.2339706 26.9588299 27.549778 28.3271751 29.069622 29.7453442 30.3990517 30.9849567 31.5482178 32.1030884 32.595417 32.9895859 33.4161682 33.8355255 34.280098 34.622673 34.8749924 35.2281303 35.646431 36.0193329 36.4407043 36.7870827 37.1489563 37.4984436 37.8192406 38.10392 38.4888763 38.8286285 39.1662598 39.4823608 39.7873993 40.1309547 40.4249344 40.7637787 41.1340332 41.5064659 41.8864746 42.2965279 42.6859131 43.0571976 43.3954887 43.7693901 44.3300018 44.6524429 45.0804253 45.674057 46.1954536 46.6834145 47.1632843 47.6276932 48.1207428 48.6165924 49.129097 49.6727753 50.1650162 50.6606369 51.1771965 51.7318115 52.315239 52.7987022 53.3407135 53.9336739 54.508606 55.1016998 55.6642838 56.2018509 56.7909546 57.3605423 57.9380074 58.5359344 59.016758 59.6000748 60.1827965 60.847435 61.5378571 62.2245369 62.9014931 63.5904846 64.2735901 65.025116 65.7774506 66.6611633 67.5693283 68.5286865 69.5334015 70.6985321 72.0784149 73.6711884 75.4443893 76.8249664 79.0072327 81.7676849 85.0590363 89.4123077 95.5785675 107.194672 236.562576 ------------------------------ Transdef: Tab for variable 7 30.1484737 32.350872 33.3227005 33.947979 34.6584167 35.2266998 35.7320557 36.1717606 36.5943909 36.9232292 37.2620697 37.5874176 37.9325562 38.2388535 38.5660744 38.8502922 39.1533432 39.3912201 39.6160927 39.972477 40.2751312 40.535759 40.8599091 41.1849289 41.4644318 41.8502274 42.2870483 42.7043839 43.1583252 43.5783844 44.1515999 44.6421127 45.1339531 45.6143341 46.174675 46.8813553 47.414814 48.0678635 48.8166313 49.5621872 50.3894806 51.1811981 51.8915329 52.6261597 53.2678299 53.8801193 54.6142349 55.33424 55.9695435 56.5748291 57.2561646 57.896019 58.6426697 59.2788353 59.9335213 60.6634865 61.1717224 61.7316589 62.3591919 63.0770111 63.8841629 64.7594147 65.6088028 66.5457306 67.557373 68.5814362 69.6665344 70.7732849 72.070076 73.2941589 74.4820328 75.7846527 77.0679626 78.4707184 79.7608032 81.3126373 82.7805481 84.4414749 86.2346191 88.0993042 90.1174011 92.0748672 93.9748688 96.0862656 98.472847 101.004036 103.72699 106.47081 109.775772 113.120384 116.983444 121.025848 126.185715 131.557526 137.997437 146.484497 156.204468 168.956451 185.835449 220.920425 477.637085 ------------------------------ Transdef: Tab for variable 8 0.678276479 21.9954987 26.1030273 28.8144455 30.4865284 31.3611336 31.9849968 32.6383896 33.1167488 33.5748711 33.9657669 34.4218903 34.8258438 35.1875839 35.5060997 35.9326706 36.1929741 36.4879379 36.7587891 37.02108 37.2778397 37.5276909 37.7976227 38.0207825 38.2168999 38.4969749 38.7067757 38.9458427 39.2212296 39.4414978 39.7211609 39.9816208 40.2197914 40.5182762 40.8404007 41.1765633 41.4454422 41.751564 42.0891495 42.4544144 42.8125 43.1286087 43.4693527 43.8340492 44.2627869 44.6992455 45.1572495 45.5198898 45.9419556 46.3633919 46.8709869 47.3170319 47.8273849 48.2722244 48.7719955 49.3111801 49.8472176 50.271019 50.7946701 51.2329559 51.7641373 52.3170395 52.8680534 53.2597046 53.7434158 54.2426224 54.8114357 55.3411026 55.80336 56.3174667 56.8295746 57.3882523 57.9705811 58.5877304 59.1415939 59.7140427 60.2791748 60.9067307 61.3761292 62.0590439 62.707222 63.4047241 64.1757965 64.9670944 65.7543793 66.6287689 67.5536804 68.5811615 69.8100739 71.0021591 72.5113373 74.1861496 75.95578 77.8163528 80.1772919 83.2442856 87.4665756 92.9769058 101.637589 118.952728 326.868286 ------------------------------ Transdef: Tab for variable 9 47.1748543 61.2664185 63.5758057 65.1707153 66.5766449 67.6192322 68.5824661 69.5405273 70.2850266 70.8768768 71.6664429 72.5011597 73.2513809 73.8352509 74.3200455 74.8829956 75.5035553 76.0191803 76.6042786 77.0332336 77.4966583 78.0086975 78.4621277 79.0121155 79.51548 80.0003662 80.5498047 81.051239 81.6122437 82.1879883 82.6252213 83.2929916 83.9905014 84.6962357 85.3616867 86.0669556 86.689682 87.4556808 88.1853409 88.8804855 89.6159058 90.3752899 91.3715057 92.1786804 92.9793854 93.7969131 94.6730957 95.6053543 96.4577484 97.3984985 98.4012146 99.2941589 100.32019 101.148911 101.990707 102.924896 103.940475 104.928864 105.770844 106.594292 107.421738 108.281815 109.170174 110.009148 110.865471 111.718994 112.506409 113.328659 114.360336 115.15979 115.966705 116.811249 117.766159 118.642319 119.498123 120.182892 121.002052 121.975693 122.867218 123.764305 124.700165 125.67627 126.581413 127.61718 128.708191 129.808197 130.84021 131.91362 133.100098 134.353882 135.668823 137.26001 139.312103 141.498505 144.140228 147.68544 151.611908 158.178833 170.330627 196.655426 424.390259 ------------------------------ Transdef: Tab for variable 10 0.0050833188 0.875878811 1.19541359 1.40606213 1.57420266 1.7027452 1.81130123 1.89938927 1.99031413 2.05630064 2.11757183 2.17809582 2.23239708 2.28148532 2.325459 2.37224293 2.41424179 2.45021009 2.48227692 2.51388931 2.54064989 2.56829715 2.59511209 2.62060237 2.64387059 2.66767812 2.68976521 2.71184015 2.72923326 2.74843359 2.76600099 2.78025436 2.79561901 2.81061506 2.82347894 2.83593845 2.8486433 2.86001825 2.871732 2.88144684 2.89111662 2.90078449 2.90977788 2.91659427 2.92639184 2.93350124 2.94090223 2.94851637 2.95537853 2.96154118 2.96830845 2.97536755 2.98022366 2.98537588 2.98974562 2.99381757 2.99898839 3.00434661 3.00866747 3.01329422 3.01808643 3.02254629 3.02704072 3.0310328 3.0347681 3.03861165 3.04308605 3.0472765 3.05105686 3.05415154 3.05769491 3.06141186 3.06506205 3.06831121 3.07222557 3.07540369 3.07841396 3.08186293 3.08477855 3.0879488 3.09095478 3.09377909 3.09620047 3.09917974 3.10161114 3.104388 3.10736179 3.10981321 3.11302638 3.11518383 3.11767673 3.12006664 3.12222338 3.12472582 3.12717724 3.12951088 3.13156605 3.13349628 3.13591814 3.1388824 3.14158678 ------------------------------ Transdef: Tab for variable 11 1.19226625E-05 0.0058504343 0.0108092595 0.0179294348 0.0245916471 0.0316685438 0.0401714444 0.0490593016 0.0577444956 0.0647816658 0.0724906921 0.0802475214 0.0884793997 0.096390605 0.103570133 0.110309564 0.117600203 0.125514388 0.133666635 0.141551763 0.150351375 0.157716393 0.165135264 0.172000021 0.179526329 0.186262533 0.193107009 0.199425578 0.206474364 0.212555826 0.218857527 0.225240231 0.231185079 0.237018585 0.2434268 0.249225348 0.255243778 0.260339081 0.266100824 0.271678209 0.277519226 0.282803059 0.288480163 0.29417026 0.299619675 0.305002153 0.311029434 0.317551851 0.322769016 0.328760147 0.334349692 0.339972734 0.34532693 0.350515872 0.356282353 0.361574829 0.367329359 0.37337321 0.379344761 0.385690987 0.391753674 0.397605896 0.403298557 0.409218311 0.414931893 0.421936214 0.427673221 0.433015823 0.439281285 0.445752025 0.45186159 0.458992004 0.465349227 0.471170306 0.476689339 0.485059053 0.492626131 0.500359535 0.508022904 0.514578223 0.521976173 0.530208528 0.53888011 0.545415938 0.55363667 0.56363523 0.573563278 0.58389008 0.594533682 0.607222915 0.618942261 0.632530212 0.646159053 0.664159298 0.685329378 0.705265403 0.72709167 0.756496787 0.798122764 0.855696678 1.1301049 ------------------------------ Transdef: Tab for variable 12 0.200001657 0.221388489 0.237702399 0.249781966 0.260388494 0.271983266 0.280983508 0.290277779 0.29903397 0.306736946 0.315024316 0.32242915 0.329819739 0.337025702 0.343968213 0.350584686 0.356298238 0.36287874 0.369543791 0.375839055 0.382321417 0.389003038 0.394408882 0.399896026 0.404238462 0.408624232 0.412320435 0.416552991 0.420283735 0.424182385 0.428539157 0.432788193 0.436615437 0.440541089 0.444752872 0.448128521 0.451673716 0.45566985 0.459830463 0.463965952 0.467726767 0.471473455 0.475111395 0.478047132 0.481945395 0.485619187 0.489403337 0.492735505 0.496951163 0.500858903 0.504209638 0.508298635 0.512419045 0.516503453 0.520452023 0.523741245 0.528394222 0.532363296 0.536830962 0.540730596 0.545316458 0.548764348 0.553324878 0.557956159 0.562840879 0.567304909 0.571727514 0.577196836 0.582438707 0.588048577 0.593457818 0.598579824 0.604470551 0.610441446 0.616497993 0.622210622 0.629068732 0.634940982 0.641957104 0.648213506 0.656118393 0.663585067 0.670303822 0.677963138 0.685724795 0.695180178 0.70362103 0.712798178 0.720605314 0.729478955 0.741676509 0.750423908 0.761598825 0.774998307 0.789582491 0.8065449 0.824241281 0.845957518 0.876530886 0.914980531 1.15372479 ------------------------------ Transdef: Tab for variable 13 20.0025482 21.434082 22.1673889 22.7679062 23.3653984 23.8201218 24.3485565 24.759819 25.1604137 25.5462723 25.8668861 26.275692 26.6602936 27.0141926 27.3263969 27.6235123 27.9620609 28.2814217 28.6061649 28.9120064 29.2792053 29.5738335 29.8883228 30.2180061 30.4883461 30.81464 31.0668526 31.3954296 31.6524849 31.9119835 32.152401 32.428936 32.7485466 33.0759315 33.3697357 33.6933899 34.0260925 34.3449249 34.5442009 34.8494263 35.1643295 35.4835281 35.8004265 36.0965195 36.3516464 36.7316284 37.0534592 37.4215622 37.7957764 38.1221008 38.4645958 38.8598061 39.2297516 39.6171112 40.0362396 40.4241943 40.8357468 41.2671738 41.7090797 42.1452789 42.6394463 43.0480042 43.5043106 43.9950562 44.496521 45.0559464 45.5842667 46.1458969 46.7168579 47.3149452 47.9563522 48.593132 49.2615204 49.9130554 50.7045135 51.3934555 52.1080551 52.8999023 53.7828636 54.5782242 55.4389992 56.4156265 57.3263168 58.3724289 59.2960358 60.526062 61.9275627 63.0568848 64.6285095 66.4765167 68.1410217 69.4962616 71.7231369 74.286972 77.212326 80.3548126 84.1970367 88.4031067 94.7449875 105.213379 232.717926 ------------------------------ Transdef: Tab for variable 14 10.0028534 10.5415077 11.017848 11.3190851 11.5893154 11.8900242 12.2194462 12.4280376 12.66745 12.916934 13.188076 13.441268 13.677125 13.9257545 14.192234 14.4568691 14.6760197 14.9035854 15.1831217 15.4336681 15.6475563 15.9192944 16.2011719 16.3805923 16.5990562 16.8458614 17.0548019 17.3003178 17.549736 17.7491379 17.9960556 18.208477 18.4843044 18.7343292 18.9674473 19.1708794 19.4218597 19.6907921 19.9549675 20.2451057 20.497879 20.7679214 21.0075512 21.2563 21.4966507 21.7346497 21.9482727 22.1834183 22.4487629 22.7042274 22.9749069 23.2169247 23.4761009 23.7391052 24.0185032 24.3056679 24.5771942 24.8455925 25.1345024 25.4014015 25.6306419 25.878046 26.1526566 26.439333 26.7736549 27.0533562 27.35289 27.6451073 27.9610271 28.1955185 28.5470848 28.8754463 29.1568279 29.5068779 29.8500404 30.2702675 30.6919613 31.1161232 31.4911842 31.9964714 32.5428772 33.0167389 33.4473953 33.943428 34.5578842 35.1329346 35.812542 36.4929428 37.3108749 38.2019577 39.0836525 39.9744186 41.1327553 42.5608139 44.0166397 45.5195923 47.7590599 50.8126068 54.9508438 61.8104019 122.221924 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 57.3 38.9 42.7 15.5 38.4 54.6 42.3 53.9 -27.6 -12.5 -28.2 2.7 10.7 2 57.3 100.0 63.4 54.3 23.8 62.0 94.2 66.5 90.3 -50.1 -22.5 -44.9 27.9 25.6 3 38.9 63.4 100.0 32.6 -0.7 26.9 66.6 46.8 68.5 -18.6 -23.4 -49.0 61.7 12.7 4 42.7 54.3 32.6 100.0 5.2 28.7 55.3 42.3 59.0 -14.8 -27.5 -55.8 10.3 67.8 5 15.5 23.8 -0.7 5.2 100.0 82.2 -4.5 51.4 50.4 26.6 -1.1 -0.4 -8.8 -2.3 6 38.4 62.0 26.9 28.7 82.2 100.0 39.1 74.4 77.4 -4.2 -11.0 -20.5 5.2 10.3 7 54.6 94.2 66.6 55.3 -4.5 39.1 100.0 57.9 77.5 -57.5 -22.1 -45.8 32.6 28.2 8 42.3 66.5 46.8 42.3 51.4 74.4 57.9 100.0 78.5 -9.1 -19.4 -34.9 20.7 21.2 9 53.9 90.3 68.5 59.0 50.4 77.4 77.5 78.5 100.0 -24.0 -24.5 -49.4 32.2 29.9 10 -27.6 -50.1 -18.6 -14.8 26.6 -4.2 -57.5 -9.1 -24.0 100.0 4.7 11.5 -5.6 -4.4 11 -12.5 -22.5 -23.4 -27.5 -1.1 -11.0 -22.1 -19.4 -24.5 4.7 100.0 54.9 -8.7 -17.2 12 -28.2 -44.9 -49.0 -55.8 -0.4 -20.5 -45.8 -34.9 -49.4 11.5 54.9 100.0 -21.9 -30.3 13 2.7 27.9 61.7 10.3 -8.8 5.2 32.6 20.7 32.2 -5.6 -8.7 -21.9 100.0 42.2 14 10.7 25.6 12.7 67.8 -2.3 10.3 28.2 21.2 29.9 -4.4 -17.2 -30.3 42.2 100.0 TOTAL CORRELATION TO TARGET (diagonal) 133.541665 TOTAL CORRELATION OF ALL VARIABLES 63.5613104 ROUND 1: MAX CORR ( 63.5593561) AFTER KILLING INPUT VARIABLE 7 CONTR 0.498424793 ROUND 2: MAX CORR ( 63.5408372) AFTER KILLING INPUT VARIABLE 11 CONTR 1.53419615 ROUND 3: MAX CORR ( 63.4361485) AFTER KILLING INPUT VARIABLE 6 CONTR 3.64596383 ROUND 4: MAX CORR ( 63.2915392) AFTER KILLING INPUT VARIABLE 10 CONTR 4.2808881 ROUND 5: MAX CORR ( 63.1080543) AFTER KILLING INPUT VARIABLE 8 CONTR 4.81585096 ROUND 6: MAX CORR ( 62.8367489) AFTER KILLING INPUT VARIABLE 12 CONTR 5.84546827 ROUND 7: MAX CORR ( 62.569856) AFTER KILLING INPUT VARIABLE 14 CONTR 5.7853381 ROUND 8: MAX CORR ( 62.0648762) AFTER KILLING INPUT VARIABLE 9 CONTR 7.93334866 ROUND 9: MAX CORR ( 61.9892937) AFTER KILLING INPUT VARIABLE 5 CONTR 3.06207858 ROUND 10: MAX CORR ( 60.6423017) AFTER KILLING INPUT VARIABLE 4 CONTR 12.852384 ROUND 11: MAX CORR ( 58.9398838) AFTER KILLING INPUT VARIABLE 3 CONTR 14.2681063 ROUND 12: MAX CORR ( 57.2963749) AFTER KILLING INPUT VARIABLE 13 CONTR 13.821553 LAST REMAINING VARIABLE: 2 total correlation to target: 63.5613104 % total significance: 124.160943 sigma correlations of single variables to target: variable 2: 57.2963749 % , in sigma: 111.922991 variable 3: 38.8733632 % , in sigma: 75.9353986 variable 4: 42.7003354 % , in sigma: 83.4110228 variable 5: 15.5489981 % , in sigma: 30.3734812 variable 6: 38.3507739 % , in sigma: 74.9145703 variable 7: 54.5892504 % , in sigma: 106.634882 variable 8: 42.2823586 % , in sigma: 82.5945452 variable 9: 53.8764907 % , in sigma: 105.242574 variable 10: -27.6133495 % , in sigma: 53.9400383 variable 11: -12.4514455 % , in sigma: 24.3227083 variable 12: -28.1902454 % , in sigma: 55.0669493 variable 13: 2.69883375 % , in sigma: 5.27191371 variable 14: 10.6546623 % , in sigma: 20.8128642 variables sorted by significance: 1 most relevant variable 2 corr 57.2963753 , in sigma: 111.922991 2 most relevant variable 13 corr 13.8215532 , in sigma: 26.9990828 3 most relevant variable 3 corr 14.2681065 , in sigma: 27.8713819 4 most relevant variable 4 corr 12.8523836 , in sigma: 25.1059026 5 most relevant variable 5 corr 3.06207848 , in sigma: 5.98147755 6 most relevant variable 9 corr 7.93334866 , in sigma: 15.4970381 7 most relevant variable 14 corr 5.78533792 , in sigma: 11.3011045 8 most relevant variable 12 corr 5.84546804 , in sigma: 11.418563 9 most relevant variable 8 corr 4.81585073 , in sigma: 9.40730399 10 most relevant variable 10 corr 4.28088808 , in sigma: 8.36230559 11 most relevant variable 6 corr 3.64596391 , in sigma: 7.12204192 12 most relevant variable 11 corr 1.53419614 , in sigma: 2.99690548 13 most relevant variable 7 corr 0.498424798 , in sigma: 0.973625192 global correlations between input variables: variable 2: 99.1806349 % variable 3: 93.227516 % variable 4: 91.2671072 % variable 5: 95.274358 % variable 6: 94.148841 % variable 7: 98.7142034 % variable 8: 86.8502442 % variable 9: 98.888411 % variable 10: 72.4555278 % variable 11: 56.0006276 % variable 12: 74.6783452 % variable 13: 84.6133224 % variable 14: 86.686012 % significance loss when removing single variables: variable 2: corr = 6.70859893 % , sigma = 13.1046066 variable 3: corr = 13.1142261 % , sigma = 25.6173868 variable 4: corr = 16.02427 % , sigma = 31.3018794 variable 5: corr = 5.09488585 % , sigma = 9.95237241 variable 6: corr = 3.66593736 % , sigma = 7.16105816 variable 7: corr = 0.498424793 % , sigma = 0.973625181 variable 8: corr = 3.01290523 % , sigma = 5.88542232 variable 9: corr = 8.12456229 % , sigma = 15.8705557 variable 10: corr = 4.00347442 % , sigma = 7.82040453 variable 11: corr = 1.50125586 % , sigma = 2.93255979 variable 12: corr = 4.29186651 % , sigma = 8.38375091 variable 13: corr = 8.43671757 % , sigma = 16.4803212 variable 14: corr = 5.92737213 % , sigma = 11.5785548 Keep only 11 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 12 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 1 --> 28.9729633 sigma out 15 active outputs RANK 2 NODE 6 --> 22.0603886 sigma out 15 active outputs RANK 3 NODE 11 --> 17.0073967 sigma out 15 active outputs RANK 4 NODE 10 --> 16.634079 sigma out 15 active outputs RANK 5 NODE 8 --> 14.9673281 sigma out 15 active outputs RANK 6 NODE 2 --> 13.9249134 sigma out 15 active outputs RANK 7 NODE 4 --> 13.315877 sigma out 15 active outputs RANK 8 NODE 9 --> 12.6026525 sigma out 15 active outputs RANK 9 NODE 3 --> 11.1547136 sigma out 15 active outputs RANK 10 NODE 5 --> 10.8535509 sigma out 15 active outputs RANK 11 NODE 7 --> 10.732934 sigma out 15 active outputs RANK 12 NODE 12 --> 9.31064129 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 11 --> 28.6638031 sigma in 12act. ( 29.9048958 sig out 1act.) RANK 2 NODE 3 --> 24.4096489 sigma in 12act. ( 23.7534027 sig out 1act.) RANK 3 NODE 13 --> 22.6197071 sigma in 12act. ( 22.3987694 sig out 1act.) RANK 4 NODE 9 --> 20.1125889 sigma in 12act. ( 22.0420914 sig out 1act.) RANK 5 NODE 4 --> 14.5623684 sigma in 12act. ( 14.6436615 sig out 1act.) RANK 6 NODE 7 --> 13.8065052 sigma in 12act. ( 15.2411346 sig out 1act.) RANK 7 NODE 6 --> 11.2683725 sigma in 12act. ( 11.2365503 sig out 1act.) RANK 8 NODE 5 --> 10.2969522 sigma in 12act. ( 14.2534876 sig out 1act.) RANK 9 NODE 12 --> 7.10884523 sigma in 12act. ( 3.3615818 sig out 1act.) RANK 10 NODE 10 --> 5.05209446 sigma in 12act. ( 0.247707069 sig out 1act.) RANK 11 NODE 2 --> 3.50450087 sigma in 12act. ( 3.26804709 sig out 1act.) RANK 12 NODE 14 --> 3.37557578 sigma in 12act. ( 3.95904279 sig out 1act.) RANK 13 NODE 15 --> 3.34473634 sigma in 12act. ( 3.82279778 sig out 1act.) RANK 14 NODE 1 --> 2.71248698 sigma in 12act. ( 3.68884015 sig out 1act.) RANK 15 NODE 8 --> 1.6230284 sigma in 12act. ( 2.38506126 sig out 1act.) sorted by output significance RANK 1 NODE 11 --> 29.9048958 sigma out 1act.( 28.6638031 sig in 12act.) RANK 2 NODE 3 --> 23.7534027 sigma out 1act.( 24.4096489 sig in 12act.) RANK 3 NODE 13 --> 22.3987694 sigma out 1act.( 22.6197071 sig in 12act.) RANK 4 NODE 9 --> 22.0420914 sigma out 1act.( 20.1125889 sig in 12act.) RANK 5 NODE 7 --> 15.2411346 sigma out 1act.( 13.8065052 sig in 12act.) RANK 6 NODE 4 --> 14.6436615 sigma out 1act.( 14.5623684 sig in 12act.) RANK 7 NODE 5 --> 14.2534876 sigma out 1act.( 10.2969522 sig in 12act.) RANK 8 NODE 6 --> 11.2365503 sigma out 1act.( 11.2683725 sig in 12act.) RANK 9 NODE 14 --> 3.95904279 sigma out 1act.( 3.37557578 sig in 12act.) RANK 10 NODE 15 --> 3.82279778 sigma out 1act.( 3.34473634 sig in 12act.) RANK 11 NODE 1 --> 3.68884015 sigma out 1act.( 2.71248698 sig in 12act.) RANK 12 NODE 12 --> 3.3615818 sigma out 1act.( 7.10884523 sig in 12act.) RANK 13 NODE 2 --> 3.26804709 sigma out 1act.( 3.50450087 sig in 12act.) RANK 14 NODE 8 --> 2.38506126 sigma out 1act.( 1.6230284 sig in 12act.) RANK 15 NODE 10 --> 0.247707069 sigma out 1act.( 5.05209446 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 57.3921776 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 1 --> 28.8359432 sigma out 15 active outputs RANK 2 NODE 2 --> 23.2471275 sigma out 15 active outputs RANK 3 NODE 6 --> 20.4812012 sigma out 15 active outputs RANK 4 NODE 11 --> 17.9184837 sigma out 15 active outputs RANK 5 NODE 10 --> 16.6287956 sigma out 15 active outputs RANK 6 NODE 8 --> 15.5580397 sigma out 15 active outputs RANK 7 NODE 4 --> 14.2705965 sigma out 15 active outputs RANK 8 NODE 5 --> 12.6567287 sigma out 15 active outputs RANK 9 NODE 9 --> 12.6389694 sigma out 15 active outputs RANK 10 NODE 7 --> 12.5291224 sigma out 15 active outputs RANK 11 NODE 3 --> 11.6368732 sigma out 15 active outputs RANK 12 NODE 12 --> 9.1878643 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 11 --> 29.8873558 sigma in 12act. ( 28.6480122 sig out 1act.) RANK 2 NODE 3 --> 20.4721069 sigma in 12act. ( 20.3597012 sig out 1act.) RANK 3 NODE 13 --> 20.4104481 sigma in 12act. ( 20.2018528 sig out 1act.) RANK 4 NODE 9 --> 19.0303993 sigma in 12act. ( 19.8938274 sig out 1act.) RANK 5 NODE 5 --> 16.7063217 sigma in 12act. ( 15.2756634 sig out 1act.) RANK 6 NODE 7 --> 15.884531 sigma in 12act. ( 14.8623066 sig out 1act.) RANK 7 NODE 4 --> 15.1666079 sigma in 12act. ( 13.5322695 sig out 1act.) RANK 8 NODE 6 --> 13.2333498 sigma in 12act. ( 10.4128284 sig out 1act.) RANK 9 NODE 8 --> 10.5866117 sigma in 12act. ( 5.69664574 sig out 1act.) RANK 10 NODE 12 --> 9.42188835 sigma in 12act. ( 2.81488967 sig out 1act.) RANK 11 NODE 1 --> 8.68697166 sigma in 12act. ( 4.55448008 sig out 1act.) RANK 12 NODE 15 --> 8.0446949 sigma in 12act. ( 4.53158283 sig out 1act.) RANK 13 NODE 14 --> 7.72946453 sigma in 12act. ( 4.23812008 sig out 1act.) RANK 14 NODE 10 --> 7.53451729 sigma in 12act. ( 1.1818589 sig out 1act.) RANK 15 NODE 2 --> 6.80054808 sigma in 12act. ( 3.59610248 sig out 1act.) sorted by output significance RANK 1 NODE 11 --> 28.6480122 sigma out 1act.( 29.8873558 sig in 12act.) RANK 2 NODE 3 --> 20.3597012 sigma out 1act.( 20.4721069 sig in 12act.) RANK 3 NODE 13 --> 20.2018528 sigma out 1act.( 20.4104481 sig in 12act.) RANK 4 NODE 9 --> 19.8938274 sigma out 1act.( 19.0303993 sig in 12act.) RANK 5 NODE 5 --> 15.2756634 sigma out 1act.( 16.7063217 sig in 12act.) RANK 6 NODE 7 --> 14.8623066 sigma out 1act.( 15.884531 sig in 12act.) RANK 7 NODE 4 --> 13.5322695 sigma out 1act.( 15.1666079 sig in 12act.) RANK 8 NODE 6 --> 10.4128284 sigma out 1act.( 13.2333498 sig in 12act.) RANK 9 NODE 8 --> 5.69664574 sigma out 1act.( 10.5866117 sig in 12act.) RANK 10 NODE 1 --> 4.55448008 sigma out 1act.( 8.68697166 sig in 12act.) RANK 11 NODE 15 --> 4.53158283 sigma out 1act.( 8.0446949 sig in 12act.) RANK 12 NODE 14 --> 4.23812008 sigma out 1act.( 7.72946453 sig in 12act.) RANK 13 NODE 2 --> 3.59610248 sigma out 1act.( 6.80054808 sig in 12act.) RANK 14 NODE 12 --> 2.81488967 sigma out 1act.( 9.42188835 sig in 12act.) RANK 15 NODE 10 --> 1.1818589 sigma out 1act.( 7.53451729 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 53.8408089 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.430905521 *** contribution from regularisation: 0.00445064669 *** contribution from error: -0.43535617 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.47505638 *** contribution from regularisation: 0.00242656283 *** contribution from error: -0.477482945 *********************************************** -----------------> Test sample ENTER BFGS code START -44850.487 -0.459198862 -0.189398363 EXIT FROM BFGS code FG_START 0. -0.459198862 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.493716925 *** contribution from regularisation: 0.00242956658 *** contribution from error: -0.4961465 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -46602.4348 -0.459198862 -38.279438 EXIT FROM BFGS code FG_LNSRCH 0. -0.466318876 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0. EXIT FROM BFGS code NEW_X -49208.9212 -0.466318876 20.3925819 ENTER BFGS code NEW_X -49208.9212 -0.466318876 20.3925819 EXIT FROM BFGS code FG_LNSRCH 0. -0.463400722 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.522819102 *** contribution from regularisation: 0.00310367974 *** contribution from error: -0.525922775 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49349.4165 -0.463400722 4.49771023 EXIT FROM BFGS code NEW_X -49349.4165 -0.463400722 4.49771023 ENTER BFGS code NEW_X -49349.4165 -0.463400722 4.49771023 EXIT FROM BFGS code FG_LNSRCH 0. -0.462248892 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.52359283 *** contribution from regularisation: 0.00310742785 *** contribution from error: -0.526700258 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49422.453 -0.462248892 0.19529371 EXIT FROM BFGS code NEW_X -49422.453 -0.462248892 0.19529371 ENTER BFGS code NEW_X -49422.453 -0.462248892 0.19529371 EXIT FROM BFGS code FG_LNSRCH 0. -0.460255414 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.524263918 *** contribution from regularisation: 0.00317332451 *** contribution from error: -0.52743727 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49485.7934 -0.460255414 -0.564636648 EXIT FROM BFGS code NEW_X -49485.7934 -0.460255414 -0.564636648 ENTER BFGS code NEW_X -49485.7934 -0.460255414 -0.564636648 EXIT FROM BFGS code FG_LNSRCH 0. -0.457269937 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.525106728 *** contribution from regularisation: 0.00313710514 *** contribution from error: -0.52824384 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49565.3488 -0.457269937 3.13228011 EXIT FROM BFGS code NEW_X -49565.3488 -0.457269937 3.13228011 ENTER BFGS code NEW_X -49565.3488 -0.457269937 3.13228011 EXIT FROM BFGS code FG_LNSRCH 0. -0.404714584 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.526654243 *** contribution from regularisation: 0.00367052201 *** contribution from error: -0.530324757 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49711.4209 -0.404714584 22.0917454 EXIT FROM BFGS code FG_LNSRCH 0. -0.433803648 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 40.112999 sigma out 15 active outputs RANK 2 NODE 1 --> 24.6791306 sigma out 15 active outputs RANK 3 NODE 8 --> 18.6728039 sigma out 15 active outputs RANK 4 NODE 11 --> 15.6581888 sigma out 15 active outputs RANK 5 NODE 12 --> 14.428936 sigma out 15 active outputs RANK 6 NODE 7 --> 12.6920605 sigma out 15 active outputs RANK 7 NODE 6 --> 11.6801691 sigma out 15 active outputs RANK 8 NODE 9 --> 11.5111513 sigma out 15 active outputs RANK 9 NODE 5 --> 8.88146114 sigma out 15 active outputs RANK 10 NODE 4 --> 8.05906391 sigma out 15 active outputs RANK 11 NODE 3 --> 8.05733299 sigma out 15 active outputs RANK 12 NODE 10 --> 6.79763699 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 5 --> 36.4591866 sigma in 12act. ( 41.57341 sig out 1act.) RANK 2 NODE 8 --> 30.3006516 sigma in 12act. ( 24.9009476 sig out 1act.) RANK 3 NODE 11 --> 19.542923 sigma in 12act. ( 20.5070133 sig out 1act.) RANK 4 NODE 1 --> 16.6332474 sigma in 12act. ( 13.6254044 sig out 1act.) RANK 5 NODE 7 --> 11.7315454 sigma in 12act. ( 10.9136915 sig out 1act.) RANK 6 NODE 9 --> 11.0359735 sigma in 12act. ( 11.1908388 sig out 1act.) RANK 7 NODE 15 --> 10.1895351 sigma in 12act. ( 9.13413906 sig out 1act.) RANK 8 NODE 13 --> 9.32632923 sigma in 12act. ( 8.49873543 sig out 1act.) RANK 9 NODE 14 --> 8.91150856 sigma in 12act. ( 6.85291386 sig out 1act.) RANK 10 NODE 10 --> 8.25992012 sigma in 12act. ( 6.51328754 sig out 1act.) RANK 11 NODE 2 --> 7.43515825 sigma in 12act. ( 5.89408255 sig out 1act.) RANK 12 NODE 4 --> 7.34366703 sigma in 12act. ( 5.34611797 sig out 1act.) RANK 13 NODE 6 --> 5.71932936 sigma in 12act. ( 2.3431313 sig out 1act.) RANK 14 NODE 3 --> 5.66556835 sigma in 12act. ( 4.43733549 sig out 1act.) RANK 15 NODE 12 --> 5.22059584 sigma in 12act. ( 0.633476973 sig out 1act.) sorted by output significance RANK 1 NODE 5 --> 41.57341 sigma out 1act.( 36.4591866 sig in 12act.) RANK 2 NODE 8 --> 24.9009476 sigma out 1act.( 30.3006516 sig in 12act.) RANK 3 NODE 11 --> 20.5070133 sigma out 1act.( 19.542923 sig in 12act.) RANK 4 NODE 1 --> 13.6254044 sigma out 1act.( 16.6332474 sig in 12act.) RANK 5 NODE 9 --> 11.1908388 sigma out 1act.( 11.0359735 sig in 12act.) RANK 6 NODE 7 --> 10.9136915 sigma out 1act.( 11.7315454 sig in 12act.) RANK 7 NODE 15 --> 9.13413906 sigma out 1act.( 10.1895351 sig in 12act.) RANK 8 NODE 13 --> 8.49873543 sigma out 1act.( 9.32632923 sig in 12act.) RANK 9 NODE 14 --> 6.85291386 sigma out 1act.( 8.91150856 sig in 12act.) RANK 10 NODE 10 --> 6.51328754 sigma out 1act.( 8.25992012 sig in 12act.) RANK 11 NODE 2 --> 5.89408255 sigma out 1act.( 7.43515825 sig in 12act.) RANK 12 NODE 4 --> 5.34611797 sigma out 1act.( 7.34366703 sig in 12act.) RANK 13 NODE 3 --> 4.43733549 sigma out 1act.( 5.66556835 sig in 12act.) RANK 14 NODE 6 --> 2.3431313 sigma out 1act.( 5.71932936 sig in 12act.) RANK 15 NODE 12 --> 0.633476973 sigma out 1act.( 5.22059584 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 59.4380989 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.528909743 *** contribution from regularisation: 0.00234733662 *** contribution from error: -0.531257093 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -49924.3185 -0.433803648 11.013648 EXIT FROM BFGS code NEW_X -49924.3185 -0.433803648 11.013648 ENTER BFGS code NEW_X -49924.3185 -0.433803648 11.013648 EXIT FROM BFGS code FG_LNSRCH 0. -0.381614268 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.529059589 *** contribution from regularisation: 0.00364047172 *** contribution from error: -0.532700062 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49938.4623 -0.381614268 -15.2376575 EXIT FROM BFGS code NEW_X -49938.4623 -0.381614268 -15.2376575 ENTER BFGS code NEW_X -49938.4623 -0.381614268 -15.2376575 EXIT FROM BFGS code FG_LNSRCH 0. -0.388099968 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.531149447 *** contribution from regularisation: 0.00267556729 *** contribution from error: -0.53382504 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50135.7265 -0.388099968 9.06446362 EXIT FROM BFGS code NEW_X -50135.7265 -0.388099968 9.06446362 ENTER BFGS code NEW_X -50135.7265 -0.388099968 9.06446362 EXIT FROM BFGS code FG_LNSRCH 0. -0.383248448 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.531256974 *** contribution from regularisation: 0.00300913397 *** contribution from error: -0.534266114 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50145.879 -0.383248448 9.76865864 EXIT FROM BFGS code NEW_X -50145.879 -0.383248448 9.76865864 ENTER BFGS code NEW_X -50145.879 -0.383248448 9.76865864 EXIT FROM BFGS code FG_LNSRCH 0. -0.374867946 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.531761944 *** contribution from regularisation: 0.00296296785 *** contribution from error: -0.534724891 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50193.5418 -0.374867946 1.80944955 EXIT FROM BFGS code NEW_X -50193.5418 -0.374867946 1.80944955 ENTER BFGS code NEW_X -50193.5418 -0.374867946 1.80944955 EXIT FROM BFGS code FG_LNSRCH 0. -0.345512956 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.532794058 *** contribution from regularisation: 0.00340467738 *** contribution from error: -0.536198735 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50290.9659 -0.345512956 -5.7869525 EXIT FROM BFGS code NEW_X -50290.9659 -0.345512956 -5.7869525 ENTER BFGS code NEW_X -50290.9659 -0.345512956 -5.7869525 EXIT FROM BFGS code FG_LNSRCH 0. -0.336518139 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.533047438 *** contribution from regularisation: 0.0031596839 *** contribution from error: -0.536207139 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50314.8799 -0.336518139 -64.6254578 EXIT FROM BFGS code FG_LNSRCH 0. -0.341162056 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.533377707 *** contribution from regularisation: 0.00322572584 *** contribution from error: -0.536603451 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50346.0544 -0.341162056 -33.4756126 EXIT FROM BFGS code NEW_X -50346.0544 -0.341162056 -33.4756126 ENTER BFGS code NEW_X -50346.0544 -0.341162056 -33.4756126 EXIT FROM BFGS code FG_LNSRCH 0. -0.345467359 0. --------------------------------------------------- Iteration : 18 *********************************************** *** Learn Path 18 *** loss function: -0.534028351 *** contribution from regularisation: 0.00313715218 *** contribution from error: -0.537165523 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50407.4699 -0.345467359 -26.480196 EXIT FROM BFGS code NEW_X -50407.4699 -0.345467359 -26.480196 ENTER BFGS code NEW_X -50407.4699 -0.345467359 -26.480196 EXIT FROM BFGS code FG_LNSRCH 0. -0.359683573 0. --------------------------------------------------- Iteration : 19 *********************************************** *** Learn Path 19 *** loss function: -0.53468436 *** contribution from regularisation: 0.00314922445 *** contribution from error: -0.537833571 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50469.3941 -0.359683573 8.9403944 EXIT FROM BFGS code NEW_X -50469.3941 -0.359683573 8.9403944 ENTER BFGS code NEW_X -50469.3941 -0.359683573 8.9403944 EXIT FROM BFGS code FG_LNSRCH 0. -0.360924095 0. --------------------------------------------------- Iteration : 20 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 71.0514374 sigma out 15 active outputs RANK 2 NODE 1 --> 30.4659824 sigma out 15 active outputs RANK 3 NODE 11 --> 22.9959431 sigma out 15 active outputs RANK 4 NODE 8 --> 21.1692524 sigma out 15 active outputs RANK 5 NODE 12 --> 20.2947502 sigma out 15 active outputs RANK 6 NODE 7 --> 19.7231922 sigma out 15 active outputs RANK 7 NODE 3 --> 18.8259678 sigma out 15 active outputs RANK 8 NODE 6 --> 16.7135067 sigma out 15 active outputs RANK 9 NODE 4 --> 16.5100193 sigma out 15 active outputs RANK 10 NODE 10 --> 14.6039381 sigma out 15 active outputs RANK 11 NODE 9 --> 13.7024984 sigma out 15 active outputs RANK 12 NODE 5 --> 9.8316431 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 5 --> 77.767952 sigma in 12act. ( 108.193108 sig out 1act.) RANK 2 NODE 8 --> 36.0408401 sigma in 12act. ( 30.097023 sig out 1act.) RANK 3 NODE 6 --> 18.1018085 sigma in 12act. ( 17.4342613 sig out 1act.) RANK 4 NODE 11 --> 17.4601498 sigma in 12act. ( 18.9718552 sig out 1act.) RANK 5 NODE 9 --> 15.9508743 sigma in 12act. ( 17.1360893 sig out 1act.) RANK 6 NODE 1 --> 15.3885384 sigma in 12act. ( 12.2687111 sig out 1act.) RANK 7 NODE 10 --> 11.9574213 sigma in 12act. ( 13.4539318 sig out 1act.) RANK 8 NODE 13 --> 11.3376722 sigma in 12act. ( 13.0175953 sig out 1act.) RANK 9 NODE 7 --> 9.20251179 sigma in 12act. ( 7.59007359 sig out 1act.) RANK 10 NODE 14 --> 8.38834763 sigma in 12act. ( 8.00249004 sig out 1act.) RANK 11 NODE 2 --> 8.30178833 sigma in 12act. ( 8.61778831 sig out 1act.) RANK 12 NODE 15 --> 8.14365673 sigma in 12act. ( 6.15794992 sig out 1act.) RANK 13 NODE 12 --> 7.5223999 sigma in 12act. ( 7.13431883 sig out 1act.) RANK 14 NODE 4 --> 5.89934826 sigma in 12act. ( 3.93658113 sig out 1act.) RANK 15 NODE 3 --> 5.88050795 sigma in 12act. ( 5.99056625 sig out 1act.) sorted by output significance RANK 1 NODE 5 --> 108.193108 sigma out 1act.( 77.767952 sig in 12act.) RANK 2 NODE 8 --> 30.097023 sigma out 1act.( 36.0408401 sig in 12act.) RANK 3 NODE 11 --> 18.9718552 sigma out 1act.( 17.4601498 sig in 12act.) RANK 4 NODE 6 --> 17.4342613 sigma out 1act.( 18.1018085 sig in 12act.) RANK 5 NODE 9 --> 17.1360893 sigma out 1act.( 15.9508743 sig in 12act.) RANK 6 NODE 10 --> 13.4539318 sigma out 1act.( 11.9574213 sig in 12act.) RANK 7 NODE 13 --> 13.0175953 sigma out 1act.( 11.3376722 sig in 12act.) RANK 8 NODE 1 --> 12.2687111 sigma out 1act.( 15.3885384 sig in 12act.) RANK 9 NODE 2 --> 8.61778831 sigma out 1act.( 8.30178833 sig in 12act.) RANK 10 NODE 14 --> 8.00249004 sigma out 1act.( 8.38834763 sig in 12act.) RANK 11 NODE 7 --> 7.59007359 sigma out 1act.( 9.20251179 sig in 12act.) RANK 12 NODE 12 --> 7.13431883 sigma out 1act.( 7.5223999 sig in 12act.) RANK 13 NODE 15 --> 6.15794992 sigma out 1act.( 8.14365673 sig in 12act.) RANK 14 NODE 3 --> 5.99056625 sigma out 1act.( 5.88050795 sig in 12act.) RANK 15 NODE 4 --> 3.93658113 sigma out 1act.( 5.89934826 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 120.025894 sigma in 15 active inputs *********************************************** *** Learn Path 20 *** loss function: -0.534886837 *** contribution from regularisation: 0.00330176647 *** contribution from error: -0.538188577 *********************************************** -----------------> Test sample Iteration No: 20 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -50488.501 -0.360924095 6.04765034 EXIT FROM BFGS code NEW_X -50488.501 -0.360924095 6.04765034 ENTER BFGS code NEW_X -50488.501 -0.360924095 6.04765034 EXIT FROM BFGS code FG_LNSRCH 0. -0.359784544 0. --------------------------------------------------- Iteration : 21 *********************************************** *** Learn Path 21 *** loss function: -0.535153806 *** contribution from regularisation: 0.00332364743 *** contribution from error: -0.53847748 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50513.7003 -0.359784544 19.5867577 EXIT FROM BFGS code NEW_X -50513.7003 -0.359784544 19.5867577 ENTER BFGS code NEW_X -50513.7003 -0.359784544 19.5867577 EXIT FROM BFGS code FG_LNSRCH 0. -0.353269666 0. --------------------------------------------------- Iteration : 22 *********************************************** *** Learn Path 22 *** loss function: -0.535304308 *** contribution from regularisation: 0.00337967114 *** contribution from error: -0.538683951 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50527.9094 -0.353269666 12.0238409 EXIT FROM BFGS code NEW_X -50527.9094 -0.353269666 12.0238409 ENTER BFGS code NEW_X -50527.9094 -0.353269666 12.0238409 EXIT FROM BFGS code FG_LNSRCH 0. -0.325702995 0. --------------------------------------------------- Iteration : 23 *********************************************** *** Learn Path 23 *** loss function: -0.535946012 *** contribution from regularisation: 0.00324751204 *** contribution from error: -0.539193511 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50588.4794 -0.325702995 -9.84265518 EXIT FROM BFGS code NEW_X -50588.4794 -0.325702995 -9.84265518 ENTER BFGS code NEW_X -50588.4794 -0.325702995 -9.84265518 EXIT FROM BFGS code FG_LNSRCH 0. -0.318508029 0. --------------------------------------------------- Iteration : 24 *********************************************** *** Learn Path 24 *** loss function: -0.535309851 *** contribution from regularisation: 0.00337531907 *** contribution from error: -0.538685143 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50528.4339 -0.318508029 64.3289795 EXIT FROM BFGS code FG_LNSRCH 0. -0.324267626 0. --------------------------------------------------- Iteration : 25 *********************************************** *** Learn Path 25 *** loss function: -0.535504282 *** contribution from regularisation: 0.00378023181 *** contribution from error: -0.539284527 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50546.7868 -0.324267626 2.45328569 EXIT FROM BFGS code FG_LNSRCH 0. -0.325635135 0. --------------------------------------------------- Iteration : 26 *********************************************** *** Learn Path 26 *** loss function: -0.535657704 *** contribution from regularisation: 0.00354211917 *** contribution from error: -0.539199829 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50561.2671 -0.325635135 -9.46640396 EXIT FROM BFGS code FG_LNSRCH 0. -0.325702727 0. --------------------------------------------------- Iteration : 27 *********************************************** *** Learn Path 27 *** loss function: -0.535675526 *** contribution from regularisation: 0.00351801747 *** contribution from error: -0.539193571 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50562.9489 -0.325702727 -10.0120602 EXIT FROM BFGS code FG_LNSRCH 0. -0.325702995 0. --------------------------------------------------- Iteration : 28 *********************************************** *** Learn Path 28 *** loss function: -0.535670996 *** contribution from regularisation: 0.0035225118 *** contribution from error: -0.539193511 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50562.5219 -0.325702995 -10.0012579 EXIT FROM BFGS code FG_LNSRCH 0. -0.325702995 0. --------------------------------------------------- Iteration : 29 *********************************************** *** Learn Path 29 *** loss function: -0.535663724 *** contribution from regularisation: 0.00352979056 *** contribution from error: -0.539193511 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50561.8349 -0.325702995 -10.0380325 EXIT FROM BFGS code FG_LNSRCH 0. -0.325702995 0. --------------------------------------------------- Iteration : 30 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 80.3874741 sigma out 15 active outputs RANK 2 NODE 1 --> 34.2916832 sigma out 15 active outputs RANK 3 NODE 3 --> 25.4557285 sigma out 15 active outputs RANK 4 NODE 12 --> 20.6931648 sigma out 15 active outputs RANK 5 NODE 7 --> 20.6386337 sigma out 15 active outputs RANK 6 NODE 8 --> 20.569603 sigma out 15 active outputs RANK 7 NODE 11 --> 20.3325977 sigma out 15 active outputs RANK 8 NODE 10 --> 18.7573872 sigma out 15 active outputs RANK 9 NODE 6 --> 16.4949493 sigma out 15 active outputs RANK 10 NODE 4 --> 16.2427444 sigma out 15 active outputs RANK 11 NODE 9 --> 12.9309082 sigma out 15 active outputs RANK 12 NODE 5 --> 10.3346357 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 5 --> 90.4379883 sigma in 12act. ( 114.71328 sig out 1act.) RANK 2 NODE 8 --> 28.9904575 sigma in 12act. ( 24.9754829 sig out 1act.) RANK 3 NODE 6 --> 20.4173698 sigma in 12act. ( 20.1660404 sig out 1act.) RANK 4 NODE 11 --> 17.655098 sigma in 12act. ( 19.2570972 sig out 1act.) RANK 5 NODE 9 --> 16.3408871 sigma in 12act. ( 16.8293076 sig out 1act.) RANK 6 NODE 10 --> 14.7593832 sigma in 12act. ( 16.2904873 sig out 1act.) RANK 7 NODE 13 --> 12.8732615 sigma in 12act. ( 14.1432276 sig out 1act.) RANK 8 NODE 1 --> 12.7858286 sigma in 12act. ( 9.6604023 sig out 1act.) RANK 9 NODE 7 --> 11.6417942 sigma in 12act. ( 10.7344713 sig out 1act.) RANK 10 NODE 14 --> 10.3895359 sigma in 12act. ( 9.82567596 sig out 1act.) RANK 11 NODE 2 --> 9.38997746 sigma in 12act. ( 9.70748901 sig out 1act.) RANK 12 NODE 12 --> 8.80906963 sigma in 12act. ( 9.14695644 sig out 1act.) RANK 13 NODE 3 --> 8.28816986 sigma in 12act. ( 8.85288715 sig out 1act.) RANK 14 NODE 15 --> 8.02287579 sigma in 12act. ( 5.90081453 sig out 1act.) RANK 15 NODE 4 --> 5.8386178 sigma in 12act. ( 4.04087067 sig out 1act.) sorted by output significance RANK 1 NODE 5 --> 114.71328 sigma out 1act.( 90.4379883 sig in 12act.) RANK 2 NODE 8 --> 24.9754829 sigma out 1act.( 28.9904575 sig in 12act.) RANK 3 NODE 6 --> 20.1660404 sigma out 1act.( 20.4173698 sig in 12act.) RANK 4 NODE 11 --> 19.2570972 sigma out 1act.( 17.655098 sig in 12act.) RANK 5 NODE 9 --> 16.8293076 sigma out 1act.( 16.3408871 sig in 12act.) RANK 6 NODE 10 --> 16.2904873 sigma out 1act.( 14.7593832 sig in 12act.) RANK 7 NODE 13 --> 14.1432276 sigma out 1act.( 12.8732615 sig in 12act.) RANK 8 NODE 7 --> 10.7344713 sigma out 1act.( 11.6417942 sig in 12act.) RANK 9 NODE 14 --> 9.82567596 sigma out 1act.( 10.3895359 sig in 12act.) RANK 10 NODE 2 --> 9.70748901 sigma out 1act.( 9.38997746 sig in 12act.) RANK 11 NODE 1 --> 9.6604023 sigma out 1act.( 12.7858286 sig in 12act.) RANK 12 NODE 12 --> 9.14695644 sigma out 1act.( 8.80906963 sig in 12act.) RANK 13 NODE 3 --> 8.85288715 sigma out 1act.( 8.28816986 sig in 12act.) RANK 14 NODE 15 --> 5.90081453 sigma out 1act.( 8.02287579 sig in 12act.) RANK 15 NODE 4 --> 4.04087067 sigma out 1act.( 5.8386178 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 126.18071 sigma in 15 active inputs *********************************************** *** Learn Path 30 *** loss function: -0.535663784 *** contribution from regularisation: 0.00352971745 *** contribution from error: -0.539193511 *********************************************** -----------------> Test sample Iteration No: 30 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -50561.8418 -0.325702995 -10.0530405 EXIT FROM BFGS code FG_LNSRCH 0. -0.325702995 0. --------------------------------------------------- Iteration : 31 *********************************************** *** Learn Path 31 *** loss function: -0.535664082 *** contribution from regularisation: 0.00352940732 *** contribution from error: -0.539193511 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50561.871 -0.325702995 -10.0597324 EXIT FROM BFGS code NEW_X -50561.871 -0.325702995 -10.0597324 ENTER BFGS code NEW_X -50561.871 -0.325702995 -10.0597324 EXIT FROM BFGS code CONVERGENC -50561.871 -0.325702995 -10.0597324 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 128.274033 sigma out 15 active outputs RANK 2 NODE 1 --> 54.7164917 sigma out 15 active outputs RANK 3 NODE 3 --> 40.1880722 sigma out 15 active outputs RANK 4 NODE 7 --> 32.7439461 sigma out 15 active outputs RANK 5 NODE 8 --> 32.2642403 sigma out 15 active outputs RANK 6 NODE 11 --> 31.7140026 sigma out 15 active outputs RANK 7 NODE 12 --> 31.4925041 sigma out 15 active outputs RANK 8 NODE 10 --> 29.3793316 sigma out 15 active outputs RANK 9 NODE 6 --> 26.0990963 sigma out 15 active outputs RANK 10 NODE 4 --> 24.3071404 sigma out 15 active outputs RANK 11 NODE 9 --> 20.0878735 sigma out 15 active outputs RANK 12 NODE 5 --> 15.8360443 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 5 --> 145.289566 sigma in 12act. ( 176.214676 sig out 1act.) RANK 2 NODE 8 --> 45.6763191 sigma in 12act. ( 37.5977249 sig out 1act.) RANK 3 NODE 6 --> 31.4952431 sigma in 12act. ( 30.1785145 sig out 1act.) RANK 4 NODE 11 --> 26.4006557 sigma in 12act. ( 30.6621056 sig out 1act.) RANK 5 NODE 9 --> 24.86059 sigma in 12act. ( 26.3011112 sig out 1act.) RANK 6 NODE 10 --> 22.638176 sigma in 12act. ( 25.4188709 sig out 1act.) RANK 7 NODE 13 --> 19.1768589 sigma in 12act. ( 22.0735416 sig out 1act.) RANK 8 NODE 1 --> 18.937933 sigma in 12act. ( 14.5087051 sig out 1act.) RANK 9 NODE 7 --> 16.9103355 sigma in 12act. ( 15.9100523 sig out 1act.) RANK 10 NODE 14 --> 15.3259439 sigma in 12act. ( 15.5723486 sig out 1act.) RANK 11 NODE 2 --> 14.0602655 sigma in 12act. ( 15.3065119 sig out 1act.) RANK 12 NODE 12 --> 13.0508127 sigma in 12act. ( 14.2625942 sig out 1act.) RANK 13 NODE 3 --> 12.3918018 sigma in 12act. ( 13.5883093 sig out 1act.) RANK 14 NODE 15 --> 10.7406263 sigma in 12act. ( 8.94949627 sig out 1act.) RANK 15 NODE 4 --> 8.01982594 sigma in 12act. ( 5.83284092 sig out 1act.) sorted by output significance RANK 1 NODE 5 --> 176.214676 sigma out 1act.( 145.289566 sig in 12act.) RANK 2 NODE 8 --> 37.5977249 sigma out 1act.( 45.6763191 sig in 12act.) RANK 3 NODE 11 --> 30.6621056 sigma out 1act.( 26.4006557 sig in 12act.) RANK 4 NODE 6 --> 30.1785145 sigma out 1act.( 31.4952431 sig in 12act.) RANK 5 NODE 9 --> 26.3011112 sigma out 1act.( 24.86059 sig in 12act.) RANK 6 NODE 10 --> 25.4188709 sigma out 1act.( 22.638176 sig in 12act.) RANK 7 NODE 13 --> 22.0735416 sigma out 1act.( 19.1768589 sig in 12act.) RANK 8 NODE 7 --> 15.9100523 sigma out 1act.( 16.9103355 sig in 12act.) RANK 9 NODE 14 --> 15.5723486 sigma out 1act.( 15.3259439 sig in 12act.) RANK 10 NODE 2 --> 15.3065119 sigma out 1act.( 14.0602655 sig in 12act.) RANK 11 NODE 1 --> 14.5087051 sigma out 1act.( 18.937933 sig in 12act.) RANK 12 NODE 12 --> 14.2625942 sigma out 1act.( 13.0508127 sig in 12act.) RANK 13 NODE 3 --> 13.5883093 sigma out 1act.( 12.3918018 sig in 12act.) RANK 14 NODE 15 --> 8.94949627 sigma out 1act.( 10.7406263 sig in 12act.) RANK 15 NODE 4 --> 5.83284092 sigma out 1act.( 8.01982594 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 193.865448 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.535672247 *** contribution from regularisation: 0.0035212771 *** contribution from error: -0.539193511 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 29884 Closing output file done