NNInput NNInputs_165.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree NNAna::CopyTree: entries= 204065 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 57130 nbkg = 146935 Bkg Entries: 146935 Sig Entries: 57130 Chosen entries: 57130 Signal fraction: 1 Background fraction: 0.388811 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 146935 Actual Signal Entries: 57130 Entries to split: 57130 Test with : 28565 Train with : 28565 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 57130 for Signal Prepared event 0 for Signal with 57130 events ====Entry 0 Variable Ht : 267.874 Variable LepAPt : 45.292 Variable LepBPt : 35.604 Variable MetSigLeptonsJets : 7.24556 Variable MetSpec : 95.2083 Variable SumEtLeptonsJets : 172.666 Variable VSumJetLeptonsPt : 107.325 Variable addEt : 176.104 Variable dPhiLepSumMet : 2.28406 Variable dPhiLeptons : 0.0254087 Variable dRLeptons : 0.325152 Variable lep1_E : 46.0978 Variable lep2_E : 40.383 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2165 Ht = 267.874 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 45.2922 LepAPt = 45.292 LepBEt = 35.604 LepBPt = 35.604 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 95.2083 MetDelPhi = 2.19471 MetSig = 6.53886 MetSigLeptonsJets = 7.24556 MetSpec = 95.2083 Mjj = 0 MostCentralJetEta = -1.47546 MtllMet = 179.575 Njets = 1 SB = 0 SumEt = 212.005 SumEtJets = 0 SumEtLeptonsJets = 172.666 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 107.325 addEt = 176.104 dPhiLepSumMet = 2.28406 dPhiLeptons = 0.0254087 dRLeptons = 0.325152 diltype = 52 dimass = 13.1154 event = 77 jet1_Et = 91.7697 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 46.0978 lep2_E = 40.383 rand = 0.999742 run = 237144 weight = 4.01724e-06 ===Show End Prepared event 10000 for Signal with 57130 events Prepared event 20000 for Signal with 57130 events Prepared event 30000 for Signal with 57130 events Prepared event 40000 for Signal with 57130 events Prepared event 50000 for Signal with 57130 events Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 146935 for Background Prepared event 0 for Background with 146935 events ====Entry 0 Variable Ht : 85.3408 Variable LepAPt : 31.1294 Variable LepBPt : 10.0664 Variable MetSigLeptonsJets : 6.87768 Variable MetSpec : 44.1437 Variable SumEtLeptonsJets : 41.1959 Variable VSumJetLeptonsPt : 41.0132 Variable addEt : 85.3408 Variable dPhiLepSumMet : 3.10536 Variable dPhiLeptons : 0.219342 Variable dRLeptons : 0.424454 Variable lep1_E : 32.3548 Variable lep2_E : 10.1027 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 85.3408 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 31.1297 LepAPt = 31.1294 LepBEt = 10.0674 LepBPt = 10.0664 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 44.1437 MetDelPhi = 3.01191 MetSig = 3.74558 MetSigLeptonsJets = 6.87768 MetSpec = 44.1437 Mjj = 0 MostCentralJetEta = 0 MtllMet = 86.2332 Njets = 0 SB = 0 SumEt = 138.899 SumEtJets = 0 SumEtLeptonsJets = 41.1959 Target = 0 TrainWeight = 0.300896 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 41.0132 addEt = 85.3408 dPhiLepSumMet = 3.10536 dPhiLeptons = 0.219342 dRLeptons = 0.424454 diltype = 17 dimass = 7.54723 event = 6717520 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 32.3548 lep2_E = 10.1027 rand = 0.999742 run = 271566 weight = 0.00296168 ===Show End Prepared event 10000 for Background with 146935 events Prepared event 20000 for Background with 146935 events Prepared event 30000 for Background with 146935 events Prepared event 40000 for Background with 146935 events Prepared event 50000 for Background with 146935 events Prepared event 60000 for Background with 146935 events Prepared event 70000 for Background with 146935 events Prepared event 80000 for Background with 146935 events Prepared event 90000 for Background with 146935 events Prepared event 100000 for Background with 146935 events Prepared event 110000 for Background with 146935 events Prepared event 120000 for Background with 146935 events Prepared event 130000 for Background with 146935 events Prepared event 140000 for Background with 146935 events Warning: found 4563 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 204065 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 4563 negative weights. Signal fraction: 62.7009163 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 47.1748543 62.5186996 64.8108978 66.7955475 68.1502151 69.1065979 70.1853943 71.0219955 71.9138718 72.8178558 73.6585999 74.3200684 74.9489136 75.5635834 76.1887131 76.8600006 77.3458862 78.0088043 78.5791168 79.1452332 79.7741089 80.5433502 81.2583771 82.0239792 82.6033096 83.4151764 84.4853973 85.4777832 86.5677032 87.9124756 89.2122803 90.791626 92.295166 94.0003357 95.8953552 97.5753326 99.3025818 100.864143 102.571518 104.278564 105.951233 107.486282 109.228973 110.923546 112.880806 114.659615 116.601593 118.337074 120.049805 121.803696 123.628479 125.331169 127.43399 129.196411 130.937897 132.646698 134.196167 135.975296 137.513092 139.101898 140.591125 142.170013 143.732178 145.417145 146.849258 148.220612 149.72023 151.160156 152.623566 153.868057 155.274841 156.799072 158.251282 159.730682 161.304443 162.988037 164.723877 166.661133 168.599243 170.696381 173.187485 175.625687 178.435181 180.905029 183.837952 187.048462 190.102905 193.302475 196.528473 200.342041 204.489502 209.107895 214.271729 220.608948 227.566528 236.081665 245.927353 258.27887 277.537659 308.93277 928.238586 ------------------------------ Transdef: Tab for variable 3 20.0001564 20.3358307 20.6131573 20.9066563 21.1918411 21.4609261 21.7025166 21.943367 22.1625309 22.3740616 22.5900917 22.8122253 23.0655479 23.2863293 23.5011082 23.720108 23.8920746 24.1085587 24.3462505 24.5481529 24.8019218 25.0561924 25.2944355 25.508522 25.6956787 25.9228897 26.1815834 26.4069462 26.6391602 26.9029579 27.1147213 27.3439445 27.6215153 27.8555698 28.057415 28.2547836 28.5025234 28.7520714 29.0421104 29.2310257 29.5085487 29.7993584 30.0431919 30.3353081 30.6269798 30.9421806 31.2217216 31.5465965 31.8410053 32.1244431 32.4313011 32.7429123 33.0601349 33.3432922 33.6285095 33.9250793 34.2637939 34.5945358 34.9566803 35.3033218 35.6327515 35.9931717 36.3279343 36.6376724 36.9453697 37.2937393 37.6312103 38.0290451 38.3684692 38.7066269 39.0439568 39.3718224 39.7357712 40.0851288 40.4535141 40.8011856 41.1620865 41.5540009 41.9208717 42.3327789 42.7834244 43.2313004 43.7057571 44.1961899 44.6823196 45.2536087 45.7843513 46.3358536 46.968811 47.6425323 48.3710213 49.187355 50.1049118 51.1740723 52.4024963 53.9592896 55.7657051 58.4141312 62.0683899 68.9004517 151.296204 ------------------------------ Transdef: Tab for variable 4 10.0002251 10.1140003 10.233469 10.3668661 10.482769 10.5929985 10.7062769 10.8542347 10.9772339 11.1007175 11.2628155 11.391983 11.5328569 11.6865578 11.840579 12.0168514 12.1876183 12.3278503 12.4942408 12.6544771 12.8028727 12.9606342 13.1423254 13.3549681 13.5139637 13.6952705 13.8670292 14.0393028 14.2436314 14.4111748 14.6342907 14.8409605 15.0663328 15.2984924 15.4864702 15.7036591 15.9068527 16.1519909 16.437809 16.7160149 16.9583511 17.2457008 17.5698318 17.86236 18.1444168 18.4432163 18.7121506 18.9995384 19.3173561 19.650856 20.0107632 20.3097305 20.6217194 20.9380302 21.2259541 21.5118484 21.8171082 22.1265602 22.4322586 22.7772732 23.1521149 23.4948921 23.8583775 24.1844559 24.5410309 24.9213085 25.2655849 25.6186981 25.9739437 26.32304 26.7100601 27.1022072 27.4513855 27.8263054 28.1731663 28.5274162 28.9053345 29.309164 29.6766624 30.0845833 30.4772415 30.8854752 31.3071365 31.6862755 32.0953827 32.5236359 32.9774094 33.4110718 33.8900948 34.3879967 34.9232025 35.5044746 36.1375351 36.8477249 37.664505 38.5567093 39.7122574 41.2188034 43.6774902 47.8606148 69.318573 ------------------------------ Transdef: Tab for variable 5 1.14728987 2.62197137 3.02709866 3.3319838 3.62447524 3.80580831 3.9878397 4.13341999 4.28921461 4.4226346 4.54700947 4.64569759 4.75970888 4.85560226 4.96272755 5.0383625 5.12020683 5.18403912 5.25900459 5.32005978 5.38138247 5.43946648 5.49095011 5.54011822 5.5907774 5.63893509 5.68967628 5.7451458 5.79333782 5.84114838 5.88670063 5.93217468 5.98333836 6.02588558 6.0694294 6.1160965 6.16148806 6.20725727 6.25651217 6.29230118 6.33833027 6.38373852 6.4275794 6.47168922 6.51751328 6.5566473 6.60616016 6.6522789 6.69800711 6.74834919 6.78965092 6.83717871 6.87985611 6.92813778 6.97552872 7.02751541 7.07597065 7.13128948 7.18166351 7.23601151 7.2918644 7.34694242 7.40694046 7.4618969 7.52510643 7.58809614 7.6514287 7.70860052 7.77167177 7.83237743 7.89663172 7.94875813 8.00668812 8.07018661 8.13850212 8.19770622 8.25863647 8.32291126 8.38955307 8.45116138 8.5210619 8.59123611 8.65917587 8.72982597 8.80588436 8.87338257 8.95594025 9.03695202 9.12445641 9.21598053 9.30912685 9.41392708 9.53728294 9.66899109 9.82696629 10.0016422 10.2473545 10.5559578 10.9582109 11.7515335 19.6634274 ------------------------------ Transdef: Tab for variable 6 15.0261002 20.2058926 24.9171753 26.0305786 27.0679855 28.0337448 29.0080757 29.7848053 30.5792809 31.1438065 31.8037453 32.4138489 32.9169846 33.406189 33.8542709 34.3273735 34.7277679 35.0988693 35.52351 35.9801331 36.3287811 36.6843643 37.0388641 37.4539337 37.8072433 38.178978 38.5505486 38.9931335 39.3855209 39.7122459 40.1525192 40.4873238 40.8386726 41.2452927 41.7054329 42.1866684 42.5750084 42.9860153 43.4933586 43.9817886 44.5365067 45.1969604 45.9251785 46.5986862 47.293663 47.9862862 48.6770935 49.3884239 50.0941238 50.8152237 51.6273575 52.42313 53.283905 54.1333847 54.8622093 55.7770042 56.5505333 57.3739777 58.1702461 58.9636765 59.7845154 60.6946716 61.5688591 62.3870468 63.2420959 64.146019 64.9758606 65.7505264 66.5821304 67.3850555 68.236084 69.0541229 69.8785553 70.6178131 71.3991852 72.1689072 73.0202866 73.8268204 74.6302643 75.3759766 76.2697601 77.0969696 77.9231567 78.7742462 79.676712 80.6169739 81.6054535 82.6485062 83.6769104 85.0457306 86.4259949 88.2522736 90.2616196 92.4053955 94.8572006 98.0255814 101.834167 106.649918 114.27771 127.550018 289.142151 ------------------------------ Transdef: Tab for variable 7 30.1229668 32.4685516 33.3507004 34.0717697 34.7524757 35.3041115 35.8043289 36.2310944 36.6045761 37.0204391 37.4032555 37.7519226 38.1133194 38.386528 38.7240295 38.9603882 39.2274971 39.4760704 39.7733994 40.1115189 40.4787636 40.8869247 41.2253456 41.5862579 41.9845505 42.3357773 42.8254395 43.4701195 44.1528397 44.8598251 45.5382614 46.4940109 47.3744659 48.294487 49.236412 50.4187393 51.4748955 52.6074066 53.5385704 54.6688004 55.6898041 56.5996399 57.6441116 58.6138992 59.5242882 60.5552139 61.4541168 62.3998146 63.462265 64.3361511 65.342041 66.2702713 67.1864166 68.1382446 69.0845032 69.9903641 70.8727875 71.6997375 72.6582642 73.399826 74.2790375 75.0863342 76.0043793 76.8441925 77.7232666 78.611557 79.5679626 80.5653381 81.6773376 82.8279037 83.9437714 85.337738 86.7964554 88.2346725 89.8982391 91.4802856 93.0694351 94.7881622 96.5238266 98.4807358 100.362747 102.174095 104.264542 106.329605 108.508499 110.808044 113.339233 116.145851 119.104736 122.564682 126.119377 129.868103 134.260406 139.135925 145.07724 151.627625 160.40033 171.406036 187.276474 212.580551 582.217407 ------------------------------ Transdef: Tab for variable 8 0.172416463 24.716114 28.815136 30.7734146 31.7074223 32.4931755 33.076931 33.6698532 34.0698547 34.5523376 34.9887772 35.3666687 35.7202606 36.0611115 36.4056015 36.7012482 37.0207748 37.3301926 37.5909119 37.8333015 38.0967941 38.356781 38.6351471 38.8852997 39.1286545 39.3838463 39.7307205 40.0356445 40.3741722 40.6963844 40.9299164 41.3249817 41.7759018 42.0961266 42.528389 43.0403519 43.6034927 44.1487999 44.6918182 45.2421112 45.9020233 46.544117 47.1485634 47.873703 48.6289482 49.357666 50.1259003 50.8374443 51.5228386 52.3492699 53.0613403 53.8111801 54.5738754 55.3318176 56.0379448 56.7839279 57.5098152 58.2611275 58.9829826 59.7071877 60.4550972 61.2153397 62.027359 62.8277588 63.5127716 64.2937622 65.0250778 65.709549 66.4335022 67.1541214 67.8652344 68.631012 69.3079681 70.0514908 70.7884445 71.5296021 72.2529755 72.9892197 73.7158508 74.4067383 75.1837921 75.9786682 76.734024 77.584137 78.4568329 79.3395538 80.2474976 81.2976837 82.4342499 83.733139 85.1579437 86.7836914 88.8045731 91.0409546 93.8578491 97.2450409 101.250458 106.520058 114.060951 129.303635 347.479309 ------------------------------ Transdef: Tab for variable 9 47.1748543 61.9902344 64.168045 65.9818954 67.4324951 68.4282379 69.302597 70.1627197 70.7940903 71.7069092 72.4696655 73.2745285 73.9323273 74.4828796 75.1377258 75.6627655 76.2090454 76.844635 77.2690277 77.8580399 78.3283463 78.8117523 79.3705978 79.8316116 80.4948425 81.1561584 81.7792358 82.4125977 83.0641098 83.6497192 84.5680771 85.3674011 86.3505173 87.3288879 88.5266266 89.4547577 90.640976 91.8923416 93.1973572 94.6124268 96.1756897 97.6289215 99.0561676 100.496391 101.855232 103.24781 104.822739 106.218201 107.688286 109.247818 110.570236 112.125389 113.527374 114.858223 116.315163 117.714134 119.106888 120.418076 121.775154 123.222214 124.527237 125.945984 127.394714 128.639771 130.010925 131.378876 132.735458 133.999573 135.294998 136.687622 137.907867 139.185822 140.383163 141.661346 142.940491 144.182556 145.317886 146.460983 147.677231 148.808563 149.975861 151.160156 152.331284 153.308319 154.417572 155.520111 156.74118 157.84848 159.057053 160.305664 161.568573 163.127533 164.744217 166.565399 168.619049 170.818848 173.930038 177.983643 184.606262 198.802368 459.46109 ------------------------------ Transdef: Tab for variable 10 0.0050833188 1.1358583 1.4324832 1.65648878 1.81546438 1.92615139 2.02633476 2.10852242 2.18225241 2.23836851 2.29353094 2.33567667 2.37906456 2.41904998 2.45554614 2.49256945 2.52183199 2.55052757 2.57967353 2.6046114 2.62364531 2.64748621 2.66800117 2.68795443 2.70870852 2.72783232 2.74471569 2.76246929 2.77674055 2.79110146 2.80427623 2.81809807 2.82969284 2.84137845 2.85018086 2.85955858 2.86981177 2.87987947 2.88930845 2.89798212 2.90638208 2.91346645 2.92109799 2.92857552 2.93703794 2.94420815 2.95056796 2.95778823 2.96370959 2.96988511 2.97615862 2.98192263 2.98642683 2.99098539 2.99553561 3.00084496 3.00511742 3.00947452 3.01378202 3.01860142 3.02295208 3.02756548 3.03148603 3.03517556 3.03948307 3.04324865 3.04675245 3.05060673 3.05407333 3.05783105 3.06081009 3.06450772 3.06779313 3.07109928 3.0741024 3.07668781 3.07991958 3.08292127 3.08605909 3.08877659 3.09183693 3.09466648 3.09710407 3.09982634 3.10244679 3.1050148 3.10798264 3.11062098 3.11339617 3.11611986 3.1185019 3.1208849 3.12303925 3.1254034 3.12763 3.12984467 3.1322825 3.13438153 3.13678837 3.13912821 3.14159226 ------------------------------ Transdef: Tab for variable 11 4.53725397E-06 0.00583792105 0.0112952292 0.0180200338 0.0238500834 0.0298349913 0.0374715328 0.0456841476 0.0526632071 0.0596122742 0.0659826994 0.0726743937 0.0801063776 0.0864605904 0.093272984 0.0985423028 0.1056301 0.112029374 0.117664978 0.124581695 0.131623 0.137966633 0.144303799 0.150673315 0.15690279 0.162731737 0.168566346 0.174219012 0.180333138 0.186132312 0.191765338 0.196801841 0.20130527 0.205320701 0.209292844 0.21319747 0.21781987 0.222749531 0.227296054 0.231849611 0.236435533 0.241625786 0.246384859 0.250979304 0.255251348 0.25985229 0.264547706 0.269398153 0.274564028 0.279278636 0.283940971 0.288348913 0.2939502 0.299139261 0.304006219 0.309592068 0.315270901 0.320477486 0.325610876 0.331661224 0.336936712 0.34282136 0.348339081 0.353576124 0.359538645 0.365718156 0.37186563 0.378083944 0.384636641 0.390406549 0.397267222 0.403697312 0.410483778 0.41726017 0.424614906 0.430871248 0.437484264 0.445027351 0.452347904 0.459760904 0.467888176 0.475281835 0.483310759 0.492889762 0.50478828 0.515476584 0.526290119 0.535455108 0.547368288 0.562754631 0.576544642 0.589954615 0.60658586 0.622554779 0.642084241 0.666078091 0.692136168 0.72230041 0.76401937 0.821855664 1.12033129 ------------------------------ Transdef: Tab for variable 12 0.200007483 0.206688762 0.213292986 0.219902664 0.225919425 0.231836915 0.237451971 0.242736429 0.248772562 0.254167676 0.25900209 0.264323413 0.269951314 0.275219172 0.280053616 0.28497529 0.289994061 0.294919938 0.30022186 0.305129766 0.310044229 0.314730763 0.319905967 0.324374914 0.329239905 0.334388614 0.339736044 0.345290303 0.34995538 0.354756147 0.359938771 0.364976645 0.369766474 0.374940932 0.380361021 0.385748088 0.390890956 0.395335555 0.400152743 0.404495627 0.408373237 0.412238061 0.415361762 0.419317335 0.422912031 0.426819742 0.43067494 0.434483677 0.438031226 0.441999733 0.445944726 0.450401664 0.454839677 0.45896709 0.46354869 0.467941284 0.4723351 0.476939559 0.481205463 0.485794902 0.490172267 0.49436149 0.499047935 0.504000306 0.5095523 0.514968395 0.520245552 0.524411678 0.530127525 0.536122322 0.542769432 0.548742771 0.554828644 0.561885357 0.568910599 0.57515502 0.581700683 0.588992953 0.59640944 0.603698552 0.611514568 0.618919671 0.628139615 0.6368348 0.646415472 0.655900121 0.664460897 0.674093723 0.684028506 0.697117507 0.710755885 0.72412312 0.737578273 0.750742793 0.766172409 0.781545162 0.80020988 0.826235056 0.85746187 0.901634216 1.13498271 ------------------------------ Transdef: Tab for variable 13 20.0277634 21.561903 22.3646717 22.9992313 23.5682526 24.177002 24.7486687 25.2937813 25.7392712 26.1203766 26.6488876 27.1073761 27.5225735 27.9238625 28.3349152 28.7649727 29.1663017 29.4418583 29.8084068 30.176918 30.5146484 30.8342133 31.1339989 31.5032082 31.8620682 32.1814957 32.5237579 32.8983459 33.2499695 33.611599 34.0083084 34.3944321 34.7077293 35.0607605 35.4114685 35.7491341 36.0888443 36.3750305 36.7451401 37.1545486 37.5169678 37.8936462 38.2511749 38.616188 38.9808578 39.3556366 39.7252197 40.078598 40.4453697 40.7874908 41.183609 41.5238838 41.9036446 42.3418427 42.7403831 43.1335564 43.5340576 43.9401016 44.3431168 44.7483749 45.1524582 45.5469093 45.979332 46.4255066 46.8474197 47.2497368 47.7556992 48.2163773 48.7419662 49.2273674 49.7334518 50.2760162 50.7716599 51.3159599 51.852272 52.401825 53.0159988 53.735527 54.3883133 55.1271439 55.8279076 56.5851517 57.505127 58.4107742 59.2457275 60.3214035 61.3934021 62.4663887 63.7960434 65.3092041 66.8897705 68.5945053 70.3793335 72.6941299 75.1103821 77.5554352 80.7907867 85.6490326 91.6823349 101.822563 232.066116 ------------------------------ Transdef: Tab for variable 14 10.0028534 10.6761379 11.087388 11.4412775 11.7950211 12.2355251 12.5085602 12.8379002 13.1871557 13.5351677 13.8239813 14.0842218 14.4047947 14.7100124 14.9859467 15.2735577 15.6077518 15.9753361 16.2765293 16.6472359 16.9413528 17.267395 17.6170273 17.8883076 18.1944122 18.5568466 18.8835449 19.1649361 19.4316921 19.8087978 20.1442757 20.4748154 20.81604 21.1439991 21.4567719 21.7687111 22.1235046 22.4882965 22.8248596 23.1446152 23.4877625 23.8011475 24.1449852 24.4885578 24.8424282 25.1973457 25.5319138 25.857708 26.1946259 26.5426254 26.9138107 27.2373581 27.5962296 27.8811607 28.2006264 28.5426846 28.8841896 29.2050476 29.5534782 29.8578873 30.2060928 30.5702286 30.9239769 31.265789 31.5810509 31.9238033 32.2991066 32.6580276 33.0453835 33.3879318 33.7980003 34.1642303 34.5452423 34.9064178 35.318161 35.7330627 36.1749001 36.6199875 37.0681992 37.5133743 37.9846954 38.4718704 39.010994 39.5840073 40.142662 40.7404861 41.3964043 42.0754852 42.8380547 43.5558167 44.3765297 45.3696518 46.5007706 47.7300072 49.06633 50.8017731 52.8928604 55.641243 59.5452766 65.6243439 122.221924 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 68.8 54.6 58.1 34.6 53.1 66.0 59.0 67.5 -24.8 -18.9 -35.8 19.0 34.6 2 68.8 100.0 68.6 65.1 39.2 70.8 94.6 77.1 91.9 -42.2 -24.9 -45.9 33.9 42.7 3 54.6 68.6 100.0 54.6 13.2 37.5 71.8 56.1 73.3 -15.7 -26.7 -50.6 63.2 35.7 4 58.1 65.1 54.6 100.0 22.7 43.0 65.2 58.1 72.0 -10.3 -30.2 -57.8 26.8 75.5 5 34.6 39.2 13.2 22.7 100.0 85.4 12.2 60.4 59.9 24.0 -8.3 -13.6 -1.7 12.4 6 53.1 70.8 37.5 43.0 85.4 100.0 50.1 80.7 81.3 -5.2 -16.0 -28.4 12.8 26.7 7 66.0 94.6 71.8 65.2 12.2 50.1 100.0 68.2 80.7 -51.1 -24.2 -45.8 38.5 43.9 8 59.0 77.1 56.1 58.1 60.4 80.7 68.2 100.0 84.8 -9.0 -24.3 -41.0 26.7 39.2 9 67.5 91.9 73.3 72.0 59.9 81.3 80.7 84.8 100.0 -17.8 -27.4 -51.2 37.7 48.8 10 -24.8 -42.2 -15.7 -10.3 24.0 -5.2 -51.1 -9.0 -17.8 100.0 3.4 6.8 -5.2 -3.7 11 -18.9 -24.9 -26.7 -30.2 -8.3 -16.0 -24.2 -24.3 -27.4 3.4 100.0 54.4 -8.5 -19.0 12 -35.8 -45.9 -50.6 -57.8 -13.6 -28.4 -45.8 -41.0 -51.2 6.8 54.4 100.0 -20.8 -34.9 13 19.0 33.9 63.2 26.8 -1.7 12.8 38.5 26.7 37.7 -5.2 -8.5 -20.8 100.0 52.5 14 34.6 42.7 35.7 75.5 12.4 26.7 43.9 39.2 48.8 -3.7 -19.0 -34.9 52.5 100.0 TOTAL CORRELATION TO TARGET (diagonal) 176.916416 TOTAL CORRELATION OF ALL VARIABLES 75.4863509 ROUND 1: MAX CORR ( 75.486226) AFTER KILLING INPUT VARIABLE 8 CONTR 0.137347751 ROUND 2: MAX CORR ( 75.4856436) AFTER KILLING INPUT VARIABLE 14 CONTR 0.296518198 ROUND 3: MAX CORR ( 75.4841829) AFTER KILLING INPUT VARIABLE 6 CONTR 0.469587222 ROUND 4: MAX CORR ( 75.4800347) AFTER KILLING INPUT VARIABLE 11 CONTR 0.791351044 ROUND 5: MAX CORR ( 75.4754136) AFTER KILLING INPUT VARIABLE 2 CONTR 0.835211896 ROUND 6: MAX CORR ( 75.3233076) AFTER KILLING INPUT VARIABLE 12 CONTR 4.78929997 ROUND 7: MAX CORR ( 75.0853164) AFTER KILLING INPUT VARIABLE 10 CONTR 5.98297012 ROUND 8: MAX CORR ( 74.316281) AFTER KILLING INPUT VARIABLE 13 CONTR 10.7189134 ROUND 9: MAX CORR ( 73.0462209) AFTER KILLING INPUT VARIABLE 9 CONTR 13.6806156 ROUND 10: MAX CORR ( 72.7451258) AFTER KILLING INPUT VARIABLE 3 CONTR 6.62548581 ROUND 11: MAX CORR ( 71.1964073) AFTER KILLING INPUT VARIABLE 4 CONTR 14.9306701 ROUND 12: MAX CORR ( 65.9685065) AFTER KILLING INPUT VARIABLE 5 CONTR 26.7784347 LAST REMAINING VARIABLE: 7 total correlation to target: 75.4863509 % total significance: 161.708038 sigma correlations of single variables to target: variable 2: 68.776219 % , in sigma: 147.333489 variable 3: 54.5709618 % , in sigma: 116.902765 variable 4: 58.0823312 % , in sigma: 124.424876 variable 5: 34.6287492 % , in sigma: 74.1822467 variable 6: 53.1222687 % , in sigma: 113.799352 variable 7: 65.9685065 % , in sigma: 141.318763 variable 8: 59.0413648 % , in sigma: 126.479332 variable 9: 67.4659187 % , in sigma: 144.526543 variable 10: -24.8375364 % , in sigma: 53.2073578 variable 11: -18.8682906 % , in sigma: 40.4199463 variable 12: -35.769327 % , in sigma: 76.6256102 variable 13: 19.000291 % , in sigma: 40.7027198 variable 14: 34.5518298 % , in sigma: 74.0174687 variables sorted by significance: 1 most relevant variable 7 corr 65.9685059 , in sigma: 141.318762 2 most relevant variable 5 corr 26.7784348 , in sigma: 57.3651806 3 most relevant variable 4 corr 14.9306698 , in sigma: 31.9847137 4 most relevant variable 3 corr 6.6254859 , in sigma: 14.1932192 5 most relevant variable 9 corr 13.6806154 , in sigma: 29.3068277 6 most relevant variable 13 corr 10.7189131 , in sigma: 22.9622228 7 most relevant variable 10 corr 5.98297024 , in sigma: 12.8168122 8 most relevant variable 12 corr 4.78929996 , in sigma: 10.2597131 9 most relevant variable 2 corr 0.835211873 , in sigma: 1.7892039 10 most relevant variable 11 corr 0.79135102 , in sigma: 1.6952445 11 most relevant variable 6 corr 0.469587237 , in sigma: 1.0059571 12 most relevant variable 14 corr 0.296518207 , in sigma: 0.635205927 13 most relevant variable 8 corr 0.137347758 , in sigma: 0.294228509 global correlations between input variables: variable 2: 99.2143741 % variable 3: 92.3338705 % variable 4: 92.8189609 % variable 5: 95.4477941 % variable 6: 95.4585481 % variable 7: 98.8699041 % variable 8: 90.344784 % variable 9: 98.6758098 % variable 10: 71.8671015 % variable 11: 55.1408133 % variable 12: 72.7214958 % variable 13: 84.7673365 % variable 14: 89.1914196 % significance loss when removing single variables: variable 2: corr = 0.876183412 % , sigma = 1.87697377 variable 3: corr = 15.862754 % , sigma = 33.9814389 variable 4: corr = 13.3908163 % , sigma = 28.6860155 variable 5: corr = 16.3278207 % , sigma = 34.9777121 variable 6: corr = 0.485923043 % , sigma = 1.04095192 variable 7: corr = 8.72176771 % , sigma = 18.6839068 variable 8: corr = 0.137347751 % , sigma = 0.294228495 variable 9: corr = 10.7687563 % , sigma = 23.0689976 variable 10: corr = 5.62971413 % , sigma = 12.0600614 variable 11: corr = 0.778676984 % , sigma = 1.66809398 variable 12: corr = 3.94542546 % , sigma = 8.45195195 variable 13: corr = 7.53152319 % , sigma = 16.1341464 variable 14: corr = 0.297716459 % , sigma = 0.637772841 Keep only 8 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 9 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 4 --> 21.2922974 sigma out 15 active outputs RANK 2 NODE 1 --> 19.7546806 sigma out 15 active outputs RANK 3 NODE 7 --> 17.4194412 sigma out 15 active outputs RANK 4 NODE 3 --> 16.9121742 sigma out 15 active outputs RANK 5 NODE 8 --> 16.7653866 sigma out 15 active outputs RANK 6 NODE 6 --> 16.120182 sigma out 15 active outputs RANK 7 NODE 9 --> 15.5890017 sigma out 15 active outputs RANK 8 NODE 2 --> 12.5477381 sigma out 15 active outputs RANK 9 NODE 5 --> 12.4770508 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 9 --> 25.6585789 sigma in 9act. ( 26.2175236 sig out 1act.) RANK 2 NODE 4 --> 23.3533707 sigma in 9act. ( 21.937233 sig out 1act.) RANK 3 NODE 7 --> 21.3030891 sigma in 9act. ( 22.3590145 sig out 1act.) RANK 4 NODE 6 --> 20.0111904 sigma in 9act. ( 23.9371872 sig out 1act.) RANK 5 NODE 13 --> 13.9328976 sigma in 9act. ( 5.57618427 sig out 1act.) RANK 6 NODE 11 --> 11.1233921 sigma in 9act. ( 9.6035099 sig out 1act.) RANK 7 NODE 12 --> 7.28796148 sigma in 9act. ( 13.4491816 sig out 1act.) RANK 8 NODE 14 --> 4.62367725 sigma in 9act. ( 0.645812571 sig out 1act.) RANK 9 NODE 5 --> 4.16976118 sigma in 9act. ( 1.3065666 sig out 1act.) RANK 10 NODE 2 --> 3.75834775 sigma in 9act. ( 0.728206933 sig out 1act.) RANK 11 NODE 15 --> 3.66697383 sigma in 9act. ( 2.67170787 sig out 1act.) RANK 12 NODE 3 --> 3.63083339 sigma in 9act. ( 5.75285435 sig out 1act.) RANK 13 NODE 1 --> 3.18578076 sigma in 9act. ( 5.09894133 sig out 1act.) RANK 14 NODE 8 --> 2.54924417 sigma in 9act. ( 1.13342774 sig out 1act.) RANK 15 NODE 10 --> 2.23523498 sigma in 9act. ( 2.27396584 sig out 1act.) sorted by output significance RANK 1 NODE 9 --> 26.2175236 sigma out 1act.( 25.6585789 sig in 9act.) RANK 2 NODE 6 --> 23.9371872 sigma out 1act.( 20.0111904 sig in 9act.) RANK 3 NODE 7 --> 22.3590145 sigma out 1act.( 21.3030891 sig in 9act.) RANK 4 NODE 4 --> 21.937233 sigma out 1act.( 23.3533707 sig in 9act.) RANK 5 NODE 12 --> 13.4491816 sigma out 1act.( 7.28796148 sig in 9act.) RANK 6 NODE 11 --> 9.6035099 sigma out 1act.( 11.1233921 sig in 9act.) RANK 7 NODE 3 --> 5.75285435 sigma out 1act.( 3.63083339 sig in 9act.) RANK 8 NODE 13 --> 5.57618427 sigma out 1act.( 13.9328976 sig in 9act.) RANK 9 NODE 1 --> 5.09894133 sigma out 1act.( 3.18578076 sig in 9act.) RANK 10 NODE 15 --> 2.67170787 sigma out 1act.( 3.66697383 sig in 9act.) RANK 11 NODE 10 --> 2.27396584 sigma out 1act.( 2.23523498 sig in 9act.) RANK 12 NODE 5 --> 1.3065666 sigma out 1act.( 4.16976118 sig in 9act.) RANK 13 NODE 8 --> 1.13342774 sigma out 1act.( 2.54924417 sig in 9act.) RANK 14 NODE 2 --> 0.728206933 sigma out 1act.( 3.75834775 sig in 9act.) RANK 15 NODE 14 --> 0.645812571 sigma out 1act.( 4.62367725 sig in 9act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 51.1962814 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 4 --> 21.232769 sigma out 15 active outputs RANK 2 NODE 1 --> 18.3634949 sigma out 15 active outputs RANK 3 NODE 2 --> 18.2536469 sigma out 15 active outputs RANK 4 NODE 7 --> 18.2085476 sigma out 15 active outputs RANK 5 NODE 8 --> 17.6701927 sigma out 15 active outputs RANK 6 NODE 3 --> 16.3717728 sigma out 15 active outputs RANK 7 NODE 6 --> 15.9089069 sigma out 15 active outputs RANK 8 NODE 9 --> 14.9082489 sigma out 15 active outputs RANK 9 NODE 5 --> 13.3374634 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 9 --> 24.4507523 sigma in 9act. ( 24.2297134 sig out 1act.) RANK 2 NODE 6 --> 21.4713116 sigma in 9act. ( 22.4239655 sig out 1act.) RANK 3 NODE 7 --> 20.0360737 sigma in 9act. ( 20.3848572 sig out 1act.) RANK 4 NODE 4 --> 19.7345619 sigma in 9act. ( 20.1161747 sig out 1act.) RANK 5 NODE 12 --> 16.5872803 sigma in 9act. ( 14.5329056 sig out 1act.) RANK 6 NODE 11 --> 10.2869844 sigma in 9act. ( 8.95228958 sig out 1act.) RANK 7 NODE 3 --> 9.38355255 sigma in 9act. ( 6.35195017 sig out 1act.) RANK 8 NODE 13 --> 8.59664822 sigma in 9act. ( 4.44992733 sig out 1act.) RANK 9 NODE 1 --> 8.15327644 sigma in 9act. ( 6.65273523 sig out 1act.) RANK 10 NODE 2 --> 7.98510599 sigma in 9act. ( 1.02245891 sig out 1act.) RANK 11 NODE 10 --> 7.31138468 sigma in 9act. ( 2.62808728 sig out 1act.) RANK 12 NODE 15 --> 5.94021654 sigma in 9act. ( 3.83534741 sig out 1act.) RANK 13 NODE 8 --> 5.35815859 sigma in 9act. ( 1.69037366 sig out 1act.) RANK 14 NODE 5 --> 5.04207659 sigma in 9act. ( 0.908732951 sig out 1act.) RANK 15 NODE 14 --> 4.53025484 sigma in 9act. ( 0.0451621786 sig out 1act.) sorted by output significance RANK 1 NODE 9 --> 24.2297134 sigma out 1act.( 24.4507523 sig in 9act.) RANK 2 NODE 6 --> 22.4239655 sigma out 1act.( 21.4713116 sig in 9act.) RANK 3 NODE 7 --> 20.3848572 sigma out 1act.( 20.0360737 sig in 9act.) RANK 4 NODE 4 --> 20.1161747 sigma out 1act.( 19.7345619 sig in 9act.) RANK 5 NODE 12 --> 14.5329056 sigma out 1act.( 16.5872803 sig in 9act.) RANK 6 NODE 11 --> 8.95228958 sigma out 1act.( 10.2869844 sig in 9act.) RANK 7 NODE 1 --> 6.65273523 sigma out 1act.( 8.15327644 sig in 9act.) RANK 8 NODE 3 --> 6.35195017 sigma out 1act.( 9.38355255 sig in 9act.) RANK 9 NODE 13 --> 4.44992733 sigma out 1act.( 8.59664822 sig in 9act.) RANK 10 NODE 15 --> 3.83534741 sigma out 1act.( 5.94021654 sig in 9act.) RANK 11 NODE 10 --> 2.62808728 sigma out 1act.( 7.31138468 sig in 9act.) RANK 12 NODE 8 --> 1.69037366 sigma out 1act.( 5.35815859 sig in 9act.) RANK 13 NODE 2 --> 1.02245891 sigma out 1act.( 7.98510599 sig in 9act.) RANK 14 NODE 5 --> 0.908732951 sigma out 1act.( 5.04207659 sig in 9act.) RANK 15 NODE 14 --> 0.0451621786 sigma out 1act.( 4.53025484 sig in 9act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 48.2930641 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.390305638 *** contribution from regularisation: 0.00307058566 *** contribution from error: -0.393376231 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.445481926 *** contribution from regularisation: 0.00159225345 *** contribution from error: -0.447074175 *********************************************** -----------------> Test sample ENTER BFGS code START -45463.6161 0.128291532 0.0326302424 EXIT FROM BFGS code FG_START 0. 0.128291532 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.480172157 *** contribution from regularisation: 0.00177323434 *** contribution from error: -0.481945395 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -48992.9247 0.128291532 22.7432899 EXIT FROM BFGS code FG_LNSRCH 0. 0.130291402 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0.548129976 *** contribution from regularisation: 0.00263876375 *** contribution from error: -0.550768733 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -55926.7985 0.130291402 -2.62981057 EXIT FROM BFGS code NEW_X -55926.7985 0.130291402 -2.62981057 ENTER BFGS code NEW_X -55926.7985 0.130291402 -2.62981057 EXIT FROM BFGS code FG_LNSRCH 0. 0.129840568 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.555171192 *** contribution from regularisation: 0.00285409321 *** contribution from error: -0.558025301 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -56645.2287 0.129840568 -0.64436692 EXIT FROM BFGS code NEW_X -56645.2287 0.129840568 -0.64436692 ENTER BFGS code NEW_X -56645.2287 0.129840568 -0.64436692 EXIT FROM BFGS code FG_LNSRCH 0. 0.129739538 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.558610737 *** contribution from regularisation: 0.00263789762 *** contribution from error: -0.56124866 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -56996.1716 0.129739538 -8.78298473 EXIT FROM BFGS code NEW_X -56996.1716 0.129739538 -8.78298473 ENTER BFGS code NEW_X -56996.1716 0.129739538 -8.78298473 EXIT FROM BFGS code FG_LNSRCH 0. 0.122951329 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.560106933 *** contribution from regularisation: 0.00239842338 *** contribution from error: -0.562505364 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57148.8288 0.122951329 -4.28718233 EXIT FROM BFGS code NEW_X -57148.8288 0.122951329 -4.28718233 ENTER BFGS code NEW_X -57148.8288 0.122951329 -4.28718233 EXIT FROM BFGS code FG_LNSRCH 0. 0.120639011 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.560395122 *** contribution from regularisation: 0.00235158531 *** contribution from error: -0.562746704 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57178.2354 0.120639011 -12.3885803 EXIT FROM BFGS code NEW_X -57178.2354 0.120639011 -12.3885803 ENTER BFGS code NEW_X -57178.2354 0.120639011 -12.3885803 EXIT FROM BFGS code FG_LNSRCH 0. 0.106165834 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.561163485 *** contribution from regularisation: 0.0022876279 *** contribution from error: -0.563451111 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57256.6319 0.106165834 -18.2140541 EXIT FROM BFGS code NEW_X -57256.6319 0.106165834 -18.2140541 ENTER BFGS code NEW_X -57256.6319 0.106165834 -18.2140541 EXIT FROM BFGS code FG_LNSRCH 0. 0.0682720467 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 50.0618782 sigma out 15 active outputs RANK 2 NODE 1 --> 32.3471336 sigma out 15 active outputs RANK 3 NODE 7 --> 25.3150902 sigma out 15 active outputs RANK 4 NODE 4 --> 24.9392452 sigma out 15 active outputs RANK 5 NODE 8 --> 21.9147644 sigma out 15 active outputs RANK 6 NODE 9 --> 14.5111523 sigma out 15 active outputs RANK 7 NODE 3 --> 14.2318478 sigma out 15 active outputs RANK 8 NODE 6 --> 11.7665339 sigma out 15 active outputs RANK 9 NODE 5 --> 10.2597227 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 12 --> 58.6375465 sigma in 9act. ( 55.503109 sig out 1act.) RANK 2 NODE 15 --> 23.4923763 sigma in 9act. ( 18.8559361 sig out 1act.) RANK 3 NODE 3 --> 22.4284382 sigma in 9act. ( 21.1306095 sig out 1act.) RANK 4 NODE 1 --> 18.4703732 sigma in 9act. ( 17.8190975 sig out 1act.) RANK 5 NODE 8 --> 15.0009661 sigma in 9act. ( 11.0409698 sig out 1act.) RANK 6 NODE 9 --> 14.9816885 sigma in 9act. ( 14.0814486 sig out 1act.) RANK 7 NODE 10 --> 13.0444899 sigma in 9act. ( 10.9116716 sig out 1act.) RANK 8 NODE 7 --> 11.9488001 sigma in 9act. ( 11.5290575 sig out 1act.) RANK 9 NODE 6 --> 11.6646032 sigma in 9act. ( 9.24900913 sig out 1act.) RANK 10 NODE 14 --> 8.55689907 sigma in 9act. ( 9.02451134 sig out 1act.) RANK 11 NODE 4 --> 7.35986328 sigma in 9act. ( 6.75414801 sig out 1act.) RANK 12 NODE 2 --> 5.7898984 sigma in 9act. ( 2.80923772 sig out 1act.) RANK 13 NODE 11 --> 4.89407587 sigma in 9act. ( 2.66109991 sig out 1act.) RANK 14 NODE 13 --> 4.81094027 sigma in 9act. ( 1.69347119 sig out 1act.) RANK 15 NODE 5 --> 3.39276862 sigma in 9act. ( 1.32725143 sig out 1act.) sorted by output significance RANK 1 NODE 12 --> 55.503109 sigma out 1act.( 58.6375465 sig in 9act.) RANK 2 NODE 3 --> 21.1306095 sigma out 1act.( 22.4284382 sig in 9act.) RANK 3 NODE 15 --> 18.8559361 sigma out 1act.( 23.4923763 sig in 9act.) RANK 4 NODE 1 --> 17.8190975 sigma out 1act.( 18.4703732 sig in 9act.) RANK 5 NODE 9 --> 14.0814486 sigma out 1act.( 14.9816885 sig in 9act.) RANK 6 NODE 7 --> 11.5290575 sigma out 1act.( 11.9488001 sig in 9act.) RANK 7 NODE 8 --> 11.0409698 sigma out 1act.( 15.0009661 sig in 9act.) RANK 8 NODE 10 --> 10.9116716 sigma out 1act.( 13.0444899 sig in 9act.) RANK 9 NODE 6 --> 9.24900913 sigma out 1act.( 11.6646032 sig in 9act.) RANK 10 NODE 14 --> 9.02451134 sigma out 1act.( 8.55689907 sig in 9act.) RANK 11 NODE 4 --> 6.75414801 sigma out 1act.( 7.35986328 sig in 9act.) RANK 12 NODE 2 --> 2.80923772 sigma out 1act.( 5.7898984 sig in 9act.) RANK 13 NODE 11 --> 2.66109991 sigma out 1act.( 4.89407587 sig in 9act.) RANK 14 NODE 13 --> 1.69347119 sigma out 1act.( 4.81094027 sig in 9act.) RANK 15 NODE 5 --> 1.32725143 sigma out 1act.( 3.39276862 sig in 9act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 70.7428131 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.562197804 *** contribution from regularisation: 0.00231670169 *** contribution from error: -0.564514518 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -57362.1685 0.0682720467 -14.0283766 EXIT FROM BFGS code NEW_X -57362.1685 0.0682720467 -14.0283766 ENTER BFGS code NEW_X -57362.1685 0.0682720467 -14.0283766 EXIT FROM BFGS code FG_LNSRCH 0. 0.0525807291 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.562924027 *** contribution from regularisation: 0.0022865925 *** contribution from error: -0.56521064 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57436.2661 0.0525807291 -1.35785937 EXIT FROM BFGS code NEW_X -57436.2661 0.0525807291 -1.35785937 ENTER BFGS code NEW_X -57436.2661 0.0525807291 -1.35785937 EXIT FROM BFGS code FG_LNSRCH 0. 0.049658414 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.563601911 *** contribution from regularisation: 0.00212546391 *** contribution from error: -0.565727353 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57505.433 0.049658414 -3.0703342 EXIT FROM BFGS code NEW_X -57505.433 0.049658414 -3.0703342 ENTER BFGS code NEW_X -57505.433 0.049658414 -3.0703342 EXIT FROM BFGS code FG_LNSRCH 0. 0.0503709093 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.563800931 *** contribution from regularisation: 0.00216588052 *** contribution from error: -0.565966785 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57525.7387 0.0503709093 -3.21294069 EXIT FROM BFGS code NEW_X -57525.7387 0.0503709093 -3.21294069 ENTER BFGS code NEW_X -57525.7387 0.0503709093 -3.21294069 EXIT FROM BFGS code FG_LNSRCH 0. 0.0457535237 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.563922644 *** contribution from regularisation: 0.00221723155 *** contribution from error: -0.566139877 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57538.1573 0.0457535237 -2.05757284 EXIT FROM BFGS code NEW_X -57538.1573 0.0457535237 -2.05757284 ENTER BFGS code NEW_X -57538.1573 0.0457535237 -2.05757284 EXIT FROM BFGS code FG_LNSRCH 0. 0.0251607243 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.564264178 *** contribution from regularisation: 0.00231537502 *** contribution from error: -0.56657958 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57573.0012 0.0251607243 -0.222952574 EXIT FROM BFGS code NEW_X -57573.0012 0.0251607243 -0.222952574 ENTER BFGS code NEW_X -57573.0012 0.0251607243 -0.222952574 EXIT FROM BFGS code FG_LNSRCH 0. 0.0184424352 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.564357638 *** contribution from regularisation: 0.00227750419 *** contribution from error: -0.566635132 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57582.5392 0.0184424352 1.30699992 EXIT FROM BFGS code FG_LNSRCH 0. 0.0216646455 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.564418972 *** contribution from regularisation: 0.00229112292 *** contribution from error: -0.566710114 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57588.7994 0.0216646455 0.561828971 EXIT FROM BFGS code NEW_X -57588.7994 0.0216646455 0.561828971 ENTER BFGS code NEW_X -57588.7994 0.0216646455 0.561828971 EXIT FROM BFGS code FG_LNSRCH 0. 0.0234938525 0. --------------------------------------------------- Iteration : 18 *********************************************** *** Learn Path 18 *** loss function: -0.564520299 *** contribution from regularisation: 0.00224191695 *** contribution from error: -0.566762209 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57599.1348 0.0234938525 0.19619599 EXIT FROM BFGS code NEW_X -57599.1348 0.0234938525 0.19619599 ENTER BFGS code NEW_X -57599.1348 0.0234938525 0.19619599 EXIT FROM BFGS code FG_LNSRCH 0. 0.023308618 0. --------------------------------------------------- Iteration : 19 *********************************************** *** Learn Path 19 *** loss function: -0.564619243 *** contribution from regularisation: 0.00220588199 *** contribution from error: -0.566825151 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57609.2319 0.023308618 -0.140044644 EXIT FROM BFGS code NEW_X -57609.2319 0.023308618 -0.140044644 ENTER BFGS code NEW_X -57609.2319 0.023308618 -0.140044644 EXIT FROM BFGS code FG_LNSRCH 0. 0.0215438027 0. --------------------------------------------------- Iteration : 20 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 67.2703934 sigma out 15 active outputs RANK 2 NODE 1 --> 36.6421585 sigma out 15 active outputs RANK 3 NODE 4 --> 28.8745899 sigma out 15 active outputs RANK 4 NODE 7 --> 24.5059242 sigma out 15 active outputs RANK 5 NODE 3 --> 23.388979 sigma out 15 active outputs RANK 6 NODE 8 --> 21.6255131 sigma out 15 active outputs RANK 7 NODE 9 --> 20.95541 sigma out 15 active outputs RANK 8 NODE 6 --> 12.5712862 sigma out 15 active outputs RANK 9 NODE 5 --> 12.2153454 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 12 --> 78.8778839 sigma in 9act. ( 77.5797958 sig out 1act.) RANK 2 NODE 15 --> 28.0920639 sigma in 9act. ( 23.1340103 sig out 1act.) RANK 3 NODE 3 --> 23.1915016 sigma in 9act. ( 26.4778423 sig out 1act.) RANK 4 NODE 1 --> 18.827816 sigma in 9act. ( 17.2984467 sig out 1act.) RANK 5 NODE 8 --> 17.8863029 sigma in 9act. ( 14.0862446 sig out 1act.) RANK 6 NODE 9 --> 13.6062908 sigma in 9act. ( 11.8856697 sig out 1act.) RANK 7 NODE 10 --> 12.8413849 sigma in 9act. ( 13.4182005 sig out 1act.) RANK 8 NODE 6 --> 12.4123755 sigma in 9act. ( 12.7961226 sig out 1act.) RANK 9 NODE 14 --> 10.5526381 sigma in 9act. ( 11.8108673 sig out 1act.) RANK 10 NODE 7 --> 10.4101734 sigma in 9act. ( 10.4587946 sig out 1act.) RANK 11 NODE 4 --> 7.70792961 sigma in 9act. ( 8.61171818 sig out 1act.) RANK 12 NODE 13 --> 4.82659483 sigma in 9act. ( 3.45084786 sig out 1act.) RANK 13 NODE 11 --> 3.8436799 sigma in 9act. ( 2.19938374 sig out 1act.) RANK 14 NODE 2 --> 3.62485552 sigma in 9act. ( 0.676701128 sig out 1act.) RANK 15 NODE 5 --> 2.39514828 sigma in 9act. ( 0.996787667 sig out 1act.) sorted by output significance RANK 1 NODE 12 --> 77.5797958 sigma out 1act.( 78.8778839 sig in 9act.) RANK 2 NODE 3 --> 26.4778423 sigma out 1act.( 23.1915016 sig in 9act.) RANK 3 NODE 15 --> 23.1340103 sigma out 1act.( 28.0920639 sig in 9act.) RANK 4 NODE 1 --> 17.2984467 sigma out 1act.( 18.827816 sig in 9act.) RANK 5 NODE 8 --> 14.0862446 sigma out 1act.( 17.8863029 sig in 9act.) RANK 6 NODE 10 --> 13.4182005 sigma out 1act.( 12.8413849 sig in 9act.) RANK 7 NODE 6 --> 12.7961226 sigma out 1act.( 12.4123755 sig in 9act.) RANK 8 NODE 9 --> 11.8856697 sigma out 1act.( 13.6062908 sig in 9act.) RANK 9 NODE 14 --> 11.8108673 sigma out 1act.( 10.5526381 sig in 9act.) RANK 10 NODE 7 --> 10.4587946 sigma out 1act.( 10.4101734 sig in 9act.) RANK 11 NODE 4 --> 8.61171818 sigma out 1act.( 7.70792961 sig in 9act.) RANK 12 NODE 13 --> 3.45084786 sigma out 1act.( 4.82659483 sig in 9act.) RANK 13 NODE 11 --> 2.19938374 sigma out 1act.( 3.8436799 sig in 9act.) RANK 14 NODE 5 --> 0.996787667 sigma out 1act.( 2.39514828 sig in 9act.) RANK 15 NODE 2 --> 0.676701128 sigma out 1act.( 3.62485552 sig in 9act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 92.6220398 sigma in 15 active inputs *********************************************** *** Learn Path 20 *** loss function: -0.564724743 *** contribution from regularisation: 0.00216083392 *** contribution from error: -0.566885591 *********************************************** -----------------> Test sample Iteration No: 20 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -57619.995 0.0215438027 1.12679827 EXIT FROM BFGS code NEW_X -57619.995 0.0215438027 1.12679827 ENTER BFGS code NEW_X -57619.995 0.0215438027 1.12679827 EXIT FROM BFGS code FG_LNSRCH 0. 0.0186955053 0. --------------------------------------------------- Iteration : 21 *********************************************** *** Learn Path 21 *** loss function: -0.564760983 *** contribution from regularisation: 0.00217876164 *** contribution from error: -0.566939771 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57623.6956 0.0186955053 -1.41288221 EXIT FROM BFGS code NEW_X -57623.6956 0.0186955053 -1.41288221 ENTER BFGS code NEW_X -57623.6956 0.0186955053 -1.41288221 EXIT FROM BFGS code FG_LNSRCH 0. 0.0179278888 0. --------------------------------------------------- Iteration : 22 *********************************************** *** Learn Path 22 *** loss function: -0.564796627 *** contribution from regularisation: 0.00224915729 *** contribution from error: -0.567045808 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57627.3268 0.0179278888 -0.0756695047 EXIT FROM BFGS code NEW_X -57627.3268 0.0179278888 -0.0756695047 ENTER BFGS code NEW_X -57627.3268 0.0179278888 -0.0756695047 EXIT FROM BFGS code FG_LNSRCH 0. 0.017334247 0. --------------------------------------------------- Iteration : 23 *********************************************** *** Learn Path 23 *** loss function: -0.564858913 *** contribution from regularisation: 0.00226217252 *** contribution from error: -0.567121089 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57633.686 0.017334247 0.708586216 EXIT FROM BFGS code NEW_X -57633.686 0.017334247 0.708586216 ENTER BFGS code NEW_X -57633.686 0.017334247 0.708586216 EXIT FROM BFGS code FG_LNSRCH 0. 0.016217865 0. --------------------------------------------------- Iteration : 24 *********************************************** *** Learn Path 24 *** loss function: -0.564925432 *** contribution from regularisation: 0.00227211718 *** contribution from error: -0.567197561 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57640.4735 0.016217865 0.891379356 EXIT FROM BFGS code NEW_X -57640.4735 0.016217865 0.891379356 ENTER BFGS code NEW_X -57640.4735 0.016217865 0.891379356 EXIT FROM BFGS code FG_LNSRCH 0. 0.0120265549 0. --------------------------------------------------- Iteration : 25 *********************************************** *** Learn Path 25 *** loss function: -0.565155208 *** contribution from regularisation: 0.0022917958 *** contribution from error: -0.567447007 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57663.9184 0.0120265549 2.60776186 EXIT FROM BFGS code NEW_X -57663.9184 0.0120265549 2.60776186 ENTER BFGS code NEW_X -57663.9184 0.0120265549 2.60776186 EXIT FROM BFGS code FG_LNSRCH 0. 0.00724858185 0. --------------------------------------------------- Iteration : 26 *********************************************** *** Learn Path 26 *** loss function: -0.565259993 *** contribution from regularisation: 0.00242279144 *** contribution from error: -0.567682803 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57674.6051 0.00724858185 -2.60021257 EXIT FROM BFGS code NEW_X -57674.6051 0.00724858185 -2.60021257 ENTER BFGS code NEW_X -57674.6051 0.00724858185 -2.60021257 EXIT FROM BFGS code FG_LNSRCH 0. 0.00735506695 0. --------------------------------------------------- Iteration : 27 *********************************************** *** Learn Path 27 *** loss function: -0.565381229 *** contribution from regularisation: 0.00242434023 *** contribution from error: -0.567805588 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57686.976 0.00735506695 -0.322948515 EXIT FROM BFGS code NEW_X -57686.976 0.00735506695 -0.322948515 ENTER BFGS code NEW_X -57686.976 0.00735506695 -0.322948515 EXIT FROM BFGS code FG_LNSRCH 0. 0.00767611386 0. --------------------------------------------------- Iteration : 28 *********************************************** *** Learn Path 28 *** loss function: -0.565419912 *** contribution from regularisation: 0.00240778341 *** contribution from error: -0.567827702 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57690.9229 0.00767611386 0.459822953 EXIT FROM BFGS code NEW_X -57690.9229 0.00767611386 0.459822953 ENTER BFGS code NEW_X -57690.9229 0.00767611386 0.459822953 EXIT FROM BFGS code FG_LNSRCH 0. 0.00774535909 0. --------------------------------------------------- Iteration : 29 *********************************************** *** Learn Path 29 *** loss function: -0.565527737 *** contribution from regularisation: 0.00238222629 *** contribution from error: -0.567909956 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57701.9234 0.00774535909 0.0445609987 EXIT FROM BFGS code NEW_X -57701.9234 0.00774535909 0.0445609987 ENTER BFGS code NEW_X -57701.9234 0.00774535909 0.0445609987 EXIT FROM BFGS code FG_LNSRCH 0. 0.00730892876 0. --------------------------------------------------- Iteration : 30 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 80.8048935 sigma out 15 active outputs RANK 2 NODE 1 --> 42.917057 sigma out 15 active outputs RANK 3 NODE 3 --> 36.755867 sigma out 15 active outputs RANK 4 NODE 7 --> 33.5483589 sigma out 15 active outputs RANK 5 NODE 4 --> 28.4589329 sigma out 15 active outputs RANK 6 NODE 9 --> 23.6983509 sigma out 15 active outputs RANK 7 NODE 8 --> 22.5846767 sigma out 15 active outputs RANK 8 NODE 5 --> 19.2436905 sigma out 15 active outputs RANK 9 NODE 6 --> 17.7180004 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 12 --> 96.4181747 sigma in 9act. ( 93.3630066 sig out 1act.) RANK 2 NODE 15 --> 37.7735939 sigma in 9act. ( 34.7292366 sig out 1act.) RANK 3 NODE 1 --> 26.7829971 sigma in 9act. ( 27.6621151 sig out 1act.) RANK 4 NODE 8 --> 20.1073284 sigma in 9act. ( 17.3811035 sig out 1act.) RANK 5 NODE 3 --> 19.0236912 sigma in 9act. ( 20.6112671 sig out 1act.) RANK 6 NODE 6 --> 17.4025726 sigma in 9act. ( 20.7148838 sig out 1act.) RANK 7 NODE 14 --> 15.8688183 sigma in 9act. ( 17.0750828 sig out 1act.) RANK 8 NODE 10 --> 13.928133 sigma in 9act. ( 17.1994934 sig out 1act.) RANK 9 NODE 9 --> 13.3517056 sigma in 9act. ( 12.3592062 sig out 1act.) RANK 10 NODE 4 --> 12.7864742 sigma in 9act. ( 16.5412388 sig out 1act.) RANK 11 NODE 13 --> 7.35337496 sigma in 9act. ( 7.82241917 sig out 1act.) RANK 12 NODE 7 --> 6.90926313 sigma in 9act. ( 6.54020452 sig out 1act.) RANK 13 NODE 11 --> 2.90187645 sigma in 9act. ( 2.84253216 sig out 1act.) RANK 14 NODE 2 --> 2.17087221 sigma in 9act. ( 1.42692041 sig out 1act.) RANK 15 NODE 5 --> 1.18964362 sigma in 9act. ( 0.376557052 sig out 1act.) sorted by output significance RANK 1 NODE 12 --> 93.3630066 sigma out 1act.( 96.4181747 sig in 9act.) RANK 2 NODE 15 --> 34.7292366 sigma out 1act.( 37.7735939 sig in 9act.) RANK 3 NODE 1 --> 27.6621151 sigma out 1act.( 26.7829971 sig in 9act.) RANK 4 NODE 6 --> 20.7148838 sigma out 1act.( 17.4025726 sig in 9act.) RANK 5 NODE 3 --> 20.6112671 sigma out 1act.( 19.0236912 sig in 9act.) RANK 6 NODE 8 --> 17.3811035 sigma out 1act.( 20.1073284 sig in 9act.) RANK 7 NODE 10 --> 17.1994934 sigma out 1act.( 13.928133 sig in 9act.) RANK 8 NODE 14 --> 17.0750828 sigma out 1act.( 15.8688183 sig in 9act.) RANK 9 NODE 4 --> 16.5412388 sigma out 1act.( 12.7864742 sig in 9act.) RANK 10 NODE 9 --> 12.3592062 sigma out 1act.( 13.3517056 sig in 9act.) RANK 11 NODE 13 --> 7.82241917 sigma out 1act.( 7.35337496 sig in 9act.) RANK 12 NODE 7 --> 6.54020452 sigma out 1act.( 6.90926313 sig in 9act.) RANK 13 NODE 11 --> 2.84253216 sigma out 1act.( 2.90187645 sig in 9act.) RANK 14 NODE 2 --> 1.42692041 sigma out 1act.( 2.17087221 sig in 9act.) RANK 15 NODE 5 --> 0.376557052 sigma out 1act.( 1.18964362 sig in 9act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 113.894531 sigma in 15 active inputs *********************************************** *** Learn Path 30 *** loss function: -0.565575361 *** contribution from regularisation: 0.00238906988 *** contribution from error: -0.567964435 *********************************************** -----------------> Test sample Iteration No: 30 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -57706.7868 0.00730892876 2.88431668 EXIT FROM BFGS code NEW_X -57706.7868 0.00730892876 2.88431668 ENTER BFGS code NEW_X -57706.7868 0.00730892876 2.88431668 EXIT FROM BFGS code FG_LNSRCH 0. 0.00673182355 0. --------------------------------------------------- Iteration : 31 *********************************************** *** Learn Path 31 *** loss function: -0.565618455 *** contribution from regularisation: 0.00246919133 *** contribution from error: -0.568087637 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57711.1794 0.00673182355 -4.8827343 EXIT FROM BFGS code NEW_X -57711.1794 0.00673182355 -4.8827343 ENTER BFGS code NEW_X -57711.1794 0.00673182355 -4.8827343 EXIT FROM BFGS code FG_LNSRCH 0. 0.0053793448 0. --------------------------------------------------- Iteration : 32 *********************************************** *** Learn Path 32 *** loss function: -0.565685272 *** contribution from regularisation: 0.00249952264 *** contribution from error: -0.568184793 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57718. 0.0053793448 -0.916584432 EXIT FROM BFGS code NEW_X -57718. 0.0053793448 -0.916584432 ENTER BFGS code NEW_X -57718. 0.0053793448 -0.916584432 EXIT FROM BFGS code FG_LNSRCH 0. 0.00431329897 0. --------------------------------------------------- Iteration : 33 *********************************************** *** Learn Path 33 *** loss function: -0.565762162 *** contribution from regularisation: 0.00250090542 *** contribution from error: -0.568263054 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57725.8439 0.00431329897 -1.9841392 EXIT FROM BFGS code NEW_X -57725.8439 0.00431329897 -1.9841392 ENTER BFGS code NEW_X -57725.8439 0.00431329897 -1.9841392 EXIT FROM BFGS code FG_LNSRCH 0. 0.00207744306 0. --------------------------------------------------- Iteration : 34 *********************************************** *** Learn Path 34 *** loss function: -0.565882921 *** contribution from regularisation: 0.00251672883 *** contribution from error: -0.568399668 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57738.1637 0.00207744306 3.21091104 EXIT FROM BFGS code NEW_X -57738.1637 0.00207744306 3.21091104 ENTER BFGS code NEW_X -57738.1637 0.00207744306 3.21091104 EXIT FROM BFGS code FG_LNSRCH 0. -0.00207426376 0. --------------------------------------------------- Iteration : 35 *********************************************** *** Learn Path 35 *** loss function: -0.566088378 *** contribution from regularisation: 0.00260133971 *** contribution from error: -0.568689704 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57759.129 -0.00207426376 4.16771936 EXIT FROM BFGS code NEW_X -57759.129 -0.00207426376 4.16771936 ENTER BFGS code NEW_X -57759.129 -0.00207426376 4.16771936 EXIT FROM BFGS code FG_LNSRCH 0. -0.0044295392 0. --------------------------------------------------- Iteration : 36 *********************************************** *** Learn Path 36 *** loss function: -0.565245092 *** contribution from regularisation: 0.00264581223 *** contribution from error: -0.567890882 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57673.0884 -0.0044295392 10.6377277 EXIT FROM BFGS code FG_LNSRCH 0. -0.00246139779 0. --------------------------------------------------- Iteration : 37 *********************************************** *** Learn Path 37 *** loss function: -0.566174805 *** contribution from regularisation: 0.00256313314 *** contribution from error: -0.568737924 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57767.9503 -0.00246139779 5.0171771 EXIT FROM BFGS code NEW_X -57767.9503 -0.00246139779 5.0171771 ENTER BFGS code NEW_X -57767.9503 -0.00246139779 5.0171771 EXIT FROM BFGS code FG_LNSRCH 0. -0.00188075786 0. --------------------------------------------------- Iteration : 38 *********************************************** *** Learn Path 38 *** loss function: -0.566191435 *** contribution from regularisation: 0.00257345918 *** contribution from error: -0.568764865 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57769.6418 -0.00188075786 -1.39216614 EXIT FROM BFGS code NEW_X -57769.6418 -0.00188075786 -1.39216614 ENTER BFGS code NEW_X -57769.6418 -0.00188075786 -1.39216614 EXIT FROM BFGS code FG_LNSRCH 0. -0.00147700531 0. --------------------------------------------------- Iteration : 39 *********************************************** *** Learn Path 39 *** loss function: -0.566212773 *** contribution from regularisation: 0.00256466633 *** contribution from error: -0.568777442 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57771.8214 -0.00147700531 -2.78954673 EXIT FROM BFGS code NEW_X -57771.8214 -0.00147700531 -2.78954673 ENTER BFGS code NEW_X -57771.8214 -0.00147700531 -2.78954673 EXIT FROM BFGS code FG_LNSRCH 0. -0.00080117106 0. --------------------------------------------------- Iteration : 40 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 92.6615677 sigma out 15 active outputs RANK 2 NODE 1 --> 53.7818947 sigma out 15 active outputs RANK 3 NODE 3 --> 43.051712 sigma out 15 active outputs RANK 4 NODE 7 --> 42.2888069 sigma out 15 active outputs RANK 5 NODE 4 --> 39.3308678 sigma out 15 active outputs RANK 6 NODE 9 --> 29.0919628 sigma out 15 active outputs RANK 7 NODE 8 --> 28.4403172 sigma out 15 active outputs RANK 8 NODE 6 --> 25.1506805 sigma out 15 active outputs RANK 9 NODE 5 --> 23.1096783 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 12 --> 108.491058 sigma in 9act. ( 104.626869 sig out 1act.) RANK 2 NODE 15 --> 53.7018433 sigma in 9act. ( 50.9718285 sig out 1act.) RANK 3 NODE 1 --> 40.6661491 sigma in 9act. ( 44.4672356 sig out 1act.) RANK 4 NODE 14 --> 27.4572258 sigma in 9act. ( 29.4347763 sig out 1act.) RANK 5 NODE 6 --> 22.177784 sigma in 9act. ( 27.5585251 sig out 1act.) RANK 6 NODE 8 --> 21.487011 sigma in 9act. ( 20.2806911 sig out 1act.) RANK 7 NODE 3 --> 20.2594395 sigma in 9act. ( 22.7664566 sig out 1act.) RANK 8 NODE 10 --> 17.9251308 sigma in 9act. ( 21.7250423 sig out 1act.) RANK 9 NODE 9 --> 17.8082199 sigma in 9act. ( 18.2321854 sig out 1act.) RANK 10 NODE 4 --> 17.2813129 sigma in 9act. ( 21.2588463 sig out 1act.) RANK 11 NODE 13 --> 9.60346794 sigma in 9act. ( 10.4994316 sig out 1act.) RANK 12 NODE 7 --> 4.99969769 sigma in 9act. ( 4.58804655 sig out 1act.) RANK 13 NODE 11 --> 2.87373137 sigma in 9act. ( 2.73504663 sig out 1act.) RANK 14 NODE 2 --> 0.676227152 sigma in 9act. ( 0.680383146 sig out 1act.) RANK 15 NODE 5 --> 0.57011348 sigma in 9act. ( 0.252560645 sig out 1act.) sorted by output significance RANK 1 NODE 12 --> 104.626869 sigma out 1act.( 108.491058 sig in 9act.) RANK 2 NODE 15 --> 50.9718285 sigma out 1act.( 53.7018433 sig in 9act.) RANK 3 NODE 1 --> 44.4672356 sigma out 1act.( 40.6661491 sig in 9act.) RANK 4 NODE 14 --> 29.4347763 sigma out 1act.( 27.4572258 sig in 9act.) RANK 5 NODE 6 --> 27.5585251 sigma out 1act.( 22.177784 sig in 9act.) RANK 6 NODE 3 --> 22.7664566 sigma out 1act.( 20.2594395 sig in 9act.) RANK 7 NODE 10 --> 21.7250423 sigma out 1act.( 17.9251308 sig in 9act.) RANK 8 NODE 4 --> 21.2588463 sigma out 1act.( 17.2813129 sig in 9act.) RANK 9 NODE 8 --> 20.2806911 sigma out 1act.( 21.487011 sig in 9act.) RANK 10 NODE 9 --> 18.2321854 sigma out 1act.( 17.8082199 sig in 9act.) RANK 11 NODE 13 --> 10.4994316 sigma out 1act.( 9.60346794 sig in 9act.) RANK 12 NODE 7 --> 4.58804655 sigma out 1act.( 4.99969769 sig in 9act.) RANK 13 NODE 11 --> 2.73504663 sigma out 1act.( 2.87373137 sig in 9act.) RANK 14 NODE 2 --> 0.680383146 sigma out 1act.( 0.676227152 sig in 9act.) RANK 15 NODE 5 --> 0.252560645 sigma out 1act.( 0.57011348 sig in 9act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 139.546997 sigma in 15 active inputs *********************************************** *** Learn Path 40 *** loss function: -0.566264153 *** contribution from regularisation: 0.002530789 *** contribution from error: -0.568794966 *********************************************** -----------------> Test sample Iteration No: 40 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -57777.0641 -0.00080117106 -9.63596249 EXIT FROM BFGS code NEW_X -57777.0641 -0.00080117106 -9.63596249 ENTER BFGS code NEW_X -57777.0641 -0.00080117106 -9.63596249 EXIT FROM BFGS code FG_LNSRCH 0. -0.00129668228 0. --------------------------------------------------- Iteration : 41 *********************************************** *** Learn Path 41 *** loss function: -0.566279292 *** contribution from regularisation: 0.00251026498 *** contribution from error: -0.568789542 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57778.6066 -0.00129668228 -1.88437736 EXIT FROM BFGS code NEW_X -57778.6066 -0.00129668228 -1.88437736 ENTER BFGS code NEW_X -57778.6066 -0.00129668228 -1.88437736 EXIT FROM BFGS code FG_LNSRCH 0. -0.00189405191 0. --------------------------------------------------- Iteration : 42 *********************************************** *** Learn Path 42 *** loss function: -0.566271663 *** contribution from regularisation: 0.00255138613 *** contribution from error: -0.56882304 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57777.8318 -0.00189405191 -2.95811772 EXIT FROM BFGS code FG_LNSRCH 0. -0.00143488287 0. --------------------------------------------------- Iteration : 43 *********************************************** *** Learn Path 43 *** loss function: -0.566219211 *** contribution from regularisation: 0.00258240569 *** contribution from error: -0.568801641 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57772.4798 -0.00143488287 -2.15194416 EXIT FROM BFGS code FG_LNSRCH 0. -0.00130174507 0. --------------------------------------------------- Iteration : 44 *********************************************** *** Learn Path 44 *** loss function: -0.566219687 *** contribution from regularisation: 0.00257034157 *** contribution from error: -0.568790019 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57772.5279 -0.00130174507 -1.90283 EXIT FROM BFGS code FG_LNSRCH 0. -0.00129669043 0. --------------------------------------------------- Iteration : 45 *********************************************** *** Learn Path 45 *** loss function: -0.566220582 *** contribution from regularisation: 0.00256898394 *** contribution from error: -0.568789542 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57772.6154 -0.00129669043 -1.89577866 EXIT FROM BFGS code FG_LNSRCH 0. -0.00129668228 0. --------------------------------------------------- Iteration : 46 *********************************************** *** Learn Path 46 *** loss function: -0.566224575 *** contribution from regularisation: 0.00256497948 *** contribution from error: -0.568789542 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57773.0239 -0.00129668228 -1.89838684 EXIT FROM BFGS code FG_LNSRCH 0. -0.00129668228 0. --------------------------------------------------- Iteration : 47 *********************************************** *** Learn Path 47 *** loss function: -0.56621623 *** contribution from regularisation: 0.00257330015 *** contribution from error: -0.568789542 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57772.1749 -0.00129668228 -1.90119505 EXIT FROM BFGS code FG_LNSRCH 0. -0.00129668228 0. --------------------------------------------------- Iteration : 48 *********************************************** *** Learn Path 48 *** loss function: -0.566218793 *** contribution from regularisation: 0.00257072458 *** contribution from error: -0.568789542 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57772.4377 -0.00129668228 -1.90398633 EXIT FROM BFGS code NEW_X -57772.4377 -0.00129668228 -1.90398633 ENTER BFGS code NEW_X -57772.4377 -0.00129668228 -1.90398633 EXIT FROM BFGS code CONVERGENC -57772.4377 -0.00129668228 -1.90398633 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 135.470261 sigma out 15 active outputs RANK 2 NODE 1 --> 78.5601654 sigma out 15 active outputs RANK 3 NODE 7 --> 59.4712524 sigma out 15 active outputs RANK 4 NODE 3 --> 59.4548416 sigma out 15 active outputs RANK 5 NODE 4 --> 57.5851593 sigma out 15 active outputs RANK 6 NODE 9 --> 43.6495323 sigma out 15 active outputs RANK 7 NODE 8 --> 40.1085052 sigma out 15 active outputs RANK 8 NODE 6 --> 35.9660072 sigma out 15 active outputs RANK 9 NODE 5 --> 31.6321182 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 12 --> 157.48613 sigma in 9act. ( 154.509201 sig out 1act.) RANK 2 NODE 15 --> 77.5516815 sigma in 9act. ( 71.4011078 sig out 1act.) RANK 3 NODE 1 --> 56.3154678 sigma in 9act. ( 62.7624779 sig out 1act.) RANK 4 NODE 14 --> 40.179882 sigma in 9act. ( 43.7789803 sig out 1act.) RANK 5 NODE 6 --> 32.3162308 sigma in 9act. ( 40.4280815 sig out 1act.) RANK 6 NODE 8 --> 30.922411 sigma in 9act. ( 29.8485222 sig out 1act.) RANK 7 NODE 3 --> 29.4082832 sigma in 9act. ( 33.4174232 sig out 1act.) RANK 8 NODE 10 --> 26.2061806 sigma in 9act. ( 32.6403389 sig out 1act.) RANK 9 NODE 9 --> 25.9958553 sigma in 9act. ( 26.6372166 sig out 1act.) RANK 10 NODE 4 --> 25.3600655 sigma in 9act. ( 31.9261436 sig out 1act.) RANK 11 NODE 13 --> 13.4455061 sigma in 9act. ( 14.9753933 sig out 1act.) RANK 12 NODE 7 --> 7.1119175 sigma in 9act. ( 6.80021572 sig out 1act.) RANK 13 NODE 11 --> 3.76218224 sigma in 9act. ( 3.74094319 sig out 1act.) RANK 14 NODE 2 --> 0.909772336 sigma in 9act. ( 0.979646266 sig out 1act.) RANK 15 NODE 5 --> 0.605353594 sigma in 9act. ( 0.366434455 sig out 1act.) sorted by output significance RANK 1 NODE 12 --> 154.509201 sigma out 1act.( 157.48613 sig in 9act.) RANK 2 NODE 15 --> 71.4011078 sigma out 1act.( 77.5516815 sig in 9act.) RANK 3 NODE 1 --> 62.7624779 sigma out 1act.( 56.3154678 sig in 9act.) RANK 4 NODE 14 --> 43.7789803 sigma out 1act.( 40.179882 sig in 9act.) RANK 5 NODE 6 --> 40.4280815 sigma out 1act.( 32.3162308 sig in 9act.) RANK 6 NODE 3 --> 33.4174232 sigma out 1act.( 29.4082832 sig in 9act.) RANK 7 NODE 10 --> 32.6403389 sigma out 1act.( 26.2061806 sig in 9act.) RANK 8 NODE 4 --> 31.9261436 sigma out 1act.( 25.3600655 sig in 9act.) RANK 9 NODE 8 --> 29.8485222 sigma out 1act.( 30.922411 sig in 9act.) RANK 10 NODE 9 --> 26.6372166 sigma out 1act.( 25.9958553 sig in 9act.) RANK 11 NODE 13 --> 14.9753933 sigma out 1act.( 13.4455061 sig in 9act.) RANK 12 NODE 7 --> 6.80021572 sigma out 1act.( 7.1119175 sig in 9act.) RANK 13 NODE 11 --> 3.74094319 sigma out 1act.( 3.76218224 sig in 9act.) RANK 14 NODE 2 --> 0.979646266 sigma out 1act.( 0.909772336 sig in 9act.) RANK 15 NODE 5 --> 0.366434455 sigma out 1act.( 0.605353594 sig in 9act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 203.835312 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.566217005 *** contribution from regularisation: 0.00257251202 *** contribution from error: -0.568789542 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 29081 Closing output file done