NNInput NNInputs_135.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree NNAna::CopyTree: entries= 40744 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 10300 nbkg = 30444 Bkg Entries: 30444 Sig Entries: 10300 Chosen entries: 10300 Signal fraction: 1 Background fraction: 0.338326 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 30444 Actual Signal Entries: 10300 Entries to split: 10300 Test with : 5150 Train with : 5150 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 10300 for Signal Prepared event 0 for Signal with 10300 events ====Entry 0 Variable Ht : 232.766 Variable LepAPt : 74.3556 Variable LepBPt : 30.6959 Variable MetSigLeptonsJets : 3.09436 Variable MetSpec : 41.4986 Variable SumEtLeptonsJets : 190.102 Variable VSumJetLeptonsPt : 35.9078 Variable addEt : 147.716 Variable dPhiLepSumMet : 2.12661 Variable dPhiLeptons : 0.248817 Variable dRLeptons : 0.333981 Variable lep1_E : 86.3275 Variable lep2_E : 32.4592 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2135 Ht = 232.766 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 74.3557 LepAPt = 74.3556 LepBEt = 30.6962 LepBPt = 30.6959 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 42.6642 MetDelPhi = 1.33651 MetSig = 2.91556 MetSigLeptonsJets = 3.09436 MetSpec = 41.4986 Mjj = 0 MostCentralJetEta = -1.56143 MtllMet = 152.006 Njets = 1 SB = 0 SumEt = 214.133 SumEtJets = 0 SumEtLeptonsJets = 190.102 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 35.9078 addEt = 147.716 dPhiLepSumMet = 2.12661 dPhiLeptons = 0.248817 dRLeptons = 0.333981 diltype = 17 dimass = 15.9508 event = 435 jet1_Et = 85.0503 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 86.3275 lep2_E = 32.4592 rand = 0.999742 run = 230778 weight = 2.42631e-06 ===Show End Prepared event 10000 for Signal with 10300 events Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 30444 for Background Prepared event 0 for Background with 30444 events ====Entry 0 Variable Ht : 85.3408 Variable LepAPt : 31.1294 Variable LepBPt : 10.0664 Variable MetSigLeptonsJets : 6.87768 Variable MetSpec : 44.1437 Variable SumEtLeptonsJets : 41.1959 Variable VSumJetLeptonsPt : 41.0132 Variable addEt : 85.3408 Variable dPhiLepSumMet : 3.10536 Variable dPhiLeptons : 0.219342 Variable dRLeptons : 0.424454 Variable lep1_E : 32.3548 Variable lep2_E : 10.1027 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 85.3408 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 31.1297 LepAPt = 31.1294 LepBEt = 10.0674 LepBPt = 10.0664 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 44.1437 MetDelPhi = 3.01191 MetSig = 3.74558 MetSigLeptonsJets = 6.87768 MetSpec = 44.1437 Mjj = 0 MostCentralJetEta = 0 MtllMet = 86.2332 Njets = 0 SB = 0 SumEt = 138.899 SumEtJets = 0 SumEtLeptonsJets = 41.1959 Target = 0 TrainWeight = 1.28045 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 41.0132 addEt = 85.3408 dPhiLepSumMet = 3.10536 dPhiLeptons = 0.219342 dRLeptons = 0.424454 diltype = 17 dimass = 7.54723 event = 6717520 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 32.3548 lep2_E = 10.1027 rand = 0.999742 run = 271566 weight = 0.00296168 ===Show End Prepared event 10000 for Background with 30444 events Prepared event 20000 for Background with 30444 events Prepared event 30000 for Background with 30444 events Warning: found 994 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 40744 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 994 negative weights. Signal fraction: 62.0445557 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 55.9230614 62.1883278 63.9844284 65.8319778 67.9470596 69.4522629 70.8289337 72.1472168 73.7552338 75.1184692 76.7157669 78.3483276 79.611557 80.9518127 82.3116608 83.7594757 84.9990845 86.0416718 87.4409332 88.5504303 89.9413986 91.0549088 92.2181549 93.3908691 94.3297882 95.4960632 96.5339966 97.5376282 98.5500336 99.5657654 100.568871 101.524963 102.591431 103.69873 104.75322 105.698265 106.546761 107.331055 108.196228 108.953583 109.916275 110.718651 111.537056 112.325745 113.184204 114.164597 115.092216 115.815262 116.886459 117.624237 118.556503 119.639 120.641998 121.539459 122.368118 123.323021 124.230698 125.307877 126.236359 127.190735 128.064362 128.97403 129.989288 130.922073 132.262543 133.419479 134.472778 136.032684 137.556458 139.288696 140.749207 141.884964 143.54332 145.178909 147.374115 149.228607 150.989548 153.066132 154.955261 156.913361 158.956726 161.116211 163.71344 166.342026 168.7332 171.262299 174.355194 177.865189 181.54718 185.576752 189.431488 194.032837 199.836853 205.689224 212.946075 221.421097 231.84198 242.535706 263.414978 309.362 679.506348 ------------------------------ Transdef: Tab for variable 3 20.0000362 20.4206123 20.8053398 21.2121086 21.5685158 21.9648991 22.3479347 22.6068516 22.9063549 23.19561 23.473135 23.7968464 24.0741482 24.3286686 24.5998802 24.8714104 25.0738258 25.3574276 25.626749 25.8880692 26.1403961 26.3448696 26.5600891 26.7683907 26.9865379 27.2060165 27.465683 27.7046585 27.972187 28.2112198 28.4584312 28.6668739 28.8479347 29.0974998 29.3159142 29.5307331 29.7437592 29.9577332 30.1802692 30.3858719 30.628891 30.8457127 31.0704308 31.287075 31.519474 31.7562733 31.9878578 32.2152786 32.4842682 32.7202187 32.9601974 33.2031555 33.412178 33.6846313 33.9334869 34.1819839 34.4107208 34.6192551 34.8389244 35.0876541 35.3416748 35.5713806 35.8938904 36.1939964 36.4606514 36.7344284 36.9687653 37.2850952 37.6104889 37.9188652 38.2385406 38.5654831 38.9309082 39.3058472 39.6821747 40.0431137 40.4721527 40.8252792 41.2439384 41.587944 42.0288086 42.4438591 42.9905548 43.4733391 44.0120583 44.6950378 45.3402176 46.0728912 46.8706207 47.7793503 48.7839127 49.7983246 50.9859505 52.1792831 53.7027702 55.7357521 57.9219131 61.1893387 65.4218979 74.3394623 162.429443 ------------------------------ Transdef: Tab for variable 4 10.0010262 10.1700001 10.3121796 10.4945488 10.6670876 10.8778248 11.0800495 11.2400227 11.4481144 11.6239052 11.8213387 12.0215893 12.2455311 12.4466314 12.658803 12.835 13.0468025 13.2316389 13.4107599 13.5803604 13.792263 13.9960976 14.2018604 14.4029226 14.5720634 14.746048 14.9719772 15.1609058 15.3587694 15.5562153 15.7750778 15.9797068 16.1833649 16.4247723 16.6298027 16.792469 17.0450668 17.2497978 17.4862099 17.7097702 17.9222984 18.1626892 18.3686867 18.5889053 18.8135662 19.0257225 19.2492027 19.4547691 19.6443481 19.867506 20.0536022 20.2116508 20.3966751 20.5550003 20.7201576 20.9171906 21.0894737 21.2604828 21.449028 21.6445999 21.8202877 21.9942379 22.1621113 22.3605614 22.5120754 22.6906738 22.8759174 23.0523205 23.2463417 23.4709034 23.6732426 23.9096107 24.0865288 24.3122044 24.552639 24.79673 25.0403748 25.3111877 25.6094055 25.8786469 26.137043 26.4061642 26.7069511 27.0040493 27.361557 27.6938057 28.1199837 28.4816399 28.8652382 29.2939987 29.692791 30.2717896 30.7812996 31.4072189 32.0567398 32.7630157 33.8504868 35.5011063 38.1628113 42.0190735 71.4863358 ------------------------------ Transdef: Tab for variable 5 1.55223811 2.60762548 2.89171982 3.09252739 3.25341916 3.38687134 3.55178356 3.66793108 3.78427696 3.9049077 3.98997307 4.06855679 4.16044426 4.23104715 4.29503059 4.35583782 4.43504 4.50222015 4.56839514 4.63961744 4.70745087 4.77301884 4.84700298 4.89502525 4.98270655 5.04352093 5.12406349 5.20374155 5.28650284 5.34664917 5.40289164 5.46935558 5.54014397 5.61240864 5.68020248 5.74110842 5.81217241 5.86636448 5.91728783 5.9791584 6.04029226 6.0972538 6.14358521 6.19635296 6.24632168 6.29606056 6.35926819 6.41496181 6.47002029 6.51634502 6.56211758 6.61169291 6.6592288 6.6998601 6.75246239 6.79765034 6.83801651 6.87866497 6.9220686 6.97076988 7.01545238 7.05951023 7.10092735 7.14589787 7.19540501 7.23768091 7.2903204 7.32309914 7.3646841 7.40571022 7.44736004 7.49351358 7.54330111 7.58697414 7.63027096 7.67950535 7.72465801 7.77943468 7.8298111 7.87861824 7.92348433 7.98119354 8.02963161 8.09266567 8.15842056 8.22603226 8.28167915 8.3572731 8.43877029 8.52478123 8.61553574 8.70404053 8.81441402 8.92282295 9.07105255 9.24151611 9.45344543 9.69572449 9.99455357 10.6057167 15.1699209 ------------------------------ Transdef: Tab for variable 6 25.0001774 25.3285542 25.6959038 26.0709152 26.5364647 26.9539909 27.3412952 27.7388248 28.2795086 28.7520676 29.4054127 29.9109192 30.4768848 31.2261829 31.8459473 32.4568443 32.9999046 33.6581154 34.2392426 34.9638443 35.625412 36.1908836 36.7976608 37.3109741 37.8095627 38.3572159 38.965004 39.556324 40.1902428 40.7230492 41.2429352 41.7798004 42.223484 42.7590408 43.2733994 43.823307 44.2644653 44.7252731 45.2106514 45.7945404 46.2114601 46.7496834 47.3251266 47.837059 48.2851143 48.7036514 49.2060776 49.6344872 50.0118408 50.4477806 50.901001 51.3378448 51.7907715 52.2712021 52.6914635 53.1458015 53.689209 54.1801529 54.5552521 55.0047607 55.5017281 55.9723778 56.4822617 56.8836899 57.3241196 57.7955856 58.3229103 58.8134499 59.3349648 59.8075562 60.1974373 60.7687073 61.3275185 61.8351974 62.2721329 62.825592 63.2737923 63.8680153 64.4619675 64.9928131 65.6515961 66.3105621 67.1125488 67.8344879 68.6424408 69.3934937 70.1881256 71.3880463 72.4799576 73.5233307 74.7188416 76.087677 77.7432861 79.8543091 82.1676636 84.7496033 88.4624329 92.8366699 99.3822784 109.376213 218.598831 ------------------------------ Transdef: Tab for variable 7 30.1229668 33.2577667 34.6219063 35.6408577 36.593544 37.3248024 37.9601517 38.9538193 39.6912003 40.3184509 40.9935646 41.7067566 42.2334824 42.8137283 43.3948097 44.1314735 44.6406097 45.3551216 45.8209 46.4727325 47.0205765 47.627758 48.2887039 48.913475 49.4546204 50.0668297 50.6402512 51.068264 51.574646 52.1523323 52.680809 53.2036171 53.6971207 54.1657639 54.6743469 55.1520004 55.5682259 56.0975571 56.5557022 57.0571823 57.5604706 58.0494232 58.6905708 59.2679214 59.8763275 60.3290405 60.8781319 61.3887787 62.0021744 62.4890594 63.1169662 63.6050339 64.1661224 64.734787 65.2561646 65.9068909 66.6867218 67.3327789 68.0622025 68.9839478 69.9131927 70.9466934 71.7733231 72.7568359 73.8623352 74.9313354 75.9009018 77.0866241 78.1827621 79.2151413 80.4529266 81.8108368 83.2953644 84.7077103 86.2346191 87.7288818 89.3939667 91.2751846 93.2788849 94.8839722 96.9227448 98.7937469 100.869804 102.574829 105.310089 108.001556 110.738098 113.18837 116.615616 119.785339 123.998726 127.653595 133.36084 139.05307 145.730103 151.810867 161.090973 171.132233 187.142456 215.651886 421.977814 ------------------------------ Transdef: Tab for variable 8 1.15510297 22.2872543 27.9732895 30.3313599 31.8643227 32.8550415 33.7437897 34.5133247 35.0796432 35.8399887 36.4200745 36.9977036 37.4464264 38.0145454 38.6169319 39.1078415 39.5663147 39.9368973 40.4552841 40.8938675 41.2977142 41.7488251 42.1881943 42.6123276 43.0213089 43.4931107 43.8860321 44.3236847 44.7437668 45.1349297 45.4691925 45.8572006 46.2975082 46.831871 47.2811546 47.6480789 48.1095123 48.5248375 48.8695984 49.2789383 49.618782 50.0613022 50.433876 50.709549 51.042923 51.4334412 51.8415146 52.2502861 52.5771294 52.9625626 53.2263031 53.5934525 53.9862976 54.3851128 54.6955261 55.05896 55.4946518 55.8702469 56.2207642 56.5545654 56.965889 57.3408546 57.7259903 58.2511826 58.7013245 59.0677299 59.5929642 60.0258217 60.4423141 60.837326 61.2341309 61.7473526 62.1874542 62.6539497 63.1617813 63.6701202 64.1712494 64.6421661 65.283638 65.8379364 66.4526978 67.0726624 67.8038483 68.5554047 69.6219482 70.4969406 71.2838364 72.2231445 73.3922882 74.5363235 76.1130981 77.59758 79.3866043 81.0714188 83.0658951 85.6604919 89.3667526 94.9627991 102.786163 115.46434 289.322327 ------------------------------ Transdef: Tab for variable 9 55.9230614 61.7263184 63.6360703 64.8863449 66.3647308 67.8544312 69.0609131 70.2311859 71.2692642 72.3725433 73.5279999 74.5858078 75.7801132 76.9379578 78.207428 79.357132 80.2365723 81.3477783 82.3893433 83.4823761 84.3756409 85.3349228 86.0537109 86.9847565 87.9066315 88.8816147 89.8652191 90.8287354 91.8171692 92.7223053 93.4627533 94.1445923 95.0221405 95.9183426 96.6047058 97.4256744 98.2991028 99.1245422 100.069687 100.8349 101.464828 102.426544 103.138405 104.011002 104.697769 105.412384 106.037292 106.649178 107.40538 108.133057 108.7939 109.458374 110.107666 110.674332 111.3284 111.872574 112.658478 113.211487 113.999451 114.654282 115.334015 115.898315 116.676208 117.363869 118.064758 118.79882 119.461121 120.192711 120.941208 121.564545 122.251541 122.967667 123.746933 124.487144 125.259857 125.959778 126.604095 127.379929 128.180588 128.958801 129.821472 130.574463 131.613525 132.504883 133.332092 134.275406 135.218262 136.478455 137.722778 139.279419 140.627106 142.16507 144.258545 146.784729 149.517273 153.293671 158.677643 165.788452 176.166718 196.504364 382.729065 ------------------------------ Transdef: Tab for variable 10 0.0050833188 0.975244522 1.21388865 1.37755525 1.51302314 1.64229858 1.72631454 1.80126274 1.88595963 1.95779324 2.02585459 2.08396173 2.13920522 2.18949056 2.24813318 2.28785634 2.32475996 2.35857534 2.39537621 2.42828703 2.45497537 2.48473597 2.51729584 2.54585814 2.57296705 2.5948019 2.61771822 2.6402545 2.6605773 2.67770386 2.70009995 2.71387005 2.73029566 2.74879241 2.76397705 2.77838612 2.79385567 2.80849743 2.82018185 2.83128119 2.84438539 2.85587692 2.87008166 2.8795898 2.89006782 2.89911342 2.90810537 2.91732764 2.92586517 2.93416786 2.94221306 2.94946098 2.95806551 2.96488738 2.97182989 2.97788811 2.98349071 2.98883724 2.99395084 2.99926853 3.00495052 3.00882769 3.01326895 3.01810789 3.02257156 3.0274682 3.03223324 3.03621387 3.04078722 3.04487085 3.04906416 3.0525198 3.05667281 3.06064129 3.0647006 3.06771731 3.07187462 3.07534623 3.07876778 3.08229303 3.08588719 3.08905268 3.09218359 3.09524679 3.09847903 3.10122323 3.10397768 3.10640001 3.10910177 3.11177206 3.11434889 3.11739278 3.12036443 3.12292576 3.12559938 3.12806749 3.13087988 3.13358688 3.13679743 3.13939381 3.14156938 ------------------------------ Transdef: Tab for variable 11 1.71661377E-05 0.00677073002 0.0174747705 0.0272805095 0.0366871059 0.0461843573 0.0541509986 0.0606644154 0.0679016113 0.0730605721 0.0793822408 0.0869750977 0.0944001675 0.100295901 0.106723785 0.113552861 0.120510578 0.126893997 0.133621931 0.139707148 0.14480865 0.150639057 0.156279683 0.162695527 0.169041514 0.175363302 0.181773305 0.188295513 0.193466932 0.197461128 0.203067541 0.207595348 0.21190536 0.215586156 0.220600843 0.224679589 0.22913897 0.234037787 0.238146305 0.242664516 0.247137785 0.251351595 0.255367756 0.259400368 0.263409734 0.268118858 0.272598386 0.276558012 0.28094852 0.285937786 0.290353298 0.295017958 0.299813986 0.303317547 0.307963222 0.31321907 0.319010079 0.324117899 0.328469753 0.333419681 0.338832617 0.344217062 0.350034237 0.355125129 0.360860348 0.366955429 0.372951746 0.379471064 0.386373162 0.39231813 0.398217082 0.404469252 0.409873456 0.416245878 0.422514856 0.42800191 0.43437016 0.439650834 0.44606173 0.45240593 0.459526777 0.466110468 0.474269509 0.482026696 0.490422189 0.498164624 0.50744319 0.516085505 0.526788712 0.535721242 0.547193587 0.559738874 0.575618446 0.592810929 0.613603115 0.638017535 0.66499114 0.69224906 0.739474654 0.808783531 1.12525487 ------------------------------ Transdef: Tab for variable 12 0.200089261 0.207959458 0.216022313 0.222592369 0.228457958 0.235419601 0.241574198 0.247969478 0.253776431 0.259747475 0.26506418 0.271776408 0.277254701 0.28220892 0.287181526 0.291976273 0.296382368 0.300655335 0.305237651 0.310018837 0.314292669 0.319385916 0.324285209 0.329186767 0.333998919 0.337865263 0.342607737 0.346729696 0.350955725 0.355142415 0.360392779 0.365131646 0.369326413 0.374187887 0.379253298 0.383807957 0.388168454 0.392678291 0.396699488 0.400922298 0.40489924 0.408761501 0.412929863 0.41740939 0.420948476 0.424572796 0.427983463 0.43172884 0.435940862 0.439731658 0.444226623 0.448210061 0.452502191 0.456309021 0.460445344 0.464857161 0.469054878 0.4735578 0.477474332 0.482275695 0.487136066 0.491519749 0.495108306 0.499729306 0.504078984 0.508616626 0.513225675 0.516952515 0.521826148 0.526458561 0.532678843 0.53663969 0.541291356 0.546680689 0.552104354 0.556511641 0.561350882 0.56744802 0.573817372 0.580299795 0.586530685 0.592460394 0.600364089 0.608323932 0.615330637 0.623555779 0.630208969 0.641171992 0.650144219 0.660265267 0.672012925 0.684398234 0.696849227 0.713172734 0.732093573 0.749318182 0.773298562 0.800777078 0.838828921 0.894057095 1.12845647 ------------------------------ Transdef: Tab for variable 13 20.1508102 21.3639889 22.0420189 22.6510868 23.1574802 23.6752968 24.1676636 24.6832867 24.987896 25.3736725 25.7603779 26.0939941 26.4304619 26.714241 27.0665741 27.3695068 27.6909027 28.0332661 28.3176003 28.6009216 28.9070988 29.2471752 29.5154991 29.7716923 30.0790405 30.3600998 30.6280937 30.8861427 31.1641502 31.4182777 31.6376572 31.890995 32.1672859 32.4095764 32.6649399 32.9089279 33.1735497 33.4231377 33.7141266 33.9694519 34.1934509 34.5179062 34.7657928 34.9961395 35.268364 35.5060883 35.7890778 36.0405121 36.3028183 36.5631485 36.8839951 37.195076 37.4937668 37.7779045 38.0824127 38.362175 38.6710129 38.9757767 39.2743874 39.571022 39.9689026 40.3655434 40.6882019 41.041275 41.393692 41.7927933 42.111618 42.4761047 42.8079376 43.1536179 43.5761681 44.0265045 44.4237404 44.9094963 45.3128357 45.7676392 46.2731285 46.7702179 47.2933807 47.8456497 48.5850563 49.1113815 49.8691521 50.6374512 51.4497223 52.1255569 52.9200058 53.7824631 54.7080803 55.7824821 56.9579353 58.1871414 59.5987053 61.1708755 63.2944527 65.4207458 68.0936127 72.3913574 77.5007477 86.706604 199.637161 ------------------------------ Transdef: Tab for variable 14 10.0034533 10.6268673 11.0735703 11.4425011 11.7110577 12.0167561 12.2688446 12.4984875 12.8072023 13.0274773 13.3347015 13.6019344 13.8950214 14.188633 14.3956776 14.6277695 14.9037247 15.1327562 15.3437004 15.5777321 15.8091221 15.9934311 16.2751732 16.5232983 16.7299805 16.9452171 17.1912651 17.4071121 17.6597977 17.9231033 18.1663666 18.4143867 18.6300011 18.8729458 19.1527767 19.3612976 19.5696831 19.8087044 19.9953842 20.2352905 20.4664192 20.6769447 20.9168167 21.1289291 21.3559456 21.6005516 21.8119659 22.0225372 22.2243824 22.4693375 22.7068787 22.912384 23.0951042 23.3251972 23.5493774 23.7584648 23.9738464 24.2043533 24.4147224 24.6953316 24.8942928 25.175354 25.4098053 25.6481037 25.8759193 26.1157074 26.3367805 26.5942917 26.8919544 27.1444931 27.4072304 27.7122574 28.0077 28.3316879 28.6436939 28.9048195 29.2158165 29.5255547 29.9051895 30.2263165 30.514225 30.8613777 31.2600613 31.6937237 32.124115 32.5574455 33.0305634 33.4836197 33.9499207 34.4673615 35.133873 35.8846817 36.5999908 37.5663338 38.4981079 39.7651062 41.3851395 43.4618034 45.8930588 51.1303253 87.1024323 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 38.7 19.1 21.6 13.7 28.9 34.1 21.7 33.0 -20.7 -7.7 -12.2 16.5 18.2 2 38.7 100.0 59.1 48.5 23.7 64.3 94.0 62.9 87.3 -45.8 -21.8 -37.5 53.6 42.2 3 19.1 59.1 100.0 29.8 0.9 27.9 61.1 44.5 67.6 -8.2 -23.8 -44.2 90.9 24.3 4 21.6 48.5 29.8 100.0 5.2 26.3 48.7 37.0 55.6 -5.0 -26.3 -48.4 28.4 90.2 5 13.7 23.7 0.9 5.2 100.0 83.2 -6.7 52.7 53.4 32.4 -2.4 0.4 1.0 4.8 6 28.9 64.3 27.9 26.3 83.2 100.0 39.5 73.7 80.2 1.8 -11.9 -17.1 25.4 23.0 7 34.1 94.0 61.1 48.7 -6.7 39.5 100.0 52.1 72.4 -55.5 -21.6 -39.5 55.5 42.4 8 21.7 62.9 44.5 37.0 52.7 73.7 52.1 100.0 77.0 1.8 -18.2 -30.8 41.0 33.2 9 33.0 87.3 67.6 55.6 53.4 80.2 72.4 77.0 100.0 -12.3 -24.7 -42.6 61.9 48.8 10 -20.7 -45.8 -8.2 -5.0 32.4 1.8 -55.5 1.8 -12.3 100.0 3.6 5.1 -6.5 -4.2 11 -7.7 -21.8 -23.8 -26.3 -2.4 -11.9 -21.6 -18.2 -24.7 3.6 100.0 54.5 -20.7 -25.5 12 -12.2 -37.5 -44.2 -48.4 0.4 -17.1 -39.5 -30.8 -42.6 5.1 54.5 100.0 -40.0 -39.9 13 16.5 53.6 90.9 28.4 1.0 25.4 55.5 41.0 61.9 -6.5 -20.7 -40.0 100.0 32.3 14 18.2 42.2 24.3 90.2 4.8 23.0 42.4 33.2 48.8 -4.2 -25.5 -39.9 32.3 100.0 TOTAL CORRELATION TO TARGET (diagonal) 85.4854121 TOTAL CORRELATION OF ALL VARIABLES 42.4824549 ROUND 1: MAX CORR ( 42.4823832) AFTER KILLING INPUT VARIABLE 11 CONTR 0.0780949106 ROUND 2: MAX CORR ( 42.4820937) AFTER KILLING INPUT VARIABLE 8 CONTR 0.15682404 ROUND 3: MAX CORR ( 42.4741931) AFTER KILLING INPUT VARIABLE 13 CONTR 0.81927193 ROUND 4: MAX CORR ( 42.4417912) AFTER KILLING INPUT VARIABLE 14 CONTR 1.65874523 ROUND 5: MAX CORR ( 42.4010396) AFTER KILLING INPUT VARIABLE 12 CONTR 1.85943151 ROUND 6: MAX CORR ( 42.1852441) AFTER KILLING INPUT VARIABLE 6 CONTR 4.27239226 ROUND 7: MAX CORR ( 41.6950205) AFTER KILLING INPUT VARIABLE 10 CONTR 6.4124945 ROUND 8: MAX CORR ( 41.1238982) AFTER KILLING INPUT VARIABLE 7 CONTR 6.87748003 ROUND 9: MAX CORR ( 40.3296533) AFTER KILLING INPUT VARIABLE 3 CONTR 8.04326235 ROUND 10: MAX CORR ( 39.5149687) AFTER KILLING INPUT VARIABLE 4 CONTR 8.06524519 ROUND 11: MAX CORR ( 38.9558701) AFTER KILLING INPUT VARIABLE 9 CONTR 6.62366489 ROUND 12: MAX CORR ( 38.6713628) AFTER KILLING INPUT VARIABLE 5 CONTR 4.69952316 LAST REMAINING VARIABLE: 2 total correlation to target: 42.4824549 % total significance: 38.8133147 sigma correlations of single variables to target: variable 2: 38.6713628 % , in sigma: 35.3313803 variable 3: 19.0565376 % , in sigma: 17.4106556 variable 4: 21.6274708 % , in sigma: 19.7595415 variable 5: 13.7315715 % , in sigma: 12.5455981 variable 6: 28.8571275 % , in sigma: 26.3647845 variable 7: 34.0744987 % , in sigma: 31.1315399 variable 8: 21.7212676 % , in sigma: 19.8452372 variable 9: 33.0057479 % , in sigma: 30.1550954 variable 10: -20.6651684 % , in sigma: 18.8803515 variable 11: -7.69480781 % , in sigma: 7.03021982 variable 12: -12.2483028 % , in sigma: 11.1904369 variable 13: 16.4698719 % , in sigma: 15.0473959 variable 14: 18.1739162 % , in sigma: 16.6042647 variables sorted by significance: 1 most relevant variable 2 corr 38.6713638 , in sigma: 35.3313813 2 most relevant variable 5 corr 4.69952297 , in sigma: 4.29363285 3 most relevant variable 9 corr 6.62366486 , in sigma: 6.05158974 4 most relevant variable 4 corr 8.06524563 , in sigma: 7.36866353 5 most relevant variable 3 corr 8.04326248 , in sigma: 7.34857904 6 most relevant variable 7 corr 6.87748003 , in sigma: 6.28348331 7 most relevant variable 10 corr 6.41249466 , in sigma: 5.85865797 8 most relevant variable 6 corr 4.27239227 , in sigma: 3.90339273 9 most relevant variable 12 corr 1.85943151 , in sigma: 1.69883544 10 most relevant variable 14 corr 1.65874529 , in sigma: 1.51548217 11 most relevant variable 13 corr 0.819271922 , in sigma: 0.748512744 12 most relevant variable 8 corr 0.156824037 , in sigma: 0.143279401 13 most relevant variable 11 corr 0.0780949071 , in sigma: 0.0713499775 global correlations between input variables: variable 2: 99.3296432 % variable 3: 96.01692 % variable 4: 94.6332905 % variable 5: 96.3672652 % variable 6: 95.651873 % variable 7: 99.0012533 % variable 8: 85.7202481 % variable 9: 98.8888937 % variable 10: 75.4186038 % variable 11: 55.9317854 % variable 12: 70.2282507 % variable 13: 93.3287053 % variable 14: 92.9776189 % significance loss when removing single variables: variable 2: corr = 11.9270131 % , sigma = 10.8968964 variable 3: corr = 8.07338709 % , sigma = 7.37610183 variable 4: corr = 8.81411535 % , sigma = 8.0528546 variable 5: corr = 5.26993035 % , sigma = 4.81477507 variable 6: corr = 4.08755089 % , sigma = 3.7345158 variable 7: corr = 5.85243984 % , sigma = 5.34697417 variable 8: corr = 0.158289752 % , sigma = 0.144618524 variable 9: corr = 11.9948388 % , sigma = 10.9588642 variable 10: corr = 6.36419005 % , sigma = 5.81452535 variable 11: corr = 0.0780949106 % , sigma = 0.0713499807 variable 12: corr = 1.75419877 % , sigma = 1.60269149 variable 13: corr = 0.81164955 % , sigma = 0.741548704 variable 14: corr = 1.01944183 % , sigma = 0.931394305 Keep only 7 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 8 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 7 --> 16.9871445 sigma out 15 active outputs RANK 2 NODE 3 --> 11.886611 sigma out 15 active outputs RANK 3 NODE 8 --> 11.4070454 sigma out 15 active outputs RANK 4 NODE 6 --> 11.1078701 sigma out 15 active outputs RANK 5 NODE 2 --> 11.0567703 sigma out 15 active outputs RANK 6 NODE 1 --> 10.463088 sigma out 15 active outputs RANK 7 NODE 5 --> 9.31711483 sigma out 15 active outputs RANK 8 NODE 4 --> 9.03980637 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 5 --> 14.9369564 sigma in 8act. ( 15.9988489 sig out 1act.) RANK 2 NODE 1 --> 14.7878227 sigma in 8act. ( 15.670063 sig out 1act.) RANK 3 NODE 7 --> 13.5956964 sigma in 8act. ( 16.6163235 sig out 1act.) RANK 4 NODE 14 --> 11.8125706 sigma in 8act. ( 12.5148249 sig out 1act.) RANK 5 NODE 11 --> 9.64385891 sigma in 8act. ( 11.9595032 sig out 1act.) RANK 6 NODE 4 --> 6.46213341 sigma in 8act. ( 7.18056536 sig out 1act.) RANK 7 NODE 9 --> 6.3761487 sigma in 8act. ( 6.52816439 sig out 1act.) RANK 8 NODE 3 --> 5.88392544 sigma in 8act. ( 6.22346592 sig out 1act.) RANK 9 NODE 15 --> 5.64352703 sigma in 8act. ( 5.15143013 sig out 1act.) RANK 10 NODE 2 --> 5.08119965 sigma in 8act. ( 4.99296427 sig out 1act.) RANK 11 NODE 13 --> 4.54816294 sigma in 8act. ( 4.12393808 sig out 1act.) RANK 12 NODE 10 --> 4.0865202 sigma in 8act. ( 4.32196665 sig out 1act.) RANK 13 NODE 12 --> 2.7345674 sigma in 8act. ( 2.74155307 sig out 1act.) RANK 14 NODE 6 --> 2.02099705 sigma in 8act. ( 2.07813144 sig out 1act.) RANK 15 NODE 8 --> 0.887833297 sigma in 8act. ( 0.0354132578 sig out 1act.) sorted by output significance RANK 1 NODE 7 --> 16.6163235 sigma out 1act.( 13.5956964 sig in 8act.) RANK 2 NODE 5 --> 15.9988489 sigma out 1act.( 14.9369564 sig in 8act.) RANK 3 NODE 1 --> 15.670063 sigma out 1act.( 14.7878227 sig in 8act.) RANK 4 NODE 14 --> 12.5148249 sigma out 1act.( 11.8125706 sig in 8act.) RANK 5 NODE 11 --> 11.9595032 sigma out 1act.( 9.64385891 sig in 8act.) RANK 6 NODE 4 --> 7.18056536 sigma out 1act.( 6.46213341 sig in 8act.) RANK 7 NODE 9 --> 6.52816439 sigma out 1act.( 6.3761487 sig in 8act.) RANK 8 NODE 3 --> 6.22346592 sigma out 1act.( 5.88392544 sig in 8act.) RANK 9 NODE 15 --> 5.15143013 sigma out 1act.( 5.64352703 sig in 8act.) RANK 10 NODE 2 --> 4.99296427 sigma out 1act.( 5.08119965 sig in 8act.) RANK 11 NODE 10 --> 4.32196665 sigma out 1act.( 4.0865202 sig in 8act.) RANK 12 NODE 13 --> 4.12393808 sigma out 1act.( 4.54816294 sig in 8act.) RANK 13 NODE 12 --> 2.74155307 sigma out 1act.( 2.7345674 sig in 8act.) RANK 14 NODE 6 --> 2.07813144 sigma out 1act.( 2.02099705 sig in 8act.) RANK 15 NODE 8 --> 0.0354132578 sigma out 1act.( 0.887833297 sig in 8act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 36.1823845 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 7 --> 19.1144638 sigma out 15 active outputs RANK 2 NODE 2 --> 12.5503883 sigma out 15 active outputs RANK 3 NODE 8 --> 11.9144983 sigma out 15 active outputs RANK 4 NODE 3 --> 11.7998152 sigma out 15 active outputs RANK 5 NODE 1 --> 11.5595036 sigma out 15 active outputs RANK 6 NODE 6 --> 11.5554247 sigma out 15 active outputs RANK 7 NODE 4 --> 11.225584 sigma out 15 active outputs RANK 8 NODE 5 --> 10.8043385 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 5 --> 14.6875877 sigma in 8act. ( 14.853322 sig out 1act.) RANK 2 NODE 1 --> 14.4752474 sigma in 8act. ( 14.0033035 sig out 1act.) RANK 3 NODE 7 --> 13.4594936 sigma in 8act. ( 14.8302507 sig out 1act.) RANK 4 NODE 11 --> 12.176959 sigma in 8act. ( 12.6623983 sig out 1act.) RANK 5 NODE 14 --> 11.2237558 sigma in 8act. ( 11.3169918 sig out 1act.) RANK 6 NODE 3 --> 9.10477161 sigma in 8act. ( 5.99776936 sig out 1act.) RANK 7 NODE 6 --> 8.30750179 sigma in 8act. ( 3.84033775 sig out 1act.) RANK 8 NODE 4 --> 7.52615643 sigma in 8act. ( 7.10806513 sig out 1act.) RANK 9 NODE 9 --> 6.94304943 sigma in 8act. ( 6.91338539 sig out 1act.) RANK 10 NODE 10 --> 6.27785778 sigma in 8act. ( 4.45231247 sig out 1act.) RANK 11 NODE 13 --> 6.20576954 sigma in 8act. ( 3.40951109 sig out 1act.) RANK 12 NODE 12 --> 5.50480986 sigma in 8act. ( 3.020823 sig out 1act.) RANK 13 NODE 2 --> 5.29090929 sigma in 8act. ( 4.47347069 sig out 1act.) RANK 14 NODE 15 --> 5.24735928 sigma in 8act. ( 3.74580503 sig out 1act.) RANK 15 NODE 8 --> 3.5751071 sigma in 8act. ( 0.633259118 sig out 1act.) sorted by output significance RANK 1 NODE 5 --> 14.853322 sigma out 1act.( 14.6875877 sig in 8act.) RANK 2 NODE 7 --> 14.8302507 sigma out 1act.( 13.4594936 sig in 8act.) RANK 3 NODE 1 --> 14.0033035 sigma out 1act.( 14.4752474 sig in 8act.) RANK 4 NODE 11 --> 12.6623983 sigma out 1act.( 12.176959 sig in 8act.) RANK 5 NODE 14 --> 11.3169918 sigma out 1act.( 11.2237558 sig in 8act.) RANK 6 NODE 4 --> 7.10806513 sigma out 1act.( 7.52615643 sig in 8act.) RANK 7 NODE 9 --> 6.91338539 sigma out 1act.( 6.94304943 sig in 8act.) RANK 8 NODE 3 --> 5.99776936 sigma out 1act.( 9.10477161 sig in 8act.) RANK 9 NODE 2 --> 4.47347069 sigma out 1act.( 5.29090929 sig in 8act.) RANK 10 NODE 10 --> 4.45231247 sigma out 1act.( 6.27785778 sig in 8act.) RANK 11 NODE 6 --> 3.84033775 sigma out 1act.( 8.30750179 sig in 8act.) RANK 12 NODE 15 --> 3.74580503 sigma out 1act.( 5.24735928 sig in 8act.) RANK 13 NODE 13 --> 3.40951109 sigma out 1act.( 6.20576954 sig in 8act.) RANK 14 NODE 12 --> 3.020823 sigma out 1act.( 5.50480986 sig in 8act.) RANK 15 NODE 8 --> 0.633259118 sigma out 1act.( 3.5751071 sig in 8act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 33.8986549 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.376645416 *** contribution from regularisation: 0.0118308309 *** contribution from error: -0.388476253 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.418369919 *** contribution from regularisation: 0.00779072801 *** contribution from error: -0.426160634 *********************************************** -----------------> Test sample ENTER BFGS code START -8525.06907 -0.317374527 0.0654473603 EXIT FROM BFGS code FG_START 0. -0.317374527 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.431791037 *** contribution from regularisation: 0.00581039628 *** contribution from error: -0.437601447 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -8796.44689 -0.317374527 45.1334114 EXIT FROM BFGS code FG_LNSRCH 0. -0.276926398 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0.449563712 *** contribution from regularisation: 0.00748553267 *** contribution from error: -0.457049251 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9158.51198 -0.276926398 -3.51487017 EXIT FROM BFGS code NEW_X -9158.51198 -0.276926398 -3.51487017 ENTER BFGS code NEW_X -9158.51198 -0.276926398 -3.51487017 EXIT FROM BFGS code FG_LNSRCH 0. -0.280990899 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.453790814 *** contribution from regularisation: 0.00721592363 *** contribution from error: -0.461006731 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9244.62659 -0.280990899 5.73318148 EXIT FROM BFGS code NEW_X -9244.62659 -0.280990899 5.73318148 ENTER BFGS code NEW_X -9244.62659 -0.280990899 5.73318148 EXIT FROM BFGS code FG_LNSRCH 0. -0.276905477 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.454067498 *** contribution from regularisation: 0.00727306353 *** contribution from error: -0.461340576 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9250.26307 -0.276905477 5.05039454 EXIT FROM BFGS code NEW_X -9250.26307 -0.276905477 5.05039454 ENTER BFGS code NEW_X -9250.26307 -0.276905477 5.05039454 EXIT FROM BFGS code FG_LNSRCH 0. -0.240984127 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.454448164 *** contribution from regularisation: 0.00683345925 *** contribution from error: -0.461281627 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9258.01779 -0.240984127 -5.88878059 EXIT FROM BFGS code NEW_X -9258.01779 -0.240984127 -5.88878059 ENTER BFGS code NEW_X -9258.01779 -0.240984127 -5.88878059 EXIT FROM BFGS code FG_LNSRCH 0. -0.24567166 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.455014169 *** contribution from regularisation: 0.00736854877 *** contribution from error: -0.462382704 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9269.54865 -0.24567166 -0.853115499 EXIT FROM BFGS code NEW_X -9269.54865 -0.24567166 -0.853115499 ENTER BFGS code NEW_X -9269.54865 -0.24567166 -0.853115499 EXIT FROM BFGS code FG_LNSRCH 0. -0.232295603 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.457733661 *** contribution from regularisation: 0.00629695877 *** contribution from error: -0.464030623 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9324.95004 -0.232295603 3.63648868 EXIT FROM BFGS code NEW_X -9324.95004 -0.232295603 3.63648868 ENTER BFGS code NEW_X -9324.95004 -0.232295603 3.63648868 EXIT FROM BFGS code FG_LNSRCH 0. -0.192397788 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 7 --> 16.109581 sigma out 15 active outputs RANK 2 NODE 1 --> 13.7367525 sigma out 15 active outputs RANK 3 NODE 3 --> 12.0250139 sigma out 15 active outputs RANK 4 NODE 2 --> 9.03290749 sigma out 15 active outputs RANK 5 NODE 5 --> 6.9050765 sigma out 15 active outputs RANK 6 NODE 4 --> 5.63100147 sigma out 15 active outputs RANK 7 NODE 8 --> 5.03037167 sigma out 15 active outputs RANK 8 NODE 6 --> 5.02689552 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 6 --> 21.1703491 sigma in 8act. ( 21.2073822 sig out 1act.) RANK 2 NODE 14 --> 9.04335117 sigma in 8act. ( 7.22460747 sig out 1act.) RANK 3 NODE 12 --> 8.83154488 sigma in 8act. ( 8.45478058 sig out 1act.) RANK 4 NODE 10 --> 7.08140469 sigma in 8act. ( 6.6084547 sig out 1act.) RANK 5 NODE 5 --> 5.8120594 sigma in 8act. ( 4.3000741 sig out 1act.) RANK 6 NODE 1 --> 4.09022617 sigma in 8act. ( 2.48259139 sig out 1act.) RANK 7 NODE 8 --> 4.03314781 sigma in 8act. ( 3.58666229 sig out 1act.) RANK 8 NODE 11 --> 3.97451687 sigma in 8act. ( 2.20215106 sig out 1act.) RANK 9 NODE 4 --> 3.81905603 sigma in 8act. ( 3.03244185 sig out 1act.) RANK 10 NODE 3 --> 3.6610055 sigma in 8act. ( 0.0201506689 sig out 1act.) RANK 11 NODE 7 --> 3.22892523 sigma in 8act. ( 0.698455095 sig out 1act.) RANK 12 NODE 2 --> 3.11821818 sigma in 8act. ( 2.77940345 sig out 1act.) RANK 13 NODE 13 --> 2.62467861 sigma in 8act. ( 0.454755753 sig out 1act.) RANK 14 NODE 9 --> 2.38748097 sigma in 8act. ( 0.529818892 sig out 1act.) RANK 15 NODE 15 --> 1.7222718 sigma in 8act. ( 0.826192856 sig out 1act.) sorted by output significance RANK 1 NODE 6 --> 21.2073822 sigma out 1act.( 21.1703491 sig in 8act.) RANK 2 NODE 12 --> 8.45478058 sigma out 1act.( 8.83154488 sig in 8act.) RANK 3 NODE 14 --> 7.22460747 sigma out 1act.( 9.04335117 sig in 8act.) RANK 4 NODE 10 --> 6.6084547 sigma out 1act.( 7.08140469 sig in 8act.) RANK 5 NODE 5 --> 4.3000741 sigma out 1act.( 5.8120594 sig in 8act.) RANK 6 NODE 8 --> 3.58666229 sigma out 1act.( 4.03314781 sig in 8act.) RANK 7 NODE 4 --> 3.03244185 sigma out 1act.( 3.81905603 sig in 8act.) RANK 8 NODE 2 --> 2.77940345 sigma out 1act.( 3.11821818 sig in 8act.) RANK 9 NODE 1 --> 2.48259139 sigma out 1act.( 4.09022617 sig in 8act.) RANK 10 NODE 11 --> 2.20215106 sigma out 1act.( 3.97451687 sig in 8act.) RANK 11 NODE 15 --> 0.826192856 sigma out 1act.( 1.7222718 sig in 8act.) RANK 12 NODE 7 --> 0.698455095 sigma out 1act.( 3.22892523 sig in 8act.) RANK 13 NODE 9 --> 0.529818892 sigma out 1act.( 2.38748097 sig in 8act.) RANK 14 NODE 13 --> 0.454755753 sigma out 1act.( 2.62467861 sig in 8act.) RANK 15 NODE 3 --> 0.0201506689 sigma out 1act.( 3.6610055 sig in 8act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 26.0393867 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.458289385 *** contribution from regularisation: 0.00615534792 *** contribution from error: -0.464444727 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -9336.27152 -0.192397788 7.92405987 EXIT FROM BFGS code NEW_X -9336.27152 -0.192397788 7.92405987 ENTER BFGS code NEW_X -9336.27152 -0.192397788 7.92405987 EXIT FROM BFGS code FG_LNSRCH 0. -0.185393602 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.459753722 *** contribution from regularisation: 0.00627568644 *** contribution from error: -0.466029406 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9366.10258 -0.185393602 -0.0113260681 EXIT FROM BFGS code NEW_X -9366.10258 -0.185393602 -0.0113260681 ENTER BFGS code NEW_X -9366.10258 -0.185393602 -0.0113260681 EXIT FROM BFGS code FG_LNSRCH 0. -0.169713631 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.460693508 *** contribution from regularisation: 0.00626525981 *** contribution from error: -0.466958761 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9385.248 -0.169713631 -2.26222754 EXIT FROM BFGS code NEW_X -9385.248 -0.169713631 -2.26222754 ENTER BFGS code NEW_X -9385.248 -0.169713631 -2.26222754 EXIT FROM BFGS code FG_LNSRCH 0. -0.144082889 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.461525619 *** contribution from regularisation: 0.00647745188 *** contribution from error: -0.468003064 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9402.19971 -0.144082889 -10.6498938 EXIT FROM BFGS code NEW_X -9402.19971 -0.144082889 -10.6498938 ENTER BFGS code NEW_X -9402.19971 -0.144082889 -10.6498938 EXIT FROM BFGS code FG_LNSRCH 0. -0.155866012 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.462171733 *** contribution from regularisation: 0.00676176231 *** contribution from error: -0.468933493 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9415.3626 -0.155866012 4.39100361 EXIT FROM BFGS code NEW_X -9415.3626 -0.155866012 4.39100361 ENTER BFGS code NEW_X -9415.3626 -0.155866012 4.39100361 EXIT FROM BFGS code FG_LNSRCH 0. -0.160446495 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.462987006 *** contribution from regularisation: 0.00708968658 *** contribution from error: -0.47007668 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9431.97117 -0.160446495 0.290390849 EXIT FROM BFGS code NEW_X -9431.97117 -0.160446495 0.290390849 ENTER BFGS code NEW_X -9431.97117 -0.160446495 0.290390849 EXIT FROM BFGS code FG_LNSRCH 0. -0.180229262 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.463908881 *** contribution from regularisation: 0.00671442831 *** contribution from error: -0.470623314 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9450.75165 -0.180229262 1.80988431 EXIT FROM BFGS code NEW_X -9450.75165 -0.180229262 1.80988431 ENTER BFGS code NEW_X -9450.75165 -0.180229262 1.80988431 EXIT FROM BFGS code FG_LNSRCH 0. -0.241393596 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.464114219 *** contribution from regularisation: 0.00615593046 *** contribution from error: -0.470270157 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9454.93493 -0.241393596 -11.4045095 EXIT FROM BFGS code NEW_X -9454.93493 -0.241393596 -11.4045095 ENTER BFGS code NEW_X -9454.93493 -0.241393596 -11.4045095 EXIT FROM BFGS code FG_LNSRCH 0. -0.25565213 0. --------------------------------------------------- Iteration : 18 *********************************************** *** Learn Path 18 *** loss function: -0.463846952 *** contribution from regularisation: 0.00679388316 *** contribution from error: -0.470640838 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9449.4903 -0.25565213 3.1105237 EXIT FROM BFGS code FG_LNSRCH 0. -0.244076401 0. --------------------------------------------------- Iteration : 19 *********************************************** *** Learn Path 19 *** loss function: -0.46354112 *** contribution from regularisation: 0.00685472041 *** contribution from error: -0.470395833 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9443.25992 -0.244076401 -8.82460976 EXIT FROM BFGS code FG_LNSRCH 0. -0.241493523 0. --------------------------------------------------- Iteration : 20 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 1 --> 28.1607323 sigma out 15 active outputs RANK 2 NODE 3 --> 26.6603546 sigma out 15 active outputs RANK 3 NODE 7 --> 24.9090633 sigma out 15 active outputs RANK 4 NODE 2 --> 10.1378708 sigma out 15 active outputs RANK 5 NODE 4 --> 6.95397377 sigma out 15 active outputs RANK 6 NODE 8 --> 4.78863621 sigma out 15 active outputs RANK 7 NODE 6 --> 4.36550045 sigma out 15 active outputs RANK 8 NODE 5 --> 4.07864952 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 6 --> 42.2631989 sigma in 8act. ( 39.2397804 sig out 1act.) RANK 2 NODE 14 --> 18.7642174 sigma in 8act. ( 17.3499126 sig out 1act.) RANK 3 NODE 5 --> 7.6050806 sigma in 8act. ( 6.73137999 sig out 1act.) RANK 4 NODE 10 --> 7.27048016 sigma in 8act. ( 7.26571798 sig out 1act.) RANK 5 NODE 12 --> 6.43948698 sigma in 8act. ( 5.41317606 sig out 1act.) RANK 6 NODE 2 --> 3.35997176 sigma in 8act. ( 3.06151056 sig out 1act.) RANK 7 NODE 11 --> 3.31622243 sigma in 8act. ( 1.44064212 sig out 1act.) RANK 8 NODE 9 --> 2.70964313 sigma in 8act. ( 2.36586261 sig out 1act.) RANK 9 NODE 8 --> 2.66659069 sigma in 8act. ( 2.04808474 sig out 1act.) RANK 10 NODE 1 --> 1.41883683 sigma in 8act. ( 1.26198733 sig out 1act.) RANK 11 NODE 4 --> 1.21209824 sigma in 8act. ( 0.421028286 sig out 1act.) RANK 12 NODE 15 --> 1.19692576 sigma in 8act. ( 1.26215613 sig out 1act.) RANK 13 NODE 7 --> 1.13528919 sigma in 8act. ( 0.915876806 sig out 1act.) RANK 14 NODE 13 --> 0.575793087 sigma in 8act. ( 0.134199649 sig out 1act.) RANK 15 NODE 3 --> 0.555036247 sigma in 8act. ( 0.341195226 sig out 1act.) sorted by output significance RANK 1 NODE 6 --> 39.2397804 sigma out 1act.( 42.2631989 sig in 8act.) RANK 2 NODE 14 --> 17.3499126 sigma out 1act.( 18.7642174 sig in 8act.) RANK 3 NODE 10 --> 7.26571798 sigma out 1act.( 7.27048016 sig in 8act.) RANK 4 NODE 5 --> 6.73137999 sigma out 1act.( 7.6050806 sig in 8act.) RANK 5 NODE 12 --> 5.41317606 sigma out 1act.( 6.43948698 sig in 8act.) RANK 6 NODE 2 --> 3.06151056 sigma out 1act.( 3.35997176 sig in 8act.) RANK 7 NODE 9 --> 2.36586261 sigma out 1act.( 2.70964313 sig in 8act.) RANK 8 NODE 8 --> 2.04808474 sigma out 1act.( 2.66659069 sig in 8act.) RANK 9 NODE 11 --> 1.44064212 sigma out 1act.( 3.31622243 sig in 8act.) RANK 10 NODE 15 --> 1.26215613 sigma out 1act.( 1.19692576 sig in 8act.) RANK 11 NODE 1 --> 1.26198733 sigma out 1act.( 1.41883683 sig in 8act.) RANK 12 NODE 7 --> 0.915876806 sigma out 1act.( 1.13528919 sig in 8act.) RANK 13 NODE 4 --> 0.421028286 sigma out 1act.( 1.21209824 sig in 8act.) RANK 14 NODE 3 --> 0.341195226 sigma out 1act.( 0.555036247 sig in 8act.) RANK 15 NODE 13 --> 0.134199649 sigma out 1act.( 0.575793087 sig in 8act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 44.6515465 sigma in 15 active inputs *********************************************** *** Learn Path 20 *** loss function: -0.463536918 *** contribution from regularisation: 0.00673819799 *** contribution from error: -0.470275104 *********************************************** -----------------> Test sample Iteration No: 20 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -9443.17392 -0.241493523 -11.3028221 EXIT FROM BFGS code FG_LNSRCH 0. -0.24139376 0. --------------------------------------------------- Iteration : 21 *********************************************** *** Learn Path 21 *** loss function: -0.463520646 *** contribution from regularisation: 0.00674951868 *** contribution from error: -0.470270157 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9442.84258 -0.24139376 -11.4068575 EXIT FROM BFGS code FG_LNSRCH 0. -0.241393596 0. --------------------------------------------------- Iteration : 22 *********************************************** *** Learn Path 22 *** loss function: -0.463549852 *** contribution from regularisation: 0.00672030775 *** contribution from error: -0.470270157 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9443.43743 -0.241393596 -11.4110785 EXIT FROM BFGS code FG_LNSRCH 0. -0.241393596 0. --------------------------------------------------- Iteration : 23 *********************************************** *** Learn Path 23 *** loss function: -0.46359399 *** contribution from regularisation: 0.00667616818 *** contribution from error: -0.470270157 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9444.33664 -0.241393596 -11.3864994 EXIT FROM BFGS code FG_LNSRCH 0. -0.241393596 0. --------------------------------------------------- Iteration : 24 *********************************************** *** Learn Path 24 *** loss function: -0.463507533 *** contribution from regularisation: 0.00676260982 *** contribution from error: -0.470270157 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -9442.57566 -0.241393596 -11.3605375 EXIT FROM BFGS code NEW_X -9442.57566 -0.241393596 -11.3605375 ENTER BFGS code NEW_X -9442.57566 -0.241393596 -11.3605375 EXIT FROM BFGS code CONVERGENC -9442.57566 -0.241393596 -11.3605375 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 1 --> 42.5944939 sigma out 15 active outputs RANK 2 NODE 3 --> 39.3245773 sigma out 15 active outputs RANK 3 NODE 7 --> 36.5653877 sigma out 15 active outputs RANK 4 NODE 2 --> 14.8580456 sigma out 15 active outputs RANK 5 NODE 4 --> 10.6219883 sigma out 15 active outputs RANK 6 NODE 8 --> 6.91666412 sigma out 15 active outputs RANK 7 NODE 6 --> 6.22976732 sigma out 15 active outputs RANK 8 NODE 5 --> 5.66087246 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 6 --> 62.908741 sigma in 8act. ( 61.6301994 sig out 1act.) RANK 2 NODE 14 --> 28.4738064 sigma in 8act. ( 26.0177689 sig out 1act.) RANK 3 NODE 5 --> 11.0032911 sigma in 8act. ( 9.89676571 sig out 1act.) RANK 4 NODE 10 --> 10.5630579 sigma in 8act. ( 10.912591 sig out 1act.) RANK 5 NODE 12 --> 8.82208157 sigma in 8act. ( 8.4359045 sig out 1act.) RANK 6 NODE 2 --> 4.64552975 sigma in 8act. ( 4.53136015 sig out 1act.) RANK 7 NODE 9 --> 3.76977873 sigma in 8act. ( 3.39363718 sig out 1act.) RANK 8 NODE 11 --> 3.69321442 sigma in 8act. ( 2.1435492 sig out 1act.) RANK 9 NODE 8 --> 3.40022874 sigma in 8act. ( 3.02158856 sig out 1act.) RANK 10 NODE 1 --> 1.83543527 sigma in 8act. ( 1.69364977 sig out 1act.) RANK 11 NODE 15 --> 1.60136521 sigma in 8act. ( 1.66069317 sig out 1act.) RANK 12 NODE 7 --> 1.41459978 sigma in 8act. ( 1.23269737 sig out 1act.) RANK 13 NODE 4 --> 1.28519404 sigma in 8act. ( 0.57844311 sig out 1act.) RANK 14 NODE 3 --> 0.595305741 sigma in 8act. ( 0.395141631 sig out 1act.) RANK 15 NODE 13 --> 0.586080968 sigma in 8act. ( 0.165847734 sig out 1act.) sorted by output significance RANK 1 NODE 6 --> 61.6301994 sigma out 1act.( 62.908741 sig in 8act.) RANK 2 NODE 14 --> 26.0177689 sigma out 1act.( 28.4738064 sig in 8act.) RANK 3 NODE 10 --> 10.912591 sigma out 1act.( 10.5630579 sig in 8act.) RANK 4 NODE 5 --> 9.89676571 sigma out 1act.( 11.0032911 sig in 8act.) RANK 5 NODE 12 --> 8.4359045 sigma out 1act.( 8.82208157 sig in 8act.) RANK 6 NODE 2 --> 4.53136015 sigma out 1act.( 4.64552975 sig in 8act.) RANK 7 NODE 9 --> 3.39363718 sigma out 1act.( 3.76977873 sig in 8act.) RANK 8 NODE 8 --> 3.02158856 sigma out 1act.( 3.40022874 sig in 8act.) RANK 9 NODE 11 --> 2.1435492 sigma out 1act.( 3.69321442 sig in 8act.) RANK 10 NODE 1 --> 1.69364977 sigma out 1act.( 1.83543527 sig in 8act.) RANK 11 NODE 15 --> 1.66069317 sigma out 1act.( 1.60136521 sig in 8act.) RANK 12 NODE 7 --> 1.23269737 sigma out 1act.( 1.41459978 sig in 8act.) RANK 13 NODE 4 --> 0.57844311 sigma out 1act.( 1.28519404 sig in 8act.) RANK 14 NODE 3 --> 0.395141631 sigma out 1act.( 0.595305741 sig in 8act.) RANK 15 NODE 13 --> 0.165847734 sigma out 1act.( 0.586080968 sig in 8act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 69.40345 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.463460535 *** contribution from regularisation: 0.00680962251 *** contribution from error: -0.470270157 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 25767 Closing output file done