NNInput NNInputs_130.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree NNAna::CopyTree: entries= 199144 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 50802 nbkg = 148342 Bkg Entries: 148342 Sig Entries: 50802 Chosen entries: 50802 Signal fraction: 1 Background fraction: 0.342465 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 148342 Actual Signal Entries: 50802 Entries to split: 50802 Test with : 25401 Train with : 25401 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 50802 for Signal Prepared event 0 for Signal with 50802 events ====Entry 0 Variable Ht : 151.435 Variable LepAPt : 41.3573 Variable LepBPt : 14.9662 Variable MetSigLeptonsJets : 5.83672 Variable MetSpec : 56.7846 Variable SumEtLeptonsJets : 94.6508 Variable VSumJetLeptonsPt : 42.3463 Variable addEt : 113.108 Variable dPhiLepSumMet : 2.38762 Variable dPhiLeptons : 0.421366 Variable dRLeptons : 0.581604 Variable lep1_E : 44.5356 Variable lep2_E : 14.9672 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2130 Ht = 151.435 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 41.3573 LepAPt = 41.3573 LepBEt = 14.9662 LepBPt = 14.9662 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 56.7846 MetDelPhi = 1.62286 MetSig = 5.01275 MetSigLeptonsJets = 5.83672 MetSpec = 56.7846 Mjj = 0 MostCentralJetEta = 2.41678 MtllMet = 115.132 Njets = 1 SB = 0 SumEt = 128.324 SumEtJets = 0 SumEtLeptonsJets = 94.6508 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 42.3463 addEt = 113.108 dPhiLepSumMet = 2.38762 dPhiLeptons = 0.421366 dRLeptons = 0.581604 diltype = 55 dimass = 14.4601 event = 2096 jet1_Et = 38.3273 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 44.5356 lep2_E = 14.9672 rand = 0.999742 run = 195245 weight = 2.27324e-06 ===Show End Prepared event 10000 for Signal with 50802 events Prepared event 20000 for Signal with 50802 events Prepared event 30000 for Signal with 50802 events Prepared event 40000 for Signal with 50802 events Prepared event 50000 for Signal with 50802 events Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 148342 for Background Prepared event 0 for Background with 148342 events ====Entry 0 Variable Ht : 85.3408 Variable LepAPt : 31.1294 Variable LepBPt : 10.0664 Variable MetSigLeptonsJets : 6.87768 Variable MetSpec : 44.1437 Variable SumEtLeptonsJets : 41.1959 Variable VSumJetLeptonsPt : 41.0132 Variable addEt : 85.3408 Variable dPhiLepSumMet : 3.10536 Variable dPhiLeptons : 0.219342 Variable dRLeptons : 0.424454 Variable lep1_E : 32.3548 Variable lep2_E : 10.1027 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 85.3408 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 31.1297 LepAPt = 31.1294 LepBEt = 10.0674 LepBPt = 10.0664 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 44.1437 MetDelPhi = 3.01191 MetSig = 3.74558 MetSigLeptonsJets = 6.87768 MetSpec = 44.1437 Mjj = 0 MostCentralJetEta = 0 MtllMet = 86.2332 Njets = 0 SB = 0 SumEt = 138.899 SumEtJets = 0 SumEtLeptonsJets = 41.1959 Target = 0 TrainWeight = 0.311237 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 41.0132 addEt = 85.3408 dPhiLepSumMet = 3.10536 dPhiLeptons = 0.219342 dRLeptons = 0.424454 diltype = 17 dimass = 7.54723 event = 6717520 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 32.3548 lep2_E = 10.1027 rand = 0.999742 run = 271566 weight = 0.00296168 ===Show End Prepared event 10000 for Background with 148342 events Prepared event 20000 for Background with 148342 events Prepared event 30000 for Background with 148342 events Prepared event 40000 for Background with 148342 events Prepared event 50000 for Background with 148342 events Prepared event 60000 for Background with 148342 events Prepared event 70000 for Background with 148342 events Prepared event 80000 for Background with 148342 events Prepared event 90000 for Background with 148342 events Prepared event 100000 for Background with 148342 events Prepared event 110000 for Background with 148342 events Prepared event 120000 for Background with 148342 events Prepared event 130000 for Background with 148342 events Prepared event 140000 for Background with 148342 events Warning: found 4738 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 199144 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 4738 negative weights. Signal fraction: 62.9086685 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 60.0019798 63.3349876 65.4249954 66.8744965 67.9516907 68.8907776 70.0139008 70.8790054 71.6104126 72.4629669 73.2438965 73.921814 74.5458069 75.2874908 75.9094238 76.5512543 77.1406403 77.804657 78.3817596 78.8987579 79.4770966 80.124855 80.7112808 81.3287048 82.0888367 82.5921402 83.2410583 83.9653244 84.7609024 85.7569275 86.5706329 87.5555725 88.5926208 89.4794312 90.6679459 91.6540146 92.8270493 93.6948624 95.0587158 96.0405655 97.2216949 98.3003235 99.2032013 100.45414 101.591629 102.52317 103.491165 104.491959 105.580765 106.611526 107.686905 108.683891 109.715858 110.75914 111.860352 113.010178 114.018356 115.058899 116.102066 117.239983 118.294037 119.430984 120.256439 121.384865 122.462601 123.647293 124.836426 126.055344 127.425735 128.777161 130.350967 131.949371 133.598038 135.242432 136.965897 138.836792 140.703033 142.684845 144.651962 146.789642 149.168518 151.583847 153.877914 156.166458 158.825226 161.857971 165.001221 168.273102 171.747849 175.898911 180.27182 185.56778 191.34259 198.256439 205.783691 215.755569 227.342255 242.373138 266.094208 302.876343 742.657471 ------------------------------ Transdef: Tab for variable 3 20.0000362 20.2841053 20.5896606 20.8636017 21.1219749 21.3403034 21.5691719 21.7955647 22.007164 22.1937389 22.3913956 22.5826416 22.7813301 22.9498825 23.1268005 23.312603 23.4981709 23.7005348 23.904911 24.1218758 24.3156414 24.4879913 24.6761971 24.8558254 25.0692062 25.2728386 25.4669647 25.6316662 25.7962837 25.9795952 26.1905937 26.382576 26.5709381 26.7748909 26.9654293 27.1401978 27.3187675 27.5376663 27.7230053 27.9346657 28.1687737 28.3557968 28.5697994 28.7562637 28.9282265 29.1377583 29.3525105 29.5984917 29.7743206 29.9763069 30.2119064 30.4648685 30.6640244 30.9261818 31.1691113 31.4212952 31.661335 31.8325348 32.0904007 32.3500443 32.6226883 32.9001694 33.1621094 33.4442139 33.7515068 34.0506287 34.3289108 34.5979538 34.8973007 35.2083435 35.5088425 35.8536224 36.2145844 36.6092758 36.9410629 37.3521194 37.7784767 38.1940613 38.6487732 39.135025 39.5028992 39.9842606 40.4964676 41.0970764 41.7408371 42.3749657 43.0246849 43.7818756 44.5561028 45.4191666 46.4160728 47.4552536 48.7007828 50.1237068 51.828907 53.9784775 56.7627983 60.2506256 65.2943344 75.5177765 228.385986 ------------------------------ Transdef: Tab for variable 4 10.0010567 10.1084051 10.2411289 10.3605461 10.4553261 10.5482044 10.6593571 10.7608967 10.8826265 10.9924908 11.1181564 11.2436419 11.3704681 11.4936647 11.6161289 11.7526226 11.8762836 11.9888315 12.1099005 12.2219658 12.3501711 12.4743233 12.5964718 12.7308064 12.8504009 12.9796524 13.1317215 13.2651215 13.4035463 13.5290327 13.6807518 13.8081284 13.9608955 14.1039543 14.2588291 14.4020691 14.5461693 14.7113419 14.8724098 15.0481462 15.2016029 15.3605366 15.4887772 15.6426096 15.8000889 15.9523258 16.1289406 16.3306465 16.5099792 16.6936226 16.8491669 17.0294151 17.2250423 17.4248562 17.6166954 17.8157463 18.0217857 18.2355137 18.4506912 18.6641197 18.8873138 19.1048031 19.3250465 19.5445633 19.7838879 20.0265236 20.2013302 20.4062805 20.6310043 20.8593521 21.0801048 21.2904167 21.5247097 21.7663574 22.0105915 22.2118263 22.4954433 22.7650471 23.0541077 23.3305759 23.6465397 23.9489765 24.2682343 24.579649 24.9395599 25.3282871 25.7348576 26.1298676 26.5338268 26.9916 27.5213394 28.0765667 28.6135406 29.3289547 30.1930885 31.165247 32.274971 33.8824387 36.8583374 41.8891029 95.7538834 ------------------------------ Transdef: Tab for variable 5 0.978936255 2.43014097 2.85164428 3.12596083 3.36400843 3.56794524 3.75835371 3.9359951 4.07847309 4.17966509 4.2966032 4.40418434 4.50343513 4.58592176 4.6622901 4.74774027 4.81582308 4.89284706 4.95946789 5.02585602 5.09096861 5.15133572 5.20970058 5.27002335 5.3279829 5.37145138 5.42275429 5.46811485 5.51782799 5.56322861 5.6117239 5.65152645 5.6987915 5.74508953 5.78951502 5.82525444 5.86308289 5.90429449 5.94574785 5.98237658 6.01833057 6.06228399 6.10426044 6.13924313 6.17900753 6.21802425 6.25589561 6.28149462 6.3194418 6.36289501 6.40261269 6.44127655 6.48232985 6.5181942 6.55539942 6.5953207 6.63314152 6.66810989 6.70819473 6.74737024 6.78649282 6.82858372 6.87319756 6.91231489 6.96024036 6.99917889 7.04395771 7.08471346 7.1310358 7.17626381 7.22281551 7.26389313 7.3078475 7.35724783 7.40694046 7.44997168 7.48769379 7.53718185 7.58734894 7.64277363 7.70215225 7.76083851 7.818573 7.8787241 7.94344521 8.00708199 8.07544708 8.14328861 8.21084881 8.28936672 8.37945843 8.47087479 8.56860352 8.7003231 8.82748508 8.97991943 9.19649506 9.46485519 9.86724281 10.5919065 18.5847588 ------------------------------ Transdef: Tab for variable 6 15.0048428 19.7500572 24.3549118 25.8699303 26.832201 27.619091 28.394762 29.1971703 29.9463081 30.6179199 31.1572723 31.665659 32.1737595 32.6683807 33.1166153 33.5866127 34.0643959 34.4534149 34.7656746 35.1362152 35.4763489 35.8612518 36.1715012 36.4965935 36.8344116 37.1807022 37.5168991 37.8493462 38.2025909 38.4964371 38.8370743 39.1957169 39.535881 39.8959274 40.2278976 40.4986115 40.810463 41.1507339 41.4875183 41.8424377 42.231102 42.5752792 42.9200592 43.2903442 43.7201958 44.0286522 44.4522324 44.8770676 45.248558 45.6569672 46.1424599 46.6376648 47.1527519 47.5848198 48.1037064 48.5372505 48.9989548 49.4552994 49.950119 50.4270172 50.8793793 51.395134 51.8987656 52.4089928 52.9137115 53.4491539 53.9686852 54.4648132 55.0042191 55.508461 56.0003967 56.5094833 56.9745483 57.4504623 57.9960709 58.5361176 59.0472794 59.638504 60.2261887 60.8574295 61.5072365 62.1159668 62.8261185 63.5420532 64.2660675 65.0584259 65.8460541 66.7900543 67.9156342 69.0884781 70.3236542 71.7902679 73.3541107 75.4092407 77.6249237 80.542923 83.6114044 87.7497864 94.4335327 106.838287 280.646332 ------------------------------ Transdef: Tab for variable 7 30.1476021 32.6088486 33.5450668 34.2376595 34.8029556 35.281601 35.7198181 36.108284 36.5285835 36.9185486 37.2694473 37.5932922 37.8930893 38.1853485 38.4729271 38.7726402 39.0557022 39.3472862 39.6659012 39.949913 40.2398338 40.5281601 40.8995132 41.2151718 41.516449 41.8884659 42.2510834 42.6754112 43.0091629 43.4230995 43.8286972 44.3966904 44.8692474 45.3811417 45.9090958 46.4788513 47.0896835 47.7787018 48.3388672 49.0207062 49.5267334 50.2044373 50.9097862 51.5367851 52.187233 52.8673286 53.466713 54.1524277 54.8083496 55.4373016 56.0477905 56.6917992 57.3096199 57.9137802 58.5851898 59.1740646 59.8346062 60.5616226 61.1815758 61.9118614 62.7404709 63.6121712 64.4204254 65.4429398 66.3373871 67.4704666 68.6599884 69.8356781 70.9965057 72.3191299 73.6564178 75.0171509 76.5754395 77.9988556 79.4149933 81.0314941 82.6101074 84.3443909 86.1083832 88.0243683 90.0429001 92.0651703 94.0479355 96.3216705 98.7460327 101.284042 104.013214 107.091568 110.588089 114.13372 118.084015 122.687378 127.665131 133.25589 138.781464 146.266785 156.357315 168.484344 187.768082 217.666153 580.960083 ------------------------------ Transdef: Tab for variable 8 2.13245893 23.0376396 27.8864059 30.1730881 31.3739433 32.2204132 32.885334 33.3378563 33.8354111 34.2063522 34.5871506 34.9702225 35.2831345 35.6143112 35.9770126 36.2917366 36.6022873 36.8717041 37.1090355 37.3388519 37.6007309 37.8410568 38.1165352 38.3441391 38.5897293 38.8503265 39.1215706 39.3600044 39.6029816 39.8626022 40.1220856 40.4431229 40.7243767 41.0202217 41.2836304 41.6030426 41.8848648 42.2486115 42.5811615 42.8849297 43.1978111 43.5234375 43.8907013 44.2806664 44.6399231 45.020256 45.4294815 45.8907394 46.3493118 46.7957382 47.2531967 47.6759415 48.1279144 48.5005951 48.9880104 49.4301147 49.9488754 50.4369812 50.8640289 51.3396912 51.8079376 52.3014336 52.7753906 53.2361526 53.6626129 54.1107178 54.5800514 55.0300598 55.5620499 56.1003571 56.5997849 57.0363159 57.5509987 58.025959 58.5551949 59.0950928 59.6626511 60.2297707 60.8687668 61.3054428 61.9582062 62.6671066 63.3971863 64.1487732 64.946991 65.7975998 66.774353 67.7630615 68.9486847 70.2529373 71.6733704 73.4574585 75.3813782 77.5681305 80.0211639 83.1252594 86.6659698 91.385437 99.3100204 114.45195 323.660004 ------------------------------ Transdef: Tab for variable 9 48.699276 62.8490982 64.7328568 66.2652283 67.3769531 68.2750092 69.1035843 70.0565491 70.8613434 71.5558472 72.3306427 73.0356598 73.609787 74.2086487 74.8865662 75.4333038 75.9354248 76.5161819 77.0454712 77.5916595 78.1375427 78.6026917 78.9947128 79.5121613 80.0397263 80.6372375 81.0783081 81.6790161 82.187088 82.693634 83.2512283 83.8343735 84.4954376 85.2008667 85.9417343 86.6053925 87.3995285 88.1442261 88.8059769 89.602829 90.4693527 91.2963715 91.9984741 92.8398819 93.5635071 94.481636 95.4007111 96.2158966 97.1416016 97.9985962 98.662796 99.4356766 100.361626 101.199593 102.010071 102.763573 103.547928 104.353531 105.129028 105.908203 106.756241 107.587143 108.395493 109.174477 109.942596 110.74276 111.520615 112.342072 113.171814 113.930084 114.708618 115.495003 116.220566 117.031708 117.850372 118.58651 119.422302 120.033691 120.821945 121.673828 122.464966 123.369926 124.258041 125.186455 126.116165 127.208267 128.283707 129.592621 130.9711 132.188416 133.685059 135.421631 137.561371 140.251282 143.176926 146.877197 152.247772 158.63382 169.939392 193.264526 582.80835 ------------------------------ Transdef: Tab for variable 10 0.0050833188 0.937811553 1.19996178 1.40118432 1.56647992 1.69893026 1.81071591 1.90685511 1.99162197 2.061759 2.12803268 2.18901849 2.24617243 2.29577255 2.33861446 2.3830049 2.42007041 2.45587373 2.48903561 2.51994228 2.54826307 2.57668114 2.60455132 2.62822151 2.65262079 2.67470932 2.69745588 2.7174964 2.73718095 2.75638199 2.7726059 2.78851128 2.80315638 2.81760979 2.83073759 2.84322834 2.85495353 2.8657527 2.87610722 2.88587594 2.89453602 2.90323806 2.91230583 2.91929579 2.92629266 2.93343306 2.9417963 2.94951916 2.95655775 2.96300602 2.96910691 2.97487903 2.98045063 2.98581839 2.99163151 2.99641919 3.00148392 3.00607872 3.01105452 3.0157125 3.0205512 3.02469444 3.02853966 3.03255582 3.03621149 3.04061937 3.04388857 3.0480144 3.05180717 3.05551291 3.05898213 3.06208253 3.06541491 3.06820726 3.07142687 3.07460976 3.0774498 3.08050442 3.08328366 3.0863483 3.08953714 3.09240055 3.09520411 3.09861517 3.10154104 3.1043458 3.10691547 3.10984039 3.11265326 3.11534691 3.11765695 3.11982751 3.1224649 3.12499952 3.1276207 3.12964129 3.13169765 3.13386059 3.13648415 3.13914728 3.1415906 ------------------------------ Transdef: Tab for variable 11 2.86871432E-06 0.00581956329 0.0112768337 0.0178001225 0.0238749981 0.0307887793 0.0385220051 0.0458397865 0.0523679256 0.0599458218 0.0657962561 0.0726214647 0.0790892541 0.0854943991 0.0921019316 0.0976393074 0.104291677 0.108772039 0.115011707 0.121500947 0.127302498 0.133302987 0.139168322 0.145061478 0.151421368 0.156633005 0.162103891 0.167865872 0.173692524 0.179436445 0.184423313 0.190300167 0.196344018 0.201585412 0.207971692 0.213722527 0.219708323 0.225494146 0.231624857 0.237556458 0.243892193 0.249745429 0.255498648 0.261685491 0.267038345 0.27326256 0.27922225 0.285502195 0.292014837 0.297796428 0.303708732 0.310037076 0.316187769 0.323095322 0.329354048 0.336076319 0.34210211 0.348244905 0.354334235 0.359945476 0.366196841 0.373157382 0.379155755 0.3857207 0.391548634 0.398786724 0.404690981 0.410819888 0.417030573 0.423354506 0.429695964 0.436225891 0.442897975 0.449888349 0.456986457 0.463544309 0.470987022 0.476993203 0.484243155 0.492792726 0.500696778 0.50934279 0.517318726 0.526144505 0.536072314 0.545644283 0.556331873 0.566966414 0.577879071 0.589246154 0.602113187 0.614085317 0.630734921 0.6482023 0.66478318 0.685085058 0.710279107 0.740219295 0.781865001 0.844407678 1.14136529 ------------------------------ Transdef: Tab for variable 12 0.100060925 0.125538856 0.144699529 0.159535468 0.17145142 0.185256928 0.198086247 0.206965715 0.217053443 0.227396309 0.237558424 0.246988654 0.25547725 0.264489979 0.272748351 0.280899316 0.288257122 0.296524704 0.305768609 0.313357323 0.321150273 0.328435123 0.336194515 0.343693972 0.351298153 0.35915938 0.366596639 0.373250067 0.379596204 0.385930777 0.391893864 0.398008168 0.403245509 0.408508539 0.413157046 0.417692065 0.422050357 0.425877333 0.430168748 0.4349958 0.43898952 0.443637848 0.447915435 0.452513337 0.456887722 0.460849643 0.465283573 0.46963191 0.473451793 0.477343589 0.481446028 0.485584348 0.489849806 0.493746221 0.497891247 0.502459884 0.507448971 0.511577129 0.516174018 0.521139681 0.525344968 0.52978456 0.534495473 0.539691806 0.544393897 0.549920261 0.555325627 0.559958458 0.565266132 0.570376396 0.576063454 0.580486596 0.586469769 0.591154873 0.596596599 0.6024611 0.608478844 0.61584276 0.622646689 0.629563391 0.636632681 0.644097745 0.651933372 0.660391212 0.667590678 0.676534235 0.684080958 0.693040848 0.702447891 0.713225901 0.723963141 0.736229599 0.748130441 0.759485066 0.775308609 0.791949272 0.809129834 0.833853364 0.862804174 0.90791893 1.14755547 ------------------------------ Transdef: Tab for variable 13 20.0080433 21.3642197 22.1225815 22.6668701 23.1577988 23.6394081 24.0345688 24.4819412 24.8917179 25.2680969 25.619976 25.9804745 26.3476467 26.6962662 27.0652237 27.3585472 27.647644 27.9545383 28.2707062 28.5719109 28.8594017 29.1692486 29.4766579 29.7535896 30.0768814 30.3592491 30.6177139 30.9057846 31.1179008 31.3954601 31.68787 31.9739246 32.2717667 32.5115051 32.7805786 33.0584793 33.3537483 33.6363449 33.949707 34.2907753 34.5759697 34.8625221 35.1678314 35.4751244 35.8005066 36.0985107 36.3842773 36.7451172 37.0898895 37.4142456 37.7331429 38.0666122 38.4201393 38.7952118 39.1738815 39.5734711 39.9691391 40.3816147 40.8134117 41.2706795 41.650238 42.1373215 42.6097679 42.9844551 43.3942757 43.8692818 44.348774 44.8150368 45.3603821 45.98349 46.5477905 47.1110458 47.721756 48.3339615 49.0409393 49.717411 50.4023933 51.1740646 51.9880753 52.8366852 53.8011703 54.8171997 55.8308258 56.8857994 58.0074005 59.1496811 60.3124046 61.6704102 63.0825768 64.6068573 66.309166 68.2897186 70.3955765 73.0594482 76.2575226 79.5503235 84.0924835 87.9705429 93.6493835 105.384872 381.504089 ------------------------------ Transdef: Tab for variable 14 10.0086575 10.6444969 10.9704418 11.3137445 11.5768852 11.8428802 12.1402359 12.3589478 12.591114 12.8468237 13.0948753 13.3100834 13.5376177 13.7143879 13.9327602 14.1522865 14.373518 14.5850792 14.8001499 15.0340366 15.2415733 15.4536762 15.6647873 15.8791866 16.0804577 16.2838554 16.5079613 16.7178535 16.9375706 17.1407681 17.380703 17.6159267 17.8140602 18.0136757 18.2275581 18.4672165 18.6603241 18.8805885 19.1189613 19.3194599 19.5505161 19.7963257 20.025959 20.2529716 20.4848785 20.6977615 20.945816 21.1854782 21.4171314 21.6596451 21.8582306 22.072876 22.3139839 22.5604668 22.7832127 23.0533218 23.2726555 23.5487022 23.8011436 24.0684776 24.3377743 24.6044044 24.8913574 25.1518402 25.4434185 25.7145767 25.9900436 26.3072014 26.6022739 26.9537544 27.2947083 27.6418381 27.9809742 28.3236542 28.6628952 29.0616379 29.4396591 29.8355904 30.2913704 30.7403584 31.2137108 31.7039146 32.2342987 32.8145523 33.4332809 33.927124 34.6226959 35.3580933 36.2323303 37.0984344 38.0923462 39.0555611 40.2561493 41.538166 43.0907211 44.5778923 46.8839035 49.8620262 54.1430359 61.452137 127.598434 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 57.2 39.0 38.8 10.5 34.6 56.1 41.8 52.8 -30.4 -4.6 -12.1 9.6 13.0 2 57.2 100.0 63.8 53.4 18.0 59.4 94.2 67.0 90.5 -49.2 -18.1 -30.5 34.4 28.6 3 39.0 63.8 100.0 31.6 -4.4 25.2 67.4 48.4 69.8 -18.1 -18.8 -33.8 67.7 14.9 4 38.8 53.4 31.6 100.0 1.1 24.9 55.1 41.4 58.3 -14.6 -22.1 -40.1 15.0 72.1 5 10.5 18.0 -4.4 1.1 100.0 80.3 -9.5 44.7 43.8 30.1 -1.2 0.0 -10.1 -4.2 6 34.6 59.4 25.2 24.9 80.3 100.0 37.5 72.3 74.0 -3.6 -9.0 -14.0 8.0 9.9 7 56.1 94.2 67.4 55.1 -9.5 37.5 100.0 60.2 78.6 -56.0 -17.1 -30.3 38.9 31.3 8 41.8 67.0 48.4 41.4 44.7 72.3 60.2 100.0 77.7 -10.4 -17.7 -26.4 26.0 22.5 9 52.8 90.5 69.8 58.3 43.8 74.0 78.6 77.7 100.0 -24.0 -20.7 -36.0 39.5 32.8 10 -30.4 -49.2 -18.1 -14.6 30.1 -3.6 -56.0 -10.4 -24.0 100.0 2.2 4.7 -6.9 -5.5 11 -4.6 -18.1 -18.8 -22.1 -1.2 -9.0 -17.1 -17.7 -20.7 2.2 100.0 59.5 -7.4 -12.7 12 -12.1 -30.5 -33.8 -40.1 0.0 -14.0 -30.3 -26.4 -36.0 4.7 59.5 100.0 -12.9 -18.4 13 9.6 34.4 67.7 15.0 -10.1 8.0 38.9 26.0 39.5 -6.9 -7.4 -12.9 100.0 41.6 14 13.0 28.6 14.9 72.1 -4.2 9.9 31.3 22.5 32.8 -5.5 -12.7 -18.4 41.6 100.0 TOTAL CORRELATION TO TARGET (diagonal) 128.969605 TOTAL CORRELATION OF ALL VARIABLES 62.8962716 ROUND 1: MAX CORR ( 62.8918991) AFTER KILLING INPUT VARIABLE 6 CONTR 0.74162432 ROUND 2: MAX CORR ( 62.8511487) AFTER KILLING INPUT VARIABLE 11 CONTR 2.26364194 ROUND 3: MAX CORR ( 62.800674) AFTER KILLING INPUT VARIABLE 8 CONTR 2.51838085 ROUND 4: MAX CORR ( 62.7402184) AFTER KILLING INPUT VARIABLE 2 CONTR 2.75493141 ROUND 5: MAX CORR ( 62.5896739) AFTER KILLING INPUT VARIABLE 9 CONTR 4.34369986 ROUND 6: MAX CORR ( 62.4022427) AFTER KILLING INPUT VARIABLE 14 CONTR 4.840184 ROUND 7: MAX CORR ( 62.1193815) AFTER KILLING INPUT VARIABLE 10 CONTR 5.93484094 ROUND 8: MAX CORR ( 61.1253777) AFTER KILLING INPUT VARIABLE 4 CONTR 11.0682318 ROUND 9: MAX CORR ( 60.5755421) AFTER KILLING INPUT VARIABLE 12 CONTR 8.18018969 ROUND 10: MAX CORR ( 59.5772227) AFTER KILLING INPUT VARIABLE 3 CONTR 10.9522071 ROUND 11: MAX CORR ( 58.3166546) AFTER KILLING INPUT VARIABLE 13 CONTR 12.1907035 ROUND 12: MAX CORR ( 56.0933485) AFTER KILLING INPUT VARIABLE 5 CONTR 15.948933 LAST REMAINING VARIABLE: 7 total correlation to target: 62.8962716 % total significance: 126.809565 sigma correlations of single variables to target: variable 2: 57.2301058 % , in sigma: 115.385613 variable 3: 38.9606027 % , in sigma: 78.5511919 variable 4: 38.7914722 % , in sigma: 78.210196 variable 5: 10.5394268 % , in sigma: 21.2492743 variable 6: 34.6064166 % , in sigma: 69.7724131 variable 7: 56.0933485 % , in sigma: 113.093717 variable 8: 41.7521317 % , in sigma: 84.1793884 variable 9: 52.7645647 % , in sigma: 106.382324 variable 10: -30.3914014 % , in sigma: 61.2742267 variable 11: -4.59575874 % , in sigma: 9.26583014 variable 12: -12.0916524 % , in sigma: 24.3788247 variable 13: 9.57552591 % , in sigma: 19.3058865 variable 14: 13.0392851 % , in sigma: 26.2894133 variables sorted by significance: 1 most relevant variable 7 corr 56.0933495 , in sigma: 113.093719 2 most relevant variable 5 corr 15.9489326 , in sigma: 32.1557569 3 most relevant variable 13 corr 12.1907034 , in sigma: 24.5785284 4 most relevant variable 3 corr 10.9522066 , in sigma: 22.0815086 5 most relevant variable 12 corr 8.18019009 , in sigma: 16.4926525 6 most relevant variable 4 corr 11.0682316 , in sigma: 22.3154346 7 most relevant variable 10 corr 5.93484116 , in sigma: 11.9656477 8 most relevant variable 14 corr 4.84018421 , in sigma: 9.75863339 9 most relevant variable 9 corr 4.34369993 , in sigma: 8.75763677 10 most relevant variable 2 corr 2.75493145 , in sigma: 5.55440968 11 most relevant variable 8 corr 2.51838088 , in sigma: 5.07748356 12 most relevant variable 11 corr 2.26364183 , in sigma: 4.56388638 13 most relevant variable 6 corr 0.741624296 , in sigma: 1.49524053 global correlations between input variables: variable 2: 99.0707412 % variable 3: 93.5251381 % variable 4: 91.6085581 % variable 5: 94.8653462 % variable 6: 93.7163464 % variable 7: 98.6013042 % variable 8: 86.515814 % variable 9: 98.7836263 % variable 10: 72.1789812 % variable 11: 60.2719314 % variable 12: 68.9520274 % variable 13: 86.3232602 % variable 14: 88.1004055 % significance loss when removing single variables: variable 2: corr = 3.53882995 % , sigma = 7.13488219 variable 3: corr = 11.6270025 % , sigma = 23.4420117 variable 4: corr = 12.4874065 % , sigma = 25.1767322 variable 5: corr = 7.25288561 % , sigma = 14.6230492 variable 6: corr = 0.74162432 % , sigma = 1.49524058 variable 7: corr = 4.86559374 % , sigma = 9.80986331 variable 8: corr = 2.22753787 % , sigma = 4.49109465 variable 9: corr = 5.83381716 % , sigma = 11.7619662 variable 10: corr = 5.8195541 % , sigma = 11.7332094 variable 11: corr = 2.24367879 % , sigma = 4.52363749 variable 12: corr = 7.94753056 % , sigma = 16.0235714 variable 13: corr = 8.40328538 % , sigma = 16.9424505 variable 14: corr = 4.71885739 % , sigma = 9.51401791 Keep only 12 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 13 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 1 --> 18.6744766 sigma out 15 active outputs RANK 2 NODE 13 --> 15.3480234 sigma out 15 active outputs RANK 3 NODE 2 --> 14.1069937 sigma out 15 active outputs RANK 4 NODE 5 --> 13.8387337 sigma out 15 active outputs RANK 5 NODE 4 --> 13.1593189 sigma out 15 active outputs RANK 6 NODE 12 --> 12.1143923 sigma out 15 active outputs RANK 7 NODE 9 --> 11.5967703 sigma out 15 active outputs RANK 8 NODE 6 --> 11.545331 sigma out 15 active outputs RANK 9 NODE 7 --> 10.7287788 sigma out 15 active outputs RANK 10 NODE 11 --> 9.44182205 sigma out 15 active outputs RANK 11 NODE 3 --> 8.69742012 sigma out 15 active outputs RANK 12 NODE 10 --> 6.60035801 sigma out 15 active outputs RANK 13 NODE 8 --> 6.13576984 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 7 --> 24.9487438 sigma in 13act. ( 27.5225544 sig out 1act.) RANK 2 NODE 2 --> 20.0683327 sigma in 13act. ( 20.3034782 sig out 1act.) RANK 3 NODE 14 --> 15.1203527 sigma in 13act. ( 16.5007915 sig out 1act.) RANK 4 NODE 6 --> 14.4046965 sigma in 13act. ( 15.312211 sig out 1act.) RANK 5 NODE 9 --> 12.6766081 sigma in 13act. ( 11.6631184 sig out 1act.) RANK 6 NODE 12 --> 8.93308353 sigma in 13act. ( 8.49181366 sig out 1act.) RANK 7 NODE 1 --> 8.52784538 sigma in 13act. ( 7.14741564 sig out 1act.) RANK 8 NODE 3 --> 6.0821805 sigma in 13act. ( 1.1863246 sig out 1act.) RANK 9 NODE 10 --> 5.96078491 sigma in 13act. ( 5.93815231 sig out 1act.) RANK 10 NODE 15 --> 5.07527113 sigma in 13act. ( 5.89420128 sig out 1act.) RANK 11 NODE 8 --> 4.04228067 sigma in 13act. ( 5.55922651 sig out 1act.) RANK 12 NODE 5 --> 3.9147284 sigma in 13act. ( 5.09941244 sig out 1act.) RANK 13 NODE 4 --> 3.44080496 sigma in 13act. ( 1.6549269 sig out 1act.) RANK 14 NODE 11 --> 2.45116615 sigma in 13act. ( 3.98342919 sig out 1act.) RANK 15 NODE 13 --> 1.14911795 sigma in 13act. ( 0.455384403 sig out 1act.) sorted by output significance RANK 1 NODE 7 --> 27.5225544 sigma out 1act.( 24.9487438 sig in 13act.) RANK 2 NODE 2 --> 20.3034782 sigma out 1act.( 20.0683327 sig in 13act.) RANK 3 NODE 14 --> 16.5007915 sigma out 1act.( 15.1203527 sig in 13act.) RANK 4 NODE 6 --> 15.312211 sigma out 1act.( 14.4046965 sig in 13act.) RANK 5 NODE 9 --> 11.6631184 sigma out 1act.( 12.6766081 sig in 13act.) RANK 6 NODE 12 --> 8.49181366 sigma out 1act.( 8.93308353 sig in 13act.) RANK 7 NODE 1 --> 7.14741564 sigma out 1act.( 8.52784538 sig in 13act.) RANK 8 NODE 10 --> 5.93815231 sigma out 1act.( 5.96078491 sig in 13act.) RANK 9 NODE 15 --> 5.89420128 sigma out 1act.( 5.07527113 sig in 13act.) RANK 10 NODE 8 --> 5.55922651 sigma out 1act.( 4.04228067 sig in 13act.) RANK 11 NODE 5 --> 5.09941244 sigma out 1act.( 3.9147284 sig in 13act.) RANK 12 NODE 11 --> 3.98342919 sigma out 1act.( 2.45116615 sig in 13act.) RANK 13 NODE 4 --> 1.6549269 sigma out 1act.( 3.44080496 sig in 13act.) RANK 14 NODE 3 --> 1.1863246 sigma out 1act.( 6.0821805 sig in 13act.) RANK 15 NODE 13 --> 0.455384403 sigma out 1act.( 1.14911795 sig in 13act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 45.637928 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 1 --> 20.5728149 sigma out 15 active outputs RANK 2 NODE 13 --> 17.9039803 sigma out 15 active outputs RANK 3 NODE 11 --> 16.4348259 sigma out 15 active outputs RANK 4 NODE 2 --> 16.0246582 sigma out 15 active outputs RANK 5 NODE 12 --> 15.9734144 sigma out 15 active outputs RANK 6 NODE 5 --> 14.83811 sigma out 15 active outputs RANK 7 NODE 4 --> 14.6519556 sigma out 15 active outputs RANK 8 NODE 6 --> 13.447258 sigma out 15 active outputs RANK 9 NODE 9 --> 12.9127092 sigma out 15 active outputs RANK 10 NODE 7 --> 12.5429935 sigma out 15 active outputs RANK 11 NODE 3 --> 10.1789103 sigma out 15 active outputs RANK 12 NODE 8 --> 7.78420734 sigma out 15 active outputs RANK 13 NODE 10 --> 7.31943226 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 7 --> 28.5942993 sigma in 13act. ( 26.8573132 sig out 1act.) RANK 2 NODE 2 --> 19.3316917 sigma in 13act. ( 19.0214291 sig out 1act.) RANK 3 NODE 14 --> 17.0833473 sigma in 13act. ( 16.0496273 sig out 1act.) RANK 4 NODE 6 --> 15.4125681 sigma in 13act. ( 15.0742455 sig out 1act.) RANK 5 NODE 9 --> 12.5871572 sigma in 13act. ( 10.8001022 sig out 1act.) RANK 6 NODE 8 --> 11.4491501 sigma in 13act. ( 5.98864985 sig out 1act.) RANK 7 NODE 12 --> 10.293787 sigma in 13act. ( 8.11124992 sig out 1act.) RANK 8 NODE 11 --> 9.84312725 sigma in 13act. ( 6.32839155 sig out 1act.) RANK 9 NODE 5 --> 9.70567894 sigma in 13act. ( 5.92406559 sig out 1act.) RANK 10 NODE 15 --> 9.56915092 sigma in 13act. ( 6.15589571 sig out 1act.) RANK 11 NODE 1 --> 9.38841248 sigma in 13act. ( 6.55419493 sig out 1act.) RANK 12 NODE 4 --> 8.30699825 sigma in 13act. ( 1.35357189 sig out 1act.) RANK 13 NODE 10 --> 7.37838316 sigma in 13act. ( 5.98749495 sig out 1act.) RANK 14 NODE 3 --> 6.35616493 sigma in 13act. ( 0.541119099 sig out 1act.) RANK 15 NODE 13 --> 5.70030785 sigma in 13act. ( 0.691839755 sig out 1act.) sorted by output significance RANK 1 NODE 7 --> 26.8573132 sigma out 1act.( 28.5942993 sig in 13act.) RANK 2 NODE 2 --> 19.0214291 sigma out 1act.( 19.3316917 sig in 13act.) RANK 3 NODE 14 --> 16.0496273 sigma out 1act.( 17.0833473 sig in 13act.) RANK 4 NODE 6 --> 15.0742455 sigma out 1act.( 15.4125681 sig in 13act.) RANK 5 NODE 9 --> 10.8001022 sigma out 1act.( 12.5871572 sig in 13act.) RANK 6 NODE 12 --> 8.11124992 sigma out 1act.( 10.293787 sig in 13act.) RANK 7 NODE 1 --> 6.55419493 sigma out 1act.( 9.38841248 sig in 13act.) RANK 8 NODE 11 --> 6.32839155 sigma out 1act.( 9.84312725 sig in 13act.) RANK 9 NODE 15 --> 6.15589571 sigma out 1act.( 9.56915092 sig in 13act.) RANK 10 NODE 8 --> 5.98864985 sigma out 1act.( 11.4491501 sig in 13act.) RANK 11 NODE 10 --> 5.98749495 sigma out 1act.( 7.37838316 sig in 13act.) RANK 12 NODE 5 --> 5.92406559 sigma out 1act.( 9.70567894 sig in 13act.) RANK 13 NODE 4 --> 1.35357189 sigma out 1act.( 8.30699825 sig in 13act.) RANK 14 NODE 13 --> 0.691839755 sigma out 1act.( 5.70030785 sig in 13act.) RANK 15 NODE 3 --> 0.541119099 sigma out 1act.( 6.35616493 sig in 13act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 44.5050697 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.423729479 *** contribution from regularisation: 0.00418571476 *** contribution from error: -0.427915186 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.457306236 *** contribution from regularisation: 0.00225287559 *** contribution from error: -0.459559113 *********************************************** -----------------> Test sample ENTER BFGS code START -45544.8529 -0.0468212031 -1.07911444 EXIT FROM BFGS code FG_START 0. -0.0468212031 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.475607574 *** contribution from regularisation: 0.0021941259 *** contribution from error: -0.47780171 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -47357.1969 -0.0468212031 -240.208893 EXIT FROM BFGS code FG_LNSRCH 0. -0.088398546 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0.511402786 *** contribution from regularisation: 0.00310674449 *** contribution from error: -0.514509559 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50921.398 -0.088398546 137.312302 EXIT FROM BFGS code NEW_X -50921.398 -0.088398546 137.312302 ENTER BFGS code NEW_X -50921.398 -0.088398546 137.312302 EXIT FROM BFGS code FG_LNSRCH 0. -0.0689320043 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.516970038 *** contribution from regularisation: 0.00333393575 *** contribution from error: -0.520303965 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51475.7414 -0.0689320043 41.9956665 EXIT FROM BFGS code NEW_X -51475.7414 -0.0689320043 41.9956665 ENTER BFGS code NEW_X -51475.7414 -0.0689320043 41.9956665 EXIT FROM BFGS code FG_LNSRCH 0. -0.0613955408 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.518380344 *** contribution from regularisation: 0.00322326762 *** contribution from error: -0.521603584 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51616.1686 -0.0613955408 14.4999447 EXIT FROM BFGS code NEW_X -51616.1686 -0.0613955408 14.4999447 ENTER BFGS code NEW_X -51616.1686 -0.0613955408 14.4999447 EXIT FROM BFGS code FG_LNSRCH 0. -0.0549865402 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.518782556 *** contribution from regularisation: 0.00307343178 *** contribution from error: -0.52185601 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51656.218 -0.0549865402 -2.24848509 EXIT FROM BFGS code NEW_X -51656.218 -0.0549865402 -2.24848509 ENTER BFGS code NEW_X -51656.218 -0.0549865402 -2.24848509 EXIT FROM BFGS code FG_LNSRCH 0. -0.0497082286 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.519085824 *** contribution from regularisation: 0.00298602483 *** contribution from error: -0.522071838 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51686.4124 -0.0497082286 1.06900978 EXIT FROM BFGS code NEW_X -51686.4124 -0.0497082286 1.06900978 ENTER BFGS code NEW_X -51686.4124 -0.0497082286 1.06900978 EXIT FROM BFGS code FG_LNSRCH 0. 0.00257670553 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.520552456 *** contribution from regularisation: 0.0026587476 *** contribution from error: -0.523211181 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51832.4506 0.00257670553 -79.4810486 EXIT FROM BFGS code NEW_X -51832.4506 0.00257670553 -79.4810486 ENTER BFGS code NEW_X -51832.4506 0.00257670553 -79.4810486 EXIT FROM BFGS code FG_LNSRCH 0. 0.00833595637 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 11 --> 43.1118965 sigma out 15 active outputs RANK 2 NODE 1 --> 29.0461464 sigma out 15 active outputs RANK 3 NODE 3 --> 26.9496384 sigma out 15 active outputs RANK 4 NODE 13 --> 20.6108303 sigma out 15 active outputs RANK 5 NODE 8 --> 16.7561455 sigma out 15 active outputs RANK 6 NODE 9 --> 14.7061796 sigma out 15 active outputs RANK 7 NODE 12 --> 13.6241608 sigma out 15 active outputs RANK 8 NODE 2 --> 13.6215448 sigma out 15 active outputs RANK 9 NODE 6 --> 11.9067383 sigma out 15 active outputs RANK 10 NODE 4 --> 11.5751209 sigma out 15 active outputs RANK 11 NODE 5 --> 11.0220032 sigma out 15 active outputs RANK 12 NODE 10 --> 10.9559526 sigma out 15 active outputs RANK 13 NODE 7 --> 5.5446744 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 5 --> 42.1497879 sigma in 13act. ( 40.3191719 sig out 1act.) RANK 2 NODE 7 --> 28.6356354 sigma in 13act. ( 28.4897404 sig out 1act.) RANK 3 NODE 2 --> 28.2808571 sigma in 13act. ( 32.2875404 sig out 1act.) RANK 4 NODE 11 --> 28.0363846 sigma in 13act. ( 27.0325375 sig out 1act.) RANK 5 NODE 14 --> 18.5282879 sigma in 13act. ( 17.7690239 sig out 1act.) RANK 6 NODE 8 --> 16.99967 sigma in 13act. ( 14.145565 sig out 1act.) RANK 7 NODE 6 --> 11.8516064 sigma in 13act. ( 9.83033276 sig out 1act.) RANK 8 NODE 1 --> 11.0606003 sigma in 13act. ( 10.4770823 sig out 1act.) RANK 9 NODE 9 --> 7.91678238 sigma in 13act. ( 6.66193295 sig out 1act.) RANK 10 NODE 15 --> 6.57089329 sigma in 13act. ( 4.74098301 sig out 1act.) RANK 11 NODE 13 --> 5.97198629 sigma in 13act. ( 5.2962718 sig out 1act.) RANK 12 NODE 3 --> 4.36046505 sigma in 13act. ( 3.02104187 sig out 1act.) RANK 13 NODE 4 --> 3.70948887 sigma in 13act. ( 0.855430663 sig out 1act.) RANK 14 NODE 10 --> 2.99068761 sigma in 13act. ( 1.29851186 sig out 1act.) RANK 15 NODE 12 --> 2.97556806 sigma in 13act. ( 1.24127567 sig out 1act.) sorted by output significance RANK 1 NODE 5 --> 40.3191719 sigma out 1act.( 42.1497879 sig in 13act.) RANK 2 NODE 2 --> 32.2875404 sigma out 1act.( 28.2808571 sig in 13act.) RANK 3 NODE 7 --> 28.4897404 sigma out 1act.( 28.6356354 sig in 13act.) RANK 4 NODE 11 --> 27.0325375 sigma out 1act.( 28.0363846 sig in 13act.) RANK 5 NODE 14 --> 17.7690239 sigma out 1act.( 18.5282879 sig in 13act.) RANK 6 NODE 8 --> 14.145565 sigma out 1act.( 16.99967 sig in 13act.) RANK 7 NODE 1 --> 10.4770823 sigma out 1act.( 11.0606003 sig in 13act.) RANK 8 NODE 6 --> 9.83033276 sigma out 1act.( 11.8516064 sig in 13act.) RANK 9 NODE 9 --> 6.66193295 sigma out 1act.( 7.91678238 sig in 13act.) RANK 10 NODE 13 --> 5.2962718 sigma out 1act.( 5.97198629 sig in 13act.) RANK 11 NODE 15 --> 4.74098301 sigma out 1act.( 6.57089329 sig in 13act.) RANK 12 NODE 3 --> 3.02104187 sigma out 1act.( 4.36046505 sig in 13act.) RANK 13 NODE 10 --> 1.29851186 sigma out 1act.( 2.99068761 sig in 13act.) RANK 14 NODE 12 --> 1.24127567 sigma out 1act.( 2.97556806 sig in 13act.) RANK 15 NODE 4 --> 0.855430663 sigma out 1act.( 3.70948887 sig in 13act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 70.998436 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.52159363 *** contribution from regularisation: 0.00321638351 *** contribution from error: -0.524810016 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -51936.1219 0.00833595637 51.6291428 EXIT FROM BFGS code NEW_X -51936.1219 0.00833595637 51.6291428 ENTER BFGS code NEW_X -51936.1219 0.00833595637 51.6291428 EXIT FROM BFGS code FG_LNSRCH 0. 0.0237424299 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.522983074 *** contribution from regularisation: 0.0031141073 *** contribution from error: -0.526097178 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52074.4705 0.0237424299 -122.595245 EXIT FROM BFGS code NEW_X -52074.4705 0.0237424299 -122.595245 ENTER BFGS code NEW_X -52074.4705 0.0237424299 -122.595245 EXIT FROM BFGS code FG_LNSRCH 0. -0.0128976405 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.523602903 *** contribution from regularisation: 0.00263051456 *** contribution from error: -0.526233435 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52136.188 -0.0128976405 -19.3051262 EXIT FROM BFGS code NEW_X -52136.188 -0.0128976405 -19.3051262 ENTER BFGS code NEW_X -52136.188 -0.0128976405 -19.3051262 EXIT FROM BFGS code FG_LNSRCH 0. -0.04153933 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.52376014 *** contribution from regularisation: 0.00294737797 *** contribution from error: -0.52670753 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52151.8446 -0.04153933 -46.2234116 EXIT FROM BFGS code NEW_X -52151.8446 -0.04153933 -46.2234116 ENTER BFGS code NEW_X -52151.8446 -0.04153933 -46.2234116 EXIT FROM BFGS code FG_LNSRCH 0. -0.0822657272 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.524350941 *** contribution from regularisation: 0.0028998435 *** contribution from error: -0.527250767 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52210.6698 -0.0822657272 -21.686018 EXIT FROM BFGS code NEW_X -52210.6698 -0.0822657272 -21.686018 ENTER BFGS code NEW_X -52210.6698 -0.0822657272 -21.686018 EXIT FROM BFGS code FG_LNSRCH 0. -0.301917315 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.52603972 *** contribution from regularisation: 0.0034639996 *** contribution from error: -0.529503703 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52378.8281 -0.301917315 58.8953323 EXIT FROM BFGS code NEW_X -52378.8281 -0.301917315 58.8953323 ENTER BFGS code NEW_X -52378.8281 -0.301917315 58.8953323 EXIT FROM BFGS code FG_LNSRCH 0. -0.371885777 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.52676481 *** contribution from regularisation: 0.00330789061 *** contribution from error: -0.530072689 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52451.0241 -0.371885777 -24.8654041 EXIT FROM BFGS code FG_LNSRCH 0. -0.337713242 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.527479053 *** contribution from regularisation: 0.0028958011 *** contribution from error: -0.530374825 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52522.144 -0.337713242 23.8590069 EXIT FROM BFGS code NEW_X -52522.144 -0.337713242 23.8590069 ENTER BFGS code NEW_X -52522.144 -0.337713242 23.8590069 EXIT FROM BFGS code FG_LNSRCH 0. -0.345207214 0. --------------------------------------------------- Iteration : 18 *********************************************** *** Learn Path 18 *** loss function: -0.527871251 *** contribution from regularisation: 0.00309769949 *** contribution from error: -0.530968964 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52561.1975 -0.345207214 25.2678509 EXIT FROM BFGS code NEW_X -52561.1975 -0.345207214 25.2678509 ENTER BFGS code NEW_X -52561.1975 -0.345207214 25.2678509 EXIT FROM BFGS code FG_LNSRCH 0. -0.341843575 0. --------------------------------------------------- Iteration : 19 *********************************************** *** Learn Path 19 *** loss function: -0.528113842 *** contribution from regularisation: 0.00306669576 *** contribution from error: -0.531180561 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52585.3506 -0.341843575 24.2248554 EXIT FROM BFGS code NEW_X -52585.3506 -0.341843575 24.2248554 ENTER BFGS code NEW_X -52585.3506 -0.341843575 24.2248554 EXIT FROM BFGS code FG_LNSRCH 0. -0.325377613 0. --------------------------------------------------- Iteration : 20 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 11 --> 60.0506401 sigma out 15 active outputs RANK 2 NODE 3 --> 43.7103271 sigma out 15 active outputs RANK 3 NODE 1 --> 35.7457161 sigma out 15 active outputs RANK 4 NODE 8 --> 22.9452667 sigma out 15 active outputs RANK 5 NODE 9 --> 22.8273983 sigma out 15 active outputs RANK 6 NODE 13 --> 21.0020447 sigma out 15 active outputs RANK 7 NODE 12 --> 20.3152523 sigma out 15 active outputs RANK 8 NODE 6 --> 20.0118275 sigma out 15 active outputs RANK 9 NODE 4 --> 17.1867752 sigma out 15 active outputs RANK 10 NODE 10 --> 16.9277344 sigma out 15 active outputs RANK 11 NODE 2 --> 16.2909412 sigma out 15 active outputs RANK 12 NODE 5 --> 15.8777409 sigma out 15 active outputs RANK 13 NODE 7 --> 12.3220463 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 5 --> 65.9046783 sigma in 13act. ( 75.1153488 sig out 1act.) RANK 2 NODE 2 --> 44.5994301 sigma in 13act. ( 46.1775589 sig out 1act.) RANK 3 NODE 14 --> 35.4282722 sigma in 13act. ( 37.3962097 sig out 1act.) RANK 4 NODE 7 --> 34.1514359 sigma in 13act. ( 37.4612541 sig out 1act.) RANK 5 NODE 1 --> 23.5174465 sigma in 13act. ( 24.2597141 sig out 1act.) RANK 6 NODE 11 --> 21.7493649 sigma in 13act. ( 21.6033077 sig out 1act.) RANK 7 NODE 9 --> 17.5427361 sigma in 13act. ( 18.7192936 sig out 1act.) RANK 8 NODE 13 --> 8.37812424 sigma in 13act. ( 9.15371227 sig out 1act.) RANK 9 NODE 8 --> 8.29886341 sigma in 13act. ( 5.70492601 sig out 1act.) RANK 10 NODE 3 --> 8.28376675 sigma in 13act. ( 8.94915009 sig out 1act.) RANK 11 NODE 6 --> 5.32214117 sigma in 13act. ( 3.14948249 sig out 1act.) RANK 12 NODE 15 --> 3.39557147 sigma in 13act. ( 1.97182274 sig out 1act.) RANK 13 NODE 12 --> 2.52853465 sigma in 13act. ( 2.3088088 sig out 1act.) RANK 14 NODE 10 --> 1.71582401 sigma in 13act. ( 0.403485715 sig out 1act.) RANK 15 NODE 4 --> 1.66969085 sigma in 13act. ( 0.659744143 sig out 1act.) sorted by output significance RANK 1 NODE 5 --> 75.1153488 sigma out 1act.( 65.9046783 sig in 13act.) RANK 2 NODE 2 --> 46.1775589 sigma out 1act.( 44.5994301 sig in 13act.) RANK 3 NODE 7 --> 37.4612541 sigma out 1act.( 34.1514359 sig in 13act.) RANK 4 NODE 14 --> 37.3962097 sigma out 1act.( 35.4282722 sig in 13act.) RANK 5 NODE 1 --> 24.2597141 sigma out 1act.( 23.5174465 sig in 13act.) RANK 6 NODE 11 --> 21.6033077 sigma out 1act.( 21.7493649 sig in 13act.) RANK 7 NODE 9 --> 18.7192936 sigma out 1act.( 17.5427361 sig in 13act.) RANK 8 NODE 13 --> 9.15371227 sigma out 1act.( 8.37812424 sig in 13act.) RANK 9 NODE 3 --> 8.94915009 sigma out 1act.( 8.28376675 sig in 13act.) RANK 10 NODE 8 --> 5.70492601 sigma out 1act.( 8.29886341 sig in 13act.) RANK 11 NODE 6 --> 3.14948249 sigma out 1act.( 5.32214117 sig in 13act.) RANK 12 NODE 12 --> 2.3088088 sigma out 1act.( 2.52853465 sig in 13act.) RANK 13 NODE 15 --> 1.97182274 sigma out 1act.( 3.39557147 sig in 13act.) RANK 14 NODE 4 --> 0.659744143 sigma out 1act.( 1.66969085 sig in 13act.) RANK 15 NODE 10 --> 0.403485715 sigma out 1act.( 1.71582401 sig in 13act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 110.445969 sigma in 15 active inputs *********************************************** *** Learn Path 20 *** loss function: -0.528489828 *** contribution from regularisation: 0.00307253422 *** contribution from error: -0.531562388 *********************************************** -----------------> Test sample Iteration No: 20 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -52622.7886 -0.325377613 48.8288498 EXIT FROM BFGS code NEW_X -52622.7886 -0.325377613 48.8288498 ENTER BFGS code NEW_X -52622.7886 -0.325377613 48.8288498 EXIT FROM BFGS code FG_LNSRCH 0. -0.264949501 0. --------------------------------------------------- Iteration : 21 *********************************************** *** Learn Path 21 *** loss function: -0.528835416 *** contribution from regularisation: 0.00310754683 *** contribution from error: -0.531942964 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52657.1989 -0.264949501 -45.0394135 EXIT FROM BFGS code NEW_X -52657.1989 -0.264949501 -45.0394135 ENTER BFGS code NEW_X -52657.1989 -0.264949501 -45.0394135 EXIT FROM BFGS code FG_LNSRCH 0. -0.247578338 0. --------------------------------------------------- Iteration : 22 *********************************************** *** Learn Path 22 *** loss function: -0.529275954 *** contribution from regularisation: 0.00306747179 *** contribution from error: -0.532343447 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52701.066 -0.247578338 17.4639225 EXIT FROM BFGS code NEW_X -52701.066 -0.247578338 17.4639225 ENTER BFGS code NEW_X -52701.066 -0.247578338 17.4639225 EXIT FROM BFGS code FG_LNSRCH 0. -0.238906562 0. --------------------------------------------------- Iteration : 23 *********************************************** *** Learn Path 23 *** loss function: -0.529408216 *** contribution from regularisation: 0.00304359244 *** contribution from error: -0.532451808 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52714.2344 -0.238906562 2.8049407 EXIT FROM BFGS code NEW_X -52714.2344 -0.238906562 2.8049407 ENTER BFGS code NEW_X -52714.2344 -0.238906562 2.8049407 EXIT FROM BFGS code FG_LNSRCH 0. -0.230701834 0. --------------------------------------------------- Iteration : 24 *********************************************** *** Learn Path 24 *** loss function: -0.52948308 *** contribution from regularisation: 0.00303517585 *** contribution from error: -0.532518268 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52721.6901 -0.230701834 2.55590868 EXIT FROM BFGS code NEW_X -52721.6901 -0.230701834 2.55590868 ENTER BFGS code NEW_X -52721.6901 -0.230701834 2.55590868 EXIT FROM BFGS code FG_LNSRCH 0. -0.212173551 0. --------------------------------------------------- Iteration : 25 *********************************************** *** Learn Path 25 *** loss function: -0.529648423 *** contribution from regularisation: 0.00302838301 *** contribution from error: -0.532676816 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52738.1516 -0.212173551 -31.9264011 EXIT FROM BFGS code NEW_X -52738.1516 -0.212173551 -31.9264011 ENTER BFGS code NEW_X -52738.1516 -0.212173551 -31.9264011 EXIT FROM BFGS code FG_LNSRCH 0. -0.187058076 0. --------------------------------------------------- Iteration : 26 *********************************************** *** Learn Path 26 *** loss function: -0.529944718 *** contribution from regularisation: 0.00304207066 *** contribution from error: -0.53298676 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52767.6565 -0.187058076 46.9128876 EXIT FROM BFGS code NEW_X -52767.6565 -0.187058076 46.9128876 ENTER BFGS code NEW_X -52767.6565 -0.187058076 46.9128876 EXIT FROM BFGS code FG_LNSRCH 0. -0.156692952 0. --------------------------------------------------- Iteration : 27 *********************************************** *** Learn Path 27 *** loss function: -0.530140102 *** contribution from regularisation: 0.00306066032 *** contribution from error: -0.533200741 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52787.111 -0.156692952 -118.500305 EXIT FROM BFGS code NEW_X -52787.111 -0.156692952 -118.500305 ENTER BFGS code NEW_X -52787.111 -0.156692952 -118.500305 EXIT FROM BFGS code FG_LNSRCH 0. -0.173427284 0. --------------------------------------------------- Iteration : 28 *********************************************** *** Learn Path 28 *** loss function: -0.530438244 *** contribution from regularisation: 0.00302056177 *** contribution from error: -0.533458829 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52816.7948 -0.173427284 -10.1888123 EXIT FROM BFGS code NEW_X -52816.7948 -0.173427284 -10.1888123 ENTER BFGS code NEW_X -52816.7948 -0.173427284 -10.1888123 EXIT FROM BFGS code FG_LNSRCH 0. -0.170700341 0. --------------------------------------------------- Iteration : 29 *********************************************** *** Learn Path 29 *** loss function: -0.530499876 *** contribution from regularisation: 0.00305410894 *** contribution from error: -0.533553958 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52822.9359 -0.170700341 21.6134186 EXIT FROM BFGS code NEW_X -52822.9359 -0.170700341 21.6134186 ENTER BFGS code NEW_X -52822.9359 -0.170700341 21.6134186 EXIT FROM BFGS code FG_LNSRCH 0. -0.164104104 0. --------------------------------------------------- Iteration : 30 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 11 --> 75.4943237 sigma out 15 active outputs RANK 2 NODE 3 --> 56.3032417 sigma out 15 active outputs RANK 3 NODE 1 --> 47.2093925 sigma out 15 active outputs RANK 4 NODE 13 --> 26.92626 sigma out 15 active outputs RANK 5 NODE 4 --> 26.5520344 sigma out 15 active outputs RANK 6 NODE 8 --> 25.6900692 sigma out 15 active outputs RANK 7 NODE 10 --> 24.093153 sigma out 15 active outputs RANK 8 NODE 9 --> 21.6146908 sigma out 15 active outputs RANK 9 NODE 7 --> 18.479229 sigma out 15 active outputs RANK 10 NODE 6 --> 18.36936 sigma out 15 active outputs RANK 11 NODE 5 --> 16.9274178 sigma out 15 active outputs RANK 12 NODE 2 --> 16.8615761 sigma out 15 active outputs RANK 13 NODE 12 --> 14.7879896 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 5 --> 81.1098557 sigma in 13act. ( 104.061996 sig out 1act.) RANK 2 NODE 14 --> 54.4888687 sigma in 13act. ( 61.771225 sig out 1act.) RANK 3 NODE 2 --> 42.6452408 sigma in 13act. ( 48.7724991 sig out 1act.) RANK 4 NODE 7 --> 38.8062553 sigma in 13act. ( 41.4256706 sig out 1act.) RANK 5 NODE 1 --> 34.2618294 sigma in 13act. ( 36.3472176 sig out 1act.) RANK 6 NODE 11 --> 26.3008862 sigma in 13act. ( 24.4775639 sig out 1act.) RANK 7 NODE 9 --> 24.7914562 sigma in 13act. ( 24.997757 sig out 1act.) RANK 8 NODE 3 --> 13.0204496 sigma in 13act. ( 14.8377142 sig out 1act.) RANK 9 NODE 13 --> 10.9766407 sigma in 13act. ( 12.5115013 sig out 1act.) RANK 10 NODE 8 --> 6.39604712 sigma in 13act. ( 4.7130022 sig out 1act.) RANK 11 NODE 6 --> 2.93567133 sigma in 13act. ( 1.49293721 sig out 1act.) RANK 12 NODE 12 --> 2.79780102 sigma in 13act. ( 2.80365491 sig out 1act.) RANK 13 NODE 15 --> 1.55505192 sigma in 13act. ( 0.134236157 sig out 1act.) RANK 14 NODE 4 --> 1.04425502 sigma in 13act. ( 0.582138717 sig out 1act.) RANK 15 NODE 10 --> 0.904259026 sigma in 13act. ( 0.216207296 sig out 1act.) sorted by output significance RANK 1 NODE 5 --> 104.061996 sigma out 1act.( 81.1098557 sig in 13act.) RANK 2 NODE 14 --> 61.771225 sigma out 1act.( 54.4888687 sig in 13act.) RANK 3 NODE 2 --> 48.7724991 sigma out 1act.( 42.6452408 sig in 13act.) RANK 4 NODE 7 --> 41.4256706 sigma out 1act.( 38.8062553 sig in 13act.) RANK 5 NODE 1 --> 36.3472176 sigma out 1act.( 34.2618294 sig in 13act.) RANK 6 NODE 9 --> 24.997757 sigma out 1act.( 24.7914562 sig in 13act.) RANK 7 NODE 11 --> 24.4775639 sigma out 1act.( 26.3008862 sig in 13act.) RANK 8 NODE 3 --> 14.8377142 sigma out 1act.( 13.0204496 sig in 13act.) RANK 9 NODE 13 --> 12.5115013 sigma out 1act.( 10.9766407 sig in 13act.) RANK 10 NODE 8 --> 4.7130022 sigma out 1act.( 6.39604712 sig in 13act.) RANK 11 NODE 12 --> 2.80365491 sigma out 1act.( 2.79780102 sig in 13act.) RANK 12 NODE 6 --> 1.49293721 sigma out 1act.( 2.93567133 sig in 13act.) RANK 13 NODE 4 --> 0.582138717 sigma out 1act.( 1.04425502 sig in 13act.) RANK 14 NODE 10 --> 0.216207296 sigma out 1act.( 0.904259026 sig in 13act.) RANK 15 NODE 15 --> 0.134236157 sigma out 1act.( 1.55505192 sig in 13act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 147.288788 sigma in 15 active inputs *********************************************** *** Learn Path 30 *** loss function: -0.530593932 *** contribution from regularisation: 0.00305578881 *** contribution from error: -0.533649743 *********************************************** -----------------> Test sample Iteration No: 30 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -52832.2997 -0.164104104 48.5456543 EXIT FROM BFGS code NEW_X -52832.2997 -0.164104104 48.5456543 ENTER BFGS code NEW_X -52832.2997 -0.164104104 48.5456543 EXIT FROM BFGS code FG_LNSRCH 0. -0.120553665 0. --------------------------------------------------- Iteration : 31 *********************************************** *** Learn Path 31 *** loss function: -0.530852795 *** contribution from regularisation: 0.00312018767 *** contribution from error: -0.533972979 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52858.0743 -0.120553665 27.4748116 EXIT FROM BFGS code NEW_X -52858.0743 -0.120553665 27.4748116 ENTER BFGS code NEW_X -52858.0743 -0.120553665 27.4748116 EXIT FROM BFGS code FG_LNSRCH 0. -0.0754545853 0. --------------------------------------------------- Iteration : 32 *********************************************** *** Learn Path 32 *** loss function: -0.530985773 *** contribution from regularisation: 0.00319535425 *** contribution from error: -0.534181118 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52871.3168 -0.0754545853 178.020233 EXIT FROM BFGS code NEW_X -52871.3168 -0.0754545853 178.020233 ENTER BFGS code NEW_X -52871.3168 -0.0754545853 178.020233 EXIT FROM BFGS code FG_LNSRCH 0. -0.0731333718 0. --------------------------------------------------- Iteration : 33 *********************************************** *** Learn Path 33 *** loss function: -0.53123188 *** contribution from regularisation: 0.00313212187 *** contribution from error: -0.534363985 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52895.822 -0.0731333718 34.0802383 EXIT FROM BFGS code NEW_X -52895.822 -0.0731333718 34.0802383 ENTER BFGS code NEW_X -52895.822 -0.0731333718 34.0802383 EXIT FROM BFGS code FG_LNSRCH 0. -0.065994449 0. --------------------------------------------------- Iteration : 34 *********************************************** *** Learn Path 34 *** loss function: -0.531286955 *** contribution from regularisation: 0.00312682684 *** contribution from error: -0.534413755 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52901.3033 -0.065994449 -4.05712557 EXIT FROM BFGS code NEW_X -52901.3033 -0.065994449 -4.05712557 ENTER BFGS code NEW_X -52901.3033 -0.065994449 -4.05712557 EXIT FROM BFGS code FG_LNSRCH 0. -0.0476388708 0. --------------------------------------------------- Iteration : 35 *********************************************** *** Learn Path 35 *** loss function: -0.531365335 *** contribution from regularisation: 0.00311934226 *** contribution from error: -0.534484684 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52909.1064 -0.0476388708 -37.5500107 EXIT FROM BFGS code NEW_X -52909.1064 -0.0476388708 -37.5500107 ENTER BFGS code NEW_X -52909.1064 -0.0476388708 -37.5500107 EXIT FROM BFGS code FG_LNSRCH 0. -0.0191491172 0. --------------------------------------------------- Iteration : 36 *********************************************** *** Learn Path 36 *** loss function: -0.53147006 *** contribution from regularisation: 0.0030858526 *** contribution from error: -0.534555912 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52919.5344 -0.0191491172 -62.4432335 EXIT FROM BFGS code NEW_X -52919.5344 -0.0191491172 -62.4432335 ENTER BFGS code NEW_X -52919.5344 -0.0191491172 -62.4432335 EXIT FROM BFGS code FG_LNSRCH 0. 0.0111030443 0. --------------------------------------------------- Iteration : 37 *********************************************** *** Learn Path 37 *** loss function: -0.531519234 *** contribution from regularisation: 0.00309648993 *** contribution from error: -0.534615695 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52924.4304 0.0111030443 12.579792 EXIT FROM BFGS code NEW_X -52924.4304 0.0111030443 12.579792 ENTER BFGS code NEW_X -52924.4304 0.0111030443 12.579792 EXIT FROM BFGS code FG_LNSRCH 0. 0.0182442702 0. --------------------------------------------------- Iteration : 38 *********************************************** *** Learn Path 38 *** loss function: -0.531533241 *** contribution from regularisation: 0.00316774775 *** contribution from error: -0.53470099 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52925.8282 0.0182442702 -1.25680864 EXIT FROM BFGS code NEW_X -52925.8282 0.0182442702 -1.25680864 ENTER BFGS code NEW_X -52925.8282 0.0182442702 -1.25680864 EXIT FROM BFGS code FG_LNSRCH 0. 0.0255589243 0. --------------------------------------------------- Iteration : 39 *********************************************** *** Learn Path 39 *** loss function: -0.531638205 *** contribution from regularisation: 0.00313492934 *** contribution from error: -0.534773111 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52936.2789 0.0255589243 21.084549 EXIT FROM BFGS code NEW_X -52936.2789 0.0255589243 21.084549 ENTER BFGS code NEW_X -52936.2789 0.0255589243 21.084549 EXIT FROM BFGS code FG_LNSRCH 0. 0.0545787998 0. --------------------------------------------------- Iteration : 40 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 11 --> 90.0335236 sigma out 15 active outputs RANK 2 NODE 1 --> 68.0700989 sigma out 15 active outputs RANK 3 NODE 3 --> 67.5209351 sigma out 15 active outputs RANK 4 NODE 10 --> 31.0105305 sigma out 15 active outputs RANK 5 NODE 8 --> 29.9537945 sigma out 15 active outputs RANK 6 NODE 4 --> 29.4357262 sigma out 15 active outputs RANK 7 NODE 13 --> 28.6370392 sigma out 15 active outputs RANK 8 NODE 6 --> 27.0235977 sigma out 15 active outputs RANK 9 NODE 7 --> 26.2703037 sigma out 15 active outputs RANK 10 NODE 9 --> 24.8321323 sigma out 15 active outputs RANK 11 NODE 2 --> 21.7802982 sigma out 15 active outputs RANK 12 NODE 5 --> 20.9050617 sigma out 15 active outputs RANK 13 NODE 12 --> 18.5963764 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 5 --> 98.1782379 sigma in 13act. ( 120.323982 sig out 1act.) RANK 2 NODE 14 --> 69.9925308 sigma in 13act. ( 83.2802505 sig out 1act.) RANK 3 NODE 2 --> 49.4407501 sigma in 13act. ( 55.8674431 sig out 1act.) RANK 4 NODE 1 --> 47.9278145 sigma in 13act. ( 50.8654671 sig out 1act.) RANK 5 NODE 7 --> 43.0185776 sigma in 13act. ( 44.5545044 sig out 1act.) RANK 6 NODE 9 --> 36.3848076 sigma in 13act. ( 35.8475761 sig out 1act.) RANK 7 NODE 11 --> 34.5184326 sigma in 13act. ( 33.2169495 sig out 1act.) RANK 8 NODE 3 --> 15.2190304 sigma in 13act. ( 16.9843483 sig out 1act.) RANK 9 NODE 13 --> 14.9021454 sigma in 13act. ( 16.4932003 sig out 1act.) RANK 10 NODE 8 --> 4.90261745 sigma in 13act. ( 4.34014034 sig out 1act.) RANK 11 NODE 12 --> 3.53969765 sigma in 13act. ( 3.6597662 sig out 1act.) RANK 12 NODE 6 --> 1.2814765 sigma in 13act. ( 0.689236879 sig out 1act.) RANK 13 NODE 15 --> 0.698224962 sigma in 13act. ( 0.253541648 sig out 1act.) RANK 14 NODE 4 --> 0.695495963 sigma in 13act. ( 0.461634248 sig out 1act.) RANK 15 NODE 10 --> 0.333877712 sigma in 13act. ( 0.246905997 sig out 1act.) sorted by output significance RANK 1 NODE 5 --> 120.323982 sigma out 1act.( 98.1782379 sig in 13act.) RANK 2 NODE 14 --> 83.2802505 sigma out 1act.( 69.9925308 sig in 13act.) RANK 3 NODE 2 --> 55.8674431 sigma out 1act.( 49.4407501 sig in 13act.) RANK 4 NODE 1 --> 50.8654671 sigma out 1act.( 47.9278145 sig in 13act.) RANK 5 NODE 7 --> 44.5545044 sigma out 1act.( 43.0185776 sig in 13act.) RANK 6 NODE 9 --> 35.8475761 sigma out 1act.( 36.3848076 sig in 13act.) RANK 7 NODE 11 --> 33.2169495 sigma out 1act.( 34.5184326 sig in 13act.) RANK 8 NODE 3 --> 16.9843483 sigma out 1act.( 15.2190304 sig in 13act.) RANK 9 NODE 13 --> 16.4932003 sigma out 1act.( 14.9021454 sig in 13act.) RANK 10 NODE 8 --> 4.34014034 sigma out 1act.( 4.90261745 sig in 13act.) RANK 11 NODE 12 --> 3.6597662 sigma out 1act.( 3.53969765 sig in 13act.) RANK 12 NODE 6 --> 0.689236879 sigma out 1act.( 1.2814765 sig in 13act.) RANK 13 NODE 4 --> 0.461634248 sigma out 1act.( 0.695495963 sig in 13act.) RANK 14 NODE 15 --> 0.253541648 sigma out 1act.( 0.698224962 sig in 13act.) RANK 15 NODE 10 --> 0.246905997 sigma out 1act.( 0.333877712 sig in 13act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 179.133972 sigma in 15 active inputs *********************************************** *** Learn Path 40 *** loss function: -0.531736791 *** contribution from regularisation: 0.00310032326 *** contribution from error: -0.534837127 *********************************************** -----------------> Test sample Iteration No: 40 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -52946.0939 0.0545787998 -22.4492779 EXIT FROM BFGS code NEW_X -52946.0939 0.0545787998 -22.4492779 ENTER BFGS code NEW_X -52946.0939 0.0545787998 -22.4492779 EXIT FROM BFGS code FG_LNSRCH 0. 0.100036748 0. --------------------------------------------------- Iteration : 41 *********************************************** *** Learn Path 41 *** loss function: -0.53172797 *** contribution from regularisation: 0.00305516296 *** contribution from error: -0.534783125 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52945.2171 0.100036748 119.131996 EXIT FROM BFGS code FG_LNSRCH 0. 0.0728101507 0. --------------------------------------------------- Iteration : 42 *********************************************** *** Learn Path 42 *** loss function: -0.531597257 *** contribution from regularisation: 0.00329714501 *** contribution from error: -0.534894407 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52932.2011 0.0728101507 35.4509239 EXIT FROM BFGS code FG_LNSRCH 0. 0.0565206222 0. --------------------------------------------------- Iteration : 43 *********************************************** *** Learn Path 43 *** loss function: -0.531613946 *** contribution from regularisation: 0.003233633 *** contribution from error: -0.534847558 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52933.8629 0.0565206222 -16.3215809 EXIT FROM BFGS code FG_LNSRCH 0. 0.0546168201 0. --------------------------------------------------- Iteration : 44 *********************************************** *** Learn Path 44 *** loss function: -0.531650305 *** contribution from regularisation: 0.00318672787 *** contribution from error: -0.534837008 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52937.4869 0.0546168201 -22.472702 EXIT FROM BFGS code FG_LNSRCH 0. 0.0545788221 0. --------------------------------------------------- Iteration : 45 *********************************************** *** Learn Path 45 *** loss function: -0.531645954 *** contribution from regularisation: 0.00319112884 *** contribution from error: -0.534837067 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52937.0524 0.0545788221 -22.6301212 EXIT FROM BFGS code FG_LNSRCH 0. 0.0545787998 0. --------------------------------------------------- Iteration : 46 *********************************************** *** Learn Path 46 *** loss function: -0.531652868 *** contribution from regularisation: 0.00318423868 *** contribution from error: -0.534837127 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52937.7383 0.0545787998 -22.6577644 EXIT FROM BFGS code FG_LNSRCH 0. 0.0545787998 0. --------------------------------------------------- Iteration : 47 *********************************************** *** Learn Path 47 *** loss function: -0.531652033 *** contribution from regularisation: 0.00318508409 *** contribution from error: -0.534837127 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52937.6541 0.0545787998 -22.6853886 EXIT FROM BFGS code FG_LNSRCH 0. 0.0545787998 0. --------------------------------------------------- Iteration : 48 *********************************************** *** Learn Path 48 *** loss function: -0.531654 *** contribution from regularisation: 0.00318309595 *** contribution from error: -0.534837067 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52937.8521 0.0545787998 -22.7106171 EXIT FROM BFGS code NEW_X -52937.8521 0.0545787998 -22.7106171 ENTER BFGS code NEW_X -52937.8521 0.0545787998 -22.7106171 EXIT FROM BFGS code CONVERGENC -52937.8521 0.0545787998 -22.7106171 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 11 --> 137.199539 sigma out 15 active outputs RANK 2 NODE 1 --> 105.536079 sigma out 15 active outputs RANK 3 NODE 3 --> 102.706108 sigma out 15 active outputs RANK 4 NODE 10 --> 48.1586571 sigma out 15 active outputs RANK 5 NODE 4 --> 46.7158852 sigma out 15 active outputs RANK 6 NODE 8 --> 46.1623955 sigma out 15 active outputs RANK 7 NODE 13 --> 45.1055222 sigma out 15 active outputs RANK 8 NODE 6 --> 41.9303932 sigma out 15 active outputs RANK 9 NODE 7 --> 40.7673759 sigma out 15 active outputs RANK 10 NODE 9 --> 38.4951591 sigma out 15 active outputs RANK 11 NODE 2 --> 33.5750999 sigma out 15 active outputs RANK 12 NODE 5 --> 32.0630074 sigma out 15 active outputs RANK 13 NODE 12 --> 29.1363373 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 5 --> 149.477402 sigma in 13act. ( 187.44397 sig out 1act.) RANK 2 NODE 14 --> 109.062088 sigma in 13act. ( 129.46701 sig out 1act.) RANK 3 NODE 2 --> 76.4411392 sigma in 13act. ( 86.1890717 sig out 1act.) RANK 4 NODE 1 --> 73.9976959 sigma in 13act. ( 79.566597 sig out 1act.) RANK 5 NODE 7 --> 65.8593445 sigma in 13act. ( 69.5542831 sig out 1act.) RANK 6 NODE 9 --> 56.5807381 sigma in 13act. ( 54.5622749 sig out 1act.) RANK 7 NODE 11 --> 53.3123817 sigma in 13act. ( 51.4767494 sig out 1act.) RANK 8 NODE 3 --> 23.3720951 sigma in 13act. ( 26.4391079 sig out 1act.) RANK 9 NODE 13 --> 22.7935333 sigma in 13act. ( 25.7322807 sig out 1act.) RANK 10 NODE 8 --> 7.06166124 sigma in 13act. ( 6.70787382 sig out 1act.) RANK 11 NODE 12 --> 5.10217524 sigma in 13act. ( 5.69330692 sig out 1act.) RANK 12 NODE 6 --> 1.54277039 sigma in 13act. ( 1.04615819 sig out 1act.) RANK 13 NODE 4 --> 0.832330585 sigma in 13act. ( 0.694279492 sig out 1act.) RANK 14 NODE 15 --> 0.759106398 sigma in 13act. ( 0.380057156 sig out 1act.) RANK 15 NODE 10 --> 0.398345143 sigma in 13act. ( 0.315368533 sig out 1act.) sorted by output significance RANK 1 NODE 5 --> 187.44397 sigma out 1act.( 149.477402 sig in 13act.) RANK 2 NODE 14 --> 129.46701 sigma out 1act.( 109.062088 sig in 13act.) RANK 3 NODE 2 --> 86.1890717 sigma out 1act.( 76.4411392 sig in 13act.) RANK 4 NODE 1 --> 79.566597 sigma out 1act.( 73.9976959 sig in 13act.) RANK 5 NODE 7 --> 69.5542831 sigma out 1act.( 65.8593445 sig in 13act.) RANK 6 NODE 9 --> 54.5622749 sigma out 1act.( 56.5807381 sig in 13act.) RANK 7 NODE 11 --> 51.4767494 sigma out 1act.( 53.3123817 sig in 13act.) RANK 8 NODE 3 --> 26.4391079 sigma out 1act.( 23.3720951 sig in 13act.) RANK 9 NODE 13 --> 25.7322807 sigma out 1act.( 22.7935333 sig in 13act.) RANK 10 NODE 8 --> 6.70787382 sigma out 1act.( 7.06166124 sig in 13act.) RANK 11 NODE 12 --> 5.69330692 sigma out 1act.( 5.10217524 sig in 13act.) RANK 12 NODE 6 --> 1.04615819 sigma out 1act.( 1.54277039 sig in 13act.) RANK 13 NODE 4 --> 0.694279492 sigma out 1act.( 0.832330585 sig in 13act.) RANK 14 NODE 15 --> 0.380057156 sigma out 1act.( 0.759106398 sig in 13act.) RANK 15 NODE 10 --> 0.315368533 sigma out 1act.( 0.398345143 sig in 13act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 278.498535 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.531653523 *** contribution from regularisation: 0.00318356114 *** contribution from error: -0.534837067 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 32441 Closing output file done