NNInput NNInputs_140.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree NNAna::CopyTree: entries= 199591 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 51249 nbkg = 148342 Bkg Entries: 148342 Sig Entries: 51249 Chosen entries: 51249 Signal fraction: 1 Background fraction: 0.345479 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 148342 Actual Signal Entries: 51249 Entries to split: 51249 Test with : 25624 Train with : 25624 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 51249 for Signal Prepared event 0 for Signal with 51249 events ====Entry 0 Variable Ht : 135.044 Variable LepAPt : 36.7733 Variable LepBPt : 15.1786 Variable MetSigLeptonsJets : 6.335 Variable MetSpec : 56.2375 Variable SumEtLeptonsJets : 78.8061 Variable VSumJetLeptonsPt : 65.4215 Variable addEt : 108.19 Variable dPhiLepSumMet : 2.59573 Variable dPhiLeptons : 0.533909 Variable dRLeptons : 0.613899 Variable lep1_E : 40.034 Variable lep2_E : 15.2798 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2140 Ht = 135.044 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 36.7735 LepAPt = 36.7733 LepBEt = 15.1793 LepBPt = 15.1786 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 56.2375 MetDelPhi = 2.21558 MetSig = 4.21501 MetSigLeptonsJets = 6.335 MetSpec = 56.2375 Mjj = 0 MostCentralJetEta = -0.266727 MtllMet = 110.158 Njets = 1 SB = 0 SumEt = 178.015 SumEtJets = 0 SumEtLeptonsJets = 78.8061 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 65.4215 addEt = 108.19 dPhiLepSumMet = 2.59573 dPhiLeptons = 0.533909 dRLeptons = 0.613899 diltype = 17 dimass = 14.3911 event = 161 jet1_Et = 26.8541 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 40.034 lep2_E = 15.2798 rand = 0.999742 run = 234313 weight = 2.40583e-06 ===Show End Prepared event 10000 for Signal with 51249 events Prepared event 20000 for Signal with 51249 events Prepared event 30000 for Signal with 51249 events Prepared event 40000 for Signal with 51249 events Prepared event 50000 for Signal with 51249 events Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 148342 for Background Prepared event 0 for Background with 148342 events ====Entry 0 Variable Ht : 85.3408 Variable LepAPt : 31.1294 Variable LepBPt : 10.0664 Variable MetSigLeptonsJets : 6.87768 Variable MetSpec : 44.1437 Variable SumEtLeptonsJets : 41.1959 Variable VSumJetLeptonsPt : 41.0132 Variable addEt : 85.3408 Variable dPhiLepSumMet : 3.10536 Variable dPhiLeptons : 0.219342 Variable dRLeptons : 0.424454 Variable lep1_E : 32.3548 Variable lep2_E : 10.1027 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 85.3408 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 31.1297 LepAPt = 31.1294 LepBEt = 10.0674 LepBPt = 10.0664 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 44.1437 MetDelPhi = 3.01191 MetSig = 3.74558 MetSigLeptonsJets = 6.87768 MetSpec = 44.1437 Mjj = 0 MostCentralJetEta = 0 MtllMet = 86.2332 Njets = 0 SB = 0 SumEt = 138.899 SumEtJets = 0 SumEtLeptonsJets = 41.1959 Target = 0 TrainWeight = 0.315882 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 41.0132 addEt = 85.3408 dPhiLepSumMet = 3.10536 dPhiLeptons = 0.219342 dRLeptons = 0.424454 diltype = 17 dimass = 7.54723 event = 6717520 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 32.3548 lep2_E = 10.1027 rand = 0.999742 run = 271566 weight = 0.00296168 ===Show End Prepared event 10000 for Background with 148342 events Prepared event 20000 for Background with 148342 events Prepared event 30000 for Background with 148342 events Prepared event 40000 for Background with 148342 events Prepared event 50000 for Background with 148342 events Prepared event 60000 for Background with 148342 events Prepared event 70000 for Background with 148342 events Prepared event 80000 for Background with 148342 events Prepared event 90000 for Background with 148342 events Prepared event 100000 for Background with 148342 events Prepared event 110000 for Background with 148342 events Prepared event 120000 for Background with 148342 events Prepared event 130000 for Background with 148342 events Prepared event 140000 for Background with 148342 events Warning: found 4692 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 199591 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 4692 negative weights. Signal fraction: 62.3254128 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 60.0048981 63.2802086 65.4925842 67.0829163 68.1954041 69.2154541 70.2376251 71.104805 71.9186478 72.7597504 73.5344391 74.2348862 74.9921799 75.5583496 76.2530212 76.8579559 77.4508362 78.1363373 78.7564163 79.4147186 79.990387 80.6478271 81.2105865 81.8797989 82.4854584 83.171196 83.7924805 84.6905823 85.664032 86.5042725 87.4658661 88.533905 89.3599014 90.5552292 91.6786804 92.9340591 94.076973 95.3855438 96.628891 97.8033905 99.0302887 100.225662 101.541077 102.641792 103.835205 104.770988 105.987183 107.204239 108.397469 109.605217 110.792313 112.095001 113.321289 114.623428 115.812309 117.051682 118.277328 119.50383 120.481949 121.694626 122.917198 124.142899 125.306816 126.484978 127.634613 128.739136 129.942154 131.189514 132.514343 133.932465 135.278458 136.77034 138.37413 139.975067 141.788315 143.456467 145.357147 147.298065 149.257858 151.380798 153.640869 156. 158.238541 160.55365 163.535919 166.398102 169.295563 172.693268 176.25058 180.094696 184.690994 189.456268 195.34906 200.982162 208.136078 217.054306 227.396759 242.031738 263.651642 301.741577 877.85614 ------------------------------ Transdef: Tab for variable 3 20.0000362 20.3297901 20.6090794 20.861702 21.1336555 21.3430595 21.5405922 21.7946873 22.02672 22.2166405 22.422966 22.6033783 22.8004589 22.9851036 23.1781731 23.3507614 23.5437908 23.7602863 23.9483528 24.162199 24.3784561 24.5713921 24.7509117 24.9457741 25.1334743 25.2953262 25.4870224 25.6593018 25.8452797 26.0556221 26.2485657 26.4318619 26.63311 26.85392 27.0730286 27.227253 27.4527321 27.6525345 27.8518181 28.0615005 28.2665482 28.5006447 28.7215195 28.935257 29.1401653 29.3583794 29.5871658 29.8264427 30.0476227 30.2658081 30.5216866 30.7439957 30.9980888 31.2434845 31.4937973 31.7413902 31.927742 32.1887207 32.4496307 32.7394714 33.0135231 33.2507095 33.5642624 33.8575897 34.1535797 34.4525261 34.7709198 35.0841522 35.3869324 35.6769638 36.0079956 36.3545532 36.6674843 37.02248 37.3881493 37.7512207 38.1414299 38.5672913 38.9876556 39.3305588 39.7902527 40.2568779 40.7797127 41.3209152 41.9282227 42.5369415 43.1651688 43.8577347 44.6496162 45.494072 46.396904 47.4180183 48.5311737 49.7140541 51.2042084 53.0906296 55.5680771 58.9005508 64.0507355 73.8680573 215.283478 ------------------------------ Transdef: Tab for variable 4 10.0008354 10.1204901 10.2453117 10.3612175 10.4647732 10.5822525 10.6910048 10.7993374 10.9231167 11.0349588 11.1719885 11.3105907 11.4335089 11.5379829 11.6751604 11.8005238 11.9445801 12.053318 12.2039127 12.3287449 12.4831066 12.6416349 12.7766037 12.8945971 13.0493832 13.2192211 13.3814945 13.5208683 13.6685934 13.8037281 13.9392433 14.0861864 14.2430992 14.4273701 14.5874481 14.7859077 14.9796352 15.1575747 15.334362 15.4853249 15.6900406 15.8633385 16.0698566 16.2788849 16.4785995 16.6737595 16.8389759 17.0370178 17.2618847 17.4765587 17.679287 17.9079742 18.14505 18.3630314 18.5903091 18.8318291 19.0628052 19.2816277 19.515625 19.7487526 19.9764042 20.1986542 20.3667927 20.582653 20.8155289 21.0230331 21.2692947 21.5260792 21.7558479 22.0102768 22.2399368 22.4939766 22.7893257 23.0546989 23.3516254 23.6475258 23.9523582 24.2858772 24.5908241 24.9102898 25.2294998 25.5862961 25.9235668 26.2378998 26.60005 27.0122395 27.3918209 27.8101082 28.2475414 28.7489243 29.2411671 29.7705879 30.3654938 31.0306282 31.8582573 32.7656708 33.9404335 35.6546783 37.9854927 43.1664162 92.8595734 ------------------------------ Transdef: Tab for variable 5 0.845276415 2.53954911 2.97551012 3.27904797 3.52721786 3.75931787 3.96137476 4.09976149 4.20626163 4.31683636 4.42679596 4.52536726 4.61459827 4.70100737 4.78691387 4.85600758 4.93364763 5.01025677 5.0701313 5.14215708 5.20577431 5.2648077 5.31870365 5.38244534 5.44194651 5.4928484 5.53087187 5.57809162 5.62670326 5.67085552 5.71672153 5.76063442 5.80490541 5.8408165 5.8869257 5.92739677 5.97235966 6.01030159 6.04883003 6.09451723 6.13297081 6.17905712 6.21676636 6.25465965 6.29026985 6.32844353 6.37224627 6.41373444 6.45205975 6.48687458 6.52504635 6.56932545 6.60960674 6.64611578 6.68772221 6.72718668 6.76810837 6.81021976 6.85350895 6.89564896 6.93963432 6.98579359 7.02678967 7.07300043 7.12087154 7.16662788 7.21346378 7.25353241 7.29929733 7.34830952 7.39786148 7.44220543 7.48362446 7.53174734 7.58058643 7.63256168 7.68544483 7.73758221 7.79567432 7.84966469 7.9116497 7.9723587 8.0358963 8.09404945 8.15372658 8.21669769 8.28722763 8.35926056 8.44231796 8.52609062 8.62513351 8.72599602 8.82748604 8.95427895 9.09564114 9.26757812 9.47037029 9.73721981 10.1700668 10.8795061 20.215023 ------------------------------ Transdef: Tab for variable 6 15.0004978 20.3210144 25.1262341 26.1964817 27.1379318 27.9721184 28.8828812 29.6029091 30.378788 31.0032139 31.4838409 32.1058731 32.7058525 33.2042465 33.5869102 34.0197525 34.4775009 34.8487549 35.2408752 35.6556549 36.0395889 36.4612274 36.759552 37.0945129 37.4856796 37.8482971 38.1744881 38.472271 38.8410797 39.1691971 39.5164833 39.8341026 40.1850166 40.5079803 40.8291016 41.1819763 41.5588112 41.9517365 42.3793182 42.8505325 43.1153946 43.5156555 43.8359299 44.328804 44.745575 45.2301178 45.6584778 46.1281204 46.6317596 47.123455 47.5459633 48.1015549 48.5787659 49.082386 49.6160736 50.188797 50.6827011 51.1776581 51.7694054 52.3616104 53.0027924 53.6340179 54.2650986 54.7967834 55.3558006 55.9415283 56.5626984 57.0594521 57.6439896 58.2491913 58.716156 59.3394241 59.8732529 60.48629 61.1101303 61.6879044 62.3036194 62.9730453 63.5854492 64.2548676 64.9271774 65.5900803 66.3737717 67.1983643 68.0230713 68.9250412 69.8027039 70.7961349 71.9209671 72.9914246 74.2531433 75.8361359 77.5300446 79.559494 81.9909973 84.5995941 88.0876617 92.3554459 99.6117859 111.921021 280.646332 ------------------------------ Transdef: Tab for variable 7 30.1229668 32.4671707 33.4855423 34.0715828 34.835434 35.3348007 35.8010101 36.2693939 36.6695518 37.0222321 37.3719025 37.7027664 38.032589 38.3169403 38.6149712 38.8875504 39.1902466 39.4901581 39.7908745 40.0786743 40.3812332 40.7171249 41.0115585 41.3786926 41.7071228 42.0270462 42.4184647 42.8699226 43.2459488 43.6912079 44.2882462 44.7706146 45.2729721 45.799614 46.4886017 47.1847115 47.9683266 48.6841774 49.394516 50.1429596 50.8993988 51.6102333 52.3556595 53.0269318 53.8201065 54.6253777 55.415741 56.0775833 56.8105469 57.4676971 58.1669121 58.8329849 59.5441742 60.2074966 60.9555931 61.3154602 62.0549011 62.7140999 63.4550781 64.2027588 64.9008484 65.6463623 66.499939 67.3681183 68.3472748 69.33078 70.4094467 71.5283661 72.6214142 73.7830505 75.130249 76.5910568 77.8382874 79.0931244 80.7755127 82.3600311 83.9478455 85.6227951 87.3218231 88.9431 90.7915802 92.8349152 94.8050995 96.8451767 99.2191696 101.675377 104.604179 107.299667 110.180389 113.567551 117.322746 121.366898 125.813675 131.216141 137.864197 144.759674 154.142395 166.075745 182.720795 214.496841 652.407166 ------------------------------ Transdef: Tab for variable 8 3.10723281 23.4581699 28.2338486 30.3719025 31.5075035 32.3271027 32.9497757 33.4853058 34.0007706 34.4382706 34.8882675 35.2237129 35.5259438 35.8758583 36.1927757 36.5134201 36.7949104 37.0937119 37.3716469 37.6244087 37.8596268 38.0814362 38.3253479 38.5968094 38.8615875 39.1374397 39.3603745 39.6041183 39.8487358 40.1353226 40.4464111 40.7460632 41.0571785 41.3311729 41.6303482 42.0064621 42.3551636 42.7320709 43.1229324 43.488327 43.8173676 44.2210922 44.6376572 45.1061363 45.4798508 46.0185661 46.4994278 47.0063629 47.4819946 47.9963074 48.3779297 48.8769722 49.3846092 49.9284515 50.4372902 50.9690475 51.4629745 51.9956894 52.5306244 53.0619659 53.6192169 54.1905365 54.7327042 55.2667999 55.7766991 56.3306656 56.8603897 57.4197197 57.9940147 58.4902115 59.009079 59.5398331 60.128891 60.7675858 61.1695251 61.7225723 62.3397865 62.9699669 63.60149 64.2116089 64.867691 65.4559021 66.1525116 66.9289093 67.7661285 68.6430817 69.5917511 70.6805649 71.8384018 73.1472321 74.4268799 76.0852432 77.9102783 79.8648148 82.3278046 85.5602722 89.3400421 94.0835876 101.411118 116.124557 326.868286 ------------------------------ Transdef: Tab for variable 9 48.1481171 62.8013687 64.8903046 66.4541931 67.5507812 68.6114044 69.5234528 70.2849045 71.0761108 71.7904053 72.5652008 73.2565689 73.9382858 74.5369873 75.138504 75.6506958 76.2052155 76.739418 77.220871 77.8010483 78.3617859 78.8618011 79.4134979 79.8664703 80.480896 80.9729843 81.501709 82.1158829 82.5562439 83.1717682 83.7114487 84.3640442 85.070755 85.8557663 86.5388489 87.1687927 87.9719315 88.7823486 89.3795624 90.4060669 91.2806015 92.1168671 93.1019592 94.07724 95.1585083 96.0212936 97.0838776 98.0667191 99.0199585 100.036438 101.123474 102.067665 102.962997 103.933929 104.891815 105.916389 106.837112 107.841614 108.84938 109.820473 110.736404 111.797424 112.839691 113.73774 114.648132 115.628021 116.461365 117.358856 118.333702 119.27037 119.923828 120.834053 121.68824 122.603439 123.514763 124.486862 125.371506 126.292137 127.272842 128.107819 129.062378 129.94519 130.844269 131.844131 132.823181 133.942902 134.925507 136.042877 137.209473 138.519501 139.96106 141.571228 143.248764 145.396881 147.913177 151.161087 155.25563 161.367477 171.67749 193.801682 582.80835 ------------------------------ Transdef: Tab for variable 10 0.0050833188 1.01107037 1.32311487 1.53282571 1.68369412 1.80349398 1.91176963 1.99410927 2.06656456 2.13691473 2.19436121 2.24505901 2.29562235 2.33648968 2.38201261 2.4271822 2.46282077 2.4941721 2.5281837 2.5578208 2.58346391 2.60991383 2.63381004 2.65793514 2.68076324 2.70195413 2.71958303 2.73980522 2.75778747 2.7743001 2.79046249 2.80351257 2.81965303 2.83231068 2.84392977 2.85524535 2.86487961 2.87515926 2.88501024 2.89512205 2.9030652 2.91227961 2.92086411 2.92950964 2.93677926 2.94420433 2.95110154 2.95797682 2.96481204 2.97141552 2.97668195 2.98223281 2.98785281 2.99317694 2.99710369 3.00239134 3.00699902 3.01192522 3.01614809 3.02050567 3.02515554 3.02896428 3.0327878 3.03642082 3.03981018 3.04338598 3.04715514 3.05113411 3.05459619 3.05806017 3.06170177 3.06515169 3.06815767 3.0712738 3.07437134 3.07738805 3.0805409 3.08354568 3.08653831 3.08975267 3.09262466 3.09535503 3.09834123 3.10125494 3.10397863 3.10633755 3.10861254 3.11095381 3.11350346 3.11582708 3.11820769 3.12065887 3.12332368 3.12575269 3.1281693 3.13026714 3.13224363 3.13454318 3.1368413 3.13909316 3.1415906 ------------------------------ Transdef: Tab for variable 11 2.86871432E-06 0.00533723878 0.0101478063 0.0166356545 0.0228303671 0.0288709868 0.0358139277 0.0428996757 0.049166441 0.0562889576 0.0619060397 0.0687494278 0.0758194923 0.0819466263 0.0891276598 0.095849365 0.101662159 0.108079851 0.112575531 0.117618084 0.123697758 0.129680634 0.13530618 0.140692383 0.146677732 0.152890205 0.158385217 0.163731515 0.169355929 0.175197333 0.181156665 0.186199307 0.191794753 0.196635842 0.201823205 0.207425267 0.213153094 0.218885273 0.224855095 0.230664968 0.236236498 0.241928339 0.248115957 0.253273368 0.259509385 0.265465617 0.27073431 0.276837468 0.28218019 0.288300395 0.29434371 0.300557733 0.307026744 0.313463807 0.319812238 0.326161027 0.332394272 0.338490248 0.344873756 0.350340009 0.355811536 0.362099469 0.367837548 0.374278367 0.380633354 0.386695743 0.393221736 0.399949908 0.406138301 0.412310004 0.41855371 0.425451934 0.432334542 0.438796043 0.445868075 0.452339053 0.459603071 0.466728687 0.473180532 0.480576098 0.489868015 0.497953892 0.506437421 0.514548302 0.523603559 0.533320189 0.544818282 0.556092441 0.568016231 0.579461575 0.593999982 0.608093083 0.624709189 0.643803239 0.660258174 0.681576014 0.705096185 0.738601685 0.774921775 0.836971641 1.1301049 ------------------------------ Transdef: Tab for variable 12 0.100055709 0.123443052 0.142580986 0.158094525 0.170665711 0.183594853 0.19554463 0.204154223 0.214043289 0.223818853 0.2335729 0.241885155 0.250599533 0.258321434 0.266121924 0.274274528 0.281567305 0.289906681 0.297313154 0.304731548 0.311520398 0.319263875 0.326542974 0.333515108 0.341071308 0.348231077 0.355271935 0.360448658 0.367257714 0.3738904 0.380896956 0.387556016 0.393295467 0.399564773 0.403987706 0.408942699 0.413050741 0.417844057 0.422358751 0.426777899 0.430642247 0.435087979 0.438879192 0.443401754 0.447431207 0.45189473 0.456100851 0.45982945 0.463946402 0.468104452 0.472174913 0.476173341 0.479312301 0.483257055 0.487471759 0.491941005 0.496436179 0.50059104 0.50502038 0.509287953 0.513966918 0.518891871 0.523637533 0.528247893 0.533091903 0.538491607 0.543604612 0.549788237 0.555058837 0.560001969 0.565299153 0.570352912 0.576208413 0.581629992 0.588551283 0.594377995 0.600853324 0.607253492 0.614304304 0.621569514 0.629398227 0.63705647 0.646247029 0.654756427 0.662835002 0.670528173 0.678288996 0.686127722 0.696658373 0.705723166 0.717085719 0.728439271 0.741286874 0.754012346 0.767970741 0.783793092 0.804484725 0.829334259 0.86000967 0.904274523 1.13453126 ------------------------------ Transdef: Tab for variable 13 20.0262547 21.3898754 22.1203003 22.6960049 23.2092361 23.7333908 24.1489029 24.6493912 25.0603886 25.4511166 25.8125553 26.1896477 26.5627975 26.8877373 27.1956348 27.5071678 27.842144 28.1542091 28.4727325 28.7791367 29.1185493 29.4266567 29.6899033 30.0100899 30.3237686 30.5690517 30.8492737 31.1109962 31.4262886 31.7459068 32.0596008 32.3418083 32.5872688 32.8806 33.1862831 33.4902306 33.8000107 34.1142502 34.3448029 34.5905533 34.9025192 35.1858368 35.4697189 35.8271484 36.1175003 36.3436203 36.6864052 37.0334511 37.3372765 37.6871948 38.0125046 38.3479233 38.6757431 39.0487976 39.4524994 39.8048515 40.2150154 40.6566429 41.0790787 41.5120544 41.918869 42.3632736 42.7322006 43.170578 43.6008606 44.0725174 44.5535736 45.1194 45.6698341 46.2420692 46.7786636 47.3504639 47.928833 48.4995728 49.1680222 49.859108 50.5672302 51.2142639 51.9411774 52.5855255 53.4022369 54.269619 55.2013893 56.2105331 57.2865524 58.4524536 59.6041107 60.8076477 62.3367767 63.8682938 65.2881775 67.0159531 69.3398895 71.5588226 74.4310226 77.8347168 82.3519669 87.4472656 92.8744507 105.434601 394.478912 ------------------------------ Transdef: Tab for variable 14 10.0028534 10.6160946 10.9670734 11.3371544 11.6268082 11.8889475 12.198472 12.451479 12.7192402 12.9879169 13.2536907 13.5156469 13.7469988 13.9371376 14.1941986 14.4513159 14.6773806 14.9292431 15.173418 15.4054203 15.6316977 15.8312721 16.0811653 16.3136234 16.5871391 16.8075085 17.0280991 17.2779388 17.5392685 17.7581234 17.9834366 18.2389107 18.4820538 18.7233162 18.9609985 19.1849937 19.4316406 19.7151318 19.9687195 20.2413063 20.492054 20.7068977 20.9627953 21.2108879 21.4634857 21.7118225 21.9585495 22.223156 22.4876728 22.7681923 23.0215912 23.3131523 23.57864 23.8363571 24.1215096 24.4125404 24.690712 24.9750443 25.2467613 25.5217705 25.7396584 26.0132637 26.2853508 26.5764771 26.8971119 27.20364 27.504076 27.8288536 28.1409645 28.4584312 28.8066883 29.1420097 29.5352745 29.9287949 30.28582 30.6985092 31.0925903 31.4895706 31.9369678 32.446907 32.9107437 33.4342079 33.8302231 34.3611221 34.872139 35.5414619 36.1762428 36.9533157 37.7108307 38.5491142 39.4889908 40.2721863 41.2567444 42.6269684 44.2541161 46.0146332 48.0243073 51.1389236 55.7551689 62.9453125 122.221924 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 59.8 41.0 45.5 17.8 40.3 57.7 45.9 56.5 -28.6 -7.1 -15.9 11.3 20.3 2 59.8 100.0 64.2 57.2 24.6 63.6 94.2 69.9 91.0 -46.5 -18.3 -32.0 34.1 32.9 3 41.0 64.2 100.0 37.0 0.9 29.2 67.5 50.4 70.2 -16.1 -19.2 -35.9 67.1 19.6 4 45.5 57.2 37.0 100.0 6.9 31.3 58.3 46.9 62.6 -13.7 -23.4 -42.6 18.0 73.5 5 17.8 24.6 0.9 6.9 100.0 81.3 -3.3 49.2 48.8 28.6 -2.7 -2.4 -7.5 -0.5 6 40.3 63.6 29.2 31.3 81.3 100.0 41.9 75.3 76.7 -4.3 -10.4 -16.8 9.8 15.0 7 57.7 94.2 67.5 58.3 -3.3 41.9 100.0 62.4 79.1 -54.1 -17.3 -31.9 38.6 35.3 8 45.9 69.9 50.4 46.9 49.2 75.3 62.4 100.0 79.8 -9.3 -18.6 -29.0 26.5 27.7 9 56.5 91.0 70.2 62.6 48.8 76.7 79.1 79.8 100.0 -21.5 -21.1 -37.6 38.9 37.5 10 -28.6 -46.5 -16.1 -13.7 28.6 -4.3 -54.1 -9.3 -21.5 100.0 1.9 4.7 -6.1 -5.6 11 -7.1 -18.3 -19.2 -23.4 -2.7 -10.4 -17.3 -18.6 -21.1 1.9 100.0 59.1 -7.4 -14.8 12 -15.9 -32.0 -35.9 -42.6 -2.4 -16.8 -31.9 -29.0 -37.6 4.7 59.1 100.0 -13.8 -21.3 13 11.3 34.1 67.1 18.0 -7.5 9.8 38.6 26.5 38.9 -6.1 -7.4 -13.8 100.0 43.7 14 20.3 32.9 19.6 73.5 -0.5 15.0 35.3 27.7 37.5 -5.6 -14.8 -21.3 43.7 100.0 TOTAL CORRELATION TO TARGET (diagonal) 139.869011 TOTAL CORRELATION OF ALL VARIABLES 65.8623047 ROUND 1: MAX CORR ( 65.8614189) AFTER KILLING INPUT VARIABLE 6 CONTR 0.341579059 ROUND 2: MAX CORR ( 65.8437455) AFTER KILLING INPUT VARIABLE 11 CONTR 1.52567611 ROUND 3: MAX CORR ( 65.8093865) AFTER KILLING INPUT VARIABLE 8 CONTR 2.12684348 ROUND 4: MAX CORR ( 65.7039387) AFTER KILLING INPUT VARIABLE 14 CONTR 3.72394782 ROUND 5: MAX CORR ( 65.5741619) AFTER KILLING INPUT VARIABLE 2 CONTR 4.12757252 ROUND 6: MAX CORR ( 65.1892942) AFTER KILLING INPUT VARIABLE 9 CONTR 7.09412677 ROUND 7: MAX CORR ( 64.7849957) AFTER KILLING INPUT VARIABLE 10 CONTR 7.249028 ROUND 8: MAX CORR ( 63.9417471) AFTER KILLING INPUT VARIABLE 12 CONTR 10.418668 ROUND 9: MAX CORR ( 63.0392178) AFTER KILLING INPUT VARIABLE 3 CONTR 10.705328 ROUND 10: MAX CORR ( 62.2525376) AFTER KILLING INPUT VARIABLE 13 CONTR 9.92796742 ROUND 11: MAX CORR ( 60.9691156) AFTER KILLING INPUT VARIABLE 4 CONTR 12.5755868 ROUND 12: MAX CORR ( 57.6796936) AFTER KILLING INPUT VARIABLE 5 CONTR 19.7556576 LAST REMAINING VARIABLE: 7 total correlation to target: 65.8623047 % total significance: 134.021077 sigma correlations of single variables to target: variable 2: 59.8480651 % , in sigma: 121.782895 variable 3: 40.9790537 % , in sigma: 83.386953 variable 4: 45.5101165 % , in sigma: 92.6070664 variable 5: 17.8130161 % , in sigma: 36.2471312 variable 6: 40.309071 % , in sigma: 82.0236268 variable 7: 57.6796936 % , in sigma: 117.370546 variable 8: 45.9195021 % , in sigma: 93.4401119 variable 9: 56.5321405 % , in sigma: 115.035427 variable 10: -28.551095 % , in sigma: 58.0977012 variable 11: -7.0519983 % , in sigma: 14.3498836 variable 12: -15.8807879 % , in sigma: 32.3153025 variable 13: 11.2848304 % , in sigma: 22.963137 variable 14: 20.3248098 % , in sigma: 41.358299 variables sorted by significance: 1 most relevant variable 7 corr 57.6796951 , in sigma: 117.370549 2 most relevant variable 5 corr 19.7556572 , in sigma: 40.2001488 3 most relevant variable 4 corr 12.5755873 , in sigma: 25.5896564 4 most relevant variable 13 corr 9.92796707 , in sigma: 20.2020996 5 most relevant variable 3 corr 10.705328 , in sigma: 21.7839262 6 most relevant variable 12 corr 10.4186678 , in sigma: 21.2006106 7 most relevant variable 10 corr 7.24902821 , in sigma: 14.7508134 8 most relevant variable 9 corr 7.0941267 , in sigma: 14.4356093 9 most relevant variable 2 corr 4.12757254 , in sigma: 8.39906405 10 most relevant variable 14 corr 3.72394776 , in sigma: 7.57774104 11 most relevant variable 8 corr 2.12684345 , in sigma: 4.32784505 12 most relevant variable 11 corr 1.52567613 , in sigma: 3.10454909 13 most relevant variable 6 corr 0.34157905 , in sigma: 0.695068177 global correlations between input variables: variable 2: 99.1229623 % variable 3: 93.1629323 % variable 4: 91.9904761 % variable 5: 95.1219682 % variable 6: 94.2309572 % variable 7: 98.6781114 % variable 8: 87.8659511 % variable 9: 98.7841286 % variable 10: 71.6481316 % variable 11: 59.9651339 % variable 12: 69.4749393 % variable 13: 86.1606852 % variable 14: 88.6219921 % significance loss when removing single variables: variable 2: corr = 4.65784811 % , sigma = 9.47810468 variable 3: corr = 13.3710591 % , sigma = 27.2083363 variable 4: corr = 15.1042936 % , sigma = 30.7352392 variable 5: corr = 10.5268435 % , sigma = 21.4207338 variable 6: corr = 0.341579059 % , sigma = 0.695068196 variable 7: corr = 4.18710909 % , sigma = 8.52021306 variable 8: corr = 2.18112876 % , sigma = 4.43830848 variable 9: corr = 8.451199 % , sigma = 17.1970719 variable 10: corr = 6.94145291 % , sigma = 14.1249383 variable 11: corr = 1.53370041 % , sigma = 3.12087743 variable 12: corr = 7.44377903 % , sigma = 15.1471056 variable 13: corr = 7.80499492 % , sigma = 15.8821321 variable 14: corr = 3.61424256 % , sigma = 7.35450546 Keep only 11 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 12 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 9 --> 32.2745857 sigma out 15 active outputs RANK 2 NODE 5 --> 24.3955936 sigma out 15 active outputs RANK 3 NODE 6 --> 22.2980289 sigma out 15 active outputs RANK 4 NODE 1 --> 20.7026577 sigma out 15 active outputs RANK 5 NODE 12 --> 20.1049347 sigma out 15 active outputs RANK 6 NODE 11 --> 19.1765003 sigma out 15 active outputs RANK 7 NODE 10 --> 18.5203266 sigma out 15 active outputs RANK 8 NODE 4 --> 17.4716969 sigma out 15 active outputs RANK 9 NODE 8 --> 16.5768013 sigma out 15 active outputs RANK 10 NODE 7 --> 14.5511856 sigma out 15 active outputs RANK 11 NODE 3 --> 14.5111713 sigma out 15 active outputs RANK 12 NODE 2 --> 12.4977293 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 14 --> 36.7786522 sigma in 12act. ( 38.4093933 sig out 1act.) RANK 2 NODE 5 --> 28.1491051 sigma in 12act. ( 26.7088623 sig out 1act.) RANK 3 NODE 6 --> 24.3851891 sigma in 12act. ( 25.7115536 sig out 1act.) RANK 4 NODE 15 --> 23.8629303 sigma in 12act. ( 20.9409847 sig out 1act.) RANK 5 NODE 3 --> 20.3464394 sigma in 12act. ( 21.8769817 sig out 1act.) RANK 6 NODE 8 --> 18.8043709 sigma in 12act. ( 18.2359905 sig out 1act.) RANK 7 NODE 12 --> 16.2766018 sigma in 12act. ( 14.1534338 sig out 1act.) RANK 8 NODE 1 --> 14.1958675 sigma in 12act. ( 14.8649025 sig out 1act.) RANK 9 NODE 13 --> 11.31528 sigma in 12act. ( 12.4277525 sig out 1act.) RANK 10 NODE 10 --> 9.45255375 sigma in 12act. ( 10.9816294 sig out 1act.) RANK 11 NODE 9 --> 6.67452526 sigma in 12act. ( 7.0031848 sig out 1act.) RANK 12 NODE 4 --> 4.2524581 sigma in 12act. ( 5.26434374 sig out 1act.) RANK 13 NODE 2 --> 2.73199248 sigma in 12act. ( 3.6704967 sig out 1act.) RANK 14 NODE 11 --> 1.96491802 sigma in 12act. ( 2.2733686 sig out 1act.) RANK 15 NODE 7 --> 1.26711082 sigma in 12act. ( 2.25382018 sig out 1act.) sorted by output significance RANK 1 NODE 14 --> 38.4093933 sigma out 1act.( 36.7786522 sig in 12act.) RANK 2 NODE 5 --> 26.7088623 sigma out 1act.( 28.1491051 sig in 12act.) RANK 3 NODE 6 --> 25.7115536 sigma out 1act.( 24.3851891 sig in 12act.) RANK 4 NODE 3 --> 21.8769817 sigma out 1act.( 20.3464394 sig in 12act.) RANK 5 NODE 15 --> 20.9409847 sigma out 1act.( 23.8629303 sig in 12act.) RANK 6 NODE 8 --> 18.2359905 sigma out 1act.( 18.8043709 sig in 12act.) RANK 7 NODE 1 --> 14.8649025 sigma out 1act.( 14.1958675 sig in 12act.) RANK 8 NODE 12 --> 14.1534338 sigma out 1act.( 16.2766018 sig in 12act.) RANK 9 NODE 13 --> 12.4277525 sigma out 1act.( 11.31528 sig in 12act.) RANK 10 NODE 10 --> 10.9816294 sigma out 1act.( 9.45255375 sig in 12act.) RANK 11 NODE 9 --> 7.0031848 sigma out 1act.( 6.67452526 sig in 12act.) RANK 12 NODE 4 --> 5.26434374 sigma out 1act.( 4.2524581 sig in 12act.) RANK 13 NODE 2 --> 3.6704967 sigma out 1act.( 2.73199248 sig in 12act.) RANK 14 NODE 11 --> 2.2733686 sigma out 1act.( 1.96491802 sig in 12act.) RANK 15 NODE 7 --> 2.25382018 sigma out 1act.( 1.26711082 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 69.9729614 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 9 --> 30.9169216 sigma out 15 active outputs RANK 2 NODE 10 --> 27.7214394 sigma out 15 active outputs RANK 3 NODE 5 --> 23.8148193 sigma out 15 active outputs RANK 4 NODE 6 --> 21.4435635 sigma out 15 active outputs RANK 5 NODE 12 --> 20.554369 sigma out 15 active outputs RANK 6 NODE 11 --> 19.3491287 sigma out 15 active outputs RANK 7 NODE 1 --> 19.2688789 sigma out 15 active outputs RANK 8 NODE 8 --> 17.6285095 sigma out 15 active outputs RANK 9 NODE 4 --> 16.3455448 sigma out 15 active outputs RANK 10 NODE 3 --> 13.5730524 sigma out 15 active outputs RANK 11 NODE 7 --> 13.4193716 sigma out 15 active outputs RANK 12 NODE 2 --> 11.3533192 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 14 --> 37.0042381 sigma in 12act. ( 36.8709183 sig out 1act.) RANK 2 NODE 5 --> 24.4895306 sigma in 12act. ( 23.6323433 sig out 1act.) RANK 3 NODE 3 --> 23.5801849 sigma in 12act. ( 20.8797188 sig out 1act.) RANK 4 NODE 6 --> 22.1676483 sigma in 12act. ( 23.3961277 sig out 1act.) RANK 5 NODE 15 --> 19.2143784 sigma in 12act. ( 17.749073 sig out 1act.) RANK 6 NODE 8 --> 17.5549049 sigma in 12act. ( 16.5775204 sig out 1act.) RANK 7 NODE 7 --> 13.9266119 sigma in 12act. ( 8.20166397 sig out 1act.) RANK 8 NODE 12 --> 13.6000576 sigma in 12act. ( 12.1427259 sig out 1act.) RANK 9 NODE 1 --> 13.2479181 sigma in 12act. ( 13.2124968 sig out 1act.) RANK 10 NODE 13 --> 12.7335167 sigma in 12act. ( 11.9921579 sig out 1act.) RANK 11 NODE 10 --> 12.6480255 sigma in 12act. ( 10.8971024 sig out 1act.) RANK 12 NODE 11 --> 12.4404716 sigma in 12act. ( 3.79501772 sig out 1act.) RANK 13 NODE 2 --> 11.8042755 sigma in 12act. ( 5.37724209 sig out 1act.) RANK 14 NODE 4 --> 7.97957897 sigma in 12act. ( 5.87618113 sig out 1act.) RANK 15 NODE 9 --> 7.90260887 sigma in 12act. ( 6.65353251 sig out 1act.) sorted by output significance RANK 1 NODE 14 --> 36.8709183 sigma out 1act.( 37.0042381 sig in 12act.) RANK 2 NODE 5 --> 23.6323433 sigma out 1act.( 24.4895306 sig in 12act.) RANK 3 NODE 6 --> 23.3961277 sigma out 1act.( 22.1676483 sig in 12act.) RANK 4 NODE 3 --> 20.8797188 sigma out 1act.( 23.5801849 sig in 12act.) RANK 5 NODE 15 --> 17.749073 sigma out 1act.( 19.2143784 sig in 12act.) RANK 6 NODE 8 --> 16.5775204 sigma out 1act.( 17.5549049 sig in 12act.) RANK 7 NODE 1 --> 13.2124968 sigma out 1act.( 13.2479181 sig in 12act.) RANK 8 NODE 12 --> 12.1427259 sigma out 1act.( 13.6000576 sig in 12act.) RANK 9 NODE 13 --> 11.9921579 sigma out 1act.( 12.7335167 sig in 12act.) RANK 10 NODE 10 --> 10.8971024 sigma out 1act.( 12.6480255 sig in 12act.) RANK 11 NODE 7 --> 8.20166397 sigma out 1act.( 13.9266119 sig in 12act.) RANK 12 NODE 9 --> 6.65353251 sigma out 1act.( 7.90260887 sig in 12act.) RANK 13 NODE 4 --> 5.87618113 sigma out 1act.( 7.97957897 sig in 12act.) RANK 14 NODE 2 --> 5.37724209 sigma out 1act.( 11.8042755 sig in 12act.) RANK 15 NODE 11 --> 3.79501772 sigma out 1act.( 12.4404716 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 65.3076935 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.427014619 *** contribution from regularisation: 0.00506685907 *** contribution from error: -0.432081491 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.475176722 *** contribution from regularisation: 0.00281380839 *** contribution from error: -0.477990538 *********************************************** -----------------> Test sample ENTER BFGS code START -47430.2414 0.543118 0.0470698215 EXIT FROM BFGS code FG_START 0. 0.543118 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.492138147 *** contribution from regularisation: 0.00270359218 *** contribution from error: -0.494841754 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -49112.927 0.543118 -10.6824932 EXIT FROM BFGS code FG_LNSRCH 0. 0.541478097 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0.522678554 *** contribution from regularisation: 0.00313545414 *** contribution from error: -0.525813997 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52160.7064 0.541478097 161.983322 EXIT FROM BFGS code NEW_X -52160.7064 0.541478097 161.983322 ENTER BFGS code NEW_X -52160.7064 0.541478097 161.983322 EXIT FROM BFGS code FG_LNSRCH 0. 0.560471296 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.525149465 *** contribution from regularisation: 0.00312909251 *** contribution from error: -0.52827853 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52407.2897 0.560471296 63.1420441 EXIT FROM BFGS code NEW_X -52407.2897 0.560471296 63.1420441 ENTER BFGS code NEW_X -52407.2897 0.560471296 63.1420441 EXIT FROM BFGS code FG_LNSRCH 0. 0.575858176 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.526308179 *** contribution from regularisation: 0.00315592205 *** contribution from error: -0.529464126 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52522.9226 0.575858176 7.77176142 EXIT FROM BFGS code NEW_X -52522.9226 0.575858176 7.77176142 ENTER BFGS code NEW_X -52522.9226 0.575858176 7.77176142 EXIT FROM BFGS code FG_LNSRCH 0. 0.584083557 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.526732862 *** contribution from regularisation: 0.00320192124 *** contribution from error: -0.529934764 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52565.3051 0.584083557 -31.1739902 EXIT FROM BFGS code NEW_X -52565.3051 0.584083557 -31.1739902 ENTER BFGS code NEW_X -52565.3051 0.584083557 -31.1739902 EXIT FROM BFGS code FG_LNSRCH 0. 0.581129372 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.527085543 *** contribution from regularisation: 0.00322760595 *** contribution from error: -0.530313134 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52600.5019 0.581129372 -33.6492577 EXIT FROM BFGS code NEW_X -52600.5019 0.581129372 -33.6492577 ENTER BFGS code NEW_X -52600.5019 0.581129372 -33.6492577 EXIT FROM BFGS code FG_LNSRCH 0. 0.534143746 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.528848588 *** contribution from regularisation: 0.00316255493 *** contribution from error: -0.532011151 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52776.444 0.534143746 -141.588333 EXIT FROM BFGS code NEW_X -52776.444 0.534143746 -141.588333 ENTER BFGS code NEW_X -52776.444 0.534143746 -141.588333 EXIT FROM BFGS code FG_LNSRCH 0. 0.48030284 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 10 --> 45.7517662 sigma out 15 active outputs RANK 2 NODE 8 --> 25.0852566 sigma out 15 active outputs RANK 3 NODE 1 --> 21.0205517 sigma out 15 active outputs RANK 4 NODE 3 --> 19.5496616 sigma out 15 active outputs RANK 5 NODE 11 --> 16.0970116 sigma out 15 active outputs RANK 6 NODE 5 --> 14.117012 sigma out 15 active outputs RANK 7 NODE 9 --> 12.4707975 sigma out 15 active outputs RANK 8 NODE 2 --> 11.8469667 sigma out 15 active outputs RANK 9 NODE 6 --> 11.5250177 sigma out 15 active outputs RANK 10 NODE 7 --> 9.93325806 sigma out 15 active outputs RANK 11 NODE 12 --> 9.20501804 sigma out 15 active outputs RANK 12 NODE 4 --> 8.23206711 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 7 --> 39.1001472 sigma in 12act. ( 48.3478088 sig out 1act.) RANK 2 NODE 2 --> 31.308157 sigma in 12act. ( 34.6398811 sig out 1act.) RANK 3 NODE 14 --> 25.3739147 sigma in 12act. ( 21.5416393 sig out 1act.) RANK 4 NODE 11 --> 19.4325733 sigma in 12act. ( 22.8838844 sig out 1act.) RANK 5 NODE 6 --> 17.2554359 sigma in 12act. ( 17.5352955 sig out 1act.) RANK 6 NODE 3 --> 14.4295187 sigma in 12act. ( 12.2824945 sig out 1act.) RANK 7 NODE 15 --> 10.8203564 sigma in 12act. ( 8.92232895 sig out 1act.) RANK 8 NODE 8 --> 9.81214237 sigma in 12act. ( 8.68823338 sig out 1act.) RANK 9 NODE 5 --> 9.76234341 sigma in 12act. ( 9.50092411 sig out 1act.) RANK 10 NODE 1 --> 9.30190468 sigma in 12act. ( 8.80843639 sig out 1act.) RANK 11 NODE 9 --> 9.12123775 sigma in 12act. ( 7.60819435 sig out 1act.) RANK 12 NODE 4 --> 7.79307652 sigma in 12act. ( 6.05545998 sig out 1act.) RANK 13 NODE 12 --> 6.16126013 sigma in 12act. ( 5.31407213 sig out 1act.) RANK 14 NODE 13 --> 6.03750181 sigma in 12act. ( 4.29306078 sig out 1act.) RANK 15 NODE 10 --> 5.90298891 sigma in 12act. ( 3.61835051 sig out 1act.) sorted by output significance RANK 1 NODE 7 --> 48.3478088 sigma out 1act.( 39.1001472 sig in 12act.) RANK 2 NODE 2 --> 34.6398811 sigma out 1act.( 31.308157 sig in 12act.) RANK 3 NODE 11 --> 22.8838844 sigma out 1act.( 19.4325733 sig in 12act.) RANK 4 NODE 14 --> 21.5416393 sigma out 1act.( 25.3739147 sig in 12act.) RANK 5 NODE 6 --> 17.5352955 sigma out 1act.( 17.2554359 sig in 12act.) RANK 6 NODE 3 --> 12.2824945 sigma out 1act.( 14.4295187 sig in 12act.) RANK 7 NODE 5 --> 9.50092411 sigma out 1act.( 9.76234341 sig in 12act.) RANK 8 NODE 15 --> 8.92232895 sigma out 1act.( 10.8203564 sig in 12act.) RANK 9 NODE 1 --> 8.80843639 sigma out 1act.( 9.30190468 sig in 12act.) RANK 10 NODE 8 --> 8.68823338 sigma out 1act.( 9.81214237 sig in 12act.) RANK 11 NODE 9 --> 7.60819435 sigma out 1act.( 9.12123775 sig in 12act.) RANK 12 NODE 4 --> 6.05545998 sigma out 1act.( 7.79307652 sig in 12act.) RANK 13 NODE 12 --> 5.31407213 sigma out 1act.( 6.16126013 sig in 12act.) RANK 14 NODE 13 --> 4.29306078 sigma out 1act.( 6.03750181 sig in 12act.) RANK 15 NODE 10 --> 3.61835051 sigma out 1act.( 5.90298891 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 73.8969345 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.530219078 *** contribution from regularisation: 0.00313439663 *** contribution from error: -0.533353448 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -52913.2101 0.48030284 13.4226303 EXIT FROM BFGS code NEW_X -52913.2101 0.48030284 13.4226303 ENTER BFGS code NEW_X -52913.2101 0.48030284 13.4226303 EXIT FROM BFGS code FG_LNSRCH 0. 0.467218429 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.53081423 *** contribution from regularisation: 0.00316927629 *** contribution from error: -0.533983529 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52972.6043 0.467218429 18.2644215 EXIT FROM BFGS code NEW_X -52972.6043 0.467218429 18.2644215 ENTER BFGS code NEW_X -52972.6043 0.467218429 18.2644215 EXIT FROM BFGS code FG_LNSRCH 0. 0.455575794 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.531175375 *** contribution from regularisation: 0.00308142579 *** contribution from error: -0.534256816 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -53008.6448 0.455575794 2.35602427 EXIT FROM BFGS code NEW_X -53008.6448 0.455575794 2.35602427 ENTER BFGS code NEW_X -53008.6448 0.455575794 2.35602427 EXIT FROM BFGS code FG_LNSRCH 0. 0.447743505 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.531459153 *** contribution from regularisation: 0.00301258778 *** contribution from error: -0.53447175 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -53036.9677 0.447743505 -3.29765534 EXIT FROM BFGS code NEW_X -53036.9677 0.447743505 -3.29765534 ENTER BFGS code NEW_X -53036.9677 0.447743505 -3.29765534 EXIT FROM BFGS code FG_LNSRCH 0. 0.424517244 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.531985283 *** contribution from regularisation: 0.00296301721 *** contribution from error: -0.534948289 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -53089.4701 0.424517244 -5.56195402 EXIT FROM BFGS code NEW_X -53089.4701 0.424517244 -5.56195402 ENTER BFGS code NEW_X -53089.4701 0.424517244 -5.56195402 EXIT FROM BFGS code FG_LNSRCH 0. 0.411804616 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.532481194 *** contribution from regularisation: 0.00287999306 *** contribution from error: -0.535361171 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -53138.9626 0.411804616 14.0720062 EXIT FROM BFGS code NEW_X -53138.9626 0.411804616 14.0720062 ENTER BFGS code NEW_X -53138.9626 0.411804616 14.0720062 EXIT FROM BFGS code FG_LNSRCH 0. 0.418057978 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.531901598 *** contribution from regularisation: 0.00293322536 *** contribution from error: -0.534834802 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -53081.1214 0.418057978 -59.8235931 EXIT FROM BFGS code FG_LNSRCH 0. 0.413753957 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.532612622 *** contribution from regularisation: 0.00296512875 *** contribution from error: -0.535577774 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -53152.0749 0.413753957 -8.70681572 EXIT FROM BFGS code NEW_X -53152.0749 0.413753957 -8.70681572 ENTER BFGS code NEW_X -53152.0749 0.413753957 -8.70681572 EXIT FROM BFGS code FG_LNSRCH 0. 0.419709086 0. --------------------------------------------------- Iteration : 18 *********************************************** *** Learn Path 18 *** loss function: -0.532856822 *** contribution from regularisation: 0.00289192912 *** contribution from error: -0.53574878 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -53176.4478 0.419709086 23.2477665 EXIT FROM BFGS code NEW_X -53176.4478 0.419709086 23.2477665 ENTER BFGS code NEW_X -53176.4478 0.419709086 23.2477665 EXIT FROM BFGS code FG_LNSRCH 0. 0.427318454 0. --------------------------------------------------- Iteration : 19 *********************************************** *** Learn Path 19 *** loss function: -0.532869577 *** contribution from regularisation: 0.0029373276 *** contribution from error: -0.535806894 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -53177.7214 0.427318454 11.0419235 EXIT FROM BFGS code NEW_X -53177.7214 0.427318454 11.0419235 ENTER BFGS code NEW_X -53177.7214 0.427318454 11.0419235 EXIT FROM BFGS code FG_LNSRCH 0. 0.4859972 0. --------------------------------------------------- Iteration : 20 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 10 --> 60.9488907 sigma out 15 active outputs RANK 2 NODE 3 --> 31.9764709 sigma out 15 active outputs RANK 3 NODE 8 --> 24.3631821 sigma out 15 active outputs RANK 4 NODE 1 --> 23.0337048 sigma out 15 active outputs RANK 5 NODE 2 --> 17.8685188 sigma out 15 active outputs RANK 6 NODE 11 --> 16.1525497 sigma out 15 active outputs RANK 7 NODE 5 --> 13.6322699 sigma out 15 active outputs RANK 8 NODE 6 --> 11.8441448 sigma out 15 active outputs RANK 9 NODE 7 --> 10.64639 sigma out 15 active outputs RANK 10 NODE 12 --> 9.18881989 sigma out 15 active outputs RANK 11 NODE 4 --> 9.13571262 sigma out 15 active outputs RANK 12 NODE 9 --> 6.8002553 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 7 --> 54.1189766 sigma in 12act. ( 72.1270752 sig out 1act.) RANK 2 NODE 2 --> 45.6194649 sigma in 12act. ( 45.3016014 sig out 1act.) RANK 3 NODE 11 --> 24.4957771 sigma in 12act. ( 27.7664089 sig out 1act.) RANK 4 NODE 14 --> 19.7768002 sigma in 12act. ( 19.3612041 sig out 1act.) RANK 5 NODE 6 --> 18.0815239 sigma in 12act. ( 20.9256992 sig out 1act.) RANK 6 NODE 3 --> 13.7951918 sigma in 12act. ( 13.6038542 sig out 1act.) RANK 7 NODE 15 --> 13.6795168 sigma in 12act. ( 14.0051193 sig out 1act.) RANK 8 NODE 8 --> 11.2898598 sigma in 12act. ( 11.9917336 sig out 1act.) RANK 9 NODE 1 --> 10.3131113 sigma in 12act. ( 11.1613388 sig out 1act.) RANK 10 NODE 9 --> 6.83795071 sigma in 12act. ( 6.61267376 sig out 1act.) RANK 11 NODE 4 --> 5.91599417 sigma in 12act. ( 5.25018263 sig out 1act.) RANK 12 NODE 12 --> 5.46516275 sigma in 12act. ( 5.39685774 sig out 1act.) RANK 13 NODE 5 --> 5.41669226 sigma in 12act. ( 4.87910604 sig out 1act.) RANK 14 NODE 10 --> 3.10064793 sigma in 12act. ( 0.910223901 sig out 1act.) RANK 15 NODE 13 --> 2.28725386 sigma in 12act. ( 0.111768492 sig out 1act.) sorted by output significance RANK 1 NODE 7 --> 72.1270752 sigma out 1act.( 54.1189766 sig in 12act.) RANK 2 NODE 2 --> 45.3016014 sigma out 1act.( 45.6194649 sig in 12act.) RANK 3 NODE 11 --> 27.7664089 sigma out 1act.( 24.4957771 sig in 12act.) RANK 4 NODE 6 --> 20.9256992 sigma out 1act.( 18.0815239 sig in 12act.) RANK 5 NODE 14 --> 19.3612041 sigma out 1act.( 19.7768002 sig in 12act.) RANK 6 NODE 15 --> 14.0051193 sigma out 1act.( 13.6795168 sig in 12act.) RANK 7 NODE 3 --> 13.6038542 sigma out 1act.( 13.7951918 sig in 12act.) RANK 8 NODE 8 --> 11.9917336 sigma out 1act.( 11.2898598 sig in 12act.) RANK 9 NODE 1 --> 11.1613388 sigma out 1act.( 10.3131113 sig in 12act.) RANK 10 NODE 9 --> 6.61267376 sigma out 1act.( 6.83795071 sig in 12act.) RANK 11 NODE 12 --> 5.39685774 sigma out 1act.( 5.46516275 sig in 12act.) RANK 12 NODE 4 --> 5.25018263 sigma out 1act.( 5.91599417 sig in 12act.) RANK 13 NODE 5 --> 4.87910604 sigma out 1act.( 5.41669226 sig in 12act.) RANK 14 NODE 10 --> 0.910223901 sigma out 1act.( 3.10064793 sig in 12act.) RANK 15 NODE 13 --> 0.111768492 sigma out 1act.( 2.28725386 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 98.0454712 sigma in 15 active inputs *********************************************** *** Learn Path 20 *** loss function: -0.53333056 *** contribution from regularisation: 0.00284292642 *** contribution from error: -0.536173463 *********************************************** -----------------> Test sample Iteration No: 20 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -53223.722 0.4859972 -11.6376257 EXIT FROM BFGS code NEW_X -53223.722 0.4859972 -11.6376257 ENTER BFGS code NEW_X -53223.722 0.4859972 -11.6376257 EXIT FROM BFGS code FG_LNSRCH 0. 0.545274198 0. --------------------------------------------------- Iteration : 21 *********************************************** *** Learn Path 21 *** loss function: -0.532922804 *** contribution from regularisation: 0.00298541877 *** contribution from error: -0.535908222 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -53183.0286 0.545274198 60.7877464 EXIT FROM BFGS code FG_LNSRCH 0. 0.498947233 0. --------------------------------------------------- Iteration : 22 *********************************************** *** Learn Path 22 *** loss function: -0.533040822 *** contribution from regularisation: 0.00320961094 *** contribution from error: -0.536250412 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -53194.8086 0.498947233 0.162416428 EXIT FROM BFGS code FG_LNSRCH 0. 0.486699462 0. --------------------------------------------------- Iteration : 23 *********************************************** *** Learn Path 23 *** loss function: -0.533132255 *** contribution from regularisation: 0.0030476104 *** contribution from error: -0.536179841 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -53203.9328 0.486699462 -10.1254549 EXIT FROM BFGS code FG_LNSRCH 0. 0.486000925 0. --------------------------------------------------- Iteration : 24 *********************************************** *** Learn Path 24 *** loss function: -0.533151567 *** contribution from regularisation: 0.0030219357 *** contribution from error: -0.536173522 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -53205.8603 0.486000925 -10.7160845 EXIT FROM BFGS code FG_LNSRCH 0. 0.4859972 0. --------------------------------------------------- Iteration : 25 *********************************************** *** Learn Path 25 *** loss function: -0.533136725 *** contribution from regularisation: 0.00303674443 *** contribution from error: -0.536173463 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -53204.38 0.4859972 -10.725605 EXIT FROM BFGS code FG_LNSRCH 0. 0.4859972 0. --------------------------------------------------- Iteration : 26 *********************************************** *** Learn Path 26 *** loss function: -0.53313303 *** contribution from regularisation: 0.00304047065 *** contribution from error: -0.536173522 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -53204.0081 0.4859972 -10.5351152 EXIT FROM BFGS code FG_LNSRCH 0. 0.4859972 0. --------------------------------------------------- Iteration : 27 *********************************************** *** Learn Path 27 *** loss function: -0.533147871 *** contribution from regularisation: 0.00302561116 *** contribution from error: -0.536173463 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -53205.491 0.4859972 -10.5587053 EXIT FROM BFGS code FG_LNSRCH 0. 0.4859972 0. --------------------------------------------------- Iteration : 28 *********************************************** *** Learn Path 28 *** loss function: -0.533137441 *** contribution from regularisation: 0.00303602428 *** contribution from error: -0.536173463 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -53204.4518 0.4859972 -10.4653931 EXIT FROM BFGS code NEW_X -53204.4518 0.4859972 -10.4653931 ENTER BFGS code NEW_X -53204.4518 0.4859972 -10.4653931 EXIT FROM BFGS code CONVERGENC -53204.4518 0.4859972 -10.4653931 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 10 --> 93.3391571 sigma out 15 active outputs RANK 2 NODE 3 --> 48.4921074 sigma out 15 active outputs RANK 3 NODE 8 --> 37.2882385 sigma out 15 active outputs RANK 4 NODE 1 --> 35.2936859 sigma out 15 active outputs RANK 5 NODE 2 --> 26.8817196 sigma out 15 active outputs RANK 6 NODE 11 --> 23.8665943 sigma out 15 active outputs RANK 7 NODE 5 --> 21.226284 sigma out 15 active outputs RANK 8 NODE 6 --> 17.9516335 sigma out 15 active outputs RANK 9 NODE 7 --> 16.0822506 sigma out 15 active outputs RANK 10 NODE 12 --> 13.9388561 sigma out 15 active outputs RANK 11 NODE 4 --> 13.7463436 sigma out 15 active outputs RANK 12 NODE 9 --> 10.4626598 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 7 --> 83.3756256 sigma in 12act. ( 111.190109 sig out 1act.) RANK 2 NODE 2 --> 69.4717026 sigma in 12act. ( 70.3759003 sig out 1act.) RANK 3 NODE 11 --> 36.9556847 sigma in 12act. ( 43.3940277 sig out 1act.) RANK 4 NODE 14 --> 30.0175018 sigma in 12act. ( 30.4717751 sig out 1act.) RANK 5 NODE 6 --> 27.6037369 sigma in 12act. ( 32.71241 sig out 1act.) RANK 6 NODE 3 --> 20.917448 sigma in 12act. ( 21.5817528 sig out 1act.) RANK 7 NODE 15 --> 20.6556492 sigma in 12act. ( 21.7428837 sig out 1act.) RANK 8 NODE 8 --> 16.9999447 sigma in 12act. ( 18.5945759 sig out 1act.) RANK 9 NODE 1 --> 15.5939951 sigma in 12act. ( 17.3268929 sig out 1act.) RANK 10 NODE 9 --> 10.0505228 sigma in 12act. ( 10.3465261 sig out 1act.) RANK 11 NODE 4 --> 8.52047348 sigma in 12act. ( 7.96236753 sig out 1act.) RANK 12 NODE 12 --> 8.02144623 sigma in 12act. ( 8.28851414 sig out 1act.) RANK 13 NODE 5 --> 7.59774351 sigma in 12act. ( 7.49062204 sig out 1act.) RANK 14 NODE 10 --> 3.42309308 sigma in 12act. ( 1.42796433 sig out 1act.) RANK 15 NODE 13 --> 2.41647387 sigma in 12act. ( 0.172823608 sig out 1act.) sorted by output significance RANK 1 NODE 7 --> 111.190109 sigma out 1act.( 83.3756256 sig in 12act.) RANK 2 NODE 2 --> 70.3759003 sigma out 1act.( 69.4717026 sig in 12act.) RANK 3 NODE 11 --> 43.3940277 sigma out 1act.( 36.9556847 sig in 12act.) RANK 4 NODE 6 --> 32.71241 sigma out 1act.( 27.6037369 sig in 12act.) RANK 5 NODE 14 --> 30.4717751 sigma out 1act.( 30.0175018 sig in 12act.) RANK 6 NODE 15 --> 21.7428837 sigma out 1act.( 20.6556492 sig in 12act.) RANK 7 NODE 3 --> 21.5817528 sigma out 1act.( 20.917448 sig in 12act.) RANK 8 NODE 8 --> 18.5945759 sigma out 1act.( 16.9999447 sig in 12act.) RANK 9 NODE 1 --> 17.3268929 sigma out 1act.( 15.5939951 sig in 12act.) RANK 10 NODE 9 --> 10.3465261 sigma out 1act.( 10.0505228 sig in 12act.) RANK 11 NODE 12 --> 8.28851414 sigma out 1act.( 8.02144623 sig in 12act.) RANK 12 NODE 4 --> 7.96236753 sigma out 1act.( 8.52047348 sig in 12act.) RANK 13 NODE 5 --> 7.49062204 sigma out 1act.( 7.59774351 sig in 12act.) RANK 14 NODE 10 --> 1.42796433 sigma out 1act.( 3.42309308 sig in 12act.) RANK 15 NODE 13 --> 0.172823608 sigma out 1act.( 2.41647387 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 151.919418 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.533142745 *** contribution from regularisation: 0.00303070061 *** contribution from error: -0.536173463 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 28167 Closing output file done