NNInput NNInputs_120.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree NNAna::CopyTree: entries= 198595 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 50253 nbkg = 148342 Bkg Entries: 148342 Sig Entries: 50253 Chosen entries: 50253 Signal fraction: 1 Background fraction: 0.338764 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 148342 Actual Signal Entries: 50253 Entries to split: 50253 Test with : 25126 Train with : 25126 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 50253 for Signal Prepared event 0 for Signal with 50253 events ====Entry 0 Variable Ht : 156.497 Variable LepAPt : 24.4462 Variable LepBPt : 15.2546 Variable MetSigLeptonsJets : 4.03888 Variable MetSpec : 26.0087 Variable SumEtLeptonsJets : 113.473 Variable VSumJetLeptonsPt : 37.0367 Variable addEt : 82.7245 Variable dPhiLepSumMet : 0.85328 Variable dPhiLeptons : 0.330941 Variable dRLeptons : 0.538617 Variable lep1_E : 30.9903 Variable lep2_E : 15.9065 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2120 Ht = 156.497 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 24.4462 LepAPt = 24.4462 LepBEt = 15.2546 LepBPt = 15.2546 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 43.0236 MetDelPhi = 0.649165 MetSig = 3.00203 MetSigLeptonsJets = 4.03888 MetSpec = 26.0087 Mjj = 0 MostCentralJetEta = -1.64867 MtllMet = 86.7807 Njets = 1 SB = 0 SumEt = 205.392 SumEtJets = 0 SumEtLeptonsJets = 113.473 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 37.0367 addEt = 82.7245 dPhiLepSumMet = 0.85328 dPhiLeptons = 0.330941 dRLeptons = 0.538617 diltype = 45 dimass = 10.4324 event = 71 jet1_Et = 73.7722 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 30.9903 lep2_E = 15.9065 rand = 0.999742 run = 232428 weight = 1.10299e-06 ===Show End Prepared event 10000 for Signal with 50253 events Prepared event 20000 for Signal with 50253 events Prepared event 30000 for Signal with 50253 events Prepared event 40000 for Signal with 50253 events Prepared event 50000 for Signal with 50253 events Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 148342 for Background Prepared event 0 for Background with 148342 events ====Entry 0 Variable Ht : 85.3408 Variable LepAPt : 31.1294 Variable LepBPt : 10.0664 Variable MetSigLeptonsJets : 6.87768 Variable MetSpec : 44.1437 Variable SumEtLeptonsJets : 41.1959 Variable VSumJetLeptonsPt : 41.0132 Variable addEt : 85.3408 Variable dPhiLepSumMet : 3.10536 Variable dPhiLeptons : 0.219342 Variable dRLeptons : 0.424454 Variable lep1_E : 32.3548 Variable lep2_E : 10.1027 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 85.3408 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 31.1297 LepAPt = 31.1294 LepBEt = 10.0674 LepBPt = 10.0664 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 44.1437 MetDelPhi = 3.01191 MetSig = 3.74558 MetSigLeptonsJets = 6.87768 MetSpec = 44.1437 Mjj = 0 MostCentralJetEta = 0 MtllMet = 86.2332 Njets = 0 SB = 0 SumEt = 138.899 SumEtJets = 0 SumEtLeptonsJets = 41.1959 Target = 0 TrainWeight = 0.307239 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 41.0132 addEt = 85.3408 dPhiLepSumMet = 3.10536 dPhiLeptons = 0.219342 dRLeptons = 0.424454 diltype = 17 dimass = 7.54723 event = 6717520 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 32.3548 lep2_E = 10.1027 rand = 0.999742 run = 271566 weight = 0.00296168 ===Show End Prepared event 10000 for Background with 148342 events Prepared event 20000 for Background with 148342 events Prepared event 30000 for Background with 148342 events Prepared event 40000 for Background with 148342 events Prepared event 50000 for Background with 148342 events Prepared event 60000 for Background with 148342 events Prepared event 70000 for Background with 148342 events Prepared event 80000 for Background with 148342 events Prepared event 90000 for Background with 148342 events Prepared event 100000 for Background with 148342 events Prepared event 110000 for Background with 148342 events Prepared event 120000 for Background with 148342 events Prepared event 130000 for Background with 148342 events Prepared event 140000 for Background with 148342 events Warning: found 4705 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 198595 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 4705 negative weights. Signal fraction: 62.4027367 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 60.0046387 63.4059753 65.442749 66.9286652 68.0126038 68.8894424 69.9832306 70.8530884 71.6016693 72.3446045 73.070076 73.8315659 74.4161301 75.1040878 75.6832581 76.3920822 76.9620514 77.4252167 78.0480728 78.6468811 79.2773514 79.7212067 80.3677902 80.908165 81.4263 82.1029205 82.641098 83.2085876 83.8011017 84.6486206 85.3290405 86.1929016 86.9804535 87.8430939 88.6494293 89.4719849 90.5290451 91.5601044 92.2830963 93.1322021 94.1556396 95.217453 96.035202 96.9411163 97.9290466 98.8156738 99.7720337 100.789062 101.632202 102.513214 103.498093 104.212822 105.045868 105.872482 106.774376 107.810806 108.723541 109.590263 110.593521 111.494186 112.401955 113.392319 114.305183 115.347198 116.379501 117.441223 118.68161 119.950729 121.340698 122.701126 124.280777 126.063904 127.90715 129.632278 131.615875 133.43988 135.335358 137.327866 139.300171 141.256378 143.524612 145.88031 148.198151 150.725006 153.422104 156.189026 159.158966 162.605804 166.247208 170.26416 175.203705 180.082458 186.304642 193.490128 200.797089 209.3367 222.340698 237.603119 262.858032 298.336639 782.079102 ------------------------------ Transdef: Tab for variable 3 20.0000362 20.3212433 20.5738525 20.8374786 21.0639839 21.2826118 21.507761 21.7331696 21.9607754 22.140728 22.3520451 22.5269928 22.7025909 22.8831024 23.0880508 23.2645874 23.4222755 23.591198 23.7902908 24.0058823 24.2159462 24.4199924 24.6055412 24.7820168 24.9816628 25.1728325 25.3084984 25.4981651 25.6659279 25.8317223 26.0305805 26.2077866 26.3996048 26.5919819 26.7689056 26.9884453 27.1669426 27.3434639 27.586256 27.7640762 27.9480629 28.1154976 28.3191261 28.5394058 28.7360783 28.9248619 29.1260567 29.3463097 29.5672798 29.7743225 29.9761925 30.1948452 30.4516869 30.6833134 30.9212608 31.1776352 31.4407959 31.6688271 31.8740349 32.1123123 32.3644714 32.6582413 32.9139252 33.1819305 33.4648399 33.7631989 34.0292892 34.2958603 34.6187897 34.9228134 35.2770691 35.5989418 35.9279404 36.2747765 36.6914101 37.0817871 37.4652405 37.8189926 38.2723618 38.7214584 39.1651764 39.6671638 40.1580544 40.6730881 41.1841354 41.8592682 42.5354996 43.288269 44.0486679 44.8633041 45.8103027 46.9510117 48.2441597 49.7795715 51.6281662 53.9206924 56.624115 60.1660614 65.2915649 75.5832825 228.385986 ------------------------------ Transdef: Tab for variable 4 10.0000496 10.1023731 10.1991959 10.2991447 10.3941765 10.4826565 10.5704403 10.6632614 10.7577744 10.835825 10.9231949 11.0015059 11.1140909 11.2110291 11.3006783 11.4127426 11.5166912 11.6238289 11.7269287 11.8257942 11.9338322 12.0514698 12.1680088 12.2769299 12.3910007 12.5053635 12.6188946 12.741087 12.8444414 12.9568405 13.073822 13.2041245 13.3383923 13.4626617 13.5898943 13.6921272 13.8144398 13.9361401 14.0514164 14.1844006 14.3345108 14.4663277 14.5934181 14.7307348 14.8731327 15.0189667 15.1733227 15.314703 15.4502993 15.5910807 15.7487736 15.8936548 16.0352612 16.2125511 16.3764191 16.5358639 16.690937 16.8506165 17.0160046 17.2013645 17.3762703 17.5463333 17.7115612 17.9093781 18.1074696 18.2981377 18.4951172 18.6852798 18.8986664 19.1338768 19.3633633 19.6060295 19.8508358 20.0834026 20.2435684 20.4092712 20.6272774 20.8687897 21.1104183 21.3677597 21.6286278 21.8928185 22.1857433 22.4909344 22.7908249 23.1235218 23.4848614 23.8430481 24.2457829 24.6593876 25.1606941 25.6768188 26.2247086 26.8089085 27.5587006 28.4068718 29.6586189 31.2020149 33.5773048 38.1040726 79.8302307 ------------------------------ Transdef: Tab for variable 5 1.0228225 2.3702364 2.78213239 3.0662775 3.29826069 3.50195551 3.6969285 3.84621096 4.00752306 4.12146902 4.20398712 4.29812527 4.39636612 4.50008106 4.58105755 4.65712452 4.7356596 4.80381823 4.87706089 4.94812679 5.01141262 5.07142258 5.13081121 5.18765402 5.24756241 5.2966671 5.34139729 5.39026165 5.43690681 5.48202705 5.52016544 5.55691385 5.59681368 5.63707304 5.67902851 5.72172356 5.76146173 5.80541277 5.84319496 5.88667107 5.92915392 5.96401978 6.00078869 6.03419209 6.07476282 6.10555458 6.14406729 6.18185043 6.21835327 6.25318766 6.27909422 6.3162055 6.35072517 6.39066696 6.42913342 6.47150993 6.50873613 6.54136562 6.57914257 6.61568117 6.64828205 6.68465328 6.72950554 6.76845312 6.80270672 6.83579826 6.87491703 6.9185133 6.96408749 6.99939537 7.04183102 7.08627892 7.13102436 7.17442608 7.21601677 7.25698757 7.3059988 7.35425854 7.39934731 7.4461689 7.49620724 7.54498005 7.60302925 7.65965939 7.72209835 7.79443073 7.86505699 7.9386096 8.01885986 8.09671783 8.18137169 8.26936245 8.37937546 8.49308014 8.63149261 8.79296112 8.99295902 9.25888824 9.59879303 10.2838144 17.8463287 ------------------------------ Transdef: Tab for variable 6 15.0119476 19.72826 24.256712 25.8436794 26.7607803 27.6365013 28.4946594 29.2415104 29.9449272 30.571476 31.1516323 31.6390324 32.1172409 32.6422043 33.0347519 33.4720116 33.8443985 34.24506 34.6274185 34.9065475 35.2642136 35.655365 35.980114 36.3233032 36.6009827 36.9100647 37.2015076 37.5493927 37.8611908 38.1702271 38.4511566 38.7421188 39.0725937 39.3623657 39.6564369 39.9081726 40.2424469 40.4865875 40.7431641 41.0511589 41.3614426 41.6738129 42.0257835 42.3828926 42.6818542 42.9763374 43.2515106 43.6275253 43.931488 44.2461586 44.5951118 44.972229 45.3377266 45.7284088 46.1206894 46.5555649 47.0022049 47.4419403 47.8540154 48.2369156 48.6620674 49.0502548 49.4482803 49.9258499 50.3452911 50.763916 51.1451492 51.5750923 52.0054092 52.4594727 52.9992561 53.5037766 54.0029716 54.5237427 55.0045166 55.4562607 55.9560928 56.4433899 56.9886856 57.5058517 58.0643959 58.6843109 59.347168 60.1062088 60.8610725 61.592598 62.4372253 63.3789215 64.4573975 65.6328888 67.0356903 68.4250565 70.0262909 71.9673004 74.1259003 76.7113113 79.9509583 83.6095428 89.7167511 99.9298935 236.562576 ------------------------------ Transdef: Tab for variable 7 30.1476021 32.5601349 33.3725624 34.0371857 34.6814804 35.1388359 35.5553665 36.0082855 36.359539 36.8033714 37.178627 37.5232086 37.8447495 38.0876007 38.3866882 38.7001114 38.9012833 39.2140198 39.471817 39.7409363 40.0002594 40.2227631 40.4718094 40.7784691 41.0630264 41.4290466 41.7407455 42.0544052 42.3852921 42.7894592 43.1601105 43.561554 44.0404587 44.5370712 44.9521217 45.4465408 45.8841438 46.4550858 46.9193344 47.4897766 48.0969086 48.617897 49.0679855 49.6572456 50.2291527 50.781044 51.3164291 51.8804398 52.4117355 52.9269791 53.504921 54.1528549 54.8128853 55.3893433 56.0050049 56.6449203 57.2583771 57.9128456 58.6225204 59.3513527 60.1808853 61.0031662 61.7276154 62.8027954 63.8387146 64.9945908 66.293602 67.5433655 68.7873611 70.1828461 71.4012909 72.9334412 74.1731186 75.4922867 76.77108 78.3523026 79.9940491 81.6693573 83.2757111 85.2186203 87.2151489 89.0927124 91.3982849 93.6865845 96.3610153 98.8122864 101.475845 104.386879 107.565041 110.914429 114.625717 119.381302 124.240768 130.415741 137.359924 145.641235 156.180145 167.90683 188.865616 219.429565 499.137787 ------------------------------ Transdef: Tab for variable 8 2.18757391 21.3801956 26.6571884 29.0389748 30.5405312 31.5202065 32.2562141 32.8365173 33.3195114 33.8068237 34.1292572 34.5226555 34.8982544 35.2198334 35.5312042 35.8844604 36.1284332 36.4528351 36.7488861 37.0413589 37.2844734 37.5246124 37.7626114 38.0168991 38.2091599 38.4485779 38.6422348 38.8501968 39.0797119 39.281189 39.5134773 39.7367477 39.9541016 40.2074661 40.453804 40.7168198 40.980011 41.2865143 41.5665169 41.824398 42.1570663 42.5057068 42.7799187 43.0870132 43.4307938 43.761116 44.1370125 44.4531708 44.8367271 45.1940575 45.5400581 45.9192352 46.3200836 46.7017822 47.0916367 47.5116806 47.9486542 48.254818 48.5952721 48.9536819 49.3472137 49.7593536 50.1294174 50.540657 50.9548187 51.3435822 51.7537079 52.1606064 52.6008415 53.0207977 53.4968643 53.9692001 54.4637833 54.9007797 55.3464966 55.8599625 56.3812141 56.9199104 57.4842339 58.0181885 58.6194687 59.2202988 59.9401474 60.718689 61.5404739 62.5551338 63.4628983 64.5643845 65.6780014 67.0196686 68.4912491 70.0061188 71.7408295 73.8227081 76.1002274 78.9369583 82.7409363 87.5239944 94.4975128 108.414154 326.868286 ------------------------------ Transdef: Tab for variable 9 49.6214828 62.6421394 64.7922211 66.3352661 67.3526001 68.3083878 69.0579834 70.0311127 70.831665 71.5223465 72.1317596 72.861908 73.4781647 74.0618134 74.6456146 75.1936798 75.6545715 76.2347336 76.7143097 77.1515045 77.5923157 78.1380005 78.664032 79.123764 79.5382462 79.9894257 80.5328674 80.9729843 81.4608841 82.0067749 82.5368347 82.9862823 83.3888702 83.9330139 84.5807114 85.0903625 85.6980743 86.2964325 86.8801575 87.5261841 88.2107468 88.7733612 89.3656921 90.1443863 90.8922119 91.5997162 92.1630859 92.9269867 93.5516281 94.325943 95.1204987 95.7677841 96.412735 97.0937958 97.8598022 98.4832764 99.2122955 99.9682007 100.740044 101.420944 102.039963 102.731461 103.467728 104.145714 104.793427 105.434067 106.068237 106.787323 107.478172 108.131668 108.831848 109.51001 110.251671 110.910454 111.629547 112.304054 113.053726 113.739761 114.487297 115.265335 116.141396 116.900475 117.729454 118.662285 119.633766 120.571274 121.74498 122.896942 124.184113 125.746399 127.604156 129.49408 131.784119 134.508514 138.006622 142.269379 147.640137 155.149323 166.087097 187.69603 424.390259 ------------------------------ Transdef: Tab for variable 10 0.0397654139 0.917725503 1.16517019 1.34039235 1.4954021 1.6193161 1.73297393 1.8350997 1.92057657 1.9996717 2.06522036 2.13119221 2.18711138 2.24006414 2.28926277 2.33081961 2.37274957 2.41193533 2.44514513 2.47859383 2.5114255 2.54091883 2.56877995 2.59801221 2.62071371 2.64479828 2.66986132 2.69348788 2.71613359 2.73524094 2.75468516 2.77308083 2.78798723 2.8043499 2.82105923 2.83385348 2.84519458 2.85433459 2.86535883 2.87638092 2.88741469 2.89674997 2.90584707 2.91505432 2.92220902 2.92934227 2.9371686 2.94522858 2.95227671 2.95992589 2.96581006 2.97192717 2.97728968 2.98327088 2.98864126 2.99339294 2.99719357 3.00278473 3.00735307 3.01242828 3.0166297 3.02161288 3.02592254 3.02996969 3.03415394 3.03786945 3.04173326 3.0451045 3.04910898 3.05329895 3.05696821 3.06023741 3.06389284 3.06710172 3.07015514 3.07306767 3.07595229 3.07889175 3.08207893 3.08463407 3.08798933 3.09145999 3.0946126 3.09749389 3.10043693 3.10308886 3.1054554 3.10835457 3.11044025 3.11312342 3.11557102 3.11834478 3.1211648 3.12375021 3.12651443 3.12871265 3.13135505 3.13369131 3.13615012 3.13883734 3.14159226 ------------------------------ Transdef: Tab for variable 11 6.91413879E-06 0.00604724884 0.0103163142 0.0174391866 0.0238204077 0.0308023058 0.0383609533 0.0452208556 0.0531400107 0.0603961907 0.067287147 0.0748654604 0.0812567472 0.0883833319 0.0955073833 0.101594359 0.108553529 0.11482048 0.121083111 0.127407193 0.134298027 0.140253603 0.146365166 0.152460396 0.157376766 0.162478566 0.168310523 0.174524486 0.180882633 0.185886741 0.191815495 0.197751135 0.203690737 0.209442735 0.215710878 0.222154617 0.228881195 0.234760642 0.240877509 0.247577488 0.254083157 0.261084437 0.266486645 0.272390664 0.278572023 0.28412199 0.290629923 0.296892345 0.303429425 0.31036818 0.315953374 0.323165596 0.330114782 0.336508125 0.343268394 0.349819481 0.356296331 0.361490101 0.367825508 0.375069648 0.381782055 0.387101918 0.392866611 0.399740338 0.406615853 0.413140297 0.418894827 0.425199926 0.431915343 0.438399553 0.445111275 0.451268435 0.457775861 0.464451909 0.472514182 0.477840543 0.484490335 0.492345095 0.500757754 0.508355141 0.516351104 0.524177372 0.532491446 0.54177922 0.550861478 0.561347842 0.571614623 0.580734134 0.591790795 0.603238583 0.615073562 0.630282044 0.643178463 0.657145143 0.673451483 0.696816206 0.721948683 0.752637148 0.789145231 0.84937191 1.1301049 ------------------------------ Transdef: Tab for variable 12 0.100055709 0.128441393 0.148763657 0.164430648 0.177486181 0.192011848 0.203620374 0.214584529 0.226200163 0.237013072 0.247909814 0.258152723 0.268029988 0.276601315 0.285288095 0.294096589 0.301930189 0.31025663 0.318979323 0.327809811 0.336510539 0.344457179 0.351403952 0.357982248 0.366175592 0.37357384 0.38081181 0.388455957 0.394378185 0.400561422 0.405553639 0.411094248 0.416178405 0.420922816 0.425672114 0.43030411 0.435419619 0.440027833 0.444430709 0.448711276 0.453898787 0.458330691 0.462108433 0.467013389 0.47157383 0.47582221 0.480240643 0.483979493 0.488960028 0.493099809 0.49704361 0.501559317 0.506014466 0.510422885 0.514828563 0.51922524 0.52402401 0.529053211 0.533959866 0.538878679 0.544537067 0.549277782 0.553611875 0.558316588 0.562471747 0.567166328 0.571640253 0.577159226 0.582418203 0.58754015 0.592943907 0.598525524 0.60387826 0.609536767 0.615561366 0.621463776 0.627737939 0.634227037 0.640356779 0.647031248 0.655020833 0.662344098 0.668555498 0.675216913 0.68175298 0.689958692 0.698429167 0.70722425 0.716932178 0.727541566 0.73790127 0.748335242 0.758236647 0.771327198 0.785974205 0.799647629 0.821196079 0.8454898 0.875678182 0.915784597 1.13539624 ------------------------------ Transdef: Tab for variable 13 20.0277634 21.2990532 21.9994774 22.5313091 23.1040344 23.6301193 24.0853004 24.4841518 24.8808784 25.2661171 25.6157761 25.9810448 26.3548164 26.7265739 27.0510979 27.3616676 27.6628456 27.9660759 28.2873611 28.5701904 28.8529854 29.1175938 29.4397125 29.7055435 29.9884109 30.2889595 30.5750751 30.827858 31.0792961 31.3654442 31.6544762 31.9224968 32.2176857 32.4425659 32.702507 33.0038986 33.3129044 33.6117668 33.9014511 34.2388535 34.4380112 34.670414 34.9703217 35.2751236 35.5820618 35.9127769 36.1811485 36.4927826 36.8342667 37.2071114 37.5688591 37.91679 38.2526855 38.5753021 38.9188805 39.3226852 39.6715317 40.0351562 40.4846344 40.835083 41.245018 41.7125893 42.1754684 42.6916008 43.1248398 43.5607567 44.0571671 44.5982475 45.1574326 45.7457886 46.3595657 46.9131203 47.5091629 48.1418686 48.9004211 49.6096382 50.3287659 51.1096153 51.9259644 52.7428551 53.7367554 54.676281 55.6352692 56.7874374 57.8285065 58.9932404 60.2492142 61.6990204 63.4323578 65.0886078 66.8596191 68.5310059 70.6426086 73.3292236 76.3482895 79.589859 83.4536972 88.2318649 94.7685165 107.816254 313.966675 ------------------------------ Transdef: Tab for variable 14 10.0084028 10.5338364 10.8465748 11.0759697 11.3401146 11.5720482 11.8138332 12.0432711 12.2649612 12.4310894 12.6424446 12.8378983 13.0442543 13.2114582 13.4187126 13.627141 13.8149872 13.9804707 14.1499901 14.3454838 14.5085974 14.6887417 14.8828945 15.0709429 15.2385578 15.4397526 15.6189003 15.790102 15.9820709 16.1678696 16.3447876 16.5305786 16.7170639 16.8894691 17.0794773 17.2956352 17.5124664 17.7202892 17.8977928 18.0977592 18.2769299 18.4864597 18.6805096 18.8883591 19.091629 19.3081551 19.5190468 19.7451115 19.9547997 20.1881943 20.4062805 20.6148987 20.8277283 21.0699253 21.2926064 21.5564003 21.7486534 21.9774551 22.2109375 22.4422684 22.6806717 22.9253616 23.186018 23.4465599 23.706131 23.9888649 24.2703724 24.5502262 24.8642139 25.1608391 25.4818649 25.7220612 26.0225449 26.3432846 26.6829567 27.0786476 27.4224987 27.819561 28.2045612 28.6289558 29.0742683 29.5410271 30.0339317 30.5350266 31.0990257 31.6671715 32.3886147 33.1375351 33.9674301 34.7408371 35.5944138 36.7074509 37.8205566 39.2188339 40.4025192 42.1206284 44.2675934 47.2150078 51.3874435 57.9738007 128.659836 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 54.6 39.1 29.6 5.0 30.7 53.8 37.6 49.7 -31.1 -3.0 -8.1 10.9 4.2 2 54.6 100.0 63.4 47.3 12.0 57.1 93.9 61.8 89.9 -51.3 -16.1 -27.8 33.9 21.6 3 39.1 63.4 100.0 24.1 -9.1 23.4 67.1 45.2 69.3 -20.6 -15.9 -31.0 67.3 8.1 4 29.6 47.3 24.1 100.0 -4.1 19.6 49.7 35.1 51.6 -15.2 -21.0 -37.7 11.0 70.6 5 5.0 12.0 -9.1 -4.1 100.0 78.6 -16.1 41.2 39.4 32.7 0.6 3.2 -12.9 -8.2 6 30.7 57.1 23.4 19.6 78.6 100.0 34.4 70.3 73.3 -2.9 -7.1 -11.2 6.7 4.5 7 53.8 93.9 67.1 49.7 -16.1 34.4 100.0 55.6 77.6 -57.7 -15.3 -28.0 38.7 25.2 8 37.6 61.8 45.2 35.1 41.2 70.3 55.6 100.0 74.1 -9.2 -14.9 -23.1 23.8 16.6 9 49.7 89.9 69.3 51.6 39.4 73.3 77.6 74.1 100.0 -26.2 -18.5 -32.9 39.0 25.4 10 -31.1 -51.3 -20.6 -15.2 32.7 -2.9 -57.7 -9.2 -26.2 100.0 3.8 6.2 -8.9 -5.1 11 -3.0 -16.1 -15.9 -21.0 0.6 -7.1 -15.3 -14.9 -18.5 3.8 100.0 59.4 -6.4 -12.4 12 -8.1 -27.8 -31.0 -37.7 3.2 -11.2 -28.0 -23.1 -32.9 6.2 59.4 100.0 -11.5 -16.0 13 10.9 33.9 67.3 11.0 -12.9 6.7 38.7 23.8 39.0 -8.9 -6.4 -11.5 100.0 38.3 14 4.2 21.6 8.1 70.6 -8.2 4.5 25.2 16.6 25.4 -5.1 -12.4 -16.0 38.3 100.0 TOTAL CORRELATION TO TARGET (diagonal) 119.6106 TOTAL CORRELATION OF ALL VARIABLES 59.8650954 ROUND 1: MAX CORR ( 59.8617189) AFTER KILLING INPUT VARIABLE 11 CONTR 0.635819732 ROUND 2: MAX CORR ( 59.8580323) AFTER KILLING INPUT VARIABLE 6 CONTR 0.664341174 ROUND 3: MAX CORR ( 59.7686348) AFTER KILLING INPUT VARIABLE 2 CONTR 3.27021771 ROUND 4: MAX CORR ( 59.7307746) AFTER KILLING INPUT VARIABLE 8 CONTR 2.12703366 ROUND 5: MAX CORR ( 59.6931889) AFTER KILLING INPUT VARIABLE 9 CONTR 2.1186409 ROUND 6: MAX CORR ( 59.3851962) AFTER KILLING INPUT VARIABLE 10 CONTR 6.05601144 ROUND 7: MAX CORR ( 59.006129) AFTER KILLING INPUT VARIABLE 13 CONTR 6.69912441 ROUND 8: MAX CORR ( 58.6704826) AFTER KILLING INPUT VARIABLE 3 CONTR 6.28472156 ROUND 9: MAX CORR ( 57.6442973) AFTER KILLING INPUT VARIABLE 12 CONTR 10.9252243 ROUND 10: MAX CORR ( 56.3751167) AFTER KILLING INPUT VARIABLE 5 CONTR 12.0295979 ROUND 11: MAX CORR ( 54.6218925) AFTER KILLING INPUT VARIABLE 4 CONTR 13.9500051 ROUND 12: MAX CORR ( 53.7604373) AFTER KILLING INPUT VARIABLE 14 CONTR 9.66263562 LAST REMAINING VARIABLE: 7 total correlation to target: 59.8650954 % total significance: 120.573605 sigma correlations of single variables to target: variable 2: 54.571787 % , in sigma: 109.912412 variable 3: 39.0856416 % , in sigma: 78.721944 variable 4: 29.6457893 % , in sigma: 59.7092455 variable 5: 4.97724169 % , in sigma: 10.0246056 variable 6: 30.7436896 % , in sigma: 61.9205139 variable 7: 53.7604373 % , in sigma: 108.278282 variable 8: 37.5780288 % , in sigma: 75.6854783 variable 9: 49.7175958 % , in sigma: 100.135642 variable 10: -31.0983699 % , in sigma: 62.6348714 variable 11: -2.96729092 % , in sigma: 5.97638673 variable 12: -8.14335916 % , in sigma: 16.4014466 variable 13: 10.8687335 % , in sigma: 21.890592 variable 14: 4.19747917 % , in sigma: 8.45409483 variables sorted by significance: 1 most relevant variable 7 corr 53.760437 , in sigma: 108.278282 2 most relevant variable 14 corr 9.6626358 , in sigma: 19.4614043 3 most relevant variable 4 corr 13.9500055 , in sigma: 28.0965467 4 most relevant variable 5 corr 12.0295982 , in sigma: 24.2286763 5 most relevant variable 12 corr 10.9252243 , in sigma: 22.0043694 6 most relevant variable 3 corr 6.28472137 , in sigma: 12.6579855 7 most relevant variable 13 corr 6.69912434 , in sigma: 13.4926298 8 most relevant variable 10 corr 6.05601168 , in sigma: 12.1973439 9 most relevant variable 9 corr 2.1186409 , in sigma: 4.26713042 10 most relevant variable 8 corr 2.12703371 , in sigma: 4.28403429 11 most relevant variable 2 corr 3.27021766 , in sigma: 6.58650801 12 most relevant variable 6 corr 0.664341152 , in sigma: 1.33804192 13 most relevant variable 11 corr 0.635819733 , in sigma: 1.28059726 global correlations between input variables: variable 2: 98.9758992 % variable 3: 93.6292779 % variable 4: 90.4317879 % variable 5: 94.8468601 % variable 6: 93.6363206 % variable 7: 98.4919752 % variable 8: 84.7662061 % variable 9: 98.7062214 % variable 10: 71.9329858 % variable 11: 60.0864468 % variable 12: 68.337455 % variable 13: 85.4320305 % variable 14: 87.0388188 % significance loss when removing single variables: variable 2: corr = 3.14856082 % , sigma = 6.34148037 variable 3: corr = 10.1096546 % , sigma = 20.3617399 variable 4: corr = 11.1284574 % , sigma = 22.413699 variable 5: corr = 5.00477632 % , sigma = 10.0800628 variable 6: corr = 0.64871495 % , sigma = 1.30656937 variable 7: corr = 4.51271152 % , sigma = 9.0890007 variable 8: corr = 2.91274021 % , sigma = 5.86651677 variable 9: corr = 3.94490791 % , sigma = 7.94539395 variable 10: corr = 5.96312543 % , sigma = 12.0102628 variable 11: corr = 0.635819732 % , sigma = 1.28059726 variable 12: corr = 9.63527292 % , sigma = 19.4062931 variable 13: corr = 6.36901987 % , sigma = 12.8277701 variable 14: corr = 7.32630619 % , sigma = 14.7558296 Keep only 11 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 12 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 5 --> 18.9625854 sigma out 15 active outputs RANK 2 NODE 1 --> 16.0482941 sigma out 15 active outputs RANK 3 NODE 3 --> 15.6674986 sigma out 15 active outputs RANK 4 NODE 8 --> 15.315218 sigma out 15 active outputs RANK 5 NODE 4 --> 15.1252251 sigma out 15 active outputs RANK 6 NODE 9 --> 14.8074512 sigma out 15 active outputs RANK 7 NODE 6 --> 14.4894352 sigma out 15 active outputs RANK 8 NODE 10 --> 13.5256338 sigma out 15 active outputs RANK 9 NODE 2 --> 13.3142395 sigma out 15 active outputs RANK 10 NODE 7 --> 13.2260056 sigma out 15 active outputs RANK 11 NODE 12 --> 11.8350134 sigma out 15 active outputs RANK 12 NODE 11 --> 7.23115349 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 4 --> 26.8048401 sigma in 12act. ( 24.9462833 sig out 1act.) RANK 2 NODE 12 --> 24.6717529 sigma in 12act. ( 25.3698082 sig out 1act.) RANK 3 NODE 14 --> 15.0084515 sigma in 12act. ( 14.2727356 sig out 1act.) RANK 4 NODE 5 --> 14.8004656 sigma in 12act. ( 15.456748 sig out 1act.) RANK 5 NODE 13 --> 14.2689295 sigma in 12act. ( 16.0053539 sig out 1act.) RANK 6 NODE 15 --> 12.7269611 sigma in 12act. ( 12.8204899 sig out 1act.) RANK 7 NODE 8 --> 12.6355324 sigma in 12act. ( 14.680397 sig out 1act.) RANK 8 NODE 7 --> 10.1237049 sigma in 12act. ( 11.2974644 sig out 1act.) RANK 9 NODE 10 --> 6.78285599 sigma in 12act. ( 6.66308022 sig out 1act.) RANK 10 NODE 6 --> 3.41290069 sigma in 12act. ( 4.63877726 sig out 1act.) RANK 11 NODE 2 --> 2.84601951 sigma in 12act. ( 3.51130462 sig out 1act.) RANK 12 NODE 3 --> 2.40714765 sigma in 12act. ( 2.14314127 sig out 1act.) RANK 13 NODE 11 --> 2.35234666 sigma in 12act. ( 3.85809088 sig out 1act.) RANK 14 NODE 9 --> 1.86665785 sigma in 12act. ( 1.82309127 sig out 1act.) RANK 15 NODE 1 --> 1.68328702 sigma in 12act. ( 0.614494264 sig out 1act.) sorted by output significance RANK 1 NODE 12 --> 25.3698082 sigma out 1act.( 24.6717529 sig in 12act.) RANK 2 NODE 4 --> 24.9462833 sigma out 1act.( 26.8048401 sig in 12act.) RANK 3 NODE 13 --> 16.0053539 sigma out 1act.( 14.2689295 sig in 12act.) RANK 4 NODE 5 --> 15.456748 sigma out 1act.( 14.8004656 sig in 12act.) RANK 5 NODE 8 --> 14.680397 sigma out 1act.( 12.6355324 sig in 12act.) RANK 6 NODE 14 --> 14.2727356 sigma out 1act.( 15.0084515 sig in 12act.) RANK 7 NODE 15 --> 12.8204899 sigma out 1act.( 12.7269611 sig in 12act.) RANK 8 NODE 7 --> 11.2974644 sigma out 1act.( 10.1237049 sig in 12act.) RANK 9 NODE 10 --> 6.66308022 sigma out 1act.( 6.78285599 sig in 12act.) RANK 10 NODE 6 --> 4.63877726 sigma out 1act.( 3.41290069 sig in 12act.) RANK 11 NODE 11 --> 3.85809088 sigma out 1act.( 2.35234666 sig in 12act.) RANK 12 NODE 2 --> 3.51130462 sigma out 1act.( 2.84601951 sig in 12act.) RANK 13 NODE 3 --> 2.14314127 sigma out 1act.( 2.40714765 sig in 12act.) RANK 14 NODE 9 --> 1.82309127 sigma out 1act.( 1.86665785 sig in 12act.) RANK 15 NODE 1 --> 0.614494264 sigma out 1act.( 1.68328702 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 50.7313766 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 12 --> 25.9275742 sigma out 15 active outputs RANK 2 NODE 5 --> 19.8186302 sigma out 15 active outputs RANK 3 NODE 1 --> 18.1736622 sigma out 15 active outputs RANK 4 NODE 8 --> 16.6923447 sigma out 15 active outputs RANK 5 NODE 9 --> 16.0785408 sigma out 15 active outputs RANK 6 NODE 4 --> 15.9501247 sigma out 15 active outputs RANK 7 NODE 3 --> 15.1895409 sigma out 15 active outputs RANK 8 NODE 2 --> 15.1051426 sigma out 15 active outputs RANK 9 NODE 6 --> 14.381074 sigma out 15 active outputs RANK 10 NODE 7 --> 13.6659451 sigma out 15 active outputs RANK 11 NODE 10 --> 13.6421051 sigma out 15 active outputs RANK 12 NODE 11 --> 8.04285812 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 12 --> 25.6581593 sigma in 12act. ( 23.9858398 sig out 1act.) RANK 2 NODE 4 --> 21.8502407 sigma in 12act. ( 21.8651104 sig out 1act.) RANK 3 NODE 5 --> 17.0953465 sigma in 12act. ( 14.9907312 sig out 1act.) RANK 4 NODE 13 --> 16.4662933 sigma in 12act. ( 16.1958408 sig out 1act.) RANK 5 NODE 8 --> 15.5186033 sigma in 12act. ( 14.2619686 sig out 1act.) RANK 6 NODE 11 --> 13.7641611 sigma in 12act. ( 6.04661608 sig out 1act.) RANK 7 NODE 9 --> 13.6311998 sigma in 12act. ( 5.05000639 sig out 1act.) RANK 8 NODE 14 --> 12.810401 sigma in 12act. ( 12.2599478 sig out 1act.) RANK 9 NODE 15 --> 12.6487961 sigma in 12act. ( 11.6508446 sig out 1act.) RANK 10 NODE 7 --> 12.5523243 sigma in 12act. ( 10.9691572 sig out 1act.) RANK 11 NODE 6 --> 11.0795879 sigma in 12act. ( 6.81912422 sig out 1act.) RANK 12 NODE 1 --> 11.0115957 sigma in 12act. ( 2.8594439 sig out 1act.) RANK 13 NODE 2 --> 10.2902594 sigma in 12act. ( 4.98592281 sig out 1act.) RANK 14 NODE 10 --> 9.62836266 sigma in 12act. ( 6.53214598 sig out 1act.) RANK 15 NODE 3 --> 7.11306906 sigma in 12act. ( 2.93620038 sig out 1act.) sorted by output significance RANK 1 NODE 12 --> 23.9858398 sigma out 1act.( 25.6581593 sig in 12act.) RANK 2 NODE 4 --> 21.8651104 sigma out 1act.( 21.8502407 sig in 12act.) RANK 3 NODE 13 --> 16.1958408 sigma out 1act.( 16.4662933 sig in 12act.) RANK 4 NODE 5 --> 14.9907312 sigma out 1act.( 17.0953465 sig in 12act.) RANK 5 NODE 8 --> 14.2619686 sigma out 1act.( 15.5186033 sig in 12act.) RANK 6 NODE 14 --> 12.2599478 sigma out 1act.( 12.810401 sig in 12act.) RANK 7 NODE 15 --> 11.6508446 sigma out 1act.( 12.6487961 sig in 12act.) RANK 8 NODE 7 --> 10.9691572 sigma out 1act.( 12.5523243 sig in 12act.) RANK 9 NODE 6 --> 6.81912422 sigma out 1act.( 11.0795879 sig in 12act.) RANK 10 NODE 10 --> 6.53214598 sigma out 1act.( 9.62836266 sig in 12act.) RANK 11 NODE 11 --> 6.04661608 sigma out 1act.( 13.7641611 sig in 12act.) RANK 12 NODE 9 --> 5.05000639 sigma out 1act.( 13.6311998 sig in 12act.) RANK 13 NODE 2 --> 4.98592281 sigma out 1act.( 10.2902594 sig in 12act.) RANK 14 NODE 3 --> 2.93620038 sigma out 1act.( 7.11306906 sig in 12act.) RANK 15 NODE 1 --> 2.8594439 sigma out 1act.( 11.0115957 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 48.4055405 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.421337366 *** contribution from regularisation: 0.00545657426 *** contribution from error: -0.426793933 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.470880061 *** contribution from regularisation: 0.0023627691 *** contribution from error: -0.473242819 *********************************************** -----------------> Test sample ENTER BFGS code START -46766.9077 -0.529938757 -0.189656526 EXIT FROM BFGS code FG_START 0. -0.529938757 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.48203367 *** contribution from regularisation: 0.00240163039 *** contribution from error: -0.48443529 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -47864.4978 -0.529938757 -42.1567345 EXIT FROM BFGS code FG_LNSRCH 0. -0.536510944 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0.503339648 *** contribution from regularisation: 0.00297292485 *** contribution from error: -0.506312549 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49980.1183 -0.536510944 83.598999 EXIT FROM BFGS code NEW_X -49980.1183 -0.536510944 83.598999 ENTER BFGS code NEW_X -49980.1183 -0.536510944 83.598999 EXIT FROM BFGS code FG_LNSRCH 0. -0.529209971 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.50826484 *** contribution from regularisation: 0.0029395062 *** contribution from error: -0.511204362 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50469.1755 -0.529209971 12.8836908 EXIT FROM BFGS code NEW_X -50469.1755 -0.529209971 12.8836908 ENTER BFGS code NEW_X -50469.1755 -0.529209971 12.8836908 EXIT FROM BFGS code FG_LNSRCH 0. -0.527806282 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.508831561 *** contribution from regularisation: 0.00302899443 *** contribution from error: -0.511860549 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50525.4463 -0.527806282 -1.72847259 EXIT FROM BFGS code NEW_X -50525.4463 -0.527806282 -1.72847259 ENTER BFGS code NEW_X -50525.4463 -0.527806282 -1.72847259 EXIT FROM BFGS code FG_LNSRCH 0. -0.527197003 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.50922972 *** contribution from regularisation: 0.00309466105 *** contribution from error: -0.512324393 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50564.9822 -0.527197003 -14.940485 EXIT FROM BFGS code NEW_X -50564.9822 -0.527197003 -14.940485 ENTER BFGS code NEW_X -50564.9822 -0.527197003 -14.940485 EXIT FROM BFGS code FG_LNSRCH 0. -0.528276205 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.509704649 *** contribution from regularisation: 0.00300521171 *** contribution from error: -0.512709856 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50612.1449 -0.528276205 -10.0977144 EXIT FROM BFGS code NEW_X -50612.1449 -0.528276205 -10.0977144 ENTER BFGS code NEW_X -50612.1449 -0.528276205 -10.0977144 EXIT FROM BFGS code FG_LNSRCH 0. -0.532352328 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.511223555 *** contribution from regularisation: 0.00300874515 *** contribution from error: -0.514232278 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50762.9627 -0.532352328 15.872468 EXIT FROM BFGS code NEW_X -50762.9627 -0.532352328 15.872468 ENTER BFGS code NEW_X -50762.9627 -0.532352328 15.872468 EXIT FROM BFGS code FG_LNSRCH 0. -0.529828191 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 12 --> 42.0605392 sigma out 15 active outputs RANK 2 NODE 1 --> 24.4864769 sigma out 15 active outputs RANK 3 NODE 3 --> 20.5078621 sigma out 15 active outputs RANK 4 NODE 4 --> 17.0076389 sigma out 15 active outputs RANK 5 NODE 5 --> 15.6566639 sigma out 15 active outputs RANK 6 NODE 8 --> 15.3963289 sigma out 15 active outputs RANK 7 NODE 9 --> 10.4979391 sigma out 15 active outputs RANK 8 NODE 6 --> 8.78296566 sigma out 15 active outputs RANK 9 NODE 10 --> 8.16789341 sigma out 15 active outputs RANK 10 NODE 11 --> 7.81725454 sigma out 15 active outputs RANK 11 NODE 2 --> 7.66839743 sigma out 15 active outputs RANK 12 NODE 7 --> 7.4818387 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 9 --> 35.2448807 sigma in 12act. ( 37.6198387 sig out 1act.) RANK 2 NODE 11 --> 30.4715118 sigma in 12act. ( 36.251255 sig out 1act.) RANK 3 NODE 12 --> 22.1579056 sigma in 12act. ( 19.6429787 sig out 1act.) RANK 4 NODE 1 --> 21.2622757 sigma in 12act. ( 20.237606 sig out 1act.) RANK 5 NODE 5 --> 12.2004128 sigma in 12act. ( 11.0433989 sig out 1act.) RANK 6 NODE 4 --> 12.1545563 sigma in 12act. ( 12.6287212 sig out 1act.) RANK 7 NODE 8 --> 12.1096182 sigma in 12act. ( 11.052207 sig out 1act.) RANK 8 NODE 2 --> 11.5992432 sigma in 12act. ( 9.73311234 sig out 1act.) RANK 9 NODE 6 --> 9.61901951 sigma in 12act. ( 6.72864151 sig out 1act.) RANK 10 NODE 13 --> 7.61765766 sigma in 12act. ( 6.08621693 sig out 1act.) RANK 11 NODE 14 --> 7.44737864 sigma in 12act. ( 6.89778662 sig out 1act.) RANK 12 NODE 3 --> 5.19826365 sigma in 12act. ( 2.75133729 sig out 1act.) RANK 13 NODE 10 --> 5.10608625 sigma in 12act. ( 2.02710962 sig out 1act.) RANK 14 NODE 7 --> 4.60172462 sigma in 12act. ( 2.26421595 sig out 1act.) RANK 15 NODE 15 --> 4.33730507 sigma in 12act. ( 2.45218873 sig out 1act.) sorted by output significance RANK 1 NODE 9 --> 37.6198387 sigma out 1act.( 35.2448807 sig in 12act.) RANK 2 NODE 11 --> 36.251255 sigma out 1act.( 30.4715118 sig in 12act.) RANK 3 NODE 1 --> 20.237606 sigma out 1act.( 21.2622757 sig in 12act.) RANK 4 NODE 12 --> 19.6429787 sigma out 1act.( 22.1579056 sig in 12act.) RANK 5 NODE 4 --> 12.6287212 sigma out 1act.( 12.1545563 sig in 12act.) RANK 6 NODE 8 --> 11.052207 sigma out 1act.( 12.1096182 sig in 12act.) RANK 7 NODE 5 --> 11.0433989 sigma out 1act.( 12.2004128 sig in 12act.) RANK 8 NODE 2 --> 9.73311234 sigma out 1act.( 11.5992432 sig in 12act.) RANK 9 NODE 14 --> 6.89778662 sigma out 1act.( 7.44737864 sig in 12act.) RANK 10 NODE 6 --> 6.72864151 sigma out 1act.( 9.61901951 sig in 12act.) RANK 11 NODE 13 --> 6.08621693 sigma out 1act.( 7.61765766 sig in 12act.) RANK 12 NODE 3 --> 2.75133729 sigma out 1act.( 5.19826365 sig in 12act.) RANK 13 NODE 15 --> 2.45218873 sigma out 1act.( 4.33730507 sig in 12act.) RANK 14 NODE 7 --> 2.26421595 sigma out 1act.( 4.60172462 sig in 12act.) RANK 15 NODE 10 --> 2.02710962 sigma out 1act.( 5.10608625 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 64.6208496 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.512766242 *** contribution from regularisation: 0.00298252027 *** contribution from error: -0.515748739 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -50916.1493 -0.529828191 27.2017498 EXIT FROM BFGS code NEW_X -50916.1493 -0.529828191 27.2017498 ENTER BFGS code NEW_X -50916.1493 -0.529828191 27.2017498 EXIT FROM BFGS code FG_LNSRCH 0. -0.518689454 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.514904559 *** contribution from regularisation: 0.00329596968 *** contribution from error: -0.518200517 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51128.4795 -0.518689454 28.6875381 EXIT FROM BFGS code NEW_X -51128.4795 -0.518689454 28.6875381 ENTER BFGS code NEW_X -51128.4795 -0.518689454 28.6875381 EXIT FROM BFGS code FG_LNSRCH 0. -0.51186192 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.517290831 *** contribution from regularisation: 0.00278098322 *** contribution from error: -0.520071805 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51365.4297 -0.51186192 12.2382059 EXIT FROM BFGS code NEW_X -51365.4297 -0.51186192 12.2382059 ENTER BFGS code NEW_X -51365.4297 -0.51186192 12.2382059 EXIT FROM BFGS code FG_LNSRCH 0. -0.510407746 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.517705142 *** contribution from regularisation: 0.00286627701 *** contribution from error: -0.520571411 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51406.5664 -0.510407746 -0.719941795 EXIT FROM BFGS code NEW_X -51406.5664 -0.510407746 -0.719941795 ENTER BFGS code NEW_X -51406.5664 -0.510407746 -0.719941795 EXIT FROM BFGS code FG_LNSRCH 0. -0.509091437 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.51786989 *** contribution from regularisation: 0.00288915378 *** contribution from error: -0.520759046 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51422.9237 -0.509091437 -10.609067 EXIT FROM BFGS code NEW_X -51422.9237 -0.509091437 -10.609067 ENTER BFGS code NEW_X -51422.9237 -0.509091437 -10.609067 EXIT FROM BFGS code FG_LNSRCH 0. -0.510640621 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.518092692 *** contribution from regularisation: 0.0029507461 *** contribution from error: -0.52104342 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51445.0495 -0.510640621 -15.1211843 EXIT FROM BFGS code NEW_X -51445.0495 -0.510640621 -15.1211843 ENTER BFGS code NEW_X -51445.0495 -0.510640621 -15.1211843 EXIT FROM BFGS code FG_LNSRCH 0. -0.519303679 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.518740892 *** contribution from regularisation: 0.00311448635 *** contribution from error: -0.521855354 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51509.4154 -0.519303679 -15.9795895 EXIT FROM BFGS code NEW_X -51509.4154 -0.519303679 -15.9795895 ENTER BFGS code NEW_X -51509.4154 -0.519303679 -15.9795895 EXIT FROM BFGS code FG_LNSRCH 0. -0.530449986 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.519545555 *** contribution from regularisation: 0.00332473498 *** contribution from error: -0.522870302 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51589.3122 -0.530449986 -15.6496897 EXIT FROM BFGS code NEW_X -51589.3122 -0.530449986 -15.6496897 ENTER BFGS code NEW_X -51589.3122 -0.530449986 -15.6496897 EXIT FROM BFGS code FG_LNSRCH 0. -0.551412642 0. --------------------------------------------------- Iteration : 18 *********************************************** *** Learn Path 18 *** loss function: -0.519218624 *** contribution from regularisation: 0.00354666775 *** contribution from error: -0.522765279 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51556.8503 -0.551412642 43.4724808 EXIT FROM BFGS code FG_LNSRCH 0. -0.539202094 0. --------------------------------------------------- Iteration : 19 *********************************************** *** Learn Path 19 *** loss function: -0.520235419 *** contribution from regularisation: 0.00312620122 *** contribution from error: -0.523361623 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51657.8187 -0.539202094 5.13008213 EXIT FROM BFGS code NEW_X -51657.8187 -0.539202094 5.13008213 ENTER BFGS code NEW_X -51657.8187 -0.539202094 5.13008213 EXIT FROM BFGS code FG_LNSRCH 0. -0.541157007 0. --------------------------------------------------- Iteration : 20 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 12 --> 79.9312744 sigma out 15 active outputs RANK 2 NODE 3 --> 59.9957123 sigma out 15 active outputs RANK 3 NODE 8 --> 29.8007317 sigma out 15 active outputs RANK 4 NODE 1 --> 29.42663 sigma out 15 active outputs RANK 5 NODE 5 --> 23.8033066 sigma out 15 active outputs RANK 6 NODE 4 --> 23.7266788 sigma out 15 active outputs RANK 7 NODE 9 --> 16.6717682 sigma out 15 active outputs RANK 8 NODE 11 --> 16.6011314 sigma out 15 active outputs RANK 9 NODE 10 --> 13.5390358 sigma out 15 active outputs RANK 10 NODE 6 --> 10.5922422 sigma out 15 active outputs RANK 11 NODE 7 --> 10.2102289 sigma out 15 active outputs RANK 12 NODE 2 --> 7.65674782 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 9 --> 79.7089157 sigma in 12act. ( 88.0984268 sig out 1act.) RANK 2 NODE 11 --> 54.1766853 sigma in 12act. ( 75.0793915 sig out 1act.) RANK 3 NODE 1 --> 32.6459694 sigma in 12act. ( 30.8480244 sig out 1act.) RANK 4 NODE 12 --> 30.1597309 sigma in 12act. ( 34.9568062 sig out 1act.) RANK 5 NODE 4 --> 27.1210308 sigma in 12act. ( 32.5619316 sig out 1act.) RANK 6 NODE 8 --> 26.0547047 sigma in 12act. ( 28.8980236 sig out 1act.) RANK 7 NODE 5 --> 19.8272419 sigma in 12act. ( 21.5506287 sig out 1act.) RANK 8 NODE 10 --> 16.0534668 sigma in 12act. ( 19.6657505 sig out 1act.) RANK 9 NODE 14 --> 16.0517826 sigma in 12act. ( 19.233551 sig out 1act.) RANK 10 NODE 13 --> 11.2164221 sigma in 12act. ( 8.50784206 sig out 1act.) RANK 11 NODE 2 --> 9.63502121 sigma in 12act. ( 8.79746342 sig out 1act.) RANK 12 NODE 6 --> 5.73340797 sigma in 12act. ( 3.8590672 sig out 1act.) RANK 13 NODE 7 --> 4.59247637 sigma in 12act. ( 3.657372 sig out 1act.) RANK 14 NODE 15 --> 4.25027657 sigma in 12act. ( 3.28638959 sig out 1act.) RANK 15 NODE 3 --> 2.53467464 sigma in 12act. ( 0.270320028 sig out 1act.) sorted by output significance RANK 1 NODE 9 --> 88.0984268 sigma out 1act.( 79.7089157 sig in 12act.) RANK 2 NODE 11 --> 75.0793915 sigma out 1act.( 54.1766853 sig in 12act.) RANK 3 NODE 12 --> 34.9568062 sigma out 1act.( 30.1597309 sig in 12act.) RANK 4 NODE 4 --> 32.5619316 sigma out 1act.( 27.1210308 sig in 12act.) RANK 5 NODE 1 --> 30.8480244 sigma out 1act.( 32.6459694 sig in 12act.) RANK 6 NODE 8 --> 28.8980236 sigma out 1act.( 26.0547047 sig in 12act.) RANK 7 NODE 5 --> 21.5506287 sigma out 1act.( 19.8272419 sig in 12act.) RANK 8 NODE 10 --> 19.6657505 sigma out 1act.( 16.0534668 sig in 12act.) RANK 9 NODE 14 --> 19.233551 sigma out 1act.( 16.0517826 sig in 12act.) RANK 10 NODE 2 --> 8.79746342 sigma out 1act.( 9.63502121 sig in 12act.) RANK 11 NODE 13 --> 8.50784206 sigma out 1act.( 11.2164221 sig in 12act.) RANK 12 NODE 6 --> 3.8590672 sigma out 1act.( 5.73340797 sig in 12act.) RANK 13 NODE 7 --> 3.657372 sigma out 1act.( 4.59247637 sig in 12act.) RANK 14 NODE 15 --> 3.28638959 sigma out 1act.( 4.25027657 sig in 12act.) RANK 15 NODE 3 --> 0.270320028 sigma out 1act.( 2.53467464 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 137.394409 sigma in 15 active inputs *********************************************** *** Learn Path 20 *** loss function: -0.520541668 *** contribution from regularisation: 0.00325543666 *** contribution from error: -0.523797095 *********************************************** -----------------> Test sample Iteration No: 20 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -51688.2239 -0.541157007 15.9093409 EXIT FROM BFGS code NEW_X -51688.2239 -0.541157007 15.9093409 ENTER BFGS code NEW_X -51688.2239 -0.541157007 15.9093409 EXIT FROM BFGS code FG_LNSRCH 0. -0.537861228 0. --------------------------------------------------- Iteration : 21 *********************************************** *** Learn Path 21 *** loss function: -0.520696223 *** contribution from regularisation: 0.00318727759 *** contribution from error: -0.523883522 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51703.5709 -0.537861228 8.82234955 EXIT FROM BFGS code NEW_X -51703.5709 -0.537861228 8.82234955 ENTER BFGS code NEW_X -51703.5709 -0.537861228 8.82234955 EXIT FROM BFGS code FG_LNSRCH 0. -0.53385365 0. --------------------------------------------------- Iteration : 22 *********************************************** *** Learn Path 22 *** loss function: -0.521243036 *** contribution from regularisation: 0.00308408495 *** contribution from error: -0.524327099 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51757.8686 -0.53385365 35.9098816 EXIT FROM BFGS code NEW_X -51757.8686 -0.53385365 35.9098816 ENTER BFGS code NEW_X -51757.8686 -0.53385365 35.9098816 EXIT FROM BFGS code FG_LNSRCH 0. -0.514831722 0. --------------------------------------------------- Iteration : 23 *********************************************** *** Learn Path 23 *** loss function: -0.521547258 *** contribution from regularisation: 0.0031065389 *** contribution from error: -0.524653792 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51788.0761 -0.514831722 -10.0619516 EXIT FROM BFGS code NEW_X -51788.0761 -0.514831722 -10.0619516 ENTER BFGS code NEW_X -51788.0761 -0.514831722 -10.0619516 EXIT FROM BFGS code FG_LNSRCH 0. -0.51230818 0. --------------------------------------------------- Iteration : 24 *********************************************** *** Learn Path 24 *** loss function: -0.52163738 *** contribution from regularisation: 0.00326013332 *** contribution from error: -0.524897516 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51797.025 -0.51230818 17.3573475 EXIT FROM BFGS code NEW_X -51797.025 -0.51230818 17.3573475 ENTER BFGS code NEW_X -51797.025 -0.51230818 17.3573475 EXIT FROM BFGS code FG_LNSRCH 0. -0.50898385 0. --------------------------------------------------- Iteration : 25 *********************************************** *** Learn Path 25 *** loss function: -0.521771848 *** contribution from regularisation: 0.00324574299 *** contribution from error: -0.525017619 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51810.3807 -0.50898385 14.6495934 EXIT FROM BFGS code NEW_X -51810.3807 -0.50898385 14.6495934 ENTER BFGS code NEW_X -51810.3807 -0.50898385 14.6495934 EXIT FROM BFGS code FG_LNSRCH 0. -0.492055893 0. --------------------------------------------------- Iteration : 26 *********************************************** *** Learn Path 26 *** loss function: -0.522105098 *** contribution from regularisation: 0.00328304432 *** contribution from error: -0.525388122 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51843.471 -0.492055893 3.9207468 EXIT FROM BFGS code NEW_X -51843.471 -0.492055893 3.9207468 ENTER BFGS code NEW_X -51843.471 -0.492055893 3.9207468 EXIT FROM BFGS code FG_LNSRCH 0. -0.466933012 0. --------------------------------------------------- Iteration : 27 *********************************************** *** Learn Path 27 *** loss function: -0.522595763 *** contribution from regularisation: 0.00333794579 *** contribution from error: -0.525933683 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51892.1908 -0.466933012 7.71538782 EXIT FROM BFGS code NEW_X -51892.1908 -0.466933012 7.71538782 ENTER BFGS code NEW_X -51892.1908 -0.466933012 7.71538782 EXIT FROM BFGS code FG_LNSRCH 0. -0.432081878 0. --------------------------------------------------- Iteration : 28 *********************************************** *** Learn Path 28 *** loss function: -0.523037672 *** contribution from regularisation: 0.00338604418 *** contribution from error: -0.526423693 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51936.0724 -0.432081878 -20.1157684 EXIT FROM BFGS code NEW_X -51936.0724 -0.432081878 -20.1157684 ENTER BFGS code NEW_X -51936.0724 -0.432081878 -20.1157684 EXIT FROM BFGS code FG_LNSRCH 0. -0.437040865 0. --------------------------------------------------- Iteration : 29 *********************************************** *** Learn Path 29 *** loss function: -0.523393869 *** contribution from regularisation: 0.00320664188 *** contribution from error: -0.52660054 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51971.4413 -0.437040865 20.9424076 EXIT FROM BFGS code NEW_X -51971.4413 -0.437040865 20.9424076 ENTER BFGS code NEW_X -51971.4413 -0.437040865 20.9424076 EXIT FROM BFGS code FG_LNSRCH 0. -0.437792033 0. --------------------------------------------------- Iteration : 30 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 12 --> 84.3030624 sigma out 15 active outputs RANK 2 NODE 3 --> 63.512661 sigma out 15 active outputs RANK 3 NODE 1 --> 49.8227158 sigma out 15 active outputs RANK 4 NODE 9 --> 26.046072 sigma out 15 active outputs RANK 5 NODE 5 --> 25.843544 sigma out 15 active outputs RANK 6 NODE 4 --> 25.484539 sigma out 15 active outputs RANK 7 NODE 8 --> 25.4802494 sigma out 15 active outputs RANK 8 NODE 6 --> 22.5680447 sigma out 15 active outputs RANK 9 NODE 7 --> 20.4721756 sigma out 15 active outputs RANK 10 NODE 11 --> 16.4854107 sigma out 15 active outputs RANK 11 NODE 10 --> 16.1080246 sigma out 15 active outputs RANK 12 NODE 2 --> 14.691164 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 9 --> 84.8778687 sigma in 12act. ( 96.0896378 sig out 1act.) RANK 2 NODE 1 --> 53.3178635 sigma in 12act. ( 48.438858 sig out 1act.) RANK 3 NODE 11 --> 48.3660812 sigma in 12act. ( 71.0559464 sig out 1act.) RANK 4 NODE 5 --> 36.1073456 sigma in 12act. ( 39.8585587 sig out 1act.) RANK 5 NODE 8 --> 31.3885155 sigma in 12act. ( 33.0777283 sig out 1act.) RANK 6 NODE 10 --> 31.2296047 sigma in 12act. ( 31.8485622 sig out 1act.) RANK 7 NODE 4 --> 31.1713657 sigma in 12act. ( 37.2555885 sig out 1act.) RANK 8 NODE 14 --> 24.5903625 sigma in 12act. ( 26.1421471 sig out 1act.) RANK 9 NODE 12 --> 23.111002 sigma in 12act. ( 25.9545918 sig out 1act.) RANK 10 NODE 13 --> 9.32006645 sigma in 12act. ( 8.58300877 sig out 1act.) RANK 11 NODE 2 --> 6.62469244 sigma in 12act. ( 5.45788717 sig out 1act.) RANK 12 NODE 7 --> 4.87939215 sigma in 12act. ( 4.52251816 sig out 1act.) RANK 13 NODE 6 --> 4.51309538 sigma in 12act. ( 3.38765502 sig out 1act.) RANK 14 NODE 15 --> 2.17901373 sigma in 12act. ( 1.16149735 sig out 1act.) RANK 15 NODE 3 --> 1.71078598 sigma in 12act. ( 0.386572123 sig out 1act.) sorted by output significance RANK 1 NODE 9 --> 96.0896378 sigma out 1act.( 84.8778687 sig in 12act.) RANK 2 NODE 11 --> 71.0559464 sigma out 1act.( 48.3660812 sig in 12act.) RANK 3 NODE 1 --> 48.438858 sigma out 1act.( 53.3178635 sig in 12act.) RANK 4 NODE 5 --> 39.8585587 sigma out 1act.( 36.1073456 sig in 12act.) RANK 5 NODE 4 --> 37.2555885 sigma out 1act.( 31.1713657 sig in 12act.) RANK 6 NODE 8 --> 33.0777283 sigma out 1act.( 31.3885155 sig in 12act.) RANK 7 NODE 10 --> 31.8485622 sigma out 1act.( 31.2296047 sig in 12act.) RANK 8 NODE 14 --> 26.1421471 sigma out 1act.( 24.5903625 sig in 12act.) RANK 9 NODE 12 --> 25.9545918 sigma out 1act.( 23.111002 sig in 12act.) RANK 10 NODE 13 --> 8.58300877 sigma out 1act.( 9.32006645 sig in 12act.) RANK 11 NODE 2 --> 5.45788717 sigma out 1act.( 6.62469244 sig in 12act.) RANK 12 NODE 7 --> 4.52251816 sigma out 1act.( 4.87939215 sig in 12act.) RANK 13 NODE 6 --> 3.38765502 sigma out 1act.( 4.51309538 sig in 12act.) RANK 14 NODE 15 --> 1.16149735 sigma out 1act.( 2.17901373 sig in 12act.) RANK 15 NODE 3 --> 0.386572123 sigma out 1act.( 1.71078598 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 152.340332 sigma in 15 active inputs *********************************************** *** Learn Path 30 *** loss function: -0.523406386 *** contribution from regularisation: 0.00323579181 *** contribution from error: -0.526642203 *********************************************** -----------------> Test sample Iteration No: 30 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -51972.6841 -0.437792033 12.1476784 EXIT FROM BFGS code NEW_X -51972.6841 -0.437792033 12.1476784 ENTER BFGS code NEW_X -51972.6841 -0.437792033 12.1476784 EXIT FROM BFGS code FG_LNSRCH 0. -0.432713956 0. --------------------------------------------------- Iteration : 31 *********************************************** *** Learn Path 31 *** loss function: -0.523485422 *** contribution from regularisation: 0.00323700975 *** contribution from error: -0.526722431 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51980.53 -0.432713956 3.33246136 EXIT FROM BFGS code NEW_X -51980.53 -0.432713956 3.33246136 ENTER BFGS code NEW_X -51980.53 -0.432713956 3.33246136 EXIT FROM BFGS code FG_LNSRCH 0. -0.418153077 0. --------------------------------------------------- Iteration : 32 *********************************************** *** Learn Path 32 *** loss function: -0.523659408 *** contribution from regularisation: 0.00324225938 *** contribution from error: -0.526901662 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51997.8063 -0.418153077 -7.91532993 EXIT FROM BFGS code NEW_X -51997.8063 -0.418153077 -7.91532993 ENTER BFGS code NEW_X -51997.8063 -0.418153077 -7.91532993 EXIT FROM BFGS code FG_LNSRCH 0. -0.38866809 0. --------------------------------------------------- Iteration : 33 *********************************************** *** Learn Path 33 *** loss function: -0.523969293 *** contribution from regularisation: 0.00324641401 *** contribution from error: -0.527215719 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52028.5808 -0.38866809 -15.5441427 EXIT FROM BFGS code NEW_X -52028.5808 -0.38866809 -15.5441427 ENTER BFGS code NEW_X -52028.5808 -0.38866809 -15.5441427 EXIT FROM BFGS code FG_LNSRCH 0. -0.3456195 0. --------------------------------------------------- Iteration : 34 *********************************************** *** Learn Path 34 *** loss function: -0.523809612 *** contribution from regularisation: 0.00327651878 *** contribution from error: -0.527086139 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52012.7207 -0.3456195 14.2483282 EXIT FROM BFGS code FG_LNSRCH 0. -0.372477829 0. --------------------------------------------------- Iteration : 35 *********************************************** *** Learn Path 35 *** loss function: -0.524056077 *** contribution from regularisation: 0.00332021038 *** contribution from error: -0.527376294 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52037.1992 -0.372477829 -4.20648193 EXIT FROM BFGS code NEW_X -52037.1992 -0.372477829 -4.20648193 ENTER BFGS code NEW_X -52037.1992 -0.372477829 -4.20648193 EXIT FROM BFGS code FG_LNSRCH 0. -0.350614727 0. --------------------------------------------------- Iteration : 36 *********************************************** *** Learn Path 36 *** loss function: -0.524255276 *** contribution from regularisation: 0.00328373769 *** contribution from error: -0.527539015 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52056.9784 -0.350614727 4.57195425 EXIT FROM BFGS code NEW_X -52056.9784 -0.350614727 4.57195425 ENTER BFGS code NEW_X -52056.9784 -0.350614727 4.57195425 EXIT FROM BFGS code FG_LNSRCH 0. -0.350858897 0. --------------------------------------------------- Iteration : 37 *********************************************** *** Learn Path 37 *** loss function: -0.524298787 *** contribution from regularisation: 0.00329733524 *** contribution from error: -0.527596116 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52061.2952 -0.350858897 5.1492815 EXIT FROM BFGS code NEW_X -52061.2952 -0.350858897 5.1492815 ENTER BFGS code NEW_X -52061.2952 -0.350858897 5.1492815 EXIT FROM BFGS code FG_LNSRCH 0. -0.342864335 0. --------------------------------------------------- Iteration : 38 *********************************************** *** Learn Path 38 *** loss function: -0.524447322 *** contribution from regularisation: 0.00327155273 *** contribution from error: -0.527718902 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52076.0482 -0.342864335 2.99862862 EXIT FROM BFGS code NEW_X -52076.0482 -0.342864335 2.99862862 ENTER BFGS code NEW_X -52076.0482 -0.342864335 2.99862862 EXIT FROM BFGS code FG_LNSRCH 0. -0.332682729 0. --------------------------------------------------- Iteration : 39 *********************************************** *** Learn Path 39 *** loss function: -0.524576843 *** contribution from regularisation: 0.00326273567 *** contribution from error: -0.527839601 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52088.9059 -0.332682729 5.49662876 EXIT FROM BFGS code NEW_X -52088.9059 -0.332682729 5.49662876 ENTER BFGS code NEW_X -52088.9059 -0.332682729 5.49662876 EXIT FROM BFGS code FG_LNSRCH 0. -0.321807116 0. --------------------------------------------------- Iteration : 40 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 12 --> 88.5756149 sigma out 15 active outputs RANK 2 NODE 1 --> 71.8783264 sigma out 15 active outputs RANK 3 NODE 3 --> 66.8304901 sigma out 15 active outputs RANK 4 NODE 9 --> 39.6894379 sigma out 15 active outputs RANK 5 NODE 5 --> 34.8088074 sigma out 15 active outputs RANK 6 NODE 6 --> 28.6633778 sigma out 15 active outputs RANK 7 NODE 4 --> 28.5150146 sigma out 15 active outputs RANK 8 NODE 8 --> 27.6445637 sigma out 15 active outputs RANK 9 NODE 10 --> 22.9035625 sigma out 15 active outputs RANK 10 NODE 11 --> 20.8326435 sigma out 15 active outputs RANK 11 NODE 7 --> 20.3335819 sigma out 15 active outputs RANK 12 NODE 2 --> 19.2260418 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 9 --> 82.1890945 sigma in 12act. ( 95.7135086 sig out 1act.) RANK 2 NODE 1 --> 65.6103363 sigma in 12act. ( 61.8555984 sig out 1act.) RANK 3 NODE 11 --> 51.539978 sigma in 12act. ( 74.4247589 sig out 1act.) RANK 4 NODE 5 --> 51.3914642 sigma in 12act. ( 56.8325539 sig out 1act.) RANK 5 NODE 8 --> 49.9592056 sigma in 12act. ( 55.0716591 sig out 1act.) RANK 6 NODE 4 --> 42.9365463 sigma in 12act. ( 45.4463463 sig out 1act.) RANK 7 NODE 10 --> 39.6894264 sigma in 12act. ( 40.9841805 sig out 1act.) RANK 8 NODE 14 --> 33.2871437 sigma in 12act. ( 34.1044998 sig out 1act.) RANK 9 NODE 12 --> 29.344574 sigma in 12act. ( 32.3282127 sig out 1act.) RANK 10 NODE 13 --> 9.46535206 sigma in 12act. ( 9.2374649 sig out 1act.) RANK 11 NODE 7 --> 6.12746 sigma in 12act. ( 6.08488703 sig out 1act.) RANK 12 NODE 2 --> 4.77951908 sigma in 12act. ( 3.85553408 sig out 1act.) RANK 13 NODE 6 --> 3.09933424 sigma in 12act. ( 2.39219356 sig out 1act.) RANK 14 NODE 15 --> 1.9593358 sigma in 12act. ( 1.31355298 sig out 1act.) RANK 15 NODE 3 --> 1.19815397 sigma in 12act. ( 0.385397702 sig out 1act.) sorted by output significance RANK 1 NODE 9 --> 95.7135086 sigma out 1act.( 82.1890945 sig in 12act.) RANK 2 NODE 11 --> 74.4247589 sigma out 1act.( 51.539978 sig in 12act.) RANK 3 NODE 1 --> 61.8555984 sigma out 1act.( 65.6103363 sig in 12act.) RANK 4 NODE 5 --> 56.8325539 sigma out 1act.( 51.3914642 sig in 12act.) RANK 5 NODE 8 --> 55.0716591 sigma out 1act.( 49.9592056 sig in 12act.) RANK 6 NODE 4 --> 45.4463463 sigma out 1act.( 42.9365463 sig in 12act.) RANK 7 NODE 10 --> 40.9841805 sigma out 1act.( 39.6894264 sig in 12act.) RANK 8 NODE 14 --> 34.1044998 sigma out 1act.( 33.2871437 sig in 12act.) RANK 9 NODE 12 --> 32.3282127 sigma out 1act.( 29.344574 sig in 12act.) RANK 10 NODE 13 --> 9.2374649 sigma out 1act.( 9.46535206 sig in 12act.) RANK 11 NODE 7 --> 6.08488703 sigma out 1act.( 6.12746 sig in 12act.) RANK 12 NODE 2 --> 3.85553408 sigma out 1act.( 4.77951908 sig in 12act.) RANK 13 NODE 6 --> 2.39219356 sigma out 1act.( 3.09933424 sig in 12act.) RANK 14 NODE 15 --> 1.31355298 sigma out 1act.( 1.9593358 sig in 12act.) RANK 15 NODE 3 --> 0.385397702 sigma out 1act.( 1.19815397 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 175.747498 sigma in 15 active inputs *********************************************** *** Learn Path 40 *** loss function: -0.524615943 *** contribution from regularisation: 0.0032828385 *** contribution from error: -0.527898788 *********************************************** -----------------> Test sample Iteration No: 40 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -52092.7877 -0.321807116 5.24835157 EXIT FROM BFGS code NEW_X -52092.7877 -0.321807116 5.24835157 ENTER BFGS code NEW_X -52092.7877 -0.321807116 5.24835157 EXIT FROM BFGS code FG_LNSRCH 0. -0.323175311 0. --------------------------------------------------- Iteration : 41 *********************************************** *** Learn Path 41 *** loss function: -0.524668038 *** contribution from regularisation: 0.0033046403 *** contribution from error: -0.527972698 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52097.9636 -0.323175311 1.41702247 EXIT FROM BFGS code NEW_X -52097.9636 -0.323175311 1.41702247 ENTER BFGS code NEW_X -52097.9636 -0.323175311 1.41702247 EXIT FROM BFGS code FG_LNSRCH 0. -0.323159635 0. --------------------------------------------------- Iteration : 42 *********************************************** *** Learn Path 42 *** loss function: -0.524685562 *** contribution from regularisation: 0.00331018795 *** contribution from error: -0.527995765 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52099.7014 -0.323159635 5.70530224 EXIT FROM BFGS code NEW_X -52099.7014 -0.323159635 5.70530224 ENTER BFGS code NEW_X -52099.7014 -0.323159635 5.70530224 EXIT FROM BFGS code FG_LNSRCH 0. -0.319678217 0. --------------------------------------------------- Iteration : 43 *********************************************** *** Learn Path 43 *** loss function: -0.524757922 *** contribution from regularisation: 0.00329419761 *** contribution from error: -0.528052092 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52106.8873 -0.319678217 8.49697685 EXIT FROM BFGS code NEW_X -52106.8873 -0.319678217 8.49697685 ENTER BFGS code NEW_X -52106.8873 -0.319678217 8.49697685 EXIT FROM BFGS code FG_LNSRCH 0. -0.311691761 0. --------------------------------------------------- Iteration : 44 *********************************************** *** Learn Path 44 *** loss function: -0.524833918 *** contribution from regularisation: 0.00329442136 *** contribution from error: -0.528128326 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52114.4352 -0.311691761 13.5391245 EXIT FROM BFGS code NEW_X -52114.4352 -0.311691761 13.5391245 ENTER BFGS code NEW_X -52114.4352 -0.311691761 13.5391245 EXIT FROM BFGS code FG_LNSRCH 0. -0.297508627 0. --------------------------------------------------- Iteration : 45 *********************************************** *** Learn Path 45 *** loss function: -0.524866819 *** contribution from regularisation: 0.00329576712 *** contribution from error: -0.528162599 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52117.699 -0.297508627 -3.83048916 EXIT FROM BFGS code NEW_X -52117.699 -0.297508627 -3.83048916 ENTER BFGS code NEW_X -52117.699 -0.297508627 -3.83048916 EXIT FROM BFGS code FG_LNSRCH 0. -0.295364082 0. --------------------------------------------------- Iteration : 46 *********************************************** *** Learn Path 46 *** loss function: -0.524895668 *** contribution from regularisation: 0.0033275974 *** contribution from error: -0.528223276 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52120.5644 -0.295364082 0.529296041 EXIT FROM BFGS code NEW_X -52120.5644 -0.295364082 0.529296041 ENTER BFGS code NEW_X -52120.5644 -0.295364082 0.529296041 EXIT FROM BFGS code FG_LNSRCH 0. -0.294706106 0. --------------------------------------------------- Iteration : 47 *********************************************** *** Learn Path 47 *** loss function: -0.524915755 *** contribution from regularisation: 0.00333398045 *** contribution from error: -0.528249741 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52122.56 -0.294706106 0.623799682 EXIT FROM BFGS code NEW_X -52122.56 -0.294706106 0.623799682 ENTER BFGS code NEW_X -52122.56 -0.294706106 0.623799682 EXIT FROM BFGS code FG_LNSRCH 0. -0.287038296 0. --------------------------------------------------- Iteration : 48 *********************************************** *** Learn Path 48 *** loss function: -0.525019228 *** contribution from regularisation: 0.00329556991 *** contribution from error: -0.528314769 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52132.8317 -0.287038296 4.05511999 EXIT FROM BFGS code NEW_X -52132.8317 -0.287038296 4.05511999 ENTER BFGS code NEW_X -52132.8317 -0.287038296 4.05511999 EXIT FROM BFGS code FG_LNSRCH 0. -0.272328019 0. --------------------------------------------------- Iteration : 49 *********************************************** *** Learn Path 49 *** loss function: -0.52505374 *** contribution from regularisation: 0.00330131734 *** contribution from error: -0.528355062 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52136.2611 -0.272328019 -9.20238781 EXIT FROM BFGS code NEW_X -52136.2611 -0.272328019 -9.20238781 ENTER BFGS code NEW_X -52136.2611 -0.272328019 -9.20238781 EXIT FROM BFGS code FG_LNSRCH 0. -0.262057006 0. --------------------------------------------------- Iteration : 50 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 12 --> 96.610939 sigma out 15 active outputs RANK 2 NODE 1 --> 91.0128784 sigma out 15 active outputs RANK 3 NODE 3 --> 71.2878418 sigma out 15 active outputs RANK 4 NODE 9 --> 47.1226883 sigma out 15 active outputs RANK 5 NODE 5 --> 38.2879219 sigma out 15 active outputs RANK 6 NODE 8 --> 30.2794685 sigma out 15 active outputs RANK 7 NODE 4 --> 29.2546825 sigma out 15 active outputs RANK 8 NODE 6 --> 28.4168663 sigma out 15 active outputs RANK 9 NODE 2 --> 23.8187408 sigma out 15 active outputs RANK 10 NODE 7 --> 23.194109 sigma out 15 active outputs RANK 11 NODE 11 --> 23.0051956 sigma out 15 active outputs RANK 12 NODE 10 --> 20.0453548 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 9 --> 87.3776855 sigma in 12act. ( 99.2438507 sig out 1act.) RANK 2 NODE 1 --> 71.8253555 sigma in 12act. ( 65.7894516 sig out 1act.) RANK 3 NODE 8 --> 65.6618958 sigma in 12act. ( 69.8540421 sig out 1act.) RANK 4 NODE 5 --> 57.1931572 sigma in 12act. ( 64.6500778 sig out 1act.) RANK 5 NODE 11 --> 56.9537354 sigma in 12act. ( 83.2626419 sig out 1act.) RANK 6 NODE 4 --> 51.5060844 sigma in 12act. ( 52.8694344 sig out 1act.) RANK 7 NODE 10 --> 43.5513573 sigma in 12act. ( 46.9541016 sig out 1act.) RANK 8 NODE 12 --> 36.4884109 sigma in 12act. ( 39.7699547 sig out 1act.) RANK 9 NODE 14 --> 36.3866997 sigma in 12act. ( 37.6060753 sig out 1act.) RANK 10 NODE 13 --> 9.16337109 sigma in 12act. ( 9.07179451 sig out 1act.) RANK 11 NODE 7 --> 7.81614971 sigma in 12act. ( 8.06735229 sig out 1act.) RANK 12 NODE 2 --> 4.33222103 sigma in 12act. ( 3.73531365 sig out 1act.) RANK 13 NODE 6 --> 2.91102743 sigma in 12act. ( 2.52048707 sig out 1act.) RANK 14 NODE 15 --> 1.99219787 sigma in 12act. ( 1.63358915 sig out 1act.) RANK 15 NODE 3 --> 0.874765337 sigma in 12act. ( 0.38179177 sig out 1act.) sorted by output significance RANK 1 NODE 9 --> 99.2438507 sigma out 1act.( 87.3776855 sig in 12act.) RANK 2 NODE 11 --> 83.2626419 sigma out 1act.( 56.9537354 sig in 12act.) RANK 3 NODE 8 --> 69.8540421 sigma out 1act.( 65.6618958 sig in 12act.) RANK 4 NODE 1 --> 65.7894516 sigma out 1act.( 71.8253555 sig in 12act.) RANK 5 NODE 5 --> 64.6500778 sigma out 1act.( 57.1931572 sig in 12act.) RANK 6 NODE 4 --> 52.8694344 sigma out 1act.( 51.5060844 sig in 12act.) RANK 7 NODE 10 --> 46.9541016 sigma out 1act.( 43.5513573 sig in 12act.) RANK 8 NODE 12 --> 39.7699547 sigma out 1act.( 36.4884109 sig in 12act.) RANK 9 NODE 14 --> 37.6060753 sigma out 1act.( 36.3866997 sig in 12act.) RANK 10 NODE 13 --> 9.07179451 sigma out 1act.( 9.16337109 sig in 12act.) RANK 11 NODE 7 --> 8.06735229 sigma out 1act.( 7.81614971 sig in 12act.) RANK 12 NODE 2 --> 3.73531365 sigma out 1act.( 4.33222103 sig in 12act.) RANK 13 NODE 6 --> 2.52048707 sigma out 1act.( 2.91102743 sig in 12act.) RANK 14 NODE 15 --> 1.63358915 sigma out 1act.( 1.99219787 sig in 12act.) RANK 15 NODE 3 --> 0.38179177 sigma out 1act.( 0.874765337 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 195.795059 sigma in 15 active inputs *********************************************** *** Learn Path 50 *** loss function: -0.525024354 *** contribution from regularisation: 0.00335718808 *** contribution from error: -0.528381526 *********************************************** -----------------> Test sample Iteration No: 50 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -52133.3424 -0.262057006 18.0210094 EXIT FROM BFGS code FG_LNSRCH 0. -0.269388795 0. --------------------------------------------------- Iteration : 51 *********************************************** *** Learn Path 51 *** loss function: -0.524934113 *** contribution from regularisation: 0.00345650362 *** contribution from error: -0.528390646 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52124.3852 -0.269388795 -1.8851701 EXIT FROM BFGS code FG_LNSRCH 0. -0.272163063 0. --------------------------------------------------- Iteration : 52 *********************************************** *** Learn Path 52 *** loss function: -0.524946988 *** contribution from regularisation: 0.00341067114 *** contribution from error: -0.528357685 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52125.6621 -0.272163063 -8.79864693 EXIT FROM BFGS code FG_LNSRCH 0. -0.272327274 0. --------------------------------------------------- Iteration : 53 *********************************************** *** Learn Path 53 *** loss function: -0.524959862 *** contribution from regularisation: 0.00339518138 *** contribution from error: -0.528355062 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52126.942 -0.272327274 -9.19602871 EXIT FROM BFGS code FG_LNSRCH 0. -0.272328019 0. --------------------------------------------------- Iteration : 54 *********************************************** *** Learn Path 54 *** loss function: -0.5249511 *** contribution from regularisation: 0.00340393675 *** contribution from error: -0.528355062 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52126.0713 -0.272328019 -9.20618629 EXIT FROM BFGS code FG_LNSRCH 0. -0.272328019 0. --------------------------------------------------- Iteration : 55 *********************************************** *** Learn Path 55 *** loss function: -0.524969697 *** contribution from regularisation: 0.00338536897 *** contribution from error: -0.528355062 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52127.915 -0.272328019 -9.20021343 EXIT FROM BFGS code FG_LNSRCH 0. -0.272328019 0. --------------------------------------------------- Iteration : 56 *********************************************** *** Learn Path 56 *** loss function: -0.524960995 *** contribution from regularisation: 0.00339406054 *** contribution from error: -0.528355062 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52127.052 -0.272328019 -9.19608498 EXIT FROM BFGS code FG_LNSRCH 0. -0.272328019 0. --------------------------------------------------- Iteration : 57 *********************************************** *** Learn Path 57 *** loss function: -0.52495867 *** contribution from regularisation: 0.00339638302 *** contribution from error: -0.528355062 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -52126.8213 -0.272328019 -9.20424938 EXIT FROM BFGS code NEW_X -52126.8213 -0.272328019 -9.20424938 ENTER BFGS code NEW_X -52126.8213 -0.272328019 -9.20424938 EXIT FROM BFGS code CONVERGENC -52126.8213 -0.272328019 -9.20424938 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 12 --> 147.680679 sigma out 15 active outputs RANK 2 NODE 1 --> 134.980026 sigma out 15 active outputs RANK 3 NODE 3 --> 111.09552 sigma out 15 active outputs RANK 4 NODE 9 --> 71.2922287 sigma out 15 active outputs RANK 5 NODE 5 --> 58.3463058 sigma out 15 active outputs RANK 6 NODE 8 --> 45.3826561 sigma out 15 active outputs RANK 7 NODE 4 --> 45.0124092 sigma out 15 active outputs RANK 8 NODE 6 --> 43.3682709 sigma out 15 active outputs RANK 9 NODE 2 --> 35.5860672 sigma out 15 active outputs RANK 10 NODE 11 --> 35.4084816 sigma out 15 active outputs RANK 11 NODE 7 --> 35.0602951 sigma out 15 active outputs RANK 12 NODE 10 --> 31.8525467 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 9 --> 135.627502 sigma in 12act. ( 154.454193 sig out 1act.) RANK 2 NODE 1 --> 109.277031 sigma in 12act. ( 102.385628 sig out 1act.) RANK 3 NODE 8 --> 96.1400223 sigma in 12act. ( 106.869164 sig out 1act.) RANK 4 NODE 5 --> 87.5776291 sigma in 12act. ( 97.4732971 sig out 1act.) RANK 5 NODE 11 --> 85.5209656 sigma in 12act. ( 127.332001 sig out 1act.) RANK 6 NODE 4 --> 78.2530746 sigma in 12act. ( 81.1293564 sig out 1act.) RANK 7 NODE 10 --> 67.0560684 sigma in 12act. ( 70.2920532 sig out 1act.) RANK 8 NODE 14 --> 55.6008072 sigma in 12act. ( 56.9322586 sig out 1act.) RANK 9 NODE 12 --> 53.9534111 sigma in 12act. ( 59.9110184 sig out 1act.) RANK 10 NODE 13 --> 14.3684092 sigma in 12act. ( 14.6951685 sig out 1act.) RANK 11 NODE 7 --> 11.3869658 sigma in 12act. ( 12.3399 sig out 1act.) RANK 12 NODE 2 --> 6.40727234 sigma in 12act. ( 5.93344879 sig out 1act.) RANK 13 NODE 6 --> 4.19130278 sigma in 12act. ( 3.87168431 sig out 1act.) RANK 14 NODE 15 --> 2.73546958 sigma in 12act. ( 2.58573508 sig out 1act.) RANK 15 NODE 3 --> 1.07651377 sigma in 12act. ( 0.588916719 sig out 1act.) sorted by output significance RANK 1 NODE 9 --> 154.454193 sigma out 1act.( 135.627502 sig in 12act.) RANK 2 NODE 11 --> 127.332001 sigma out 1act.( 85.5209656 sig in 12act.) RANK 3 NODE 8 --> 106.869164 sigma out 1act.( 96.1400223 sig in 12act.) RANK 4 NODE 1 --> 102.385628 sigma out 1act.( 109.277031 sig in 12act.) RANK 5 NODE 5 --> 97.4732971 sigma out 1act.( 87.5776291 sig in 12act.) RANK 6 NODE 4 --> 81.1293564 sigma out 1act.( 78.2530746 sig in 12act.) RANK 7 NODE 10 --> 70.2920532 sigma out 1act.( 67.0560684 sig in 12act.) RANK 8 NODE 12 --> 59.9110184 sigma out 1act.( 53.9534111 sig in 12act.) RANK 9 NODE 14 --> 56.9322586 sigma out 1act.( 55.6008072 sig in 12act.) RANK 10 NODE 13 --> 14.6951685 sigma out 1act.( 14.3684092 sig in 12act.) RANK 11 NODE 7 --> 12.3399 sigma out 1act.( 11.3869658 sig in 12act.) RANK 12 NODE 2 --> 5.93344879 sigma out 1act.( 6.40727234 sig in 12act.) RANK 13 NODE 6 --> 3.87168431 sigma out 1act.( 4.19130278 sig in 12act.) RANK 14 NODE 15 --> 2.58573508 sigma out 1act.( 2.73546958 sig in 12act.) RANK 15 NODE 3 --> 0.588916719 sigma out 1act.( 1.07651377 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 300.422058 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.524970531 *** contribution from regularisation: 0.00338449609 *** contribution from error: -0.528355002 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 33318 Closing output file done