NNInput NNInputs_110.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree NNAna::CopyTree: entries= 193027 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 46092 nbkg = 146935 Bkg Entries: 146935 Sig Entries: 46092 Chosen entries: 46092 Signal fraction: 1 Background fraction: 0.31369 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 146935 Actual Signal Entries: 46092 Entries to split: 46092 Test with : 23046 Train with : 23046 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 46092 for Signal Prepared event 0 for Signal with 46092 events ====Entry 0 Variable Ht : 103.037 Variable LepAPt : 29.4151 Variable LepBPt : 16.2907 Variable MetSigLeptonsJets : 3.90375 Variable MetSpec : 32.7321 Variable SumEtLeptonsJets : 70.3044 Variable VSumJetLeptonsPt : 47.5713 Variable addEt : 78.4382 Variable dPhiLepSumMet : 2.69477 Variable dPhiLeptons : 0.316822 Variable dRLeptons : 0.339778 Variable lep1_E : 29.4179 Variable lep2_E : 16.3887 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2110 Ht = 103.037 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 29.4154 LepAPt = 29.4151 LepBEt = 16.2907 LepBPt = 16.2907 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 32.7321 MetDelPhi = 1.84515 MetSig = 2.82942 MetSigLeptonsJets = 3.90375 MetSpec = 32.7321 Mjj = 0 MostCentralJetEta = 1.57834 MtllMet = 78.5262 Njets = 1 SB = 0 SumEt = 133.83 SumEtJets = 0 SumEtLeptonsJets = 70.3044 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 47.5713 addEt = 78.4382 dPhiLepSumMet = 2.69477 dPhiLeptons = 0.316822 dRLeptons = 0.339778 diltype = 3 dimass = 7.41356 event = 59 jet1_Et = 24.5987 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 29.4179 lep2_E = 16.3887 rand = 0.999742 run = 229664 weight = 4.83121e-07 ===Show End Prepared event 10000 for Signal with 46092 events Prepared event 20000 for Signal with 46092 events Prepared event 30000 for Signal with 46092 events Prepared event 40000 for Signal with 46092 events Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 146935 for Background Prepared event 0 for Background with 146935 events ====Entry 0 Variable Ht : 85.3408 Variable LepAPt : 31.1294 Variable LepBPt : 10.0664 Variable MetSigLeptonsJets : 6.87768 Variable MetSpec : 44.1437 Variable SumEtLeptonsJets : 41.1959 Variable VSumJetLeptonsPt : 41.0132 Variable addEt : 85.3408 Variable dPhiLepSumMet : 3.10536 Variable dPhiLeptons : 0.219342 Variable dRLeptons : 0.424454 Variable lep1_E : 32.3548 Variable lep2_E : 10.1027 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 85.3408 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 31.1297 LepAPt = 31.1294 LepBEt = 10.0674 LepBPt = 10.0664 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 44.1437 MetDelPhi = 3.01191 MetSig = 3.74558 MetSigLeptonsJets = 6.87768 MetSpec = 44.1437 Mjj = 0 MostCentralJetEta = 0 MtllMet = 86.2332 Njets = 0 SB = 0 SumEt = 138.899 SumEtJets = 0 SumEtLeptonsJets = 41.1959 Target = 0 TrainWeight = 0.240745 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 41.0132 addEt = 85.3408 dPhiLepSumMet = 3.10536 dPhiLeptons = 0.219342 dRLeptons = 0.424454 diltype = 17 dimass = 7.54723 event = 6717520 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 32.3548 lep2_E = 10.1027 rand = 0.999742 run = 271566 weight = 0.00296168 ===Show End Prepared event 10000 for Background with 146935 events Prepared event 20000 for Background with 146935 events Prepared event 30000 for Background with 146935 events Prepared event 40000 for Background with 146935 events Prepared event 50000 for Background with 146935 events Prepared event 60000 for Background with 146935 events Prepared event 70000 for Background with 146935 events Prepared event 80000 for Background with 146935 events Prepared event 90000 for Background with 146935 events Prepared event 100000 for Background with 146935 events Prepared event 110000 for Background with 146935 events Prepared event 120000 for Background with 146935 events Prepared event 130000 for Background with 146935 events Prepared event 140000 for Background with 146935 events Warning: found 4584 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 193027 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 4584 negative weights. Signal fraction: 62.8978004 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 47.1748543 61.5705338 63.7871857 65.7785645 67.1769257 68.3579254 69.0402222 70.0912628 70.8584671 71.572525 72.3006592 73.1043091 73.8334045 74.3864136 75.0198975 75.5488281 76.1416321 76.7068024 77.1443787 77.6946945 78.1849289 78.7086105 79.2757645 79.7759171 80.3298798 80.9241638 81.4643097 82.0413589 82.5713196 83.2455215 83.915657 84.6212082 85.3401337 86.1522369 86.86026 87.6922226 88.4683151 89.1978607 90.1443787 90.948822 91.7928925 92.7170029 93.4771576 94.2950821 95.1463089 95.8523636 96.5782776 97.3575897 98.1209869 98.85112 99.6120605 100.29715 101.100021 101.868744 102.665207 103.41346 104.252747 105.099213 106.002472 106.748825 107.771301 108.593506 109.57872 110.57634 111.70314 113.011444 114.477081 116.093948 117.72348 119.437698 121.145004 122.747726 124.490944 126.130394 127.90612 129.750458 131.565292 133.476227 135.607727 137.780731 139.776642 142.218658 144.469757 147.088837 149.60022 152.37085 155.530212 158.93988 162.669037 166.857864 171.448257 176.818649 182.341873 188.970932 196.485474 205.944061 217.978149 234.714432 256.738098 294.029541 627.063232 ------------------------------ Transdef: Tab for variable 3 20.0000362 20.2804985 20.5229034 20.7504692 20.9883423 21.189743 21.4273968 21.5925503 21.7869759 21.9643726 22.1455231 22.3330421 22.5408516 22.7392082 22.9420853 23.123848 23.2988243 23.52314 23.6979065 23.8581734 24.0473404 24.2168503 24.4354172 24.6026077 24.8041191 24.9881344 25.171957 25.3418961 25.5212746 25.6908169 25.8850098 26.0996323 26.2913933 26.4750977 26.673481 26.8786697 27.0881767 27.307785 27.5306301 27.7229214 27.9270477 28.0929642 28.2903786 28.5127907 28.7293701 28.9679279 29.1604462 29.3805122 29.6214771 29.8720989 30.1119156 30.3771935 30.5805645 30.7991791 31.0432014 31.3077278 31.5646496 31.8220596 32.0904007 32.3041763 32.5796051 32.8501625 33.1136475 33.4001617 33.6968727 33.955864 34.2480049 34.5635033 34.9277954 35.2651443 35.5721207 35.897522 36.2199173 36.6038132 36.9152641 37.3074226 37.68927 38.0962753 38.5509262 38.9975662 39.456974 39.9594193 40.4619789 41.0198822 41.6228561 42.2914619 43.0625916 43.8348541 44.7349396 45.7332382 46.8191071 47.8936691 49.3501663 50.9276352 52.7300568 54.8272781 57.7599869 61.4817581 67.2050323 76.992424 183.946381 ------------------------------ Transdef: Tab for variable 4 10.0000381 10.081274 10.1561909 10.2397652 10.312892 10.3927021 10.4693375 10.5466156 10.6268301 10.7018185 10.7755127 10.8517857 10.9337006 11.0121441 11.0911369 11.1845846 11.2713423 11.3499546 11.4251862 11.5139103 11.5954771 11.6902685 11.77981 11.8600388 11.9570923 12.0434246 12.1532135 12.2469597 12.3388033 12.4400082 12.5478964 12.6527653 12.7560863 12.8443851 12.938942 13.0335712 13.1380024 13.2322941 13.3436489 13.4468384 13.5522852 13.6543236 13.7655888 13.8705978 13.9646597 14.082304 14.190856 14.2925224 14.3969021 14.5217342 14.628212 14.7567196 14.8800631 15.006547 15.144928 15.2747784 15.3844452 15.5226994 15.6621723 15.7962418 15.9327545 16.071043 16.2124615 16.366497 16.5262451 16.6859474 16.8291969 16.9745064 17.1563759 17.3370476 17.5305443 17.7170334 17.9079056 18.1228294 18.3312073 18.5187569 18.7536087 18.977829 19.2126923 19.4638977 19.7028885 19.9762878 20.2265892 20.4093132 20.6960526 20.9960518 21.3016472 21.6285744 21.9755058 22.386013 22.7437172 23.246479 23.7809105 24.3876228 25.0634537 25.9358063 26.9218407 28.2969017 30.3667946 33.5127754 69.6593857 ------------------------------ Transdef: Tab for variable 5 1.07720709 2.18470597 2.54210567 2.78297019 2.98922968 3.14090204 3.29263091 3.43829012 3.58011866 3.70846033 3.82415438 3.93325996 4.02886486 4.11588001 4.197649 4.28467417 4.37150335 4.45209837 4.53073978 4.59710407 4.65853786 4.72974443 4.79153109 4.8629756 4.93506289 5.00025511 5.06550646 5.12924767 5.18007708 5.23443508 5.28168106 5.3287859 5.37784004 5.42171764 5.46124172 5.50465393 5.54798603 5.58889198 5.62546873 5.6647501 5.70868969 5.7508173 5.78987789 5.82805824 5.86495161 5.90198135 5.94258261 5.97836637 6.01551819 6.0514307 6.08322239 6.12045956 6.15669584 6.18923807 6.23076773 6.26432657 6.29525471 6.32851028 6.35930347 6.39816904 6.43454981 6.4688921 6.50812817 6.54403353 6.5881691 6.62949276 6.66525269 6.70165539 6.73711252 6.77556515 6.81686783 6.84700918 6.88094807 6.92178726 6.96598911 7.00531292 7.04843903 7.09403944 7.14119911 7.19295692 7.23948193 7.28668308 7.34305096 7.40186501 7.4576664 7.52402639 7.59007931 7.65679836 7.7273407 7.8130827 7.89997959 7.99580765 8.10387611 8.21837807 8.35305405 8.48919201 8.69620895 8.95113564 9.33834648 9.98149014 18.1234589 ------------------------------ Transdef: Tab for variable 6 15.0054646 17.9858131 20.6679649 23.8080959 25.3761673 26.1594963 26.7928791 27.337368 28.0040207 28.588356 29.1673546 29.7417336 30.2497692 30.7791138 31.2934017 31.7913666 32.2792435 32.66716 33.0098724 33.4335365 33.8041 34.1451569 34.5213242 34.8354759 35.1222038 35.4472275 35.7755394 36.0776749 36.3870583 36.6584511 36.9409561 37.2573738 37.5079193 37.7666168 38.0089378 38.2808914 38.5668373 38.8229904 39.1120033 39.3576126 39.6313324 39.8960876 40.1821976 40.4361877 40.7574883 41.01651 41.2906761 41.6214943 41.9602127 42.2704926 42.6092682 42.9155884 43.1795731 43.4986267 43.7949677 44.1194229 44.4681702 44.8259735 45.2308426 45.5612221 45.922699 46.2943878 46.6529999 47.0539551 47.4096947 47.7991142 48.1583595 48.5291595 48.8853989 49.2512169 49.7043495 50.1063499 50.4866104 50.9059143 51.3608398 51.8368378 52.2787781 52.7594223 53.2418213 53.8140793 54.3533783 54.9124069 55.494091 56.1375046 56.8236923 57.5180016 58.3051262 59.2654877 60.2290573 61.3512573 62.5668335 64.0195618 65.6278229 67.412735 69.5536346 72.1303558 75.2960663 79.365036 84.6530304 95.6906586 236.562576 ------------------------------ Transdef: Tab for variable 7 30.1199875 32.3656654 33.2461357 33.9028435 34.4953651 35.0017166 35.3961792 35.8148346 36.1805801 36.531765 36.8765335 37.1988907 37.5125008 37.7847366 38.136795 38.3800163 38.650528 38.8716049 39.1346664 39.3882866 39.6470108 39.9830627 40.1993408 40.4087486 40.7501984 41.0478287 41.3316574 41.6145287 41.9123764 42.1960297 42.5552406 42.9221725 43.3249664 43.7319069 44.2212296 44.604866 44.9947433 45.4492111 45.8579102 46.2723389 46.7464676 47.2430077 47.7549286 48.2113152 48.6696014 49.1761627 49.6997528 50.2603683 50.8447495 51.3523788 51.885376 52.479187 53.0332184 53.6168137 54.2657547 55.0372009 55.796608 56.4617157 57.2795448 58.2824974 59.2661781 60.4115105 61.6080704 62.7314301 64.0407257 65.3499832 66.6093445 67.8885422 69.1524277 70.4278336 71.8002243 73.0902252 74.3655319 75.8213043 77.2078171 78.853714 80.5063782 82.1448059 83.9347839 85.9037781 88.0406952 90.2372589 92.4732971 94.695282 97.2367096 100.005875 102.766106 105.584808 108.727203 112.413872 116.470543 121.244125 126.367455 132.666992 139.387817 146.620697 156.417603 169.496307 186.651306 214.158096 422.412994 ------------------------------ Transdef: Tab for variable 8 0.156775147 18.9956322 23.8366737 26.773674 28.7904758 30.1369438 31.1677608 31.8707733 32.3902588 32.8560181 33.2363205 33.6968384 34.0059204 34.3921547 34.7576027 35.0936203 35.3446236 35.6049347 35.8944092 36.1378365 36.4164886 36.6509857 36.8979111 37.1786575 37.3913345 37.6480789 37.8540955 38.069622 38.2772942 38.5145035 38.7052994 38.9041977 39.1410637 39.340004 39.5933914 39.8175507 40.0463181 40.2624817 40.4657631 40.7182846 40.9298248 41.2151718 41.4496689 41.7057457 41.983181 42.2280655 42.5663223 42.9128494 43.1925735 43.5140877 43.8191833 44.1601906 44.440979 44.773838 45.062294 45.3644485 45.7063751 46.0562019 46.4185562 46.7353554 47.1002274 47.4254265 47.7711334 48.1519051 48.5045166 48.8482056 49.234169 49.5840149 50.0017853 50.3431511 50.7657013 51.125061 51.5938187 52.0438309 52.4840012 53.0102844 53.4585495 53.9208412 54.4895477 55.0595284 55.6976395 56.2694016 56.9872055 57.8080177 58.6164246 59.516861 60.6427765 61.761528 63.0789642 64.4143677 65.7574768 67.2760773 69.0795135 71.0554352 73.4368591 76.3793182 79.6513901 84.445755 91.6572723 105.01902 280.344635 ------------------------------ Transdef: Tab for variable 9 47.1748543 60.6289787 62.8431473 64.6525421 66.0670319 67.2293091 68.2263031 68.8894424 69.6054077 70.3899231 71.0698929 71.6217346 72.2331009 72.9012146 73.555191 74.0449371 74.5303497 75.0970688 75.5420532 76.0387726 76.5050659 76.9793625 77.3154907 77.7743149 78.1715393 78.6007309 79.0499115 79.4952545 79.8432007 80.2830963 80.7206573 81.216774 81.6996613 82.1652679 82.5623703 83.0583572 83.6281357 84.1605072 84.6928101 85.2737579 85.8456573 86.4014893 86.9061127 87.5387878 88.1490936 88.6274719 89.2208633 89.8279114 90.5208588 91.0593567 91.6565018 92.2696533 92.8398895 93.3574524 93.9453201 94.5497818 95.1473083 95.7078094 96.2694855 96.835495 97.476181 98.0551147 98.588829 99.17733 99.7503662 100.327423 100.880783 101.465103 102.005157 102.596359 103.172638 103.757874 104.351707 104.983932 105.56102 106.207695 106.873749 107.560333 108.304993 109.017128 109.76944 110.507462 111.377747 112.319771 113.399979 114.436493 115.59845 116.914764 118.392784 120.129684 122.0905 124.186813 126.795761 129.922394 133.832031 138.110596 143.606949 150.854889 161.665497 182.096924 382.729065 ------------------------------ Transdef: Tab for variable 10 0.00497005275 0.833418489 1.07163143 1.2585156 1.39095306 1.51428175 1.61819315 1.71637988 1.79813588 1.86635971 1.94097197 2.01366854 2.07062364 2.12868214 2.18394184 2.23257303 2.27722263 2.32282877 2.35985518 2.40402555 2.44445086 2.47499299 2.50581598 2.53697157 2.56365299 2.58906317 2.6140151 2.63627386 2.66030073 2.68270922 2.70398784 2.72492266 2.74505663 2.7631309 2.77895093 2.79476738 2.80959177 2.82554245 2.83988428 2.85059953 2.86260843 2.87368965 2.88488817 2.89590454 2.90400839 2.91333818 2.92134142 2.92857695 2.93618536 2.94442987 2.95171118 2.95972013 2.96579146 2.97210121 2.97766328 2.98351765 2.988451 2.99328351 2.99872828 3.00409746 3.00924206 3.01434898 3.01912403 3.02319074 3.02761817 3.03181911 3.0360465 3.04064322 3.04431391 3.04806519 3.051929 3.0560832 3.05935216 3.06284952 3.06626916 3.07040429 3.07378292 3.07655525 3.07979727 3.08282757 3.08607388 3.08918118 3.09278274 3.09585834 3.09870243 3.10114145 3.10372353 3.10629678 3.10933495 3.11218834 3.11543036 3.11819291 3.12067866 3.12329435 3.1261878 3.12845516 3.13126087 3.13347793 3.13598108 3.13844419 3.14159226 ------------------------------ Transdef: Tab for variable 11 1.19226625E-05 0.0059134271 0.0123476367 0.0190971512 0.0259385705 0.0331542492 0.041399762 0.049215138 0.0577087402 0.0662107766 0.0745104551 0.0827959776 0.0910966396 0.0992574692 0.108162284 0.116610721 0.124685779 0.13185668 0.139575988 0.146675706 0.15409258 0.160686016 0.168026537 0.174753428 0.181939244 0.189333856 0.196161106 0.202003241 0.20661974 0.21213007 0.217758149 0.22353065 0.229179144 0.234828383 0.241187453 0.247234806 0.253264308 0.25906229 0.264957786 0.2709167 0.276995897 0.28265202 0.288762271 0.295179456 0.300461352 0.306630135 0.312314391 0.318019211 0.323981702 0.330254197 0.336921573 0.343271017 0.349886894 0.355529189 0.361816764 0.368325889 0.374869049 0.381749988 0.387531459 0.394870877 0.400595009 0.406862497 0.413702458 0.420017362 0.426438332 0.432362199 0.438862562 0.444401264 0.450635195 0.457155704 0.462837338 0.469275475 0.475575507 0.483364522 0.491053462 0.498675942 0.505850315 0.512773514 0.520596445 0.529346824 0.537333667 0.545453012 0.554685831 0.564887464 0.574027658 0.583039999 0.592781782 0.603211462 0.613590419 0.626293898 0.638392925 0.650273561 0.665118694 0.680562377 0.700517595 0.715574503 0.740257144 0.768993974 0.81070143 0.863759637 1.13201857 ------------------------------ Transdef: Tab for variable 12 0.200011998 0.211460114 0.222164974 0.232402295 0.242121786 0.250498176 0.258159995 0.268340468 0.277384549 0.285086751 0.294084311 0.302130878 0.310878873 0.318655789 0.326488942 0.334737182 0.34227556 0.350026995 0.357850134 0.365820289 0.372631133 0.37957412 0.387199372 0.393810034 0.400268257 0.405800372 0.411102146 0.415599108 0.420704275 0.426094353 0.430365503 0.434860021 0.439015836 0.444420218 0.448814213 0.453943431 0.458443403 0.462951362 0.466754377 0.471482575 0.475679338 0.480087817 0.485029817 0.489424884 0.493476152 0.498121947 0.502631783 0.507670701 0.512005448 0.516177297 0.520872951 0.524721622 0.529320121 0.5343858 0.539466023 0.544065952 0.54913038 0.553815007 0.558138967 0.56284672 0.567825794 0.571887851 0.576583266 0.580491126 0.585227966 0.590487301 0.595274091 0.600081682 0.604688466 0.609648228 0.614956141 0.620597005 0.626131117 0.632068992 0.638043165 0.64428854 0.650453925 0.657049716 0.662367821 0.667585075 0.673748553 0.680204868 0.687592328 0.696098089 0.702960551 0.711523533 0.718729377 0.727152288 0.736616611 0.745850205 0.755704761 0.766702652 0.777696371 0.789575875 0.803344667 0.819949985 0.836695254 0.858682156 0.88625586 0.931100845 1.14129901 ------------------------------ Transdef: Tab for variable 13 20.0293255 21.3645172 22.0123138 22.5015316 23.0845242 23.5095215 23.8984947 24.3379135 24.8042202 25.1615562 25.5143852 25.8778553 26.213623 26.6033516 26.983696 27.2893944 27.6512222 27.9887581 28.2720146 28.5461769 28.8767395 29.2056313 29.4582825 29.7389927 30.0572433 30.3500137 30.6644306 30.9471893 31.2485542 31.5448017 31.813076 32.0856895 32.361145 32.6149673 32.8805618 33.1907196 33.4964066 33.8369293 34.1858635 34.5093307 34.8047562 35.1038704 35.3974457 35.7272148 36.0534592 36.3121719 36.6706924 37.0318832 37.375103 37.7499619 38.1301765 38.4895172 38.8765869 39.2723083 39.6991196 40.1109276 40.4812622 40.9107971 41.365303 41.832119 42.2897377 42.7618065 43.2073174 43.6836777 44.2080002 44.782589 45.3622513 45.8856583 46.4527359 47.0171738 47.6219826 48.1788025 48.8108292 49.51054 50.3230667 51.0692139 51.9605408 52.7226028 53.6747513 54.5624466 55.4281693 56.4287109 57.6337395 58.7012787 59.9622307 61.2886658 62.4989548 63.9504013 65.6255188 67.2208557 69.0512238 70.8904419 73.257988 75.7213669 78.3847351 81.8342514 85.7354584 90.2084961 96.6228409 107.919197 232.066116 ------------------------------ Transdef: Tab for variable 14 10.0028534 10.4553995 10.7460136 10.9560661 11.1976976 11.3968563 11.5901537 11.7516403 11.9379759 12.1160746 12.2804337 12.4467335 12.627943 12.8058853 12.9678392 13.1353397 13.2984695 13.4685516 13.6272535 13.7968559 13.9284792 14.0824661 14.237668 14.4021597 14.5639725 14.7207794 14.8802719 15.0487747 15.1987457 15.3585091 15.5157776 15.6664009 15.8348141 16.0021133 16.1842213 16.3200912 16.4764328 16.6590214 16.8002262 16.9560165 17.1380615 17.3046017 17.4927216 17.6778679 17.8440609 18.0315876 18.2107887 18.4097099 18.6052799 18.7873077 19.0004654 19.16959 19.3908195 19.5948887 19.8188343 20.0331421 20.2507553 20.4697647 20.6833038 20.924984 21.1672363 21.4257622 21.6722755 21.8960648 22.1492271 22.4208355 22.6955605 22.9832497 23.2613945 23.5503273 23.8782997 24.2042885 24.551899 24.8589916 25.2367287 25.5579338 25.9366703 26.2762184 26.6699257 27.0904846 27.5000534 27.9358101 28.3553467 28.8999825 29.4149761 30.0251427 30.753191 31.4154739 32.0958519 33.0059013 33.9528275 34.8313599 35.9816818 37.2293701 38.6023102 40.3879471 42.5275764 45.1819878 48.8707581 56.1275635 118.581718 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 52.5 41.1 16.7 -1.3 24.0 51.6 31.2 46.3 -33.3 -2.1 -9.0 9.7 -12.0 2 52.5 100.0 64.8 39.0 4.8 49.5 93.9 55.5 88.1 -54.6 -15.7 -31.2 34.0 10.7 3 41.1 64.8 100.0 16.3 -14.0 18.8 68.5 41.6 69.9 -24.1 -15.6 -34.3 65.6 -0.9 4 16.7 39.0 16.3 100.0 -7.3 11.5 41.6 26.4 42.5 -12.5 -19.3 -36.9 6.2 66.3 5 -1.3 4.8 -14.0 -7.3 100.0 79.2 -23.0 39.3 35.9 35.9 4.4 8.2 -14.7 -7.8 6 24.0 49.5 18.8 11.5 79.2 100.0 25.8 66.2 69.6 1.8 -3.4 -7.3 4.4 -1.3 7 51.6 93.9 68.5 41.6 -23.0 25.8 100.0 49.0 75.0 -60.6 -15.8 -32.4 38.4 13.9 8 31.2 55.5 41.6 26.4 39.3 66.2 49.0 100.0 70.0 -7.1 -12.9 -21.4 20.9 8.3 9 46.3 88.1 69.9 42.5 35.9 69.6 75.0 70.0 100.0 -26.7 -16.4 -34.2 38.7 14.5 10 -33.3 -54.6 -24.1 -12.5 35.9 1.8 -60.6 -7.1 -26.7 100.0 5.3 10.6 -10.6 0.3 11 -2.1 -15.7 -15.6 -19.3 4.4 -3.4 -15.8 -12.9 -16.4 5.3 100.0 56.1 -5.9 -9.7 12 -9.0 -31.2 -34.3 -36.9 8.2 -7.3 -32.4 -21.4 -34.2 10.6 56.1 100.0 -14.3 -13.8 13 9.7 34.0 65.6 6.2 -14.7 4.4 38.4 20.9 38.7 -10.6 -5.9 -14.3 100.0 35.3 14 -12.0 10.7 -0.9 66.3 -7.8 -1.3 13.9 8.3 14.5 0.3 -9.7 -13.8 35.3 100.0 TOTAL CORRELATION TO TARGET (diagonal) 111.895816 TOTAL CORRELATION OF ALL VARIABLES 59.2216876 ROUND 1: MAX CORR ( 59.2173765) AFTER KILLING INPUT VARIABLE 5 CONTR 0.714560802 ROUND 2: MAX CORR ( 59.2145278) AFTER KILLING INPUT VARIABLE 9 CONTR 0.580841041 ROUND 3: MAX CORR ( 59.2040282) AFTER KILLING INPUT VARIABLE 11 CONTR 1.11505878 ROUND 4: MAX CORR ( 59.1763032) AFTER KILLING INPUT VARIABLE 6 CONTR 1.8116545 ROUND 5: MAX CORR ( 59.1249945) AFTER KILLING INPUT VARIABLE 7 CONTR 2.46371332 ROUND 6: MAX CORR ( 58.9927066) AFTER KILLING INPUT VARIABLE 8 CONTR 3.95291525 ROUND 7: MAX CORR ( 58.6744232) AFTER KILLING INPUT VARIABLE 13 CONTR 6.11976321 ROUND 8: MAX CORR ( 58.2671753) AFTER KILLING INPUT VARIABLE 10 CONTR 6.90102993 ROUND 9: MAX CORR ( 57.2347347) AFTER KILLING INPUT VARIABLE 3 CONTR 10.9201127 ROUND 10: MAX CORR ( 56.4650234) AFTER KILLING INPUT VARIABLE 12 CONTR 9.35499816 ROUND 11: MAX CORR ( 55.4136759) AFTER KILLING INPUT VARIABLE 4 CONTR 10.8454321 ROUND 12: MAX CORR ( 52.5098057) AFTER KILLING INPUT VARIABLE 14 CONTR 17.7029879 LAST REMAINING VARIABLE: 2 total correlation to target: 59.2216876 % total significance: 113.684438 sigma correlations of single variables to target: variable 2: 52.5098057 % , in sigma: 100.800028 variable 3: 41.0710811 % , in sigma: 78.8417718 variable 4: 16.7326665 % , in sigma: 32.1207292 variable 5: -1.30300914 % , in sigma: 2.50131106 variable 6: 24.0139804 % , in sigma: 46.0982451 variable 7: 51.6355116 % , in sigma: 99.1216961 variable 8: 31.2351406 % , in sigma: 59.9602875 variable 9: 46.2952357 % , in sigma: 88.8702784 variable 10: -33.3403029 % , in sigma: 64.0014454 variable 11: -2.05904668 % , in sigma: 3.95263247 variable 12: -8.95234168 % , in sigma: 17.185291 variable 13: 9.68646476 % , in sigma: 18.5945445 variable 14: -11.971325 % , in sigma: 22.9806582 variables sorted by significance: 1 most relevant variable 2 corr 52.5098038 , in sigma: 100.800024 2 most relevant variable 14 corr 17.7029877 , in sigma: 33.9833985 3 most relevant variable 4 corr 10.8454323 , in sigma: 20.8193472 4 most relevant variable 12 corr 9.35499859 , in sigma: 17.9582481 5 most relevant variable 3 corr 10.9201126 , in sigma: 20.9627067 6 most relevant variable 10 corr 6.90103006 , in sigma: 13.2475071 7 most relevant variable 13 corr 6.11976337 , in sigma: 11.7477547 8 most relevant variable 8 corr 3.95291519 , in sigma: 7.58818198 9 most relevant variable 7 corr 2.46371341 , in sigma: 4.72944771 10 most relevant variable 6 corr 1.81165445 , in sigma: 3.47772795 11 most relevant variable 11 corr 1.11505878 , in sigma: 2.14051365 12 most relevant variable 9 corr 0.580841064 , in sigma: 1.1150069 13 most relevant variable 5 corr 0.714560807 , in sigma: 1.37170093 global correlations between input variables: variable 2: 98.8942672 % variable 3: 93.6746402 % variable 4: 88.1353899 % variable 5: 94.567372 % variable 6: 92.887628 % variable 7: 98.4749757 % variable 8: 81.7583362 % variable 9: 98.3991988 % variable 10: 73.5040474 % variable 11: 56.8703204 % variable 12: 66.9415354 % variable 13: 84.444341 % variable 14: 85.1810776 % significance loss when removing single variables: variable 2: corr = 3.19290422 % , sigma = 6.12923301 variable 3: corr = 8.98782626 % , sigma = 17.2534087 variable 4: corr = 8.38599257 % , sigma = 16.0981035 variable 5: corr = 0.714560802 % , sigma = 1.37170092 variable 6: corr = 1.24275054 % , sigma = 2.38563612 variable 7: corr = 2.40685879 % , sigma = 4.62030719 variable 8: corr = 2.12430861 % , sigma = 4.07791201 variable 9: corr = 0.839804517 % , sigma = 1.61212401 variable 10: corr = 6.19717837 % , sigma = 11.8963638 variable 11: corr = 1.13919661 % , sigma = 2.18684965 variable 12: corr = 9.27526579 % , sigma = 17.8051897 variable 13: corr = 6.03517353 % , sigma = 11.5853725 variable 14: corr = 11.154177 % , sigma = 21.4120265 Keep only 9 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 10 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 1 --> 22.5954647 sigma out 15 active outputs RANK 2 NODE 4 --> 15.1770906 sigma out 15 active outputs RANK 3 NODE 3 --> 14.4318714 sigma out 15 active outputs RANK 4 NODE 9 --> 13.9127913 sigma out 15 active outputs RANK 5 NODE 5 --> 13.0287294 sigma out 15 active outputs RANK 6 NODE 6 --> 12.9768362 sigma out 15 active outputs RANK 7 NODE 8 --> 12.9332647 sigma out 15 active outputs RANK 8 NODE 10 --> 12.4671373 sigma out 15 active outputs RANK 9 NODE 2 --> 12.2111282 sigma out 15 active outputs RANK 10 NODE 7 --> 9.38297844 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 7 --> 25.2321014 sigma in 10act. ( 26.0187187 sig out 1act.) RANK 2 NODE 14 --> 19.629488 sigma in 10act. ( 19.0223675 sig out 1act.) RANK 3 NODE 11 --> 18.1017132 sigma in 10act. ( 17.2903347 sig out 1act.) RANK 4 NODE 12 --> 12.0498581 sigma in 10act. ( 11.5806456 sig out 1act.) RANK 5 NODE 13 --> 11.182456 sigma in 10act. ( 12.6701689 sig out 1act.) RANK 6 NODE 8 --> 10.1111107 sigma in 10act. ( 11.2978411 sig out 1act.) RANK 7 NODE 2 --> 9.88819981 sigma in 10act. ( 11.7913141 sig out 1act.) RANK 8 NODE 15 --> 8.9162302 sigma in 10act. ( 12.1073627 sig out 1act.) RANK 9 NODE 1 --> 7.19699144 sigma in 10act. ( 7.03505421 sig out 1act.) RANK 10 NODE 6 --> 5.35538578 sigma in 10act. ( 7.31715631 sig out 1act.) RANK 11 NODE 3 --> 4.95546341 sigma in 10act. ( 7.35270548 sig out 1act.) RANK 12 NODE 5 --> 4.11918116 sigma in 10act. ( 3.64798689 sig out 1act.) RANK 13 NODE 9 --> 2.83809066 sigma in 10act. ( 3.00165915 sig out 1act.) RANK 14 NODE 10 --> 2.51812959 sigma in 10act. ( 3.29619122 sig out 1act.) RANK 15 NODE 4 --> 2.22247338 sigma in 10act. ( 2.83705807 sig out 1act.) sorted by output significance RANK 1 NODE 7 --> 26.0187187 sigma out 1act.( 25.2321014 sig in 10act.) RANK 2 NODE 14 --> 19.0223675 sigma out 1act.( 19.629488 sig in 10act.) RANK 3 NODE 11 --> 17.2903347 sigma out 1act.( 18.1017132 sig in 10act.) RANK 4 NODE 13 --> 12.6701689 sigma out 1act.( 11.182456 sig in 10act.) RANK 5 NODE 15 --> 12.1073627 sigma out 1act.( 8.9162302 sig in 10act.) RANK 6 NODE 2 --> 11.7913141 sigma out 1act.( 9.88819981 sig in 10act.) RANK 7 NODE 12 --> 11.5806456 sigma out 1act.( 12.0498581 sig in 10act.) RANK 8 NODE 8 --> 11.2978411 sigma out 1act.( 10.1111107 sig in 10act.) RANK 9 NODE 3 --> 7.35270548 sigma out 1act.( 4.95546341 sig in 10act.) RANK 10 NODE 6 --> 7.31715631 sigma out 1act.( 5.35538578 sig in 10act.) RANK 11 NODE 1 --> 7.03505421 sigma out 1act.( 7.19699144 sig in 10act.) RANK 12 NODE 5 --> 3.64798689 sigma out 1act.( 4.11918116 sig in 10act.) RANK 13 NODE 10 --> 3.29619122 sigma out 1act.( 2.51812959 sig in 10act.) RANK 14 NODE 9 --> 3.00165915 sigma out 1act.( 2.83809066 sig in 10act.) RANK 15 NODE 4 --> 2.83705807 sigma out 1act.( 2.22247338 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 47.3710403 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 1 --> 24.0493603 sigma out 15 active outputs RANK 2 NODE 10 --> 17.9149513 sigma out 15 active outputs RANK 3 NODE 4 --> 17.3798904 sigma out 15 active outputs RANK 4 NODE 3 --> 17.060112 sigma out 15 active outputs RANK 5 NODE 5 --> 15.6580229 sigma out 15 active outputs RANK 6 NODE 9 --> 14.5979195 sigma out 15 active outputs RANK 7 NODE 6 --> 13.995244 sigma out 15 active outputs RANK 8 NODE 8 --> 13.828969 sigma out 15 active outputs RANK 9 NODE 2 --> 13.7481098 sigma out 15 active outputs RANK 10 NODE 7 --> 10.9551983 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 7 --> 25.9850712 sigma in 10act. ( 25.2788773 sig out 1act.) RANK 2 NODE 11 --> 17.8540688 sigma in 10act. ( 16.7219105 sig out 1act.) RANK 3 NODE 14 --> 17.8266029 sigma in 10act. ( 17.3916035 sig out 1act.) RANK 4 NODE 13 --> 13.7831993 sigma in 10act. ( 13.1901712 sig out 1act.) RANK 5 NODE 15 --> 13.5566397 sigma in 10act. ( 12.5445204 sig out 1act.) RANK 6 NODE 2 --> 12.3041792 sigma in 10act. ( 11.8509312 sig out 1act.) RANK 7 NODE 12 --> 11.8824825 sigma in 10act. ( 10.8468113 sig out 1act.) RANK 8 NODE 8 --> 11.568819 sigma in 10act. ( 11.4580011 sig out 1act.) RANK 9 NODE 3 --> 11.1411047 sigma in 10act. ( 7.60924149 sig out 1act.) RANK 10 NODE 4 --> 9.81624222 sigma in 10act. ( 3.71670032 sig out 1act.) RANK 11 NODE 1 --> 9.41846848 sigma in 10act. ( 6.53701115 sig out 1act.) RANK 12 NODE 6 --> 8.76838875 sigma in 10act. ( 7.5991292 sig out 1act.) RANK 13 NODE 5 --> 7.99204636 sigma in 10act. ( 3.54640079 sig out 1act.) RANK 14 NODE 10 --> 7.4563036 sigma in 10act. ( 3.78808403 sig out 1act.) RANK 15 NODE 9 --> 5.41532326 sigma in 10act. ( 3.25964403 sig out 1act.) sorted by output significance RANK 1 NODE 7 --> 25.2788773 sigma out 1act.( 25.9850712 sig in 10act.) RANK 2 NODE 14 --> 17.3916035 sigma out 1act.( 17.8266029 sig in 10act.) RANK 3 NODE 11 --> 16.7219105 sigma out 1act.( 17.8540688 sig in 10act.) RANK 4 NODE 13 --> 13.1901712 sigma out 1act.( 13.7831993 sig in 10act.) RANK 5 NODE 15 --> 12.5445204 sigma out 1act.( 13.5566397 sig in 10act.) RANK 6 NODE 2 --> 11.8509312 sigma out 1act.( 12.3041792 sig in 10act.) RANK 7 NODE 8 --> 11.4580011 sigma out 1act.( 11.568819 sig in 10act.) RANK 8 NODE 12 --> 10.8468113 sigma out 1act.( 11.8824825 sig in 10act.) RANK 9 NODE 3 --> 7.60924149 sigma out 1act.( 11.1411047 sig in 10act.) RANK 10 NODE 6 --> 7.5991292 sigma out 1act.( 8.76838875 sig in 10act.) RANK 11 NODE 1 --> 6.53701115 sigma out 1act.( 9.41846848 sig in 10act.) RANK 12 NODE 10 --> 3.78808403 sigma out 1act.( 7.4563036 sig in 10act.) RANK 13 NODE 4 --> 3.71670032 sigma out 1act.( 9.81624222 sig in 10act.) RANK 14 NODE 5 --> 3.54640079 sigma out 1act.( 7.99204636 sig in 10act.) RANK 15 NODE 9 --> 3.25964403 sigma out 1act.( 5.41532326 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 46.3852005 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.449611843 *** contribution from regularisation: 0.00354338158 *** contribution from error: -0.45315522 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.478426069 *** contribution from regularisation: 0.00223260117 *** contribution from error: -0.48065868 *********************************************** -----------------> Test sample ENTER BFGS code START -46183.9874 -0.332546055 -0.200560808 EXIT FROM BFGS code FG_START 0. -0.332546055 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.49634555 *** contribution from regularisation: 0.00207322021 *** contribution from error: -0.498418778 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -47903.7968 -0.332546055 -134.642746 EXIT FROM BFGS code FG_LNSRCH 0. -0.365870684 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0.517691612 *** contribution from regularisation: 0.00292444043 *** contribution from error: -0.520616055 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49963.9694 -0.365870684 22.5794582 EXIT FROM BFGS code NEW_X -49963.9694 -0.365870684 22.5794582 ENTER BFGS code NEW_X -49963.9694 -0.365870684 22.5794582 EXIT FROM BFGS code FG_LNSRCH 0. -0.36170575 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.518766522 *** contribution from regularisation: 0.00282571698 *** contribution from error: -0.521592259 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50067.7129 -0.36170575 11.0854092 EXIT FROM BFGS code NEW_X -50067.7129 -0.36170575 11.0854092 ENTER BFGS code NEW_X -50067.7129 -0.36170575 11.0854092 EXIT FROM BFGS code FG_LNSRCH 0. -0.358744144 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.518985987 *** contribution from regularisation: 0.00277396734 *** contribution from error: -0.521759927 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50088.8917 -0.358744144 8.72601604 EXIT FROM BFGS code NEW_X -50088.8917 -0.358744144 8.72601604 ENTER BFGS code NEW_X -50088.8917 -0.358744144 8.72601604 EXIT FROM BFGS code FG_LNSRCH 0. -0.349296749 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.519328952 *** contribution from regularisation: 0.00266872672 *** contribution from error: -0.52199769 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50121.9977 -0.349296749 0.304532021 EXIT FROM BFGS code NEW_X -50121.9977 -0.349296749 0.304532021 ENTER BFGS code NEW_X -50121.9977 -0.349296749 0.304532021 EXIT FROM BFGS code FG_LNSRCH 0. -0.340544999 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.519691885 *** contribution from regularisation: 0.00262582116 *** contribution from error: -0.522317708 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50157.0237 -0.340544999 -5.84021902 EXIT FROM BFGS code NEW_X -50157.0237 -0.340544999 -5.84021902 ENTER BFGS code NEW_X -50157.0237 -0.340544999 -5.84021902 EXIT FROM BFGS code FG_LNSRCH 0. -0.263701111 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.520443916 *** contribution from regularisation: 0.00251909299 *** contribution from error: -0.522962987 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50229.606 -0.263701111 -47.398304 EXIT FROM BFGS code FG_LNSRCH 0. -0.307848155 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 10 --> 29.6523762 sigma out 15 active outputs RANK 2 NODE 1 --> 21.3248081 sigma out 15 active outputs RANK 3 NODE 4 --> 17.7396755 sigma out 15 active outputs RANK 4 NODE 5 --> 17.2486115 sigma out 15 active outputs RANK 5 NODE 2 --> 13.978632 sigma out 15 active outputs RANK 6 NODE 3 --> 13.6165047 sigma out 15 active outputs RANK 7 NODE 8 --> 13.3115749 sigma out 15 active outputs RANK 8 NODE 9 --> 12.0891342 sigma out 15 active outputs RANK 9 NODE 7 --> 9.6352129 sigma out 15 active outputs RANK 10 NODE 6 --> 7.01908875 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 4 --> 28.1515884 sigma in 10act. ( 33.0036736 sig out 1act.) RANK 2 NODE 7 --> 19.5483608 sigma in 10act. ( 19.0205936 sig out 1act.) RANK 3 NODE 3 --> 18.986311 sigma in 10act. ( 16.7591534 sig out 1act.) RANK 4 NODE 15 --> 15.0142784 sigma in 10act. ( 12.091548 sig out 1act.) RANK 5 NODE 2 --> 13.5056772 sigma in 10act. ( 12.8217278 sig out 1act.) RANK 6 NODE 6 --> 13.3226805 sigma in 10act. ( 11.4265289 sig out 1act.) RANK 7 NODE 8 --> 13.2666588 sigma in 10act. ( 12.0271006 sig out 1act.) RANK 8 NODE 13 --> 10.5067759 sigma in 10act. ( 8.39417648 sig out 1act.) RANK 9 NODE 11 --> 9.85707951 sigma in 10act. ( 8.82891655 sig out 1act.) RANK 10 NODE 9 --> 9.39916134 sigma in 10act. ( 8.03780842 sig out 1act.) RANK 11 NODE 10 --> 7.77623177 sigma in 10act. ( 6.13542223 sig out 1act.) RANK 12 NODE 14 --> 6.6737442 sigma in 10act. ( 4.71291351 sig out 1act.) RANK 13 NODE 5 --> 6.114748 sigma in 10act. ( 4.15068865 sig out 1act.) RANK 14 NODE 12 --> 5.92276239 sigma in 10act. ( 4.23313141 sig out 1act.) RANK 15 NODE 1 --> 4.75880384 sigma in 10act. ( 0.151513249 sig out 1act.) sorted by output significance RANK 1 NODE 4 --> 33.0036736 sigma out 1act.( 28.1515884 sig in 10act.) RANK 2 NODE 7 --> 19.0205936 sigma out 1act.( 19.5483608 sig in 10act.) RANK 3 NODE 3 --> 16.7591534 sigma out 1act.( 18.986311 sig in 10act.) RANK 4 NODE 2 --> 12.8217278 sigma out 1act.( 13.5056772 sig in 10act.) RANK 5 NODE 15 --> 12.091548 sigma out 1act.( 15.0142784 sig in 10act.) RANK 6 NODE 8 --> 12.0271006 sigma out 1act.( 13.2666588 sig in 10act.) RANK 7 NODE 6 --> 11.4265289 sigma out 1act.( 13.3226805 sig in 10act.) RANK 8 NODE 11 --> 8.82891655 sigma out 1act.( 9.85707951 sig in 10act.) RANK 9 NODE 13 --> 8.39417648 sigma out 1act.( 10.5067759 sig in 10act.) RANK 10 NODE 9 --> 8.03780842 sigma out 1act.( 9.39916134 sig in 10act.) RANK 11 NODE 10 --> 6.13542223 sigma out 1act.( 7.77623177 sig in 10act.) RANK 12 NODE 14 --> 4.71291351 sigma out 1act.( 6.6737442 sig in 10act.) RANK 13 NODE 12 --> 4.23313141 sigma out 1act.( 5.92276239 sig in 10act.) RANK 14 NODE 5 --> 4.15068865 sigma out 1act.( 6.114748 sig in 10act.) RANK 15 NODE 1 --> 0.151513249 sigma out 1act.( 4.75880384 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 51.2421455 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.521374762 *** contribution from regularisation: 0.00237449282 *** contribution from error: -0.523749232 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -50319.4401 -0.307848155 -19.3607025 EXIT FROM BFGS code NEW_X -50319.4401 -0.307848155 -19.3607025 ENTER BFGS code NEW_X -50319.4401 -0.307848155 -19.3607025 EXIT FROM BFGS code FG_LNSRCH 0. -0.237265185 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.522167265 *** contribution from regularisation: 0.00295508304 *** contribution from error: -0.525122344 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50395.9316 -0.237265185 -61.1357841 EXIT FROM BFGS code FG_LNSRCH 0. -0.271276176 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.523331106 *** contribution from regularisation: 0.00200220849 *** contribution from error: -0.525333285 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50508.2567 -0.271276176 -35.7886696 EXIT FROM BFGS code NEW_X -50508.2567 -0.271276176 -35.7886696 ENTER BFGS code NEW_X -50508.2567 -0.271276176 -35.7886696 EXIT FROM BFGS code FG_LNSRCH 0. -0.23528178 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.524856508 *** contribution from regularisation: 0.00309233018 *** contribution from error: -0.527948856 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50655.4777 -0.23528178 -5.54444647 EXIT FROM BFGS code NEW_X -50655.4777 -0.23528178 -5.54444647 ENTER BFGS code NEW_X -50655.4777 -0.23528178 -5.54444647 EXIT FROM BFGS code FG_LNSRCH 0. -0.240231082 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.525670588 *** contribution from regularisation: 0.00230068783 *** contribution from error: -0.527971268 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50734.0428 -0.240231082 -31.494606 EXIT FROM BFGS code FG_LNSRCH 0. -0.237869754 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.525978327 *** contribution from regularisation: 0.00250076992 *** contribution from error: -0.528479099 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50763.7476 -0.237869754 -16.7654114 EXIT FROM BFGS code NEW_X -50763.7476 -0.237869754 -16.7654114 ENTER BFGS code NEW_X -50763.7476 -0.237869754 -16.7654114 EXIT FROM BFGS code FG_LNSRCH 0. -0.248456955 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.52622813 *** contribution from regularisation: 0.00232015783 *** contribution from error: -0.5285483 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50787.8537 -0.248456955 -4.32560825 EXIT FROM BFGS code NEW_X -50787.8537 -0.248456955 -4.32560825 ENTER BFGS code NEW_X -50787.8537 -0.248456955 -4.32560825 EXIT FROM BFGS code FG_LNSRCH 0. -0.252542943 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.526354074 *** contribution from regularisation: 0.00241738232 *** contribution from error: -0.52877146 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50800.0085 -0.252542943 -1.98889458 EXIT FROM BFGS code NEW_X -50800.0085 -0.252542943 -1.98889458 ENTER BFGS code NEW_X -50800.0085 -0.252542943 -1.98889458 EXIT FROM BFGS code FG_LNSRCH 0. -0.261113912 0. --------------------------------------------------- Iteration : 18 *********************************************** *** Learn Path 18 *** loss function: -0.526691914 *** contribution from regularisation: 0.00241004559 *** contribution from error: -0.529101968 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50832.6159 -0.261113912 8.60793972 EXIT FROM BFGS code NEW_X -50832.6159 -0.261113912 8.60793972 ENTER BFGS code NEW_X -50832.6159 -0.261113912 8.60793972 EXIT FROM BFGS code FG_LNSRCH 0. -0.265819937 0. --------------------------------------------------- Iteration : 19 *********************************************** *** Learn Path 19 *** loss function: -0.527063191 *** contribution from regularisation: 0.00249807001 *** contribution from error: -0.529561281 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50868.4486 -0.265819937 -29.8384781 EXIT FROM BFGS code NEW_X -50868.4486 -0.265819937 -29.8384781 ENTER BFGS code NEW_X -50868.4486 -0.265819937 -29.8384781 EXIT FROM BFGS code FG_LNSRCH 0. -0.28340885 0. --------------------------------------------------- Iteration : 20 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 10 --> 57.5470924 sigma out 15 active outputs RANK 2 NODE 8 --> 38.0370293 sigma out 15 active outputs RANK 3 NODE 2 --> 37.1531105 sigma out 15 active outputs RANK 4 NODE 9 --> 23.1245747 sigma out 15 active outputs RANK 5 NODE 1 --> 22.9613609 sigma out 15 active outputs RANK 6 NODE 7 --> 22.1297588 sigma out 15 active outputs RANK 7 NODE 5 --> 21.8328972 sigma out 15 active outputs RANK 8 NODE 4 --> 21.3179321 sigma out 15 active outputs RANK 9 NODE 3 --> 11.1810894 sigma out 15 active outputs RANK 10 NODE 6 --> 6.86984015 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 4 --> 79.5592728 sigma in 10act. ( 84.4893036 sig out 1act.) RANK 2 NODE 7 --> 25.5867081 sigma in 10act. ( 27.6907158 sig out 1act.) RANK 3 NODE 2 --> 21.2742043 sigma in 10act. ( 25.7726078 sig out 1act.) RANK 4 NODE 3 --> 20.1156769 sigma in 10act. ( 18.6437988 sig out 1act.) RANK 5 NODE 9 --> 14.792572 sigma in 10act. ( 16.2692394 sig out 1act.) RANK 6 NODE 6 --> 14.7440128 sigma in 10act. ( 16.3084793 sig out 1act.) RANK 7 NODE 8 --> 12.4769478 sigma in 10act. ( 11.3062229 sig out 1act.) RANK 8 NODE 15 --> 11.9896402 sigma in 10act. ( 11.4494514 sig out 1act.) RANK 9 NODE 11 --> 10.4869671 sigma in 10act. ( 11.2855015 sig out 1act.) RANK 10 NODE 12 --> 5.41382122 sigma in 10act. ( 4.64263391 sig out 1act.) RANK 11 NODE 1 --> 4.40891886 sigma in 10act. ( 4.07014799 sig out 1act.) RANK 12 NODE 10 --> 4.35738707 sigma in 10act. ( 1.1660043 sig out 1act.) RANK 13 NODE 14 --> 4.32295132 sigma in 10act. ( 2.98773217 sig out 1act.) RANK 14 NODE 13 --> 3.70357084 sigma in 10act. ( 2.19574189 sig out 1act.) RANK 15 NODE 5 --> 2.55692768 sigma in 10act. ( 0.404438704 sig out 1act.) sorted by output significance RANK 1 NODE 4 --> 84.4893036 sigma out 1act.( 79.5592728 sig in 10act.) RANK 2 NODE 7 --> 27.6907158 sigma out 1act.( 25.5867081 sig in 10act.) RANK 3 NODE 2 --> 25.7726078 sigma out 1act.( 21.2742043 sig in 10act.) RANK 4 NODE 3 --> 18.6437988 sigma out 1act.( 20.1156769 sig in 10act.) RANK 5 NODE 6 --> 16.3084793 sigma out 1act.( 14.7440128 sig in 10act.) RANK 6 NODE 9 --> 16.2692394 sigma out 1act.( 14.792572 sig in 10act.) RANK 7 NODE 15 --> 11.4494514 sigma out 1act.( 11.9896402 sig in 10act.) RANK 8 NODE 8 --> 11.3062229 sigma out 1act.( 12.4769478 sig in 10act.) RANK 9 NODE 11 --> 11.2855015 sigma out 1act.( 10.4869671 sig in 10act.) RANK 10 NODE 12 --> 4.64263391 sigma out 1act.( 5.41382122 sig in 10act.) RANK 11 NODE 1 --> 4.07014799 sigma out 1act.( 4.40891886 sig in 10act.) RANK 12 NODE 14 --> 2.98773217 sigma out 1act.( 4.32295132 sig in 10act.) RANK 13 NODE 13 --> 2.19574189 sigma out 1act.( 3.70357084 sig in 10act.) RANK 14 NODE 10 --> 1.1660043 sigma out 1act.( 4.35738707 sig in 10act.) RANK 15 NODE 5 --> 0.404438704 sigma out 1act.( 2.55692768 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 99.4352341 sigma in 15 active inputs *********************************************** *** Learn Path 20 *** loss function: -0.527336478 *** contribution from regularisation: 0.00256407331 *** contribution from error: -0.529900551 *********************************************** -----------------> Test sample Iteration No: 20 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -50894.8242 -0.28340885 47.5170784 EXIT FROM BFGS code NEW_X -50894.8242 -0.28340885 47.5170784 ENTER BFGS code NEW_X -50894.8242 -0.28340885 47.5170784 EXIT FROM BFGS code FG_LNSRCH 0. -0.274686188 0. --------------------------------------------------- Iteration : 21 *********************************************** *** Learn Path 21 *** loss function: -0.527571678 *** contribution from regularisation: 0.00255482923 *** contribution from error: -0.530126512 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50917.5249 -0.274686188 28.3661118 EXIT FROM BFGS code NEW_X -50917.5249 -0.274686188 28.3661118 ENTER BFGS code NEW_X -50917.5249 -0.274686188 28.3661118 EXIT FROM BFGS code FG_LNSRCH 0. -0.264149308 0. --------------------------------------------------- Iteration : 22 *********************************************** *** Learn Path 22 *** loss function: -0.527710855 *** contribution from regularisation: 0.00255343947 *** contribution from error: -0.530264318 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50930.9601 -0.264149308 -2.78772712 EXIT FROM BFGS code NEW_X -50930.9601 -0.264149308 -2.78772712 ENTER BFGS code NEW_X -50930.9601 -0.264149308 -2.78772712 EXIT FROM BFGS code FG_LNSRCH 0. -0.253170729 0. --------------------------------------------------- Iteration : 23 *********************************************** *** Learn Path 23 *** loss function: -0.527894437 *** contribution from regularisation: 0.00259113056 *** contribution from error: -0.53048557 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50948.6757 -0.253170729 -4.66758299 EXIT FROM BFGS code NEW_X -50948.6757 -0.253170729 -4.66758299 ENTER BFGS code NEW_X -50948.6757 -0.253170729 -4.66758299 EXIT FROM BFGS code FG_LNSRCH 0. -0.233213753 0. --------------------------------------------------- Iteration : 24 *********************************************** *** Learn Path 24 *** loss function: -0.528143287 *** contribution from regularisation: 0.00262146746 *** contribution from error: -0.530764759 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50972.6927 -0.233213753 -38.6783676 EXIT FROM BFGS code NEW_X -50972.6927 -0.233213753 -38.6783676 ENTER BFGS code NEW_X -50972.6927 -0.233213753 -38.6783676 EXIT FROM BFGS code FG_LNSRCH 0. -0.227205604 0. --------------------------------------------------- Iteration : 25 *********************************************** *** Learn Path 25 *** loss function: -0.528268158 *** contribution from regularisation: 0.00264774938 *** contribution from error: -0.530915916 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50984.7431 -0.227205604 2.63541412 EXIT FROM BFGS code NEW_X -50984.7431 -0.227205604 2.63541412 ENTER BFGS code NEW_X -50984.7431 -0.227205604 2.63541412 EXIT FROM BFGS code FG_LNSRCH 0. -0.22892864 0. --------------------------------------------------- Iteration : 26 *********************************************** *** Learn Path 26 *** loss function: -0.528299749 *** contribution from regularisation: 0.00264744065 *** contribution from error: -0.530947208 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50987.7926 -0.22892864 28.2016029 EXIT FROM BFGS code NEW_X -50987.7926 -0.22892864 28.2016029 ENTER BFGS code NEW_X -50987.7926 -0.22892864 28.2016029 EXIT FROM BFGS code FG_LNSRCH 0. -0.225314289 0. --------------------------------------------------- Iteration : 27 *********************************************** *** Learn Path 27 *** loss function: -0.528318226 *** contribution from regularisation: 0.00263157114 *** contribution from error: -0.530949771 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50989.5753 -0.225314289 25.0698605 EXIT FROM BFGS code NEW_X -50989.5753 -0.225314289 25.0698605 ENTER BFGS code NEW_X -50989.5753 -0.225314289 25.0698605 EXIT FROM BFGS code FG_LNSRCH 0. -0.202279985 0. --------------------------------------------------- Iteration : 28 *********************************************** *** Learn Path 28 *** loss function: -0.528443813 *** contribution from regularisation: 0.00240841857 *** contribution from error: -0.530852258 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51001.6951 -0.202279985 32.4000664 EXIT FROM BFGS code NEW_X -51001.6951 -0.202279985 32.4000664 ENTER BFGS code NEW_X -51001.6951 -0.202279985 32.4000664 EXIT FROM BFGS code FG_LNSRCH 0. -0.139263928 0. --------------------------------------------------- Iteration : 29 *********************************************** *** Learn Path 29 *** loss function: -0.52660042 *** contribution from regularisation: 0.00235702051 *** contribution from error: -0.528957427 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50823.7845 -0.139263928 -321.574341 EXIT FROM BFGS code FG_LNSRCH 0. -0.197942033 0. --------------------------------------------------- Iteration : 30 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 10 --> 62.0779533 sigma out 15 active outputs RANK 2 NODE 8 --> 40.649765 sigma out 15 active outputs RANK 3 NODE 2 --> 39.0032806 sigma out 15 active outputs RANK 4 NODE 9 --> 31.4134808 sigma out 15 active outputs RANK 5 NODE 1 --> 28.688427 sigma out 15 active outputs RANK 6 NODE 5 --> 25.7038937 sigma out 15 active outputs RANK 7 NODE 4 --> 24.8376751 sigma out 15 active outputs RANK 8 NODE 7 --> 15.7892466 sigma out 15 active outputs RANK 9 NODE 3 --> 13.6125135 sigma out 15 active outputs RANK 10 NODE 6 --> 7.78592682 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 4 --> 84.8036194 sigma in 10act. ( 91.180603 sig out 1act.) RANK 2 NODE 7 --> 31.0367241 sigma in 10act. ( 32.7472878 sig out 1act.) RANK 3 NODE 2 --> 26.2494774 sigma in 10act. ( 31.4638557 sig out 1act.) RANK 4 NODE 3 --> 23.1986141 sigma in 10act. ( 22.3473129 sig out 1act.) RANK 5 NODE 9 --> 17.2753601 sigma in 10act. ( 19.2689419 sig out 1act.) RANK 6 NODE 6 --> 15.7883863 sigma in 10act. ( 17.5728168 sig out 1act.) RANK 7 NODE 8 --> 15.4710073 sigma in 10act. ( 14.7972317 sig out 1act.) RANK 8 NODE 15 --> 13.7202778 sigma in 10act. ( 14.0888481 sig out 1act.) RANK 9 NODE 11 --> 12.5388775 sigma in 10act. ( 13.5405159 sig out 1act.) RANK 10 NODE 12 --> 6.9275589 sigma in 10act. ( 6.59451199 sig out 1act.) RANK 11 NODE 10 --> 3.63933253 sigma in 10act. ( 1.2774682 sig out 1act.) RANK 12 NODE 1 --> 3.47560358 sigma in 10act. ( 3.0005219 sig out 1act.) RANK 13 NODE 14 --> 3.36273432 sigma in 10act. ( 2.36040783 sig out 1act.) RANK 14 NODE 13 --> 2.64325047 sigma in 10act. ( 1.41200376 sig out 1act.) RANK 15 NODE 5 --> 1.92371285 sigma in 10act. ( 0.287572563 sig out 1act.) sorted by output significance RANK 1 NODE 4 --> 91.180603 sigma out 1act.( 84.8036194 sig in 10act.) RANK 2 NODE 7 --> 32.7472878 sigma out 1act.( 31.0367241 sig in 10act.) RANK 3 NODE 2 --> 31.4638557 sigma out 1act.( 26.2494774 sig in 10act.) RANK 4 NODE 3 --> 22.3473129 sigma out 1act.( 23.1986141 sig in 10act.) RANK 5 NODE 9 --> 19.2689419 sigma out 1act.( 17.2753601 sig in 10act.) RANK 6 NODE 6 --> 17.5728168 sigma out 1act.( 15.7883863 sig in 10act.) RANK 7 NODE 8 --> 14.7972317 sigma out 1act.( 15.4710073 sig in 10act.) RANK 8 NODE 15 --> 14.0888481 sigma out 1act.( 13.7202778 sig in 10act.) RANK 9 NODE 11 --> 13.5405159 sigma out 1act.( 12.5388775 sig in 10act.) RANK 10 NODE 12 --> 6.59451199 sigma out 1act.( 6.9275589 sig in 10act.) RANK 11 NODE 1 --> 3.0005219 sigma out 1act.( 3.47560358 sig in 10act.) RANK 12 NODE 14 --> 2.36040783 sigma out 1act.( 3.36273432 sig in 10act.) RANK 13 NODE 13 --> 1.41200376 sigma out 1act.( 2.64325047 sig in 10act.) RANK 14 NODE 10 --> 1.2774682 sigma out 1act.( 3.63933253 sig in 10act.) RANK 15 NODE 5 --> 0.287572563 sigma out 1act.( 1.92371285 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 110.536552 sigma in 15 active inputs *********************************************** *** Learn Path 30 *** loss function: -0.527871966 *** contribution from regularisation: 0.00298205111 *** contribution from error: -0.530854046 *********************************************** -----------------> Test sample Iteration No: 30 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -50946.5053 -0.197942033 11.6590939 EXIT FROM BFGS code FG_LNSRCH 0. -0.202240482 0. --------------------------------------------------- Iteration : 31 *********************************************** *** Learn Path 31 *** loss function: -0.528251529 *** contribution from regularisation: 0.00260075601 *** contribution from error: -0.530852258 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50983.1387 -0.202240482 32.1472778 EXIT FROM BFGS code FG_LNSRCH 0. -0.20227997 0. --------------------------------------------------- Iteration : 32 *********************************************** *** Learn Path 32 *** loss function: -0.528213918 *** contribution from regularisation: 0.00263826712 *** contribution from error: -0.530852199 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50979.5117 -0.20227997 32.2982445 EXIT FROM BFGS code FG_LNSRCH 0. -0.202279985 0. --------------------------------------------------- Iteration : 33 *********************************************** *** Learn Path 33 *** loss function: -0.528226316 *** contribution from regularisation: 0.00262588868 *** contribution from error: -0.530852199 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50980.7064 -0.202279985 32.2823792 EXIT FROM BFGS code FG_LNSRCH 0. -0.202279985 0. --------------------------------------------------- Iteration : 34 *********************************************** *** Learn Path 34 *** loss function: -0.52822417 *** contribution from regularisation: 0.00262803794 *** contribution from error: -0.530852199 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50980.499 -0.202279985 32.2692642 EXIT FROM BFGS code FG_LNSRCH 0. -0.202279985 0. --------------------------------------------------- Iteration : 35 *********************************************** *** Learn Path 35 *** loss function: -0.528218627 *** contribution from regularisation: 0.00263360585 *** contribution from error: -0.530852258 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50979.9616 -0.202279985 32.2173309 EXIT FROM BFGS code NEW_X -50979.9616 -0.202279985 32.2173309 ENTER BFGS code NEW_X -50979.9616 -0.202279985 32.2173309 EXIT FROM BFGS code CONVERGENC -50979.9616 -0.202279985 32.2173309 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 10 --> 99.4924927 sigma out 15 active outputs RANK 2 NODE 8 --> 65.4299774 sigma out 15 active outputs RANK 3 NODE 2 --> 63.9176712 sigma out 15 active outputs RANK 4 NODE 9 --> 51.7011681 sigma out 15 active outputs RANK 5 NODE 1 --> 45.400425 sigma out 15 active outputs RANK 6 NODE 5 --> 40.9611206 sigma out 15 active outputs RANK 7 NODE 4 --> 39.0646858 sigma out 15 active outputs RANK 8 NODE 7 --> 24.8143787 sigma out 15 active outputs RANK 9 NODE 3 --> 22.4746075 sigma out 15 active outputs RANK 10 NODE 6 --> 12.8279333 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 4 --> 137.806625 sigma in 10act. ( 148.691788 sig out 1act.) RANK 2 NODE 7 --> 49.2209244 sigma in 10act. ( 54.0619392 sig out 1act.) RANK 3 NODE 2 --> 41.4407883 sigma in 10act. ( 51.9138451 sig out 1act.) RANK 4 NODE 3 --> 36.6152573 sigma in 10act. ( 36.1215363 sig out 1act.) RANK 5 NODE 9 --> 27.5794373 sigma in 10act. ( 31.2050476 sig out 1act.) RANK 6 NODE 6 --> 25.2339363 sigma in 10act. ( 28.699482 sig out 1act.) RANK 7 NODE 8 --> 24.1537724 sigma in 10act. ( 23.6065121 sig out 1act.) RANK 8 NODE 15 --> 21.6316719 sigma in 10act. ( 22.6793747 sig out 1act.) RANK 9 NODE 11 --> 19.6073914 sigma in 10act. ( 21.710516 sig out 1act.) RANK 10 NODE 12 --> 10.3686934 sigma in 10act. ( 10.4285851 sig out 1act.) RANK 11 NODE 1 --> 5.03639317 sigma in 10act. ( 5.00287676 sig out 1act.) RANK 12 NODE 14 --> 4.68241882 sigma in 10act. ( 4.01102734 sig out 1act.) RANK 13 NODE 10 --> 3.7441237 sigma in 10act. ( 2.14500761 sig out 1act.) RANK 14 NODE 13 --> 3.4739151 sigma in 10act. ( 2.28517795 sig out 1act.) RANK 15 NODE 5 --> 1.83580732 sigma in 10act. ( 0.480659902 sig out 1act.) sorted by output significance RANK 1 NODE 4 --> 148.691788 sigma out 1act.( 137.806625 sig in 10act.) RANK 2 NODE 7 --> 54.0619392 sigma out 1act.( 49.2209244 sig in 10act.) RANK 3 NODE 2 --> 51.9138451 sigma out 1act.( 41.4407883 sig in 10act.) RANK 4 NODE 3 --> 36.1215363 sigma out 1act.( 36.6152573 sig in 10act.) RANK 5 NODE 9 --> 31.2050476 sigma out 1act.( 27.5794373 sig in 10act.) RANK 6 NODE 6 --> 28.699482 sigma out 1act.( 25.2339363 sig in 10act.) RANK 7 NODE 8 --> 23.6065121 sigma out 1act.( 24.1537724 sig in 10act.) RANK 8 NODE 15 --> 22.6793747 sigma out 1act.( 21.6316719 sig in 10act.) RANK 9 NODE 11 --> 21.710516 sigma out 1act.( 19.6073914 sig in 10act.) RANK 10 NODE 12 --> 10.4285851 sigma out 1act.( 10.3686934 sig in 10act.) RANK 11 NODE 1 --> 5.00287676 sigma out 1act.( 5.03639317 sig in 10act.) RANK 12 NODE 14 --> 4.01102734 sigma out 1act.( 4.68241882 sig in 10act.) RANK 13 NODE 13 --> 2.28517795 sigma out 1act.( 3.4739151 sig in 10act.) RANK 14 NODE 10 --> 2.14500761 sigma out 1act.( 3.7441237 sig in 10act.) RANK 15 NODE 5 --> 0.480659902 sigma out 1act.( 1.83580732 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 180.36647 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.52822417 *** contribution from regularisation: 0.00262806099 *** contribution from error: -0.530852258 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 28444 Closing output file done