NNInput NNInputs_110.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree NNAna::CopyTree: entries= 195024 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 46682 nbkg = 148342 Bkg Entries: 148342 Sig Entries: 46682 Chosen entries: 46682 Signal fraction: 1 Background fraction: 0.314692 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 148342 Actual Signal Entries: 46682 Entries to split: 46682 Test with : 23341 Train with : 23341 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 46682 for Signal Prepared event 0 for Signal with 46682 events ====Entry 0 Variable Ht : 103.037 Variable LepAPt : 29.4151 Variable LepBPt : 16.2907 Variable MetSigLeptonsJets : 3.90375 Variable MetSpec : 32.7321 Variable SumEtLeptonsJets : 70.3044 Variable VSumJetLeptonsPt : 47.5713 Variable addEt : 78.4382 Variable dPhiLepSumMet : 2.69477 Variable dPhiLeptons : 0.316822 Variable dRLeptons : 0.339778 Variable lep1_E : 29.4179 Variable lep2_E : 16.3887 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2110 Ht = 103.037 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 29.4154 LepAPt = 29.4151 LepBEt = 16.2907 LepBPt = 16.2907 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 32.7321 MetDelPhi = 1.84515 MetSig = 2.82942 MetSigLeptonsJets = 3.90375 MetSpec = 32.7321 Mjj = 0 MostCentralJetEta = 1.57834 MtllMet = 78.5262 Njets = 1 SB = 0 SumEt = 133.83 SumEtJets = 0 SumEtLeptonsJets = 70.3044 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 47.5713 addEt = 78.4382 dPhiLepSumMet = 2.69477 dPhiLeptons = 0.316822 dRLeptons = 0.339778 diltype = 3 dimass = 7.41356 event = 59 jet1_Et = 24.5987 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 29.4179 lep2_E = 16.3887 rand = 0.999742 run = 229664 weight = 4.83121e-07 ===Show End Prepared event 10000 for Signal with 46682 events Prepared event 20000 for Signal with 46682 events Prepared event 30000 for Signal with 46682 events Prepared event 40000 for Signal with 46682 events Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 148342 for Background Prepared event 0 for Background with 148342 events ====Entry 0 Variable Ht : 85.3408 Variable LepAPt : 31.1294 Variable LepBPt : 10.0664 Variable MetSigLeptonsJets : 6.87768 Variable MetSpec : 44.1437 Variable SumEtLeptonsJets : 41.1959 Variable VSumJetLeptonsPt : 41.0132 Variable addEt : 85.3408 Variable dPhiLepSumMet : 3.10536 Variable dPhiLeptons : 0.219342 Variable dRLeptons : 0.424454 Variable lep1_E : 32.3548 Variable lep2_E : 10.1027 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 85.3408 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 31.1297 LepAPt = 31.1294 LepBEt = 10.0674 LepBPt = 10.0664 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 44.1437 MetDelPhi = 3.01191 MetSig = 3.74558 MetSigLeptonsJets = 6.87768 MetSpec = 44.1437 Mjj = 0 MostCentralJetEta = 0 MtllMet = 86.2332 Njets = 0 SB = 0 SumEt = 138.899 SumEtJets = 0 SumEtLeptonsJets = 41.1959 Target = 0 TrainWeight = 0.281026 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 41.0132 addEt = 85.3408 dPhiLepSumMet = 3.10536 dPhiLeptons = 0.219342 dRLeptons = 0.424454 diltype = 17 dimass = 7.54723 event = 6717520 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 32.3548 lep2_E = 10.1027 rand = 0.999742 run = 271566 weight = 0.00296168 ===Show End Prepared event 10000 for Background with 148342 events Prepared event 20000 for Background with 148342 events Prepared event 30000 for Background with 148342 events Prepared event 40000 for Background with 148342 events Prepared event 50000 for Background with 148342 events Prepared event 60000 for Background with 148342 events Prepared event 70000 for Background with 148342 events Prepared event 80000 for Background with 148342 events Prepared event 90000 for Background with 148342 events Prepared event 100000 for Background with 148342 events Prepared event 110000 for Background with 148342 events Prepared event 120000 for Background with 148342 events Prepared event 130000 for Background with 148342 events Prepared event 140000 for Background with 148342 events Warning: found 4715 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 195024 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 4715 negative weights. Signal fraction: 62.7027435 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 60.0004501 62.7844238 64.8924026 66.3924789 67.6448517 68.6586609 69.5550537 70.5041199 71.2114334 71.9753494 72.7576752 73.4771729 74.0624542 74.7329407 75.3191147 75.8933868 76.4571228 77.0262756 77.513855 78.1389847 78.6559601 79.2429733 79.7436905 80.4397736 80.9476166 81.4598846 82.1012115 82.5562439 83.1330338 83.7582626 84.4650726 85.1497955 85.9796524 86.7437363 87.5223236 88.2770004 88.9849091 89.8212585 90.5797424 91.3986664 92.1815186 93.0094452 93.8709869 94.8052139 95.6049423 96.3353043 97.0871735 97.8412857 98.5397339 99.323967 100.0354 100.75621 101.590363 102.267654 103.154648 103.933929 104.80896 105.624275 106.490593 107.39093 108.369072 109.314163 110.343597 111.346893 112.594208 113.841766 115.35556 116.834709 118.310501 120.050308 121.70932 123.384949 125.112267 126.801453 128.559448 130.434753 132.412048 134.412384 136.635071 138.899261 140.952057 143.399628 145.775604 148.211044 150.922852 153.995697 157.418793 160.78949 164.662674 169.000732 173.561035 178.61058 184.183395 191.221161 199.841156 209.198303 222.138153 238.987762 263.981567 307.08725 717.948669 ------------------------------ Transdef: Tab for variable 3 20.0000362 20.2638683 20.5181351 20.7771263 21.0196209 21.254734 21.4901657 21.6926918 21.931097 22.1402969 22.305933 22.5041122 22.6956787 22.8424988 23.0321484 23.2222004 23.4011116 23.5912209 23.7850304 23.9958572 24.2059708 24.4063606 24.6132298 24.8140049 24.9983826 25.180275 25.3300495 25.5320206 25.6961575 25.899456 26.1058235 26.3135757 26.4994049 26.7186852 26.9299393 27.1261654 27.3061581 27.5364418 27.7045822 27.9169445 28.1133137 28.3077316 28.5253487 28.7008858 28.9548912 29.1479988 29.4150486 29.64254 29.8399124 30.0846519 30.3476009 30.5882435 30.819025 31.0659256 31.3262787 31.5601902 31.7526741 32.0020828 32.2547913 32.4852257 32.7607231 33.0382996 33.288475 33.5782242 33.8405457 34.123909 34.4398918 34.7762299 35.0972176 35.4230042 35.6653366 35.988472 36.3252068 36.6675339 37.0285645 37.4199982 37.8294144 38.2560806 38.7249527 39.1758003 39.6787186 40.1657372 40.6593018 41.191925 41.7854538 42.4204025 43.2390594 43.9603767 44.8871727 45.8718529 46.9822884 48.1683121 49.6903305 51.2705917 53.1250076 55.455925 58.4810982 62.8541145 68.717453 79.5542145 228.385986 ------------------------------ Transdef: Tab for variable 4 10.0001202 10.0870647 10.1690903 10.2508125 10.3376904 10.4006033 10.4753551 10.5397644 10.6183643 10.7045412 10.7814989 10.8640194 10.9474125 11.0254993 11.1112633 11.1999283 11.2944641 11.3840084 11.456171 11.5436573 11.6265125 11.7134895 11.7924576 11.8820944 11.9660549 12.0564575 12.155056 12.2462673 12.3290501 12.4163418 12.5172739 12.6138334 12.7054691 12.8036118 12.8868475 12.9817467 13.0860109 13.1964931 13.3069611 13.419548 13.5222626 13.6308298 13.7458191 13.841424 13.9518814 14.0591965 14.1634064 14.2792702 14.4143066 14.5441303 14.6754389 14.802393 14.9305019 15.0743961 15.2067719 15.3422899 15.4558592 15.601429 15.7685356 15.9152546 16.0570145 16.2188225 16.3710327 16.5148277 16.6601887 16.7980862 16.947998 17.1235924 17.3019104 17.4889851 17.6761742 17.8724365 18.0687294 18.276329 18.476944 18.7007885 18.9303379 19.149889 19.4027576 19.6543274 19.925106 20.1675797 20.3849239 20.657074 20.9526634 21.2365513 21.5195503 21.8479233 22.220192 22.616436 23.027401 23.5524254 24.0656204 24.7221451 25.5044594 26.3839684 27.6242142 29.0548058 31.2408104 35.1239548 98.2854004 ------------------------------ Transdef: Tab for variable 5 1.13823473 2.23389149 2.61617327 2.89385509 3.12369275 3.3150301 3.49873281 3.66375637 3.8043499 3.9518919 4.05807114 4.14631462 4.23459625 4.32048225 4.39952374 4.47354412 4.55144215 4.62187767 4.68990135 4.75407505 4.81236839 4.8730526 4.93835258 5.00665998 5.0603447 5.12440014 5.17048216 5.22396088 5.27346516 5.31734133 5.36439419 5.41241932 5.45458794 5.4964962 5.53722095 5.57474804 5.61719513 5.6547184 5.69651699 5.73287153 5.77295256 5.81139565 5.84346199 5.88223457 5.92025185 5.95566654 5.99125671 6.02468204 6.06219959 6.09927368 6.13072777 6.17487526 6.20718956 6.24019766 6.2722559 6.30847979 6.34701824 6.38524485 6.41978168 6.45710278 6.49145269 6.5258112 6.56359196 6.60174274 6.64322805 6.68320084 6.72709322 6.76321507 6.80036831 6.83467388 6.8758564 6.91638422 6.96070719 6.99265289 7.04073524 7.08459377 7.13327503 7.17768097 7.22281551 7.27549458 7.32566833 7.37829781 7.42811918 7.48599148 7.54204082 7.60643005 7.66315365 7.73477173 7.8125267 7.89268923 7.97975111 8.06978035 8.1742382 8.28331757 8.42831039 8.58876228 8.79099083 9.04261971 9.40879345 10.057827 18.1234589 ------------------------------ Transdef: Tab for variable 6 15.0050735 18.8699226 22.4800568 25.2274551 26.0962524 26.950573 27.5631714 28.2419167 28.9648132 29.5681534 30.2273979 30.8217545 31.2662964 31.7211037 32.182869 32.6678848 33.0102005 33.3757439 33.7429123 34.1717758 34.5089493 34.7844696 35.0689888 35.4228935 35.7597961 36.0518303 36.3534164 36.6271896 36.912674 37.2281036 37.516819 37.8100739 38.1093254 38.382782 38.5859756 38.8604584 39.1515923 39.4775467 39.7408524 40.0179939 40.2802963 40.5378876 40.8495903 41.1423569 41.4282074 41.7689095 42.0888824 42.393013 42.7369461 42.974205 43.2691956 43.5846214 43.9169083 44.2556381 44.5761337 44.9469795 45.2472038 45.5795517 45.9036789 46.2772446 46.6385689 47.0088577 47.4069977 47.7995529 48.1208878 48.4737549 48.8467712 49.2420502 49.6586571 50.013813 50.3397751 50.7218018 51.1649857 51.5607758 52.0305977 52.4864159 52.9786301 53.4784622 53.9703674 54.4502335 55.0179825 55.5978699 56.3106689 56.9133644 57.5750618 58.349144 59.2597008 60.1695099 61.2685699 62.2889977 63.6650963 65.1721649 66.7201385 68.6516418 70.6087418 73.2142639 76.345726 80.521225 85.6219788 96.8774643 290.979675 ------------------------------ Transdef: Tab for variable 7 30.1476021 32.4903946 33.3463287 34.0113564 34.5666885 35.0493851 35.4446602 35.8915787 36.3055573 36.7230377 37.057724 37.3468323 37.6358643 37.9246407 38.1703796 38.4329224 38.7243652 38.9587097 39.2389984 39.4977722 39.7583313 40.0183868 40.3097382 40.5641861 40.8865585 41.2001343 41.4918365 41.8185844 42.1283798 42.501915 42.806488 43.1136322 43.4861908 43.876564 44.3339577 44.6499939 45.0217743 45.4432449 45.8833008 46.384552 46.8418922 47.2857246 47.8184013 48.3131714 48.7568893 49.1985321 49.7356873 50.2700043 50.8250427 51.362381 51.884285 52.3926239 52.9995499 53.5795517 54.2357559 54.9465065 55.7103271 56.4981308 57.2844086 58.2240219 59.1454773 60.2551193 61.3900414 62.5109406 63.7030869 64.8889999 66.0625153 67.403183 68.6653442 70.0321045 71.1560364 72.5353317 73.8631134 75.3552246 76.8374176 78.599472 80.0991516 81.7646027 83.4155426 85.4040527 87.4902954 89.81633 91.9403839 94.4615479 97.0329437 99.7675171 102.55014 105.69986 109.119865 112.673553 116.722778 121.642365 126.832672 133.098358 139.994415 148.130707 158.262985 172.000916 190.379242 226.656281 573.509644 ------------------------------ Transdef: Tab for variable 8 1.197348 20.3650665 24.9611778 27.8619499 29.5338478 30.8278809 31.6778793 32.276268 32.8126221 33.2213593 33.6587601 34.0117149 34.3722916 34.7269669 35.0437164 35.357193 35.6355743 35.9404297 36.2347755 36.5307121 36.7857666 36.9845085 37.2021408 37.4157333 37.6662178 37.8854065 38.0957756 38.3143921 38.5733261 38.7847672 39.018692 39.2503433 39.4666061 39.6650467 39.8699188 40.111351 40.3768768 40.6212997 40.8648491 41.1232758 41.3569565 41.6265869 41.8997841 42.1929016 42.4924011 42.7474518 42.9941673 43.2986526 43.5630493 43.8590927 44.176506 44.5018425 44.8143234 45.1528053 45.4359474 45.7935562 46.2019348 46.5174904 46.8537827 47.201561 47.5501099 47.9181747 48.2747459 48.6242371 48.9608192 49.3546906 49.7191925 50.0842133 50.4669418 50.8364258 51.2026787 51.6465378 52.0527039 52.5049973 52.9833603 53.4365616 53.9584007 54.4763947 55.054512 55.6556702 56.3711853 57.0125313 57.7335548 58.4789696 59.3246536 60.2323074 61.3147087 62.399353 63.5470009 64.9241943 66.4503021 68.0851898 69.963501 71.9402924 74.4397736 77.2585144 80.4498978 85.3173218 92.5267868 106.305122 280.344635 ------------------------------ Transdef: Tab for variable 9 47.5967369 62.3564453 64.1367645 65.6402588 66.8285217 67.8689346 68.7182159 69.5673828 70.3355408 71.0481415 71.6856537 72.3537292 73.0289154 73.6652832 74.1487579 74.7327118 75.2232361 75.6614532 76.1585693 76.6317978 77.0454712 77.4684525 77.9845276 78.4107819 78.8804321 79.3626328 79.773819 80.3280029 80.7741241 81.1938324 81.6235428 82.1323242 82.5439606 82.9936829 83.4841461 83.9375 84.504097 84.9922791 85.5740509 86.1930084 86.803833 87.361908 87.973999 88.5437469 89.0294952 89.571167 90.2180023 90.7993622 91.359436 91.9377747 92.5207825 93.1000519 93.6539688 94.3267975 94.9173508 95.5959473 96.0595398 96.6541901 97.1980972 97.8096313 98.3359451 98.8513336 99.448494 99.9838409 100.477676 101.087769 101.643257 102.159027 102.757683 103.347702 103.946617 104.57045 105.157837 105.744339 106.388474 107.138641 107.772339 108.486641 109.172195 110.003448 110.838425 111.674088 112.640434 113.686951 114.724342 115.829346 117.076752 118.459999 120.062241 121.853668 124.000816 126.536217 129.352814 132.95401 136.785522 141.432587 147.911102 154.76413 166.426331 189.382904 512.95929 ------------------------------ Transdef: Tab for variable 10 0.0111021781 0.817437172 1.04697835 1.22828031 1.38222456 1.50814605 1.61467278 1.71794605 1.80861866 1.88740826 1.96273065 2.03214884 2.09099627 2.15190411 2.20510578 2.25065184 2.30352139 2.33904719 2.38037658 2.42293859 2.45488358 2.49244881 2.52704287 2.55152845 2.58011508 2.60598803 2.62717247 2.65040708 2.67394066 2.69848871 2.71975756 2.74049187 2.75955629 2.77895498 2.79564667 2.81054997 2.82545304 2.83952665 2.84932375 2.85794401 2.87004519 2.87957978 2.89086914 2.90159893 2.91134691 2.91993833 2.9274106 2.93392444 2.94250631 2.9498353 2.95729446 2.96340799 2.96939635 2.97546577 2.98027992 2.98591709 2.99042416 2.99590015 3.00151062 3.00603294 3.01122022 3.01513147 3.01971531 3.02452898 3.02883196 3.03300142 3.0373106 3.04132509 3.04465485 3.04896259 3.05274916 3.05679703 3.06014824 3.06414008 3.06727123 3.07089329 3.07361507 3.07671452 3.08011436 3.08315063 3.08648539 3.09026051 3.09331846 3.09623528 3.09922719 3.10205388 3.104527 3.1074264 3.10973597 3.11256266 3.11527252 3.11765695 3.12066746 3.12306857 3.12575912 3.1285696 3.1313293 3.13352013 3.13617826 3.13876152 3.14158916 ------------------------------ Transdef: Tab for variable 11 6.91413879E-06 0.00647367071 0.0125101507 0.01930058 0.0258809328 0.0329102874 0.0402399302 0.0471866131 0.0540843047 0.0609600395 0.0684991479 0.0757826865 0.0829850435 0.0904624462 0.097427249 0.104077697 0.110910699 0.117210418 0.124102831 0.130239338 0.136355758 0.142308116 0.148591682 0.154465795 0.160722017 0.167508602 0.173533767 0.179499149 0.184804797 0.19135046 0.197145551 0.202865124 0.208793521 0.213769794 0.220416561 0.226614594 0.232499421 0.238665104 0.244812608 0.251198888 0.257768273 0.263762712 0.270144939 0.276551843 0.282784969 0.289298654 0.296035767 0.302226126 0.30906567 0.315116286 0.321885347 0.328119338 0.335842311 0.342321754 0.349280417 0.355928838 0.363169312 0.369510204 0.376111865 0.382933497 0.388769805 0.395155668 0.402251214 0.409594387 0.416180551 0.423583269 0.430519223 0.436653376 0.442772806 0.44960165 0.456341952 0.463252425 0.469987035 0.476112485 0.48369801 0.491901636 0.499123633 0.506986499 0.515638828 0.523397803 0.530577779 0.540295541 0.548900843 0.557777345 0.567828536 0.576966345 0.586627245 0.596829474 0.607770324 0.618702769 0.6324054 0.644894779 0.658374131 0.674204528 0.694354832 0.713648319 0.737919927 0.765683413 0.80428648 0.859748483 1.1301049 ------------------------------ Transdef: Tab for variable 12 0.100036338 0.127548739 0.147235066 0.163982838 0.176739007 0.191008717 0.201266766 0.214395151 0.226337641 0.237591743 0.248449892 0.258589029 0.268915683 0.278628349 0.288600922 0.298164606 0.306704044 0.315059543 0.32444635 0.333766639 0.342698634 0.350248963 0.358915687 0.367426127 0.375610828 0.38274464 0.390714854 0.397739083 0.403874516 0.409900844 0.414877594 0.419887334 0.42542091 0.430161715 0.435199559 0.439347148 0.444272637 0.449244201 0.454368651 0.458855391 0.462941438 0.467415571 0.471799254 0.476345837 0.481183112 0.486020327 0.490538776 0.494744569 0.498828739 0.50361073 0.508632302 0.51333499 0.518401682 0.522917688 0.527197003 0.531854749 0.536827385 0.541683912 0.54651618 0.551316202 0.555663586 0.560068488 0.565109968 0.570014358 0.574739397 0.579910278 0.585622191 0.591425538 0.596540809 0.602243543 0.606847167 0.612676978 0.618492842 0.623660445 0.630063593 0.636459887 0.642983198 0.649154067 0.655391455 0.662061214 0.667594612 0.674246371 0.681919098 0.688954711 0.696782827 0.703554511 0.712045431 0.721132517 0.730543494 0.740215063 0.749884009 0.759113848 0.770606399 0.782279849 0.7954216 0.810519218 0.829216421 0.851268589 0.879689693 0.925738335 1.13539624 ------------------------------ Transdef: Tab for variable 13 20.0144272 21.2699089 22.0392876 22.5484486 23.1006432 23.5379601 23.9626312 24.4072723 24.8588772 25.2110462 25.5988121 25.9494705 26.3126106 26.6707287 27.0129356 27.3359127 27.6312408 27.9631596 28.2911243 28.5959816 28.8819466 29.2297363 29.5154076 29.8093529 30.1293449 30.408783 30.703371 30.9481506 31.2146034 31.5265541 31.8018341 32.1033287 32.3840065 32.6695633 32.965065 33.2198486 33.5180321 33.8068542 34.1229553 34.4303131 34.7186584 34.9853287 35.2887726 35.6044998 35.9252243 36.2064743 36.5358849 36.8843803 37.2397919 37.6249924 38.0057449 38.351265 38.6962662 39.0824051 39.4775543 39.8153915 40.1953545 40.5689697 40.9800568 41.3851318 41.8054886 42.3038902 42.7036438 43.1458244 43.5610809 44.0643997 44.6023216 45.1435928 45.6614075 46.2249222 46.7563324 47.3882751 48.0293884 48.6554375 49.4298363 50.1861572 50.8647118 51.7171478 52.5351028 53.4888382 54.4092026 55.3360023 56.3749237 57.640625 58.6182861 59.9120178 61.283329 62.6887283 64.1438141 65.6284332 67.443779 69.5172272 71.6843567 74.3232269 77.4489441 81.2263184 85.4182663 90.3762054 96.3167725 108.837448 381.504089 ------------------------------ Transdef: Tab for variable 14 10.0058851 10.4702568 10.7408962 10.9806767 11.2189054 11.3994408 11.5902996 11.7447834 11.9378262 12.1039257 12.2664471 12.4101276 12.5721531 12.7406082 12.9049053 13.0626926 13.2216454 13.3798275 13.5280809 13.6868515 13.8443737 13.9727287 14.1385527 14.2963982 14.4514265 14.6222057 14.7773514 14.9324207 15.1013765 15.2654638 15.4218168 15.5709372 15.7327728 15.8932734 16.0561352 16.2328186 16.3640766 16.527998 16.6876125 16.8438911 16.9793282 17.1737461 17.3595009 17.5559692 17.7188396 17.869812 18.0529213 18.2116795 18.3813858 18.5695457 18.7571926 18.9750099 19.1812725 19.3884697 19.5994034 19.8158569 20.0664387 20.2723045 20.503521 20.7433167 20.9756241 21.1988487 21.4351692 21.6722393 21.9061222 22.1430092 22.3843918 22.6249695 22.9212875 23.1906319 23.4599514 23.7296982 24.0563698 24.3962154 24.7491455 25.0753021 25.4547462 25.7554741 26.1013489 26.5132904 26.9601974 27.3876801 27.8772545 28.3799667 28.9724865 29.5293617 30.2381096 30.8984756 31.6242104 32.4428024 33.3956337 34.2900848 35.454071 36.6523438 38.0929108 40.0526733 41.9690552 44.6692429 48.4957161 55.6373215 122.221924 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 51.5 40.1 16.9 -2.5 23.2 51.2 31.2 45.4 -32.5 0.6 -2.8 13.3 -7.2 2 51.5 100.0 64.5 41.0 5.3 50.9 93.8 57.0 89.1 -53.5 -14.7 -26.3 37.0 16.0 3 40.1 64.5 100.0 18.2 -13.4 19.8 68.1 43.0 70.5 -22.0 -15.2 -29.9 68.4 2.6 4 16.9 41.0 18.2 100.0 -7.3 12.9 43.7 28.4 44.5 -13.8 -19.8 -34.6 9.5 69.3 5 -2.5 5.3 -13.4 -7.3 100.0 78.0 -22.4 38.3 34.4 35.9 0.9 3.0 -15.3 -8.5 6 23.2 50.9 19.8 12.9 78.0 100.0 27.9 66.5 69.2 0.8 -6.3 -9.8 5.9 0.6 7 51.2 93.8 68.1 43.7 -22.4 27.9 100.0 51.3 76.7 -59.3 -13.8 -26.1 41.8 19.4 8 31.2 57.0 43.0 28.4 38.3 66.5 51.3 100.0 70.6 -7.3 -14.2 -21.7 24.1 11.3 9 45.4 89.1 70.5 44.5 34.4 69.2 76.7 70.6 100.0 -27.0 -17.4 -31.8 41.9 19.2 10 -32.5 -53.5 -22.0 -13.8 35.9 0.8 -59.3 -7.3 -27.0 100.0 2.1 4.7 -11.4 -3.0 11 0.6 -14.7 -15.2 -19.8 0.9 -6.3 -13.8 -14.2 -17.4 2.1 100.0 60.4 -6.6 -11.1 12 -2.8 -26.3 -29.9 -34.6 3.0 -9.8 -26.1 -21.7 -31.8 4.7 60.4 100.0 -12.9 -14.3 13 13.3 37.0 68.4 9.5 -15.3 5.9 41.8 24.1 41.9 -11.4 -6.6 -12.9 100.0 35.9 14 -7.2 16.0 2.6 69.3 -8.5 0.6 19.4 11.3 19.2 -3.0 -11.1 -14.3 35.9 100.0 TOTAL CORRELATION TO TARGET (diagonal) 109.735735 TOTAL CORRELATION OF ALL VARIABLES 57.944199 ROUND 1: MAX CORR ( 57.9441858) AFTER KILLING INPUT VARIABLE 9 CONTR 0.0390456093 ROUND 2: MAX CORR ( 57.9439967) AFTER KILLING INPUT VARIABLE 5 CONTR 0.148037047 ROUND 3: MAX CORR ( 57.9394916) AFTER KILLING INPUT VARIABLE 11 CONTR 0.722546066 ROUND 4: MAX CORR ( 57.9273082) AFTER KILLING INPUT VARIABLE 6 CONTR 1.18812535 ROUND 5: MAX CORR ( 57.7975142) AFTER KILLING INPUT VARIABLE 7 CONTR 3.87561529 ROUND 6: MAX CORR ( 57.6447564) AFTER KILLING INPUT VARIABLE 8 CONTR 4.19937013 ROUND 7: MAX CORR ( 57.4150409) AFTER KILLING INPUT VARIABLE 13 CONTR 5.14110996 ROUND 8: MAX CORR ( 57.010474) AFTER KILLING INPUT VARIABLE 10 CONTR 6.80387986 ROUND 9: MAX CORR ( 55.940943) AFTER KILLING INPUT VARIABLE 3 CONTR 10.9911348 ROUND 10: MAX CORR ( 54.6482433) AFTER KILLING INPUT VARIABLE 4 CONTR 11.9565298 ROUND 11: MAX CORR ( 53.8128213) AFTER KILLING INPUT VARIABLE 12 CONTR 9.5189687 ROUND 12: MAX CORR ( 51.4862509) AFTER KILLING INPUT VARIABLE 14 CONTR 15.6520189 LAST REMAINING VARIABLE: 2 total correlation to target: 57.944199 % total significance: 112.197648 sigma correlations of single variables to target: variable 2: 51.4862509 % , in sigma: 99.6930906 variable 3: 40.0679783 % , in sigma: 77.583831 variable 4: 16.9254933 % , in sigma: 32.7729191 variable 5: -2.52937952 % , in sigma: 4.89765047 variable 6: 23.2319382 % , in sigma: 44.9841205 variable 7: 51.1821992 % , in sigma: 99.1043537 variable 8: 31.2042492 % , in sigma: 60.420947 variable 9: 45.4294364 % , in sigma: 87.9652497 variable 10: -32.5286984 % , in sigma: 62.9854849 variable 11: 0.639128344 % , in sigma: 1.23754747 variable 12: -2.77655282 % , in sigma: 5.3762534 variable 13: 13.3246696 % , in sigma: 25.8006257 variable 14: -7.21972296 % , in sigma: 13.9795864 variables sorted by significance: 1 most relevant variable 2 corr 51.4862518 , in sigma: 99.6930923 2 most relevant variable 14 corr 15.6520185 , in sigma: 30.3070834 3 most relevant variable 12 corr 9.51896858 , in sigma: 18.4316275 4 most relevant variable 4 corr 11.9565296 , in sigma: 23.1514893 5 most relevant variable 3 corr 10.9911346 , in sigma: 21.2821901 6 most relevant variable 10 corr 6.80387974 , in sigma: 13.1743871 7 most relevant variable 13 corr 5.14110994 , in sigma: 9.95475743 8 most relevant variable 8 corr 4.19936991 , in sigma: 8.13126139 9 most relevant variable 7 corr 3.87561536 , in sigma: 7.50437381 10 most relevant variable 6 corr 1.18812537 , in sigma: 2.30057322 11 most relevant variable 11 corr 0.722546041 , in sigma: 1.39906959 12 most relevant variable 5 corr 0.148037046 , in sigma: 0.286644888 13 most relevant variable 9 corr 0.0390456095 , in sigma: 0.0756042131 global correlations between input variables: variable 2: 98.8810117 % variable 3: 93.8850716 % variable 4: 88.7368388 % variable 5: 94.526143 % variable 6: 92.4869913 % variable 7: 98.4359077 % variable 8: 82.4543785 % variable 9: 98.5186386 % variable 10: 73.1138407 % variable 11: 61.0384907 % variable 12: 68.1411805 % variable 13: 85.3426223 % variable 14: 86.094769 % significance loss when removing single variables: variable 2: corr = 2.69915607 % , sigma = 5.22638967 variable 3: corr = 7.65272598 % , sigma = 14.8180124 variable 4: corr = 7.11673941 % , sigma = 13.7801789 variable 5: corr = 0.149208528 % , sigma = 0.288913235 variable 6: corr = 0.833277386 % , sigma = 1.61347926 variable 7: corr = 3.02035243 % , sigma = 5.84832384 variable 8: corr = 2.36193444 % , sigma = 4.57342571 variable 9: corr = 0.0390456093 % , sigma = 0.0756042127 variable 10: corr = 5.50492841 % , sigma = 10.659221 variable 11: corr = 0.718772528 % , sigma = 1.39176292 variable 12: corr = 11.1437781 % , sigma = 21.5777545 variable 13: corr = 5.25191643 % , sigma = 10.1693126 variable 14: corr = 9.90181648 % , sigma = 19.1729378 Keep only 9 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 10 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 15.889142 sigma out 15 active outputs RANK 2 NODE 5 --> 13.5189533 sigma out 15 active outputs RANK 3 NODE 10 --> 13.0193052 sigma out 15 active outputs RANK 4 NODE 6 --> 11.5617495 sigma out 15 active outputs RANK 5 NODE 3 --> 9.8274107 sigma out 15 active outputs RANK 6 NODE 7 --> 9.75470638 sigma out 15 active outputs RANK 7 NODE 1 --> 9.593153 sigma out 15 active outputs RANK 8 NODE 8 --> 9.36745834 sigma out 15 active outputs RANK 9 NODE 9 --> 7.27972221 sigma out 15 active outputs RANK 10 NODE 4 --> 6.66085339 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 8 --> 16.1273022 sigma in 10act. ( 20.6024208 sig out 1act.) RANK 2 NODE 13 --> 16.0381641 sigma in 10act. ( 18.1604939 sig out 1act.) RANK 3 NODE 11 --> 12.5083618 sigma in 10act. ( 13.8539228 sig out 1act.) RANK 4 NODE 4 --> 10.10326 sigma in 10act. ( 9.96086502 sig out 1act.) RANK 5 NODE 14 --> 9.72921944 sigma in 10act. ( 10.7329388 sig out 1act.) RANK 6 NODE 7 --> 9.55423069 sigma in 10act. ( 12.4019194 sig out 1act.) RANK 7 NODE 15 --> 9.46808052 sigma in 10act. ( 9.91362953 sig out 1act.) RANK 8 NODE 2 --> 8.06791306 sigma in 10act. ( 7.49183941 sig out 1act.) RANK 9 NODE 1 --> 6.55355501 sigma in 10act. ( 7.59892273 sig out 1act.) RANK 10 NODE 10 --> 5.57593966 sigma in 10act. ( 8.64081001 sig out 1act.) RANK 11 NODE 6 --> 2.64150691 sigma in 10act. ( 3.27524281 sig out 1act.) RANK 12 NODE 5 --> 1.99329674 sigma in 10act. ( 0.63277787 sig out 1act.) RANK 13 NODE 9 --> 1.58266318 sigma in 10act. ( 2.42105246 sig out 1act.) RANK 14 NODE 3 --> 1.37246811 sigma in 10act. ( 0.0783872828 sig out 1act.) RANK 15 NODE 12 --> 1.17087102 sigma in 10act. ( 1.30962253 sig out 1act.) sorted by output significance RANK 1 NODE 8 --> 20.6024208 sigma out 1act.( 16.1273022 sig in 10act.) RANK 2 NODE 13 --> 18.1604939 sigma out 1act.( 16.0381641 sig in 10act.) RANK 3 NODE 11 --> 13.8539228 sigma out 1act.( 12.5083618 sig in 10act.) RANK 4 NODE 7 --> 12.4019194 sigma out 1act.( 9.55423069 sig in 10act.) RANK 5 NODE 14 --> 10.7329388 sigma out 1act.( 9.72921944 sig in 10act.) RANK 6 NODE 4 --> 9.96086502 sigma out 1act.( 10.10326 sig in 10act.) RANK 7 NODE 15 --> 9.91362953 sigma out 1act.( 9.46808052 sig in 10act.) RANK 8 NODE 10 --> 8.64081001 sigma out 1act.( 5.57593966 sig in 10act.) RANK 9 NODE 1 --> 7.59892273 sigma out 1act.( 6.55355501 sig in 10act.) RANK 10 NODE 2 --> 7.49183941 sigma out 1act.( 8.06791306 sig in 10act.) RANK 11 NODE 6 --> 3.27524281 sigma out 1act.( 2.64150691 sig in 10act.) RANK 12 NODE 9 --> 2.42105246 sigma out 1act.( 1.58266318 sig in 10act.) RANK 13 NODE 12 --> 1.30962253 sigma out 1act.( 1.17087102 sig in 10act.) RANK 14 NODE 5 --> 0.63277787 sigma out 1act.( 1.99329674 sig in 10act.) RANK 15 NODE 3 --> 0.0783872828 sigma out 1act.( 1.37246811 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 40.2485046 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 22.0921574 sigma out 15 active outputs RANK 2 NODE 5 --> 16.7386818 sigma out 15 active outputs RANK 3 NODE 10 --> 15.2478647 sigma out 15 active outputs RANK 4 NODE 6 --> 13.8303699 sigma out 15 active outputs RANK 5 NODE 1 --> 12.8278151 sigma out 15 active outputs RANK 6 NODE 7 --> 12.1724882 sigma out 15 active outputs RANK 7 NODE 8 --> 12.0676527 sigma out 15 active outputs RANK 8 NODE 3 --> 11.4719286 sigma out 15 active outputs RANK 9 NODE 4 --> 9.27918339 sigma out 15 active outputs RANK 10 NODE 9 --> 8.9104414 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 8 --> 19.2072086 sigma in 10act. ( 20.4919605 sig out 1act.) RANK 2 NODE 13 --> 17.9914093 sigma in 10act. ( 18.1548862 sig out 1act.) RANK 3 NODE 7 --> 14.2866888 sigma in 10act. ( 12.3409786 sig out 1act.) RANK 4 NODE 11 --> 14.250948 sigma in 10act. ( 13.7342758 sig out 1act.) RANK 5 NODE 10 --> 11.6278353 sigma in 10act. ( 9.13611221 sig out 1act.) RANK 6 NODE 14 --> 11.3579292 sigma in 10act. ( 10.5240536 sig out 1act.) RANK 7 NODE 15 --> 10.6315002 sigma in 10act. ( 9.79237747 sig out 1act.) RANK 8 NODE 4 --> 9.93951893 sigma in 10act. ( 9.64475441 sig out 1act.) RANK 9 NODE 1 --> 9.34975338 sigma in 10act. ( 7.91491175 sig out 1act.) RANK 10 NODE 2 --> 9.14169025 sigma in 10act. ( 7.13501406 sig out 1act.) RANK 11 NODE 6 --> 8.10063171 sigma in 10act. ( 3.47689033 sig out 1act.) RANK 12 NODE 5 --> 6.58710957 sigma in 10act. ( 0.605621517 sig out 1act.) RANK 13 NODE 12 --> 6.10697794 sigma in 10act. ( 1.72481275 sig out 1act.) RANK 14 NODE 9 --> 5.93167257 sigma in 10act. ( 4.01301336 sig out 1act.) RANK 15 NODE 3 --> 4.3773489 sigma in 10act. ( 0.192610875 sig out 1act.) sorted by output significance RANK 1 NODE 8 --> 20.4919605 sigma out 1act.( 19.2072086 sig in 10act.) RANK 2 NODE 13 --> 18.1548862 sigma out 1act.( 17.9914093 sig in 10act.) RANK 3 NODE 11 --> 13.7342758 sigma out 1act.( 14.250948 sig in 10act.) RANK 4 NODE 7 --> 12.3409786 sigma out 1act.( 14.2866888 sig in 10act.) RANK 5 NODE 14 --> 10.5240536 sigma out 1act.( 11.3579292 sig in 10act.) RANK 6 NODE 15 --> 9.79237747 sigma out 1act.( 10.6315002 sig in 10act.) RANK 7 NODE 4 --> 9.64475441 sigma out 1act.( 9.93951893 sig in 10act.) RANK 8 NODE 10 --> 9.13611221 sigma out 1act.( 11.6278353 sig in 10act.) RANK 9 NODE 1 --> 7.91491175 sigma out 1act.( 9.34975338 sig in 10act.) RANK 10 NODE 2 --> 7.13501406 sigma out 1act.( 9.14169025 sig in 10act.) RANK 11 NODE 9 --> 4.01301336 sigma out 1act.( 5.93167257 sig in 10act.) RANK 12 NODE 6 --> 3.47689033 sigma out 1act.( 8.10063171 sig in 10act.) RANK 13 NODE 12 --> 1.72481275 sigma out 1act.( 6.10697794 sig in 10act.) RANK 14 NODE 5 --> 0.605621517 sigma out 1act.( 6.58710957 sig in 10act.) RANK 15 NODE 3 --> 0.192610875 sigma out 1act.( 4.3773489 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 40.2332306 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.44785434 *** contribution from regularisation: 0.00321050361 *** contribution from error: -0.451064855 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.469684243 *** contribution from regularisation: 0.0020073757 *** contribution from error: -0.471691608 *********************************************** -----------------> Test sample ENTER BFGS code START -45809.6 0.212811187 -0.0550298579 EXIT FROM BFGS code FG_START 0. 0.212811187 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.481312156 *** contribution from regularisation: 0.00181444013 *** contribution from error: -0.483126611 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -46933.711 0.212811187 -116.283119 EXIT FROM BFGS code FG_LNSRCH 0. 0.186541006 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0.510063529 *** contribution from regularisation: 0.0029311385 *** contribution from error: -0.512994647 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -49737.3138 0.186541006 -67.3207703 EXIT FROM BFGS code NEW_X -49737.3138 0.186541006 -67.3207703 ENTER BFGS code NEW_X -49737.3138 0.186541006 -67.3207703 EXIT FROM BFGS code FG_LNSRCH 0. 0.169651031 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.514355302 *** contribution from regularisation: 0.00302918721 *** contribution from error: -0.51738447 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50155.8122 0.169651031 -15.7757893 EXIT FROM BFGS code NEW_X -50155.8122 0.169651031 -15.7757893 ENTER BFGS code NEW_X -50155.8122 0.169651031 -15.7757893 EXIT FROM BFGS code FG_LNSRCH 0. 0.165455952 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.514932752 *** contribution from regularisation: 0.00277284044 *** contribution from error: -0.517705619 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50212.1196 0.165455952 -12.4907846 EXIT FROM BFGS code NEW_X -50212.1196 0.165455952 -12.4907846 ENTER BFGS code NEW_X -50212.1196 0.165455952 -12.4907846 EXIT FROM BFGS code FG_LNSRCH 0. 0.151372284 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.515479982 *** contribution from regularisation: 0.00259304931 *** contribution from error: -0.518073022 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50265.4859 0.151372284 -2.01100683 EXIT FROM BFGS code NEW_X -50265.4859 0.151372284 -2.01100683 ENTER BFGS code NEW_X -50265.4859 0.151372284 -2.01100683 EXIT FROM BFGS code FG_LNSRCH 0. 0.1463577 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.515827894 *** contribution from regularisation: 0.00253992225 *** contribution from error: -0.518367827 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50299.4117 0.1463577 -5.58516598 EXIT FROM BFGS code NEW_X -50299.4117 0.1463577 -5.58516598 ENTER BFGS code NEW_X -50299.4117 0.1463577 -5.58516598 EXIT FROM BFGS code FG_LNSRCH 0. 0.0926241875 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.517592251 *** contribution from regularisation: 0.00250881212 *** contribution from error: -0.52010107 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50471.4533 0.0926241875 -6.11520481 EXIT FROM BFGS code NEW_X -50471.4533 0.0926241875 -6.11520481 ENTER BFGS code NEW_X -50471.4533 0.0926241875 -6.11520481 EXIT FROM BFGS code FG_LNSRCH 0. 0.0461319052 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 42.8512459 sigma out 15 active outputs RANK 2 NODE 10 --> 26.9451256 sigma out 15 active outputs RANK 3 NODE 5 --> 26.8489494 sigma out 15 active outputs RANK 4 NODE 1 --> 26.4648342 sigma out 15 active outputs RANK 5 NODE 9 --> 20.927248 sigma out 15 active outputs RANK 6 NODE 8 --> 13.3200703 sigma out 15 active outputs RANK 7 NODE 4 --> 12.6629686 sigma out 15 active outputs RANK 8 NODE 7 --> 12.1999569 sigma out 15 active outputs RANK 9 NODE 3 --> 8.39327526 sigma out 15 active outputs RANK 10 NODE 6 --> 8.06263256 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 7 --> 48.7365952 sigma in 10act. ( 45.9660797 sig out 1act.) RANK 2 NODE 8 --> 32.7534485 sigma in 10act. ( 32.0015984 sig out 1act.) RANK 3 NODE 10 --> 22.7471485 sigma in 10act. ( 22.6453686 sig out 1act.) RANK 4 NODE 13 --> 19.9190693 sigma in 10act. ( 23.2989464 sig out 1act.) RANK 5 NODE 15 --> 12.5065899 sigma in 10act. ( 13.1738148 sig out 1act.) RANK 6 NODE 12 --> 11.1458349 sigma in 10act. ( 9.51265144 sig out 1act.) RANK 7 NODE 14 --> 9.9178915 sigma in 10act. ( 8.78373623 sig out 1act.) RANK 8 NODE 9 --> 9.83090305 sigma in 10act. ( 7.51880646 sig out 1act.) RANK 9 NODE 1 --> 6.97746372 sigma in 10act. ( 5.40868521 sig out 1act.) RANK 10 NODE 6 --> 6.52323198 sigma in 10act. ( 4.37117481 sig out 1act.) RANK 11 NODE 11 --> 5.9329071 sigma in 10act. ( 4.46795607 sig out 1act.) RANK 12 NODE 4 --> 5.19037771 sigma in 10act. ( 4.34359741 sig out 1act.) RANK 13 NODE 2 --> 4.67864561 sigma in 10act. ( 2.33891678 sig out 1act.) RANK 14 NODE 5 --> 3.97505927 sigma in 10act. ( 0.484529138 sig out 1act.) RANK 15 NODE 3 --> 3.81702399 sigma in 10act. ( 2.85959935 sig out 1act.) sorted by output significance RANK 1 NODE 7 --> 45.9660797 sigma out 1act.( 48.7365952 sig in 10act.) RANK 2 NODE 8 --> 32.0015984 sigma out 1act.( 32.7534485 sig in 10act.) RANK 3 NODE 13 --> 23.2989464 sigma out 1act.( 19.9190693 sig in 10act.) RANK 4 NODE 10 --> 22.6453686 sigma out 1act.( 22.7471485 sig in 10act.) RANK 5 NODE 15 --> 13.1738148 sigma out 1act.( 12.5065899 sig in 10act.) RANK 6 NODE 12 --> 9.51265144 sigma out 1act.( 11.1458349 sig in 10act.) RANK 7 NODE 14 --> 8.78373623 sigma out 1act.( 9.9178915 sig in 10act.) RANK 8 NODE 9 --> 7.51880646 sigma out 1act.( 9.83090305 sig in 10act.) RANK 9 NODE 1 --> 5.40868521 sigma out 1act.( 6.97746372 sig in 10act.) RANK 10 NODE 11 --> 4.46795607 sigma out 1act.( 5.9329071 sig in 10act.) RANK 11 NODE 6 --> 4.37117481 sigma out 1act.( 6.52323198 sig in 10act.) RANK 12 NODE 4 --> 4.34359741 sigma out 1act.( 5.19037771 sig in 10act.) RANK 13 NODE 3 --> 2.85959935 sigma out 1act.( 3.81702399 sig in 10act.) RANK 14 NODE 2 --> 2.33891678 sigma out 1act.( 4.67864561 sig in 10act.) RANK 15 NODE 5 --> 0.484529138 sigma out 1act.( 3.97505927 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 68.4940567 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.519144475 *** contribution from regularisation: 0.00258031301 *** contribution from error: -0.521724761 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -50622.8149 0.0461319052 8.92449951 EXIT FROM BFGS code NEW_X -50622.8149 0.0461319052 8.92449951 ENTER BFGS code NEW_X -50622.8149 0.0461319052 8.92449951 EXIT FROM BFGS code FG_LNSRCH 0. -0.00366152683 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.519387186 *** contribution from regularisation: 0.00298613776 *** contribution from error: -0.522373319 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50646.4856 -0.00366152683 18.9789352 EXIT FROM BFGS code FG_LNSRCH 0. 0.0238624271 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.521017671 *** contribution from regularisation: 0.00203239964 *** contribution from error: -0.52305007 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50805.4742 0.0238624271 18.2327957 EXIT FROM BFGS code NEW_X -50805.4742 0.0238624271 18.2327957 ENTER BFGS code NEW_X -50805.4742 0.0238624271 18.2327957 EXIT FROM BFGS code FG_LNSRCH 0. 0.0296637751 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.520729125 *** contribution from regularisation: 0.00262807077 *** contribution from error: -0.523357213 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50777.3406 0.0296637751 30.0296841 EXIT FROM BFGS code FG_LNSRCH 0. 0.0257191211 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.521121979 *** contribution from regularisation: 0.00234812452 *** contribution from error: -0.523470104 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50815.646 0.0257191211 21.6895752 EXIT FROM BFGS code NEW_X -50815.646 0.0257191211 21.6895752 ENTER BFGS code NEW_X -50815.646 0.0257191211 21.6895752 EXIT FROM BFGS code FG_LNSRCH 0. 0.0212439727 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.521286964 *** contribution from regularisation: 0.00258837384 *** contribution from error: -0.523875356 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50831.7338 0.0212439727 11.4808521 EXIT FROM BFGS code NEW_X -50831.7338 0.0212439727 11.4808521 ENTER BFGS code NEW_X -50831.7338 0.0212439727 11.4808521 EXIT FROM BFGS code FG_LNSRCH 0. 0.0254841745 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.521636546 *** contribution from regularisation: 0.0023886268 *** contribution from error: -0.524025202 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50865.8232 0.0254841745 6.82492161 EXIT FROM BFGS code NEW_X -50865.8232 0.0254841745 6.82492161 ENTER BFGS code NEW_X -50865.8232 0.0254841745 6.82492161 EXIT FROM BFGS code FG_LNSRCH 0. 0.0619428083 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.522517383 *** contribution from regularisation: 0.00245458633 *** contribution from error: -0.524971962 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -50951.7147 0.0619428083 -41.296032 EXIT FROM BFGS code NEW_X -50951.7147 0.0619428083 -41.296032 ENTER BFGS code NEW_X -50951.7147 0.0619428083 -41.296032 EXIT FROM BFGS code FG_LNSRCH 0. 0.0674768165 0. --------------------------------------------------- Iteration : 18 *********************************************** *** Learn Path 18 *** loss function: -0.523262143 *** contribution from regularisation: 0.00243947003 *** contribution from error: -0.525701642 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51024.3368 0.0674768165 -34.6054001 EXIT FROM BFGS code NEW_X -51024.3368 0.0674768165 -34.6054001 ENTER BFGS code NEW_X -51024.3368 0.0674768165 -34.6054001 EXIT FROM BFGS code FG_LNSRCH 0. 0.0491970181 0. --------------------------------------------------- Iteration : 19 *********************************************** *** Learn Path 19 *** loss function: -0.523499548 *** contribution from regularisation: 0.00231075101 *** contribution from error: -0.525810301 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51047.4908 0.0491970181 4.96148062 EXIT FROM BFGS code NEW_X -51047.4908 0.0491970181 4.96148062 ENTER BFGS code NEW_X -51047.4908 0.0491970181 4.96148062 EXIT FROM BFGS code FG_LNSRCH 0. 0.0415278785 0. --------------------------------------------------- Iteration : 20 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 69.3088837 sigma out 15 active outputs RANK 2 NODE 10 --> 41.9340782 sigma out 15 active outputs RANK 3 NODE 9 --> 38.2468414 sigma out 15 active outputs RANK 4 NODE 1 --> 36.6160393 sigma out 15 active outputs RANK 5 NODE 8 --> 27.4143353 sigma out 15 active outputs RANK 6 NODE 5 --> 23.7765694 sigma out 15 active outputs RANK 7 NODE 7 --> 23.7542152 sigma out 15 active outputs RANK 8 NODE 4 --> 23.0026264 sigma out 15 active outputs RANK 9 NODE 3 --> 19.5801067 sigma out 15 active outputs RANK 10 NODE 6 --> 10.4609365 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 7 --> 79.8888931 sigma in 10act. ( 74.3889389 sig out 1act.) RANK 2 NODE 8 --> 62.5150337 sigma in 10act. ( 65.895401 sig out 1act.) RANK 3 NODE 13 --> 28.3758183 sigma in 10act. ( 33.4207993 sig out 1act.) RANK 4 NODE 10 --> 23.3270512 sigma in 10act. ( 25.9975471 sig out 1act.) RANK 5 NODE 15 --> 15.0375891 sigma in 10act. ( 17.433012 sig out 1act.) RANK 6 NODE 2 --> 11.425725 sigma in 10act. ( 12.6685581 sig out 1act.) RANK 7 NODE 12 --> 9.83862495 sigma in 10act. ( 10.0984459 sig out 1act.) RANK 8 NODE 3 --> 6.98492384 sigma in 10act. ( 7.34299517 sig out 1act.) RANK 9 NODE 1 --> 6.53010178 sigma in 10act. ( 5.3860507 sig out 1act.) RANK 10 NODE 4 --> 4.65158987 sigma in 10act. ( 4.00810099 sig out 1act.) RANK 11 NODE 14 --> 4.5953331 sigma in 10act. ( 2.30645967 sig out 1act.) RANK 12 NODE 6 --> 4.12839365 sigma in 10act. ( 2.8087163 sig out 1act.) RANK 13 NODE 9 --> 3.48623133 sigma in 10act. ( 1.16371071 sig out 1act.) RANK 14 NODE 11 --> 2.69995594 sigma in 10act. ( 0.882645607 sig out 1act.) RANK 15 NODE 5 --> 2.22708464 sigma in 10act. ( 0.901742637 sig out 1act.) sorted by output significance RANK 1 NODE 7 --> 74.3889389 sigma out 1act.( 79.8888931 sig in 10act.) RANK 2 NODE 8 --> 65.895401 sigma out 1act.( 62.5150337 sig in 10act.) RANK 3 NODE 13 --> 33.4207993 sigma out 1act.( 28.3758183 sig in 10act.) RANK 4 NODE 10 --> 25.9975471 sigma out 1act.( 23.3270512 sig in 10act.) RANK 5 NODE 15 --> 17.433012 sigma out 1act.( 15.0375891 sig in 10act.) RANK 6 NODE 2 --> 12.6685581 sigma out 1act.( 11.425725 sig in 10act.) RANK 7 NODE 12 --> 10.0984459 sigma out 1act.( 9.83862495 sig in 10act.) RANK 8 NODE 3 --> 7.34299517 sigma out 1act.( 6.98492384 sig in 10act.) RANK 9 NODE 1 --> 5.3860507 sigma out 1act.( 6.53010178 sig in 10act.) RANK 10 NODE 4 --> 4.00810099 sigma out 1act.( 4.65158987 sig in 10act.) RANK 11 NODE 6 --> 2.8087163 sigma out 1act.( 4.12839365 sig in 10act.) RANK 12 NODE 14 --> 2.30645967 sigma out 1act.( 4.5953331 sig in 10act.) RANK 13 NODE 9 --> 1.16371071 sigma out 1act.( 3.48623133 sig in 10act.) RANK 14 NODE 5 --> 0.901742637 sigma out 1act.( 2.22708464 sig in 10act.) RANK 15 NODE 11 --> 0.882645607 sigma out 1act.( 2.69995594 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 111.131767 sigma in 15 active inputs *********************************************** *** Learn Path 20 *** loss function: -0.523746669 *** contribution from regularisation: 0.00228085555 *** contribution from error: -0.526027501 *********************************************** -----------------> Test sample Iteration No: 20 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -51071.5853 0.0415278785 5.70996141 EXIT FROM BFGS code NEW_X -51071.5853 0.0415278785 5.70996141 ENTER BFGS code NEW_X -51071.5853 0.0415278785 5.70996141 EXIT FROM BFGS code FG_LNSRCH 0. 0.0340454839 0. --------------------------------------------------- Iteration : 21 *********************************************** *** Learn Path 21 *** loss function: -0.523840964 *** contribution from regularisation: 0.00232193014 *** contribution from error: -0.526162922 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51080.7806 0.0340454839 5.34732866 EXIT FROM BFGS code NEW_X -51080.7806 0.0340454839 5.34732866 ENTER BFGS code NEW_X -51080.7806 0.0340454839 5.34732866 EXIT FROM BFGS code FG_LNSRCH 0. -0.013622568 0. --------------------------------------------------- Iteration : 22 *********************************************** *** Learn Path 22 *** loss function: -0.524401665 *** contribution from regularisation: 0.0023508931 *** contribution from error: -0.526752532 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51135.4536 -0.013622568 16.5831738 EXIT FROM BFGS code NEW_X -51135.4536 -0.013622568 16.5831738 ENTER BFGS code NEW_X -51135.4536 -0.013622568 16.5831738 EXIT FROM BFGS code FG_LNSRCH 0. -0.0135609424 0. --------------------------------------------------- Iteration : 23 *********************************************** *** Learn Path 23 *** loss function: -0.524323344 *** contribution from regularisation: 0.00242777518 *** contribution from error: -0.526751101 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51127.8161 -0.0135609424 -81.0170746 EXIT FROM BFGS code FG_LNSRCH 0. -0.0136023713 0. --------------------------------------------------- Iteration : 24 *********************************************** *** Learn Path 24 *** loss function: -0.524351001 *** contribution from regularisation: 0.00248858891 *** contribution from error: -0.526839614 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51130.5131 -0.0136023713 -15.545145 EXIT FROM BFGS code FG_LNSRCH 0. -0.0136194676 0. --------------------------------------------------- Iteration : 25 *********************************************** *** Learn Path 25 *** loss function: -0.524327934 *** contribution from regularisation: 0.00244234828 *** contribution from error: -0.526770294 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51128.267 -0.0136194676 11.670785 EXIT FROM BFGS code FG_LNSRCH 0. -0.0136224702 0. --------------------------------------------------- Iteration : 26 *********************************************** *** Learn Path 26 *** loss function: -0.524326146 *** contribution from regularisation: 0.00242699566 *** contribution from error: -0.526753128 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51128.0929 -0.0136224702 16.4765301 EXIT FROM BFGS code FG_LNSRCH 0. -0.013622568 0. --------------------------------------------------- Iteration : 27 *********************************************** *** Learn Path 27 *** loss function: -0.524316251 *** contribution from regularisation: 0.00243631913 *** contribution from error: -0.526752591 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51127.1237 -0.013622568 16.6399708 EXIT FROM BFGS code FG_LNSRCH 0. -0.013622568 0. --------------------------------------------------- Iteration : 28 *********************************************** *** Learn Path 28 *** loss function: -0.524319589 *** contribution from regularisation: 0.00243292656 *** contribution from error: -0.526752532 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51127.4544 -0.013622568 16.6480579 EXIT FROM BFGS code FG_LNSRCH 0. -0.013622568 0. --------------------------------------------------- Iteration : 29 *********************************************** *** Learn Path 29 *** loss function: -0.524319649 *** contribution from regularisation: 0.00243291585 *** contribution from error: -0.526752591 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -51127.4554 -0.013622568 16.6550503 EXIT FROM BFGS code FG_LNSRCH 0. -0.013622568 0. --------------------------------------------------- Iteration : 30 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 71.983902 sigma out 15 active outputs RANK 2 NODE 1 --> 45.1884804 sigma out 15 active outputs RANK 3 NODE 10 --> 40.0988235 sigma out 15 active outputs RANK 4 NODE 9 --> 36.7025795 sigma out 15 active outputs RANK 5 NODE 4 --> 27.8426571 sigma out 15 active outputs RANK 6 NODE 8 --> 26.7599773 sigma out 15 active outputs RANK 7 NODE 5 --> 23.8549347 sigma out 15 active outputs RANK 8 NODE 7 --> 23.8086452 sigma out 15 active outputs RANK 9 NODE 3 --> 18.8007622 sigma out 15 active outputs RANK 10 NODE 6 --> 9.49960804 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 7 --> 84.6507568 sigma in 10act. ( 75.4072189 sig out 1act.) RANK 2 NODE 8 --> 64.240036 sigma in 10act. ( 74.5659561 sig out 1act.) RANK 3 NODE 13 --> 28.534174 sigma in 10act. ( 33.8630333 sig out 1act.) RANK 4 NODE 10 --> 22.105711 sigma in 10act. ( 25.093359 sig out 1act.) RANK 5 NODE 15 --> 14.0461836 sigma in 10act. ( 16.2424755 sig out 1act.) RANK 6 NODE 2 --> 11.6169195 sigma in 10act. ( 12.9471436 sig out 1act.) RANK 7 NODE 12 --> 10.1339674 sigma in 10act. ( 10.5127678 sig out 1act.) RANK 8 NODE 1 --> 9.29932976 sigma in 10act. ( 8.85263729 sig out 1act.) RANK 9 NODE 3 --> 7.4406538 sigma in 10act. ( 7.99798822 sig out 1act.) RANK 10 NODE 4 --> 4.35619736 sigma in 10act. ( 3.54936624 sig out 1act.) RANK 11 NODE 14 --> 3.94783497 sigma in 10act. ( 1.29267645 sig out 1act.) RANK 12 NODE 6 --> 3.90141869 sigma in 10act. ( 2.63333917 sig out 1act.) RANK 13 NODE 9 --> 3.79021978 sigma in 10act. ( 1.85781312 sig out 1act.) RANK 14 NODE 5 --> 2.71659064 sigma in 10act. ( 2.05738974 sig out 1act.) RANK 15 NODE 11 --> 2.42829871 sigma in 10act. ( 0.397445917 sig out 1act.) sorted by output significance RANK 1 NODE 7 --> 75.4072189 sigma out 1act.( 84.6507568 sig in 10act.) RANK 2 NODE 8 --> 74.5659561 sigma out 1act.( 64.240036 sig in 10act.) RANK 3 NODE 13 --> 33.8630333 sigma out 1act.( 28.534174 sig in 10act.) RANK 4 NODE 10 --> 25.093359 sigma out 1act.( 22.105711 sig in 10act.) RANK 5 NODE 15 --> 16.2424755 sigma out 1act.( 14.0461836 sig in 10act.) RANK 6 NODE 2 --> 12.9471436 sigma out 1act.( 11.6169195 sig in 10act.) RANK 7 NODE 12 --> 10.5127678 sigma out 1act.( 10.1339674 sig in 10act.) RANK 8 NODE 1 --> 8.85263729 sigma out 1act.( 9.29932976 sig in 10act.) RANK 9 NODE 3 --> 7.99798822 sigma out 1act.( 7.4406538 sig in 10act.) RANK 10 NODE 4 --> 3.54936624 sigma out 1act.( 4.35619736 sig in 10act.) RANK 11 NODE 6 --> 2.63333917 sigma out 1act.( 3.90141869 sig in 10act.) RANK 12 NODE 5 --> 2.05738974 sigma out 1act.( 2.71659064 sig in 10act.) RANK 13 NODE 9 --> 1.85781312 sigma out 1act.( 3.79021978 sig in 10act.) RANK 14 NODE 14 --> 1.29267645 sigma out 1act.( 3.94783497 sig in 10act.) RANK 15 NODE 11 --> 0.397445917 sigma out 1act.( 2.42829871 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 117.200935 sigma in 15 active inputs *********************************************** *** Learn Path 30 *** loss function: -0.524313927 *** contribution from regularisation: 0.00243860576 *** contribution from error: -0.526752532 *********************************************** -----------------> Test sample Iteration No: 30 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -51126.9006 -0.013622568 16.6598454 EXIT FROM BFGS code NEW_X -51126.9006 -0.013622568 16.6598454 ENTER BFGS code NEW_X -51126.9006 -0.013622568 16.6598454 EXIT FROM BFGS code CONVERGENC -51126.9006 -0.013622568 16.6598454 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 109.16095 sigma out 15 active outputs RANK 2 NODE 1 --> 70.6840744 sigma out 15 active outputs RANK 3 NODE 10 --> 60.1748238 sigma out 15 active outputs RANK 4 NODE 9 --> 58.5127373 sigma out 15 active outputs RANK 5 NODE 4 --> 43.8495712 sigma out 15 active outputs RANK 6 NODE 8 --> 41.7247276 sigma out 15 active outputs RANK 7 NODE 7 --> 38.0444832 sigma out 15 active outputs RANK 8 NODE 5 --> 36.518795 sigma out 15 active outputs RANK 9 NODE 3 --> 29.7269745 sigma out 15 active outputs RANK 10 NODE 6 --> 14.8863726 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 7 --> 131.867706 sigma in 10act. ( 120.501923 sig out 1act.) RANK 2 NODE 8 --> 97.1388168 sigma in 10act. ( 118.838242 sig out 1act.) RANK 3 NODE 13 --> 44.7654648 sigma in 10act. ( 53.5288162 sig out 1act.) RANK 4 NODE 10 --> 34.5445747 sigma in 10act. ( 39.3139191 sig out 1act.) RANK 5 NODE 15 --> 21.7256794 sigma in 10act. ( 25.8860264 sig out 1act.) RANK 6 NODE 2 --> 18.2162514 sigma in 10act. ( 20.6664181 sig out 1act.) RANK 7 NODE 12 --> 15.439867 sigma in 10act. ( 16.9256687 sig out 1act.) RANK 8 NODE 1 --> 13.9577589 sigma in 10act. ( 13.6829319 sig out 1act.) RANK 9 NODE 3 --> 11.4753437 sigma in 10act. ( 12.7422371 sig out 1act.) RANK 10 NODE 4 --> 5.76527452 sigma in 10act. ( 5.72020721 sig out 1act.) RANK 11 NODE 6 --> 4.91466427 sigma in 10act. ( 4.15179205 sig out 1act.) RANK 12 NODE 9 --> 4.63207626 sigma in 10act. ( 2.91841817 sig out 1act.) RANK 13 NODE 14 --> 4.25332069 sigma in 10act. ( 2.029562 sig out 1act.) RANK 14 NODE 5 --> 3.61648107 sigma in 10act. ( 3.25999212 sig out 1act.) RANK 15 NODE 11 --> 2.46531892 sigma in 10act. ( 0.637267053 sig out 1act.) sorted by output significance RANK 1 NODE 7 --> 120.501923 sigma out 1act.( 131.867706 sig in 10act.) RANK 2 NODE 8 --> 118.838242 sigma out 1act.( 97.1388168 sig in 10act.) RANK 3 NODE 13 --> 53.5288162 sigma out 1act.( 44.7654648 sig in 10act.) RANK 4 NODE 10 --> 39.3139191 sigma out 1act.( 34.5445747 sig in 10act.) RANK 5 NODE 15 --> 25.8860264 sigma out 1act.( 21.7256794 sig in 10act.) RANK 6 NODE 2 --> 20.6664181 sigma out 1act.( 18.2162514 sig in 10act.) RANK 7 NODE 12 --> 16.9256687 sigma out 1act.( 15.439867 sig in 10act.) RANK 8 NODE 1 --> 13.6829319 sigma out 1act.( 13.9577589 sig in 10act.) RANK 9 NODE 3 --> 12.7422371 sigma out 1act.( 11.4753437 sig in 10act.) RANK 10 NODE 4 --> 5.72020721 sigma out 1act.( 5.76527452 sig in 10act.) RANK 11 NODE 6 --> 4.15179205 sigma out 1act.( 4.91466427 sig in 10act.) RANK 12 NODE 5 --> 3.25999212 sigma out 1act.( 3.61648107 sig in 10act.) RANK 13 NODE 9 --> 2.91841817 sigma out 1act.( 4.63207626 sig in 10act.) RANK 14 NODE 14 --> 2.029562 sigma out 1act.( 4.25332069 sig in 10act.) RANK 15 NODE 11 --> 0.637267053 sigma out 1act.( 2.46531892 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 186.711105 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.524327099 *** contribution from regularisation: 0.00242545176 *** contribution from error: -0.526752532 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 28444 Closing output file done