NNInput NNInputs_125.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree NNAna::CopyTree: entries= 184946 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 47514 nbkg = 137432 Bkg Entries: 137432 Sig Entries: 47514 Chosen entries: 47514 Signal fraction: 1 Background fraction: 0.345727 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 137432 Actual Signal Entries: 47514 Entries to split: 47514 Test with : 23757 Train with : 23757 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 47514 for Signal Prepared event 0 for Signal with 47514 events ====Entry 0 Variable Ht : 120.494 Variable LepAPt : 52.8932 Variable LepBPt : 13.787 Variable MetSigLeptonsJets : 6.59008 Variable MetSpec : 53.8132 Variable SumEtLeptonsJets : 66.6802 Variable VSumJetLeptonsPt : 66.139 Variable addEt : 120.494 Variable dPhiLepSumMet : 2.99118 Variable dPhiLeptons : 0.315251 Variable dRLeptons : 0.406645 Variable lep1_E : 70.3851 Variable lep2_E : 15.8115 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2125 Ht = 120.494 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 52.8933 LepAPt = 52.8932 LepBEt = 13.787 LepBPt = 13.787 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 53.8132 MetDelPhi = 2.92651 MetSig = 4.22058 MetSigLeptonsJets = 6.59008 MetSpec = 53.8132 Mjj = 0 MostCentralJetEta = 0 MtllMet = 129.103 Njets = 0 SB = 0 SumEt = 162.568 SumEtJets = 0 SumEtLeptonsJets = 66.6802 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 66.139 addEt = 120.494 dPhiLepSumMet = 2.99118 dPhiLeptons = 0.315251 dRLeptons = 0.406645 diltype = 49 dimass = 10.9671 event = 2092 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 70.3851 lep2_E = 15.8115 rand = 0.999742 run = 235158 weight = 1.4937e-06 ===Show End Prepared event 10000 for Signal with 47514 events Prepared event 20000 for Signal with 47514 events Prepared event 30000 for Signal with 47514 events Prepared event 40000 for Signal with 47514 events Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 137432 for Background Prepared event 0 for Background with 137432 events ====Entry 0 Variable Ht : 85.3408 Variable LepAPt : 31.1294 Variable LepBPt : 10.0664 Variable MetSigLeptonsJets : 6.87768 Variable MetSpec : 44.1437 Variable SumEtLeptonsJets : 41.1959 Variable VSumJetLeptonsPt : 41.0132 Variable addEt : 85.3408 Variable dPhiLepSumMet : 3.10536 Variable dPhiLeptons : 0.219342 Variable dRLeptons : 0.424454 Variable lep1_E : 32.3548 Variable lep2_E : 10.1027 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 85.3408 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 31.1297 LepAPt = 31.1294 LepBEt = 10.0674 LepBPt = 10.0664 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 44.1437 MetDelPhi = 3.01191 MetSig = 3.74558 MetSigLeptonsJets = 6.87768 MetSpec = 44.1437 Mjj = 0 MostCentralJetEta = 0 MtllMet = 86.2332 Njets = 0 SB = 0 SumEt = 138.899 SumEtJets = 0 SumEtLeptonsJets = 41.1959 Target = 0 TrainWeight = 0.3468 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 41.0132 addEt = 85.3408 dPhiLepSumMet = 3.10536 dPhiLeptons = 0.219342 dRLeptons = 0.424454 diltype = 17 dimass = 7.54723 event = 6717520 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 32.3548 lep2_E = 10.1027 rand = 0.999742 run = 271566 weight = 0.00296168 ===Show End Prepared event 10000 for Background with 137432 events Prepared event 20000 for Background with 137432 events Prepared event 30000 for Background with 137432 events Prepared event 40000 for Background with 137432 events Prepared event 50000 for Background with 137432 events Prepared event 60000 for Background with 137432 events Prepared event 70000 for Background with 137432 events Prepared event 80000 for Background with 137432 events Prepared event 90000 for Background with 137432 events Prepared event 100000 for Background with 137432 events Prepared event 110000 for Background with 137432 events Prepared event 120000 for Background with 137432 events Prepared event 130000 for Background with 137432 events Warning: found 4498 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 184946 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 4498 negative weights. Signal fraction: 62.194519 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 60.0048981 63.2795029 65.4524384 66.9868164 68.1537628 69.1921539 70.1805725 70.9513245 71.7512817 72.6460419 73.440773 74.04142 74.844101 75.5129776 76.1490936 76.7773895 77.3422318 78.0844727 78.69767 79.2351074 79.791687 80.4825211 81.0693359 81.6780014 82.4071808 82.9332504 83.6130142 84.4643021 85.1341858 86.0355148 86.9279022 87.8186188 88.6545258 89.4642029 90.5356369 91.5618286 92.5388641 93.3877563 94.5280228 95.5188599 96.3469696 97.3778458 98.3357162 99.3534927 100.323624 101.420013 102.296471 103.318741 104.119644 104.913284 105.88385 106.718964 107.660721 108.475067 109.393639 110.407715 111.384186 112.345001 113.28019 114.19017 115.206398 116.250542 117.234283 118.331436 119.487274 120.761658 121.961937 123.204216 124.542892 125.916092 127.464981 129.005432 130.652664 132.267853 134.019745 136.076874 138.024689 140.175476 142.237839 144.43631 146.738708 148.951492 151.383545 153.907623 156.276962 158.964905 161.9953 165.544952 169.39917 173.491287 178.050751 183.453308 188.834717 195.414612 203.258667 213.646713 225.383362 241.109375 266.06012 304.266968 1151.67786 ------------------------------ Transdef: Tab for variable 3 20.0000362 20.302578 20.6260796 20.9209442 21.1687355 21.4177475 21.6168003 21.8693962 22.0766754 22.2626629 22.4529839 22.6441975 22.8386612 23.0402336 23.2374496 23.376255 23.5721397 23.7840691 24.0209408 24.2371521 24.4385891 24.5986652 24.8036404 25.0093822 25.2018356 25.3391399 25.5441437 25.715847 25.9067707 26.0885696 26.2605114 26.4451637 26.6312523 26.8450451 27.0333328 27.1986275 27.3888512 27.6104527 27.798687 27.9789162 28.1909866 28.3869381 28.6131554 28.8178482 28.9956474 29.1953545 29.398222 29.6361809 29.8322735 30.0858231 30.2923203 30.5251312 30.7456017 30.9961662 31.1974525 31.4246807 31.6741867 31.8566074 32.0708847 32.2974701 32.5486755 32.7917786 33.0988617 33.3576889 33.6448288 33.9075623 34.1923218 34.4664612 34.7957001 35.1279068 35.4343262 35.7407074 36.0640717 36.4739304 36.8369827 37.2305565 37.5640564 38.0063896 38.4703674 38.9206161 39.4192314 39.9586334 40.4618454 41.041687 41.6932907 42.3094406 42.9817963 43.7315559 44.513237 45.4101181 46.4242668 47.5418625 48.7630463 50.1471825 51.6667366 53.7684784 56.55299 59.9841843 65.4937668 76.0890503 201.695786 ------------------------------ Transdef: Tab for variable 4 10.0001202 10.1112766 10.213438 10.3353481 10.4320164 10.4936228 10.5848675 10.6992912 10.7950401 10.8977394 11.0015059 11.1078329 11.2137775 11.3199768 11.4284611 11.5329685 11.6477232 11.7680264 11.8888512 11.9984455 12.1358814 12.246479 12.3573341 12.4830437 12.6142168 12.7526741 12.8525457 12.98808 13.1326733 13.2620277 13.3933372 13.518055 13.6657495 13.784565 13.9036322 14.0272179 14.1789532 14.3258362 14.4705925 14.6376257 14.7860861 14.9381447 15.0977449 15.2507324 15.3961143 15.5746098 15.7261047 15.9122868 16.0922623 16.2735653 16.4259033 16.5843239 16.7660599 16.9388237 17.1019936 17.3110695 17.5209293 17.7031937 17.9227142 18.122963 18.3221931 18.5450668 18.7535477 18.9598312 19.1671963 19.3826103 19.623724 19.8599453 20.0965385 20.2329254 20.3838158 20.5974464 20.8155117 21.060503 21.2652893 21.4887047 21.7113495 21.9377251 22.2105446 22.481144 22.7819576 23.0695 23.3892708 23.7103519 24.0679016 24.4327621 24.7797642 25.1766663 25.6156826 26.052969 26.4984055 27.0726681 27.7594299 28.2705231 29.0744038 30.0158768 31.2335892 32.6582947 35.2904205 39.9882851 100.668335 ------------------------------ Transdef: Tab for variable 5 0.808297217 2.3783319 2.76739693 3.05853987 3.30442977 3.49167633 3.67960072 3.84462214 3.98867607 4.09542751 4.18319225 4.28214264 4.37782097 4.47144604 4.562747 4.64628124 4.7178793 4.79136276 4.86889935 4.92903137 5.00338173 5.05963659 5.12378597 5.17973375 5.23924637 5.29077339 5.34283543 5.39446449 5.44626331 5.49451351 5.53022003 5.5699892 5.61647081 5.658288 5.70132971 5.74587297 5.78662872 5.82652283 5.8677454 5.90978813 5.95108509 5.98608398 6.02212572 6.06322908 6.10342789 6.14058018 6.18286324 6.21962643 6.26137829 6.28801918 6.32661629 6.36962605 6.40848684 6.45216846 6.48845863 6.52129412 6.55818081 6.60187197 6.63567162 6.67630005 6.71691751 6.7538619 6.78810692 6.8281908 6.86912155 6.90123034 6.94530296 6.98977184 7.02969265 7.0701046 7.11920452 7.17116833 7.21602249 7.25459766 7.29993629 7.3438735 7.38615608 7.43483448 7.48514748 7.52886534 7.58501434 7.63740158 7.69544888 7.7597847 7.82296991 7.88063145 7.95463085 8.02501869 8.09950256 8.18218899 8.27057838 8.3714962 8.48031807 8.61789894 8.74435425 8.89766598 9.12119865 9.37772179 9.76957893 10.4217567 17.7251911 ------------------------------ Transdef: Tab for variable 6 15.024827 19.6172657 23.6119347 25.6501045 26.5564308 27.3309593 28.1883583 28.9213505 29.5336189 30.2371407 30.7724705 31.4212494 31.9965687 32.5105705 32.9580383 33.3753891 33.7825317 34.1683578 34.572197 34.9541435 35.3005447 35.6733475 35.9818878 36.3560905 36.6942825 36.9879723 37.3015289 37.6811028 38.0180283 38.3616829 38.6476288 38.9570236 39.3147354 39.6611328 39.9924927 40.2822952 40.5729141 40.9025955 41.2722397 41.6022568 42.0025253 42.3228607 42.7103653 42.9617195 43.2283554 43.6070251 43.8847733 44.1863594 44.532444 44.9088211 45.3304214 45.7948608 46.1733398 46.5890656 47.0319023 47.4743423 47.921524 48.307869 48.7717743 49.2212753 49.5889473 50.0151138 50.4226761 50.8669739 51.3411293 51.8680573 52.3550491 52.8859711 53.3712769 53.902195 54.4233475 54.9031868 55.4030838 55.9566689 56.4730453 56.8959084 57.4037628 57.9303169 58.5203743 59.1037445 59.6920166 60.352951 61.0494576 61.7341614 62.4964752 63.3557663 64.2457809 65.1658859 66.2271194 67.3630829 68.7045898 70.0217514 71.6851196 73.5560379 75.8148193 78.2546768 81.7953339 86.4839783 93.1173401 104.259201 259.730835 ------------------------------ Transdef: Tab for variable 7 30.1648712 32.7598495 33.6425018 34.2438965 34.8903465 35.391468 35.8491058 36.3146362 36.7547035 37.1129074 37.4635811 37.7673264 38.0460892 38.2860489 38.6244812 38.8919563 39.1943893 39.4775543 39.8297424 40.1320419 40.4211044 40.7130356 41.0810585 41.4323006 41.7262688 42.0756073 42.4104195 42.7485428 43.1426468 43.6460876 44.1596184 44.5939865 45.0248489 45.480732 45.9019547 46.42173 46.9975281 47.5558624 48.1990089 48.7239914 49.2566376 49.8503189 50.532608 51.1396103 51.7178345 52.2757072 52.8521194 53.4271164 54.0867615 54.7064018 55.2685204 55.9085388 56.4994965 57.1530457 57.8098145 58.4457016 59.0536423 59.8191032 60.555912 61.1294556 61.9301224 62.8045464 63.7340126 64.6661148 65.6577301 66.7935486 68.1777267 69.3022308 70.6124039 71.8628235 73.2229156 74.4837952 75.9744339 77.1336288 78.616745 79.9592819 81.599472 83.2240982 84.9643555 86.8920364 88.6903687 90.6649628 92.8073273 94.9545746 97.5435867 100.160034 103.093796 106.209351 109.373962 112.903351 116.835953 121.176559 126.646774 131.900803 138.190735 145.95047 156.123749 168.893433 187.886902 219.046356 731.025879 ------------------------------ Transdef: Tab for variable 8 0.972836435 21.4062672 26.501503 28.8332863 30.5398235 31.777298 32.494442 32.9587402 33.4369583 33.9235992 34.3054047 34.7237587 35.0838699 35.4062576 35.6888657 36.0179329 36.3027802 36.5792847 36.8642006 37.125679 37.3540726 37.5908279 37.8427277 38.0862503 38.344162 38.6005516 38.8755798 39.1335678 39.3565903 39.5977173 39.8191376 40.0820389 40.3516083 40.6766586 40.9603882 41.2647705 41.5358734 41.8732109 42.1942596 42.5162125 42.8161011 43.1287766 43.438736 43.7916946 44.126152 44.4356384 44.8392563 45.219017 45.5877838 45.9899826 46.4599838 46.854435 47.2977142 47.7346687 48.0349503 48.3147583 48.7674294 49.1623497 49.5366211 49.9979782 50.432457 50.897377 51.3183365 51.7331581 52.2070007 52.6798019 53.0965195 53.5595398 54.0717354 54.5465851 54.9988556 55.4639816 55.9482803 56.4667587 56.9437332 57.4467621 57.965538 58.5261803 59.1090469 59.675312 60.2563324 60.9425888 61.6384506 62.4454041 63.3043671 64.1534729 65.1537018 66.1386108 67.2241058 68.60112 70.0537491 71.7139587 73.5692596 75.4440384 78.0039673 80.8071136 84.728775 89.5977249 96.6746063 111.219009 428.068787 ------------------------------ Transdef: Tab for variable 9 49.2546883 62.7101555 64.7318649 66.2757721 67.4027405 68.461441 69.391037 70.1400757 70.861618 71.5436554 72.2938766 72.9749908 73.6395493 74.1765442 74.8632431 75.4603043 75.9836731 76.5557404 77.0276184 77.5557709 78.1414795 78.6407013 79.0944214 79.6494446 80.0877914 80.665329 81.1951904 81.6851044 82.2960358 82.7314224 83.3471375 83.8631439 84.5344238 85.0546417 85.7152557 86.4668732 86.9681091 87.6535492 88.4234467 89.0203857 89.5432129 90.2560654 91.0688477 91.7420578 92.5154724 93.1037521 93.9360428 94.7471466 95.5242538 96.1953659 96.8691635 97.6706238 98.3947906 99.1724701 99.9321442 100.783691 101.542755 102.19416 102.952026 103.687912 104.420486 105.163719 105.925507 106.556641 107.248024 107.961151 108.705322 109.407417 110.14183 110.833374 111.554977 112.276459 113.011375 113.718704 114.461914 115.248428 115.932846 116.718262 117.533997 118.363197 119.191528 120.078316 120.97995 121.914307 122.947617 123.990158 125.096313 126.311996 127.618698 129.221191 130.862183 132.867767 135.089325 137.838257 141.174713 145.350815 150.680954 157.944061 170.058243 190.395767 573.537598 ------------------------------ Transdef: Tab for variable 10 0.00602381537 0.905131876 1.19984078 1.38169944 1.52898383 1.65803123 1.76524091 1.85595345 1.93658674 2.02296782 2.08904767 2.15089798 2.20982409 2.26001072 2.3060174 2.35127974 2.39104843 2.42579317 2.46011305 2.49265766 2.52428579 2.55321503 2.58180594 2.61049485 2.63242626 2.65543699 2.67899656 2.70160556 2.72319317 2.74397326 2.76237202 2.77651525 2.79402637 2.80853152 2.82321167 2.83555532 2.84688711 2.85720205 2.8683238 2.87943196 2.88860464 2.89766693 2.90805411 2.91696978 2.92461538 2.93262458 2.9401536 2.94752121 2.95365715 2.96014404 2.9668026 2.97308922 2.97855973 2.98367405 2.98822784 2.99337149 2.99757743 3.00289869 3.00729227 3.01265812 3.01691532 3.02198267 3.02613163 3.02967978 3.0339694 3.03791261 3.0416317 3.0452795 3.04922962 3.05314493 3.05689502 3.06041098 3.06401968 3.06712794 3.07049227 3.07349229 3.0764389 3.07951736 3.08280301 3.08579969 3.08913064 3.09212732 3.09554815 3.09855223 3.10144758 3.10412216 3.10684204 3.10918427 3.11209917 3.11442852 3.11670852 3.11960173 3.12213922 3.12451363 3.12726474 3.12926579 3.13174343 3.13440037 3.13645983 3.138978 3.14159226 ------------------------------ Transdef: Tab for variable 11 1.09672546E-05 0.00526451971 0.00997996703 0.0165009499 0.0232088566 0.0288931001 0.0374674834 0.0468120463 0.0565824509 0.0631787777 0.071811974 0.0805489123 0.0885443985 0.0976969004 0.10575366 0.113516994 0.120822333 0.128360391 0.135564148 0.144095719 0.151898861 0.159131482 0.166209817 0.173987389 0.180801749 0.187405169 0.193742037 0.20149529 0.208381176 0.215210557 0.22173214 0.228927046 0.234940171 0.241908789 0.248286426 0.254651785 0.261153817 0.266095042 0.272070974 0.27793473 0.283682644 0.290021181 0.296246409 0.302880824 0.309737921 0.315728068 0.321838737 0.327885807 0.333684385 0.340306222 0.346419215 0.351752996 0.358223677 0.363179564 0.368581116 0.374263644 0.380308688 0.385760784 0.390469909 0.395940781 0.402063608 0.407986283 0.413779616 0.419559956 0.424718261 0.430913806 0.436792254 0.442216396 0.448152542 0.455209255 0.460432351 0.466984153 0.473148108 0.479188472 0.48582232 0.49272418 0.499012828 0.505839586 0.512773275 0.520359457 0.526594877 0.535460711 0.54496789 0.553708911 0.563035667 0.573337674 0.582623839 0.593442559 0.604358792 0.616083443 0.628157258 0.639818788 0.653696537 0.668942511 0.687281609 0.708124757 0.733542919 0.762001991 0.796305776 0.851682663 1.1246134 ------------------------------ Transdef: Tab for variable 12 0.103511363 0.196956038 0.221851408 0.241204381 0.255452454 0.268565893 0.280668557 0.290518582 0.30096525 0.308855742 0.317739516 0.325748384 0.334636509 0.342695117 0.349874377 0.355815381 0.362538636 0.369675457 0.376320779 0.383703172 0.389517188 0.395917088 0.40125525 0.405765474 0.41062814 0.414434552 0.4190045 0.422851562 0.427084148 0.430951059 0.435196579 0.439555258 0.443870932 0.447758436 0.452748626 0.456831217 0.460169137 0.464073122 0.467828631 0.471799254 0.475850463 0.479040354 0.482903719 0.486629367 0.490344107 0.493640542 0.497328103 0.501179218 0.505581737 0.509269416 0.513623595 0.518109858 0.521919906 0.525871456 0.529698372 0.534032226 0.538783073 0.543613195 0.548009276 0.55244875 0.556979179 0.561023951 0.565953374 0.571171761 0.576201022 0.580361724 0.585567594 0.590015769 0.594100595 0.598579168 0.603425622 0.608209908 0.614375055 0.620724261 0.625432611 0.631280541 0.637187958 0.644445658 0.650572598 0.657256246 0.663483381 0.669327378 0.676445901 0.684339881 0.69213748 0.699742675 0.705917478 0.714660645 0.723856926 0.73387289 0.743164301 0.753996193 0.764568746 0.779631376 0.793536067 0.807944 0.826784611 0.853235185 0.877633154 0.919062734 1.13539624 ------------------------------ Transdef: Tab for variable 13 20.0262547 21.4070778 22.1673012 22.7672939 23.3186264 23.8649483 24.310955 24.752821 25.1497116 25.5611343 25.9368439 26.292202 26.6816616 27.0404453 27.3618698 27.6728516 27.9880314 28.3602829 28.6620331 28.9723682 29.2879333 29.5636425 29.8502407 30.1443291 30.3792 30.6647511 30.9449883 31.203867 31.5024128 31.8018341 32.0846329 32.365303 32.6021576 32.8757782 33.174324 33.4974594 33.7823639 34.1247978 34.3623657 34.6650467 34.9769516 35.2550812 35.5587616 35.8857193 36.1389389 36.4171944 36.8050308 37.159462 37.4546432 37.7935562 38.1301575 38.5020676 38.8481827 39.2666855 39.6191254 39.9913597 40.4220886 40.8439102 41.2786026 41.7097931 42.21035 42.6282082 43.0663681 43.5197372 44.0228729 44.5060539 45.0692902 45.6301193 46.1849213 46.7575188 47.3041687 47.9194946 48.571228 49.2748413 49.9539566 50.6522827 51.4174461 52.1341476 52.9914017 53.9254227 54.8540878 55.7364159 56.89431 58.040062 59.1106339 60.1561775 61.5390396 63.0702133 64.6955414 66.2699814 67.8988495 69.9881516 71.9356995 74.1777191 77.2247696 80.3827209 84.6676559 90.2144394 96.6658173 109.544937 232.717926 ------------------------------ Transdef: Tab for variable 14 10.0028534 10.5356941 10.8583193 11.2114315 11.4319725 11.7153568 11.9511967 12.2427006 12.4192467 12.6695013 12.9384651 13.1935139 13.4005919 13.6271667 13.8478775 14.0927553 14.3226624 14.5397921 14.7403088 14.9341955 15.1855478 15.3958607 15.5939426 15.7959356 15.997736 16.2277622 16.4086895 16.6046982 16.786335 16.9838333 17.1867409 17.4125557 17.647625 17.8403969 18.0680981 18.2691422 18.4960995 18.6865158 18.8896408 19.084919 19.2945633 19.5331802 19.7786331 20.0116043 20.2651424 20.4960899 20.6770477 20.9079475 21.1504555 21.3641453 21.6152382 21.8229904 22.0249271 22.2609921 22.5060577 22.744482 23.0155945 23.2577591 23.5441322 23.7902756 24.0598602 24.3410759 24.5916023 24.8757858 25.1724243 25.4315472 25.6720009 25.9619389 26.2777443 26.5612373 26.869524 27.1914921 27.4972458 27.8877831 28.2579556 28.5957985 29.0300102 29.4269333 29.8861008 30.360384 30.858429 31.3659935 31.8752384 32.3901138 33.0054932 33.6249313 34.3200378 35.1000557 35.9433022 36.7810173 37.7318726 38.8634567 39.8670044 41.080246 42.5821609 44.2534714 46.1335907 48.955986 53.5507965 60.9325256 115.518631 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 54.6 36.1 34.2 6.9 30.9 53.4 37.3 49.4 -30.7 -9.9 -22.5 4.8 5.5 2 54.6 100.0 63.3 49.6 15.0 57.5 94.0 62.9 89.9 -51.0 -23.6 -44.6 30.4 21.9 3 36.1 63.3 100.0 26.8 -6.2 24.3 66.5 45.5 69.2 -19.5 -23.4 -48.1 64.0 8.4 4 34.2 49.6 26.8 100.0 -1.9 21.5 51.7 37.1 54.3 -14.5 -27.5 -53.9 9.4 68.8 5 6.9 15.0 -6.2 -1.9 100.0 79.8 -13.3 44.2 42.4 31.9 0.1 4.0 -11.8 -6.6 6 30.9 57.5 24.3 21.5 79.8 100.0 34.7 71.2 73.6 -2.3 -10.7 -17.6 5.3 5.5 7 53.4 94.0 66.5 51.7 -13.3 34.7 100.0 55.9 77.2 -57.8 -23.0 -45.9 34.8 24.9 8 37.3 62.9 45.5 37.1 44.2 71.2 55.9 100.0 75.1 -8.4 -20.5 -33.7 21.3 17.4 9 49.4 89.9 69.2 54.3 42.4 73.6 77.2 75.1 100.0 -24.9 -25.8 -49.7 35.1 26.1 10 -30.7 -51.0 -19.5 -14.5 31.9 -2.3 -57.8 -8.4 -24.9 100.0 5.8 12.1 -7.7 -4.0 11 -9.9 -23.6 -23.4 -27.5 0.1 -10.7 -23.0 -20.5 -25.8 5.8 100.0 55.9 -12.0 -19.4 12 -22.5 -44.6 -48.1 -53.9 4.0 -17.6 -45.9 -33.7 -49.7 12.1 55.9 100.0 -24.6 -30.5 13 4.8 30.4 64.0 9.4 -11.8 5.3 34.8 21.3 35.1 -7.7 -12.0 -24.6 100.0 38.9 14 5.5 21.9 8.4 68.8 -6.6 5.5 24.9 17.4 26.1 -4.0 -19.4 -30.5 38.9 100.0 TOTAL CORRELATION TO TARGET (diagonal) 121.424377 TOTAL CORRELATION OF ALL VARIABLES 59.5265313 ROUND 1: MAX CORR ( 59.5263566) AFTER KILLING INPUT VARIABLE 6 CONTR 0.144211689 ROUND 2: MAX CORR ( 59.5170861) AFTER KILLING INPUT VARIABLE 11 CONTR 1.05051969 ROUND 3: MAX CORR ( 59.4527227) AFTER KILLING INPUT VARIABLE 8 CONTR 2.76718335 ROUND 4: MAX CORR ( 59.3665997) AFTER KILLING INPUT VARIABLE 2 CONTR 3.19891788 ROUND 5: MAX CORR ( 59.3330059) AFTER KILLING INPUT VARIABLE 9 CONTR 1.99689012 ROUND 6: MAX CORR ( 59.0287512) AFTER KILLING INPUT VARIABLE 10 CONTR 6.00101061 ROUND 7: MAX CORR ( 58.6204121) AFTER KILLING INPUT VARIABLE 14 CONTR 6.93114372 ROUND 8: MAX CORR ( 58.1537548) AFTER KILLING INPUT VARIABLE 12 CONTR 7.38197208 ROUND 9: MAX CORR ( 57.8078039) AFTER KILLING INPUT VARIABLE 4 CONTR 6.33379891 ROUND 10: MAX CORR ( 56.8898443) AFTER KILLING INPUT VARIABLE 3 CONTR 10.2609846 ROUND 11: MAX CORR ( 55.3698002) AFTER KILLING INPUT VARIABLE 5 CONTR 13.0629096 ROUND 12: MAX CORR ( 53.3932167) AFTER KILLING INPUT VARIABLE 13 CONTR 14.6621687 LAST REMAINING VARIABLE: 7 total correlation to target: 59.5265313 % total significance: 116.658211 sigma correlations of single variables to target: variable 2: 54.6072119 % , in sigma: 107.017485 variable 3: 36.1251488 % , in sigma: 70.7969229 variable 4: 34.2083597 % , in sigma: 67.0404603 variable 5: 6.92675786 % , in sigma: 13.5748407 variable 6: 30.876897 % , in sigma: 60.5115652 variable 7: 53.3932167 % , in sigma: 104.638336 variable 8: 37.285973 % , in sigma: 73.0718693 variable 9: 49.4389471 % , in sigma: 96.8888832 variable 10: -30.6520217 % , in sigma: 60.0708616 variable 11: -9.89800404 % , in sigma: 19.3977949 variable 12: -22.5460838 % , in sigma: 44.1851011 variable 13: 4.83770742 % , in sigma: 9.48078583 variable 14: 5.54937544 % , in sigma: 10.8754903 variables sorted by significance: 1 most relevant variable 7 corr 53.3932152 , in sigma: 104.638333 2 most relevant variable 13 corr 14.6621685 , in sigma: 28.7344536 3 most relevant variable 5 corr 13.0629091 , in sigma: 25.6002757 4 most relevant variable 3 corr 10.2609844 , in sigma: 20.1091524 5 most relevant variable 4 corr 6.33379889 , in sigma: 12.4127785 6 most relevant variable 12 corr 7.38197231 , in sigma: 14.4669556 7 most relevant variable 14 corr 6.93114376 , in sigma: 13.5834361 8 most relevant variable 10 corr 6.00101042 , in sigma: 11.7605902 9 most relevant variable 9 corr 1.99689007 , in sigma: 3.91344193 10 most relevant variable 2 corr 3.19891787 , in sigma: 6.26913795 11 most relevant variable 8 corr 2.7671833 , in sigma: 5.42303823 12 most relevant variable 11 corr 1.0505197 , in sigma: 2.05877526 13 most relevant variable 6 corr 0.144211695 , in sigma: 0.282621513 global correlations between input variables: variable 2: 99.0405055 % variable 3: 93.4536123 % variable 4: 90.7133181 % variable 5: 95.0124027 % variable 6: 93.4864942 % variable 7: 98.5762413 % variable 8: 85.20342 % variable 9: 98.7291483 % variable 10: 72.8390543 % variable 11: 57.193588 % variable 12: 74.8291496 % variable 13: 84.4893218 % variable 14: 86.4663313 % significance loss when removing single variables: variable 2: corr = 4.06804803 % , sigma = 7.97243173 variable 3: corr = 9.01464676 % , sigma = 17.6666193 variable 4: corr = 12.2810733 % , sigma = 24.0680586 variable 5: corr = 5.07071484 % , sigma = 9.93742639 variable 6: corr = 0.144211689 % , sigma = 0.282621502 variable 7: corr = 3.63029638 % , sigma = 7.1145399 variable 8: corr = 2.60521228 % , sigma = 5.10561254 variable 9: corr = 4.30464077 % , sigma = 8.43609869 variable 10: corr = 5.615355 % , sigma = 11.0047949 variable 11: corr = 1.04687748 % , sigma = 2.05163733 variable 12: corr = 5.92742736 % , sigma = 11.6163845 variable 13: corr = 7.11738345 % , sigma = 13.9484227 variable 14: corr = 6.74891308 % , sigma = 13.2263062 Keep only 11 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 12 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 11 --> 22.8770256 sigma out 15 active outputs RANK 2 NODE 3 --> 16.9567738 sigma out 15 active outputs RANK 3 NODE 7 --> 15.5547915 sigma out 15 active outputs RANK 4 NODE 4 --> 15.5275183 sigma out 15 active outputs RANK 5 NODE 1 --> 14.712038 sigma out 15 active outputs RANK 6 NODE 5 --> 14.4946613 sigma out 15 active outputs RANK 7 NODE 2 --> 13.215868 sigma out 15 active outputs RANK 8 NODE 10 --> 12.3200569 sigma out 15 active outputs RANK 9 NODE 8 --> 11.2603512 sigma out 15 active outputs RANK 10 NODE 6 --> 10.1821289 sigma out 15 active outputs RANK 11 NODE 9 --> 10.1612177 sigma out 15 active outputs RANK 12 NODE 12 --> 8.4189949 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 15 --> 35.032505 sigma in 12act. ( 35.0873718 sig out 1act.) RANK 2 NODE 2 --> 17.1195621 sigma in 12act. ( 16.5771561 sig out 1act.) RANK 3 NODE 6 --> 14.5269728 sigma in 12act. ( 15.3062592 sig out 1act.) RANK 4 NODE 1 --> 12.6842394 sigma in 12act. ( 13.9829645 sig out 1act.) RANK 5 NODE 12 --> 10.3436823 sigma in 12act. ( 9.74045181 sig out 1act.) RANK 6 NODE 10 --> 9.09179974 sigma in 12act. ( 9.40880108 sig out 1act.) RANK 7 NODE 5 --> 8.12064552 sigma in 12act. ( 12.5091171 sig out 1act.) RANK 8 NODE 13 --> 8.11371994 sigma in 12act. ( 6.39837742 sig out 1act.) RANK 9 NODE 8 --> 7.04386234 sigma in 12act. ( 6.94599485 sig out 1act.) RANK 10 NODE 11 --> 6.92629576 sigma in 12act. ( 9.71792412 sig out 1act.) RANK 11 NODE 7 --> 6.63261795 sigma in 12act. ( 9.07929134 sig out 1act.) RANK 12 NODE 4 --> 6.18319988 sigma in 12act. ( 4.45946407 sig out 1act.) RANK 13 NODE 3 --> 6.03939104 sigma in 12act. ( 5.73077345 sig out 1act.) RANK 14 NODE 14 --> 4.34687996 sigma in 12act. ( 4.66818333 sig out 1act.) RANK 15 NODE 9 --> 1.62412822 sigma in 12act. ( 2.74313259 sig out 1act.) sorted by output significance RANK 1 NODE 15 --> 35.0873718 sigma out 1act.( 35.032505 sig in 12act.) RANK 2 NODE 2 --> 16.5771561 sigma out 1act.( 17.1195621 sig in 12act.) RANK 3 NODE 6 --> 15.3062592 sigma out 1act.( 14.5269728 sig in 12act.) RANK 4 NODE 1 --> 13.9829645 sigma out 1act.( 12.6842394 sig in 12act.) RANK 5 NODE 5 --> 12.5091171 sigma out 1act.( 8.12064552 sig in 12act.) RANK 6 NODE 12 --> 9.74045181 sigma out 1act.( 10.3436823 sig in 12act.) RANK 7 NODE 11 --> 9.71792412 sigma out 1act.( 6.92629576 sig in 12act.) RANK 8 NODE 10 --> 9.40880108 sigma out 1act.( 9.09179974 sig in 12act.) RANK 9 NODE 7 --> 9.07929134 sigma out 1act.( 6.63261795 sig in 12act.) RANK 10 NODE 8 --> 6.94599485 sigma out 1act.( 7.04386234 sig in 12act.) RANK 11 NODE 13 --> 6.39837742 sigma out 1act.( 8.11371994 sig in 12act.) RANK 12 NODE 3 --> 5.73077345 sigma out 1act.( 6.03939104 sig in 12act.) RANK 13 NODE 14 --> 4.66818333 sigma out 1act.( 4.34687996 sig in 12act.) RANK 14 NODE 4 --> 4.45946407 sigma out 1act.( 6.18319988 sig in 12act.) RANK 15 NODE 9 --> 2.74313259 sigma out 1act.( 1.62412822 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 51.2221985 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 11 --> 30.5074902 sigma out 15 active outputs RANK 2 NODE 3 --> 18.6578922 sigma out 15 active outputs RANK 3 NODE 1 --> 16.9956779 sigma out 15 active outputs RANK 4 NODE 7 --> 16.7505779 sigma out 15 active outputs RANK 5 NODE 4 --> 16.4388294 sigma out 15 active outputs RANK 6 NODE 2 --> 15.9211769 sigma out 15 active outputs RANK 7 NODE 5 --> 15.8694372 sigma out 15 active outputs RANK 8 NODE 10 --> 14.3079977 sigma out 15 active outputs RANK 9 NODE 8 --> 13.5696249 sigma out 15 active outputs RANK 10 NODE 9 --> 12.6441479 sigma out 15 active outputs RANK 11 NODE 6 --> 12.141861 sigma out 15 active outputs RANK 12 NODE 12 --> 10.4189482 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 15 --> 37.7665672 sigma in 12act. ( 34.7144279 sig out 1act.) RANK 2 NODE 2 --> 16.231184 sigma in 12act. ( 15.6330929 sig out 1act.) RANK 3 NODE 5 --> 16.1863308 sigma in 12act. ( 13.7090311 sig out 1act.) RANK 4 NODE 6 --> 15.1937456 sigma in 12act. ( 14.4199705 sig out 1act.) RANK 5 NODE 1 --> 14.1327763 sigma in 12act. ( 13.5213909 sig out 1act.) RANK 6 NODE 11 --> 13.5964718 sigma in 12act. ( 9.89502525 sig out 1act.) RANK 7 NODE 7 --> 13.2330418 sigma in 12act. ( 9.73615265 sig out 1act.) RANK 8 NODE 9 --> 12.2634573 sigma in 12act. ( 5.92295456 sig out 1act.) RANK 9 NODE 12 --> 10.5420952 sigma in 12act. ( 8.83675861 sig out 1act.) RANK 10 NODE 10 --> 10.1044865 sigma in 12act. ( 9.31297493 sig out 1act.) RANK 11 NODE 3 --> 9.79294491 sigma in 12act. ( 5.63973761 sig out 1act.) RANK 12 NODE 13 --> 8.97073078 sigma in 12act. ( 5.57890511 sig out 1act.) RANK 13 NODE 8 --> 7.77071142 sigma in 12act. ( 6.76983738 sig out 1act.) RANK 14 NODE 14 --> 6.87616539 sigma in 12act. ( 5.08180237 sig out 1act.) RANK 15 NODE 4 --> 6.20714664 sigma in 12act. ( 3.43356133 sig out 1act.) sorted by output significance RANK 1 NODE 15 --> 34.7144279 sigma out 1act.( 37.7665672 sig in 12act.) RANK 2 NODE 2 --> 15.6330929 sigma out 1act.( 16.231184 sig in 12act.) RANK 3 NODE 6 --> 14.4199705 sigma out 1act.( 15.1937456 sig in 12act.) RANK 4 NODE 5 --> 13.7090311 sigma out 1act.( 16.1863308 sig in 12act.) RANK 5 NODE 1 --> 13.5213909 sigma out 1act.( 14.1327763 sig in 12act.) RANK 6 NODE 11 --> 9.89502525 sigma out 1act.( 13.5964718 sig in 12act.) RANK 7 NODE 7 --> 9.73615265 sigma out 1act.( 13.2330418 sig in 12act.) RANK 8 NODE 10 --> 9.31297493 sigma out 1act.( 10.1044865 sig in 12act.) RANK 9 NODE 12 --> 8.83675861 sigma out 1act.( 10.5420952 sig in 12act.) RANK 10 NODE 8 --> 6.76983738 sigma out 1act.( 7.77071142 sig in 12act.) RANK 11 NODE 9 --> 5.92295456 sigma out 1act.( 12.2634573 sig in 12act.) RANK 12 NODE 3 --> 5.63973761 sigma out 1act.( 9.79294491 sig in 12act.) RANK 13 NODE 13 --> 5.57890511 sigma out 1act.( 8.97073078 sig in 12act.) RANK 14 NODE 14 --> 5.08180237 sigma out 1act.( 6.87616539 sig in 12act.) RANK 15 NODE 4 --> 3.43356133 sigma out 1act.( 6.20714664 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 50.6672173 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.432212979 *** contribution from regularisation: 0.00480291061 *** contribution from error: -0.437015891 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.469193429 *** contribution from regularisation: 0.00269185146 *** contribution from error: -0.471885294 *********************************************** -----------------> Test sample ENTER BFGS code START -43396.9707 0.340640366 0.0241746232 EXIT FROM BFGS code FG_START 0. 0.340640366 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.482392251 *** contribution from regularisation: 0.00249786745 *** contribution from error: -0.484890133 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -44608.2582 0.340640366 171.507706 EXIT FROM BFGS code FG_LNSRCH 0. 0.382954478 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0.499734759 *** contribution from regularisation: 0.00329473079 *** contribution from error: -0.503029466 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46211.9733 0.382954478 33.8337402 EXIT FROM BFGS code NEW_X -46211.9733 0.382954478 33.8337402 ENTER BFGS code NEW_X -46211.9733 0.382954478 33.8337402 EXIT FROM BFGS code FG_LNSRCH 0. 0.386487275 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.502720833 *** contribution from regularisation: 0.0031309172 *** contribution from error: -0.505851746 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46488.106 0.386487275 18.0888309 EXIT FROM BFGS code NEW_X -46488.106 0.386487275 18.0888309 ENTER BFGS code NEW_X -46488.106 0.386487275 18.0888309 EXIT FROM BFGS code FG_LNSRCH 0. 0.391007543 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.503136337 *** contribution from regularisation: 0.00310326507 *** contribution from error: -0.506239593 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46526.5276 0.391007543 2.26579642 EXIT FROM BFGS code NEW_X -46526.5276 0.391007543 2.26579642 ENTER BFGS code NEW_X -46526.5276 0.391007543 2.26579642 EXIT FROM BFGS code FG_LNSRCH 0. 0.39592728 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.503471434 *** contribution from regularisation: 0.00310390652 *** contribution from error: -0.506575346 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46557.514 0.39592728 -26.0970879 EXIT FROM BFGS code NEW_X -46557.514 0.39592728 -26.0970879 ENTER BFGS code NEW_X -46557.514 0.39592728 -26.0970879 EXIT FROM BFGS code FG_LNSRCH 0. 0.391258746 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.503788769 *** contribution from regularisation: 0.00309796399 *** contribution from error: -0.506886721 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46586.8598 0.391258746 -22.71698 EXIT FROM BFGS code FG_LNSRCH 0. 0.372584641 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.504820645 *** contribution from regularisation: 0.00296779349 *** contribution from error: -0.50778842 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46682.2814 0.372584641 -7.29576445 EXIT FROM BFGS code NEW_X -46682.2814 0.372584641 -7.29576445 ENTER BFGS code NEW_X -46682.2814 0.372584641 -7.29576445 EXIT FROM BFGS code FG_LNSRCH 0. 0.318560302 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 11 --> 41.2604408 sigma out 15 active outputs RANK 2 NODE 4 --> 24.0989189 sigma out 15 active outputs RANK 3 NODE 1 --> 23.881567 sigma out 15 active outputs RANK 4 NODE 3 --> 19.1299076 sigma out 15 active outputs RANK 5 NODE 8 --> 17.5885868 sigma out 15 active outputs RANK 6 NODE 2 --> 14.8708601 sigma out 15 active outputs RANK 7 NODE 10 --> 9.64952278 sigma out 15 active outputs RANK 8 NODE 12 --> 9.09469414 sigma out 15 active outputs RANK 9 NODE 9 --> 8.74904823 sigma out 15 active outputs RANK 10 NODE 5 --> 7.50379515 sigma out 15 active outputs RANK 11 NODE 7 --> 6.96784449 sigma out 15 active outputs RANK 12 NODE 6 --> 6.40443039 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 9 --> 40.1620522 sigma in 12act. ( 47.0469055 sig out 1act.) RANK 2 NODE 7 --> 26.0150013 sigma in 12act. ( 27.9347324 sig out 1act.) RANK 3 NODE 11 --> 19.5940323 sigma in 12act. ( 19.58391 sig out 1act.) RANK 4 NODE 10 --> 17.8772926 sigma in 12act. ( 19.3155003 sig out 1act.) RANK 5 NODE 15 --> 17.4306259 sigma in 12act. ( 13.4081726 sig out 1act.) RANK 6 NODE 5 --> 16.5546169 sigma in 12act. ( 14.3368778 sig out 1act.) RANK 7 NODE 1 --> 13.5231037 sigma in 12act. ( 12.7637186 sig out 1act.) RANK 8 NODE 14 --> 10.7940273 sigma in 12act. ( 9.28746414 sig out 1act.) RANK 9 NODE 13 --> 8.36381245 sigma in 12act. ( 7.6764307 sig out 1act.) RANK 10 NODE 2 --> 7.99884605 sigma in 12act. ( 6.4493947 sig out 1act.) RANK 11 NODE 4 --> 6.53327322 sigma in 12act. ( 6.42066765 sig out 1act.) RANK 12 NODE 3 --> 5.58920288 sigma in 12act. ( 3.02558064 sig out 1act.) RANK 13 NODE 8 --> 5.16177607 sigma in 12act. ( 3.56552482 sig out 1act.) RANK 14 NODE 6 --> 4.67921448 sigma in 12act. ( 1.60784185 sig out 1act.) RANK 15 NODE 12 --> 4.61019278 sigma in 12act. ( 3.66310239 sig out 1act.) sorted by output significance RANK 1 NODE 9 --> 47.0469055 sigma out 1act.( 40.1620522 sig in 12act.) RANK 2 NODE 7 --> 27.9347324 sigma out 1act.( 26.0150013 sig in 12act.) RANK 3 NODE 11 --> 19.58391 sigma out 1act.( 19.5940323 sig in 12act.) RANK 4 NODE 10 --> 19.3155003 sigma out 1act.( 17.8772926 sig in 12act.) RANK 5 NODE 5 --> 14.3368778 sigma out 1act.( 16.5546169 sig in 12act.) RANK 6 NODE 15 --> 13.4081726 sigma out 1act.( 17.4306259 sig in 12act.) RANK 7 NODE 1 --> 12.7637186 sigma out 1act.( 13.5231037 sig in 12act.) RANK 8 NODE 14 --> 9.28746414 sigma out 1act.( 10.7940273 sig in 12act.) RANK 9 NODE 13 --> 7.6764307 sigma out 1act.( 8.36381245 sig in 12act.) RANK 10 NODE 2 --> 6.4493947 sigma out 1act.( 7.99884605 sig in 12act.) RANK 11 NODE 4 --> 6.42066765 sigma out 1act.( 6.53327322 sig in 12act.) RANK 12 NODE 12 --> 3.66310239 sigma out 1act.( 4.61019278 sig in 12act.) RANK 13 NODE 8 --> 3.56552482 sigma out 1act.( 5.16177607 sig in 12act.) RANK 14 NODE 3 --> 3.02558064 sigma out 1act.( 5.58920288 sig in 12act.) RANK 15 NODE 6 --> 1.60784185 sigma out 1act.( 4.67921448 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 67.5610199 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.506403267 *** contribution from regularisation: 0.00310428208 *** contribution from error: -0.509507537 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -46828.6283 0.318560302 5.63863897 EXIT FROM BFGS code NEW_X -46828.6283 0.318560302 5.63863897 ENTER BFGS code NEW_X -46828.6283 0.318560302 5.63863897 EXIT FROM BFGS code FG_LNSRCH 0. 0.300076783 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.50892508 *** contribution from regularisation: 0.00333195576 *** contribution from error: -0.51225704 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47061.8276 0.300076783 12.4303122 EXIT FROM BFGS code NEW_X -47061.8276 0.300076783 12.4303122 ENTER BFGS code NEW_X -47061.8276 0.300076783 12.4303122 EXIT FROM BFGS code FG_LNSRCH 0. 0.277630448 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.509753644 *** contribution from regularisation: 0.00342140067 *** contribution from error: -0.51317507 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47138.4512 0.277630448 2.06942344 EXIT FROM BFGS code NEW_X -47138.4512 0.277630448 2.06942344 ENTER BFGS code NEW_X -47138.4512 0.277630448 2.06942344 EXIT FROM BFGS code FG_LNSRCH 0. 0.288083553 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.510477126 *** contribution from regularisation: 0.00305322115 *** contribution from error: -0.513530374 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47205.3491 0.288083553 -3.97099233 EXIT FROM BFGS code NEW_X -47205.3491 0.288083553 -3.97099233 ENTER BFGS code NEW_X -47205.3491 0.288083553 -3.97099233 EXIT FROM BFGS code FG_LNSRCH 0. 0.293460518 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.511180043 *** contribution from regularisation: 0.00307604601 *** contribution from error: -0.51425606 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47270.3534 0.293460518 2.80599427 EXIT FROM BFGS code NEW_X -47270.3534 0.293460518 2.80599427 ENTER BFGS code NEW_X -47270.3534 0.293460518 2.80599427 EXIT FROM BFGS code FG_LNSRCH 0. 0.286382109 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.51186502 *** contribution from regularisation: 0.00332807493 *** contribution from error: -0.515193105 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47333.6956 0.286382109 7.88280869 EXIT FROM BFGS code NEW_X -47333.6956 0.286382109 7.88280869 ENTER BFGS code NEW_X -47333.6956 0.286382109 7.88280869 EXIT FROM BFGS code FG_LNSRCH 0. 0.2788257 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.51337868 *** contribution from regularisation: 0.00321681215 *** contribution from error: -0.516595483 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47473.6694 0.2788257 -21.6445923 EXIT FROM BFGS code NEW_X -47473.6694 0.2788257 -21.6445923 ENTER BFGS code NEW_X -47473.6694 0.2788257 -21.6445923 EXIT FROM BFGS code FG_LNSRCH 0. 0.27023235 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.513765275 *** contribution from regularisation: 0.00323504303 *** contribution from error: -0.517000318 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47509.4163 0.27023235 -18.5387821 EXIT FROM BFGS code NEW_X -47509.4163 0.27023235 -18.5387821 ENTER BFGS code NEW_X -47509.4163 0.27023235 -18.5387821 EXIT FROM BFGS code FG_LNSRCH 0. 0.245785713 0. --------------------------------------------------- Iteration : 18 *********************************************** *** Learn Path 18 *** loss function: -0.514215052 *** contribution from regularisation: 0.0034548603 *** contribution from error: -0.517669916 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47551.0109 0.245785713 -4.75902081 EXIT FROM BFGS code NEW_X -47551.0109 0.245785713 -4.75902081 ENTER BFGS code NEW_X -47551.0109 0.245785713 -4.75902081 EXIT FROM BFGS code FG_LNSRCH 0. 0.235389099 0. --------------------------------------------------- Iteration : 19 *********************************************** *** Learn Path 19 *** loss function: -0.514743686 *** contribution from regularisation: 0.00329818483 *** contribution from error: -0.518041849 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47599.8928 0.235389099 -9.48735332 EXIT FROM BFGS code NEW_X -47599.8928 0.235389099 -9.48735332 ENTER BFGS code NEW_X -47599.8928 0.235389099 -9.48735332 EXIT FROM BFGS code FG_LNSRCH 0. 0.22618416 0. --------------------------------------------------- Iteration : 20 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 11 --> 69.0576477 sigma out 15 active outputs RANK 2 NODE 4 --> 53.3891602 sigma out 15 active outputs RANK 3 NODE 1 --> 30.2136173 sigma out 15 active outputs RANK 4 NODE 3 --> 23.1160488 sigma out 15 active outputs RANK 5 NODE 8 --> 20.3727646 sigma out 15 active outputs RANK 6 NODE 2 --> 19.702013 sigma out 15 active outputs RANK 7 NODE 5 --> 15.5894699 sigma out 15 active outputs RANK 8 NODE 12 --> 15.3553219 sigma out 15 active outputs RANK 9 NODE 7 --> 15.3315668 sigma out 15 active outputs RANK 10 NODE 9 --> 10.4187374 sigma out 15 active outputs RANK 11 NODE 6 --> 5.74208736 sigma out 15 active outputs RANK 12 NODE 10 --> 5.46185637 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 9 --> 70.0905457 sigma in 12act. ( 100.11235 sig out 1act.) RANK 2 NODE 7 --> 52.8011131 sigma in 12act. ( 55.5211182 sig out 1act.) RANK 3 NODE 10 --> 36.8488426 sigma in 12act. ( 42.2795677 sig out 1act.) RANK 4 NODE 11 --> 20.8314171 sigma in 12act. ( 22.4939423 sig out 1act.) RANK 5 NODE 1 --> 20.517231 sigma in 12act. ( 24.7269402 sig out 1act.) RANK 6 NODE 14 --> 15.8409033 sigma in 12act. ( 18.1708889 sig out 1act.) RANK 7 NODE 4 --> 12.3842735 sigma in 12act. ( 14.1954966 sig out 1act.) RANK 8 NODE 5 --> 12.2413282 sigma in 12act. ( 11.1641359 sig out 1act.) RANK 9 NODE 13 --> 11.0015917 sigma in 12act. ( 12.6678076 sig out 1act.) RANK 10 NODE 15 --> 10.2428913 sigma in 12act. ( 9.04201698 sig out 1act.) RANK 11 NODE 6 --> 5.65064478 sigma in 12act. ( 5.35824299 sig out 1act.) RANK 12 NODE 2 --> 3.04880452 sigma in 12act. ( 1.11702073 sig out 1act.) RANK 13 NODE 12 --> 2.78611493 sigma in 12act. ( 2.43838072 sig out 1act.) RANK 14 NODE 3 --> 2.75563097 sigma in 12act. ( 1.26500583 sig out 1act.) RANK 15 NODE 8 --> 2.22535205 sigma in 12act. ( 0.116889693 sig out 1act.) sorted by output significance RANK 1 NODE 9 --> 100.11235 sigma out 1act.( 70.0905457 sig in 12act.) RANK 2 NODE 7 --> 55.5211182 sigma out 1act.( 52.8011131 sig in 12act.) RANK 3 NODE 10 --> 42.2795677 sigma out 1act.( 36.8488426 sig in 12act.) RANK 4 NODE 1 --> 24.7269402 sigma out 1act.( 20.517231 sig in 12act.) RANK 5 NODE 11 --> 22.4939423 sigma out 1act.( 20.8314171 sig in 12act.) RANK 6 NODE 14 --> 18.1708889 sigma out 1act.( 15.8409033 sig in 12act.) RANK 7 NODE 4 --> 14.1954966 sigma out 1act.( 12.3842735 sig in 12act.) RANK 8 NODE 13 --> 12.6678076 sigma out 1act.( 11.0015917 sig in 12act.) RANK 9 NODE 5 --> 11.1641359 sigma out 1act.( 12.2413282 sig in 12act.) RANK 10 NODE 15 --> 9.04201698 sigma out 1act.( 10.2428913 sig in 12act.) RANK 11 NODE 6 --> 5.35824299 sigma out 1act.( 5.65064478 sig in 12act.) RANK 12 NODE 12 --> 2.43838072 sigma out 1act.( 2.78611493 sig in 12act.) RANK 13 NODE 3 --> 1.26500583 sigma out 1act.( 2.75563097 sig in 12act.) RANK 14 NODE 2 --> 1.11702073 sigma out 1act.( 3.04880452 sig in 12act.) RANK 15 NODE 8 --> 0.116889693 sigma out 1act.( 2.22535205 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 130.177261 sigma in 15 active inputs *********************************************** *** Learn Path 20 *** loss function: -0.515010118 *** contribution from regularisation: 0.00329771359 *** contribution from error: -0.518307805 *********************************************** -----------------> Test sample Iteration No: 20 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -47624.5332 0.22618416 -10.6028481 EXIT FROM BFGS code NEW_X -47624.5332 0.22618416 -10.6028481 ENTER BFGS code NEW_X -47624.5332 0.22618416 -10.6028481 EXIT FROM BFGS code FG_LNSRCH 0. 0.203132629 0. --------------------------------------------------- Iteration : 21 *********************************************** *** Learn Path 21 *** loss function: -0.515470028 *** contribution from regularisation: 0.00336693996 *** contribution from error: -0.518836975 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47667.0577 0.203132629 -12.0851297 EXIT FROM BFGS code NEW_X -47667.0577 0.203132629 -12.0851297 ENTER BFGS code NEW_X -47667.0577 0.203132629 -12.0851297 EXIT FROM BFGS code FG_LNSRCH 0. 0.164929524 0. --------------------------------------------------- Iteration : 22 *********************************************** *** Learn Path 22 *** loss function: -0.515772581 *** contribution from regularisation: 0.00344382063 *** contribution from error: -0.519216418 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47695.0361 0.164929524 -0.833481312 EXIT FROM BFGS code NEW_X -47695.0361 0.164929524 -0.833481312 ENTER BFGS code NEW_X -47695.0361 0.164929524 -0.833481312 EXIT FROM BFGS code FG_LNSRCH 0. 0.165571168 0. --------------------------------------------------- Iteration : 23 *********************************************** *** Learn Path 23 *** loss function: -0.516403437 *** contribution from regularisation: 0.00321440748 *** contribution from error: -0.519617856 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47753.3746 0.165571168 -7.02341795 EXIT FROM BFGS code NEW_X -47753.3746 0.165571168 -7.02341795 ENTER BFGS code NEW_X -47753.3746 0.165571168 -7.02341795 EXIT FROM BFGS code FG_LNSRCH 0. 0.168174699 0. --------------------------------------------------- Iteration : 24 *********************************************** *** Learn Path 24 *** loss function: -0.516544461 *** contribution from regularisation: 0.00317218178 *** contribution from error: -0.51971662 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47766.4132 0.168174699 -5.53935385 EXIT FROM BFGS code NEW_X -47766.4132 0.168174699 -5.53935385 ENTER BFGS code NEW_X -47766.4132 0.168174699 -5.53935385 EXIT FROM BFGS code FG_LNSRCH 0. 0.151287362 0. --------------------------------------------------- Iteration : 25 *********************************************** *** Learn Path 25 *** loss function: -0.51666218 *** contribution from regularisation: 0.00324886874 *** contribution from error: -0.519911051 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47777.3033 0.151287362 -9.32504654 EXIT FROM BFGS code NEW_X -47777.3033 0.151287362 -9.32504654 ENTER BFGS code NEW_X -47777.3033 0.151287362 -9.32504654 EXIT FROM BFGS code FG_LNSRCH 0. 0.139063165 0. --------------------------------------------------- Iteration : 26 *********************************************** *** Learn Path 26 *** loss function: -0.516860664 *** contribution from regularisation: 0.00325663248 *** contribution from error: -0.520117283 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47795.6548 0.139063165 -5.3624177 EXIT FROM BFGS code NEW_X -47795.6548 0.139063165 -5.3624177 ENTER BFGS code NEW_X -47795.6548 0.139063165 -5.3624177 EXIT FROM BFGS code FG_LNSRCH 0. 0.126840264 0. --------------------------------------------------- Iteration : 27 *********************************************** *** Learn Path 27 *** loss function: -0.517036855 *** contribution from regularisation: 0.00324109336 *** contribution from error: -0.520277977 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47811.9507 0.126840264 2.07339621 EXIT FROM BFGS code NEW_X -47811.9507 0.126840264 2.07339621 ENTER BFGS code NEW_X -47811.9507 0.126840264 2.07339621 EXIT FROM BFGS code FG_LNSRCH 0. 0.123640895 0. --------------------------------------------------- Iteration : 28 *********************************************** *** Learn Path 28 *** loss function: -0.517069578 *** contribution from regularisation: 0.00327225379 *** contribution from error: -0.520341814 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47814.9755 0.123640895 -1.02516258 EXIT FROM BFGS code NEW_X -47814.9755 0.123640895 -1.02516258 ENTER BFGS code NEW_X -47814.9755 0.123640895 -1.02516258 EXIT FROM BFGS code FG_LNSRCH 0. 0.12073151 0. --------------------------------------------------- Iteration : 29 *********************************************** *** Learn Path 29 *** loss function: -0.517136335 *** contribution from regularisation: 0.00326010957 *** contribution from error: -0.520396471 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47821.1493 0.12073151 -1.55655301 EXIT FROM BFGS code NEW_X -47821.1493 0.12073151 -1.55655301 ENTER BFGS code NEW_X -47821.1493 0.12073151 -1.55655301 EXIT FROM BFGS code FG_LNSRCH 0. 0.116237134 0. --------------------------------------------------- Iteration : 30 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 11 --> 89.1607895 sigma out 15 active outputs RANK 2 NODE 4 --> 60.0249481 sigma out 15 active outputs RANK 3 NODE 1 --> 56.2775459 sigma out 15 active outputs RANK 4 NODE 5 --> 27.1692314 sigma out 15 active outputs RANK 5 NODE 8 --> 22.9452419 sigma out 15 active outputs RANK 6 NODE 2 --> 21.8882179 sigma out 15 active outputs RANK 7 NODE 12 --> 21.4496193 sigma out 15 active outputs RANK 8 NODE 3 --> 21.1835384 sigma out 15 active outputs RANK 9 NODE 7 --> 18.8747463 sigma out 15 active outputs RANK 10 NODE 9 --> 17.7349224 sigma out 15 active outputs RANK 11 NODE 10 --> 13.219821 sigma out 15 active outputs RANK 12 NODE 6 --> 10.8728838 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 9 --> 84.4898148 sigma in 12act. ( 114.899635 sig out 1act.) RANK 2 NODE 7 --> 84.0046921 sigma in 12act. ( 72.4865646 sig out 1act.) RANK 3 NODE 10 --> 38.7303505 sigma in 12act. ( 45.9313278 sig out 1act.) RANK 4 NODE 11 --> 33.3854065 sigma in 12act. ( 34.6131859 sig out 1act.) RANK 5 NODE 1 --> 20.7630405 sigma in 12act. ( 25.4468536 sig out 1act.) RANK 6 NODE 14 --> 19.4111919 sigma in 12act. ( 22.2602615 sig out 1act.) RANK 7 NODE 13 --> 15.9423866 sigma in 12act. ( 19.1749783 sig out 1act.) RANK 8 NODE 4 --> 13.9821157 sigma in 12act. ( 16.3479176 sig out 1act.) RANK 9 NODE 5 --> 13.1663532 sigma in 12act. ( 12.536046 sig out 1act.) RANK 10 NODE 15 --> 7.7118926 sigma in 12act. ( 7.73560524 sig out 1act.) RANK 11 NODE 6 --> 4.64392281 sigma in 12act. ( 4.81432056 sig out 1act.) RANK 12 NODE 3 --> 2.25448298 sigma in 12act. ( 1.52885818 sig out 1act.) RANK 13 NODE 12 --> 1.92172647 sigma in 12act. ( 1.73798382 sig out 1act.) RANK 14 NODE 2 --> 1.88078988 sigma in 12act. ( 0.802092254 sig out 1act.) RANK 15 NODE 8 --> 1.39250898 sigma in 12act. ( 0.0893721804 sig out 1act.) sorted by output significance RANK 1 NODE 9 --> 114.899635 sigma out 1act.( 84.4898148 sig in 12act.) RANK 2 NODE 7 --> 72.4865646 sigma out 1act.( 84.0046921 sig in 12act.) RANK 3 NODE 10 --> 45.9313278 sigma out 1act.( 38.7303505 sig in 12act.) RANK 4 NODE 11 --> 34.6131859 sigma out 1act.( 33.3854065 sig in 12act.) RANK 5 NODE 1 --> 25.4468536 sigma out 1act.( 20.7630405 sig in 12act.) RANK 6 NODE 14 --> 22.2602615 sigma out 1act.( 19.4111919 sig in 12act.) RANK 7 NODE 13 --> 19.1749783 sigma out 1act.( 15.9423866 sig in 12act.) RANK 8 NODE 4 --> 16.3479176 sigma out 1act.( 13.9821157 sig in 12act.) RANK 9 NODE 5 --> 12.536046 sigma out 1act.( 13.1663532 sig in 12act.) RANK 10 NODE 15 --> 7.73560524 sigma out 1act.( 7.7118926 sig in 12act.) RANK 11 NODE 6 --> 4.81432056 sigma out 1act.( 4.64392281 sig in 12act.) RANK 12 NODE 12 --> 1.73798382 sigma out 1act.( 1.92172647 sig in 12act.) RANK 13 NODE 3 --> 1.52885818 sigma out 1act.( 2.25448298 sig in 12act.) RANK 14 NODE 2 --> 0.802092254 sigma out 1act.( 1.88078988 sig in 12act.) RANK 15 NODE 8 --> 0.0893721804 sigma out 1act.( 1.39250898 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 154.234116 sigma in 15 active inputs *********************************************** *** Learn Path 30 *** loss function: -0.51723659 *** contribution from regularisation: 0.00321581378 *** contribution from error: -0.52045238 *********************************************** -----------------> Test sample Iteration No: 30 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -47830.4196 0.116237134 -3.80453897 EXIT FROM BFGS code NEW_X -47830.4196 0.116237134 -3.80453897 ENTER BFGS code NEW_X -47830.4196 0.116237134 -3.80453897 EXIT FROM BFGS code FG_LNSRCH 0. 0.11336451 0. --------------------------------------------------- Iteration : 31 *********************************************** *** Learn Path 31 *** loss function: -0.517126977 *** contribution from regularisation: 0.00326480926 *** contribution from error: -0.520391762 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47820.2818 0.11336451 -2.96667862 EXIT FROM BFGS code FG_LNSRCH 0. 0.115599684 0. --------------------------------------------------- Iteration : 32 *********************************************** *** Learn Path 32 *** loss function: -0.51709193 *** contribution from regularisation: 0.00338860555 *** contribution from error: -0.520480514 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47817.0425 0.115599684 -3.60732961 EXIT FROM BFGS code FG_LNSRCH 0. 0.116207808 0. --------------------------------------------------- Iteration : 33 *********************************************** *** Learn Path 33 *** loss function: -0.517130196 *** contribution from regularisation: 0.00332405069 *** contribution from error: -0.520454228 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47820.5811 0.116207808 -3.78868651 EXIT FROM BFGS code FG_LNSRCH 0. 0.116237029 0. --------------------------------------------------- Iteration : 34 *********************************************** *** Learn Path 34 *** loss function: -0.517147839 *** contribution from regularisation: 0.00330457115 *** contribution from error: -0.52045244 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47822.2128 0.116237029 -3.80978179 EXIT FROM BFGS code FG_LNSRCH 0. 0.116237134 0. --------------------------------------------------- Iteration : 35 *********************************************** *** Learn Path 35 *** loss function: -0.517131925 *** contribution from regularisation: 0.00332047907 *** contribution from error: -0.52045238 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47820.7409 0.116237134 -3.80772638 EXIT FROM BFGS code FG_LNSRCH 0. 0.116237134 0. --------------------------------------------------- Iteration : 36 *********************************************** *** Learn Path 36 *** loss function: -0.517136753 *** contribution from regularisation: 0.00331564946 *** contribution from error: -0.52045238 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47821.1875 0.116237134 -3.8120029 EXIT FROM BFGS code FG_LNSRCH 0. 0.116237134 0. --------------------------------------------------- Iteration : 37 *********************************************** *** Learn Path 37 *** loss function: -0.517134845 *** contribution from regularisation: 0.0033175759 *** contribution from error: -0.52045244 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47821.0094 0.116237134 -3.80452037 EXIT FROM BFGS code FG_LNSRCH 0. 0.116237134 0. --------------------------------------------------- Iteration : 38 *********************************************** *** Learn Path 38 *** loss function: -0.517137349 *** contribution from regularisation: 0.00331503269 *** contribution from error: -0.52045238 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -47821.2446 0.116237134 -3.80308127 EXIT FROM BFGS code NEW_X -47821.2446 0.116237134 -3.80308127 ENTER BFGS code NEW_X -47821.2446 0.116237134 -3.80308127 EXIT FROM BFGS code CONVERGENC -47821.2446 0.116237134 -3.80308127 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 11 --> 135.044464 sigma out 15 active outputs RANK 2 NODE 4 --> 91.8573837 sigma out 15 active outputs RANK 3 NODE 1 --> 87.1414566 sigma out 15 active outputs RANK 4 NODE 5 --> 41.4533119 sigma out 15 active outputs RANK 5 NODE 8 --> 34.7865448 sigma out 15 active outputs RANK 6 NODE 2 --> 32.2037888 sigma out 15 active outputs RANK 7 NODE 3 --> 32.0463142 sigma out 15 active outputs RANK 8 NODE 12 --> 31.5659084 sigma out 15 active outputs RANK 9 NODE 7 --> 29.1723728 sigma out 15 active outputs RANK 10 NODE 9 --> 26.6982288 sigma out 15 active outputs RANK 11 NODE 10 --> 19.7774792 sigma out 15 active outputs RANK 12 NODE 6 --> 16.7218037 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 7 --> 128.884308 sigma in 12act. ( 110.953468 sig out 1act.) RANK 2 NODE 9 --> 128.283691 sigma in 12act. ( 175.589111 sig out 1act.) RANK 3 NODE 10 --> 59.143322 sigma in 12act. ( 70.5995941 sig out 1act.) RANK 4 NODE 11 --> 49.9997787 sigma in 12act. ( 53.4987907 sig out 1act.) RANK 5 NODE 1 --> 31.3325787 sigma in 12act. ( 39.021965 sig out 1act.) RANK 6 NODE 14 --> 29.2914658 sigma in 12act. ( 34.0749931 sig out 1act.) RANK 7 NODE 13 --> 23.8367996 sigma in 12act. ( 28.9453545 sig out 1act.) RANK 8 NODE 4 --> 21.0877457 sigma in 12act. ( 25.1401272 sig out 1act.) RANK 9 NODE 5 --> 19.5220509 sigma in 12act. ( 19.2804947 sig out 1act.) RANK 10 NODE 15 --> 11.1855478 sigma in 12act. ( 11.9117393 sig out 1act.) RANK 11 NODE 6 --> 6.75236177 sigma in 12act. ( 7.16428471 sig out 1act.) RANK 12 NODE 3 --> 2.85059357 sigma in 12act. ( 2.34190083 sig out 1act.) RANK 13 NODE 12 --> 2.63721585 sigma in 12act. ( 2.59777665 sig out 1act.) RANK 14 NODE 2 --> 2.14992046 sigma in 12act. ( 1.2129699 sig out 1act.) RANK 15 NODE 8 --> 1.42508388 sigma in 12act. ( 0.134864822 sig out 1act.) sorted by output significance RANK 1 NODE 9 --> 175.589111 sigma out 1act.( 128.283691 sig in 12act.) RANK 2 NODE 7 --> 110.953468 sigma out 1act.( 128.884308 sig in 12act.) RANK 3 NODE 10 --> 70.5995941 sigma out 1act.( 59.143322 sig in 12act.) RANK 4 NODE 11 --> 53.4987907 sigma out 1act.( 49.9997787 sig in 12act.) RANK 5 NODE 1 --> 39.021965 sigma out 1act.( 31.3325787 sig in 12act.) RANK 6 NODE 14 --> 34.0749931 sigma out 1act.( 29.2914658 sig in 12act.) RANK 7 NODE 13 --> 28.9453545 sigma out 1act.( 23.8367996 sig in 12act.) RANK 8 NODE 4 --> 25.1401272 sigma out 1act.( 21.0877457 sig in 12act.) RANK 9 NODE 5 --> 19.2804947 sigma out 1act.( 19.5220509 sig in 12act.) RANK 10 NODE 15 --> 11.9117393 sigma out 1act.( 11.1855478 sig in 12act.) RANK 11 NODE 6 --> 7.16428471 sigma out 1act.( 6.75236177 sig in 12act.) RANK 12 NODE 12 --> 2.59777665 sigma out 1act.( 2.63721585 sig in 12act.) RANK 13 NODE 3 --> 2.34190083 sigma out 1act.( 2.85059357 sig in 12act.) RANK 14 NODE 2 --> 1.2129699 sigma out 1act.( 2.14992046 sig in 12act.) RANK 15 NODE 8 --> 0.134864822 sigma out 1act.( 1.42508388 sig in 12act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 236.052948 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.517133594 *** contribution from regularisation: 0.00331884343 *** contribution from error: -0.52045244 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 29884 Closing output file done