NNInput NNInputs_115.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree NNAna::CopyTree: entries= 181163 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 44827 nbkg = 136336 Bkg Entries: 136336 Sig Entries: 44827 Chosen entries: 44827 Signal fraction: 1 Background fraction: 0.328798 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 136336 Actual Signal Entries: 44827 Entries to split: 44827 Test with : 22413 Train with : 22413 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 44827 for Signal Prepared event 0 for Signal with 44827 events ====Entry 0 Variable Ht : 147.2 Variable LepAPt : 33.8436 Variable LepBPt : 16.8195 Variable MetSigLeptonsJets : 7.93922 Variable MetSpec : 69.8321 Variable SumEtLeptonsJets : 77.3669 Variable VSumJetLeptonsPt : 74.5437 Variable addEt : 120.496 Variable dPhiLepSumMet : 2.77597 Variable dPhiLeptons : 0.349396 Variable dRLeptons : 0.409585 Variable lep1_E : 34.748 Variable lep2_E : 16.8224 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2115 Ht = 147.2 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 33.8439 LepAPt = 33.8436 LepBEt = 16.82 LepBPt = 16.8195 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 69.8321 MetDelPhi = 2.54203 MetSig = 5.42558 MetSigLeptonsJets = 7.93922 MetSpec = 69.8321 Mjj = 0 MostCentralJetEta = -1.25048 MtllMet = 121.128 Njets = 1 SB = 0 SumEt = 165.661 SumEtJets = 0 SumEtLeptonsJets = 77.3669 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 74.5437 addEt = 120.496 dPhiLepSumMet = 2.77597 dPhiLeptons = 0.349396 dRLeptons = 0.409585 diltype = 17 dimass = 9.74565 event = 4187 jet1_Et = 26.7039 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 34.748 lep2_E = 16.8224 rand = 0.999742 run = 237705 weight = 5.90147e-07 ===Show End Prepared event 10000 for Signal with 44827 events Prepared event 20000 for Signal with 44827 events Prepared event 30000 for Signal with 44827 events Prepared event 40000 for Signal with 44827 events Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 136336 for Background Prepared event 0 for Background with 136336 events ====Entry 0 Variable Ht : 85.3408 Variable LepAPt : 31.1294 Variable LepBPt : 10.0664 Variable MetSigLeptonsJets : 6.87768 Variable MetSpec : 44.1437 Variable SumEtLeptonsJets : 41.1959 Variable VSumJetLeptonsPt : 41.0132 Variable addEt : 85.3408 Variable dPhiLepSumMet : 3.10536 Variable dPhiLeptons : 0.219342 Variable dRLeptons : 0.424454 Variable lep1_E : 32.3548 Variable lep2_E : 10.1027 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 85.3408 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 31.1297 LepAPt = 31.1294 LepBEt = 10.0674 LepBPt = 10.0664 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 44.1437 MetDelPhi = 3.01191 MetSig = 3.74558 MetSigLeptonsJets = 6.87768 MetSpec = 44.1437 Mjj = 0 MostCentralJetEta = 0 MtllMet = 86.2332 Njets = 0 SB = 0 SumEt = 138.899 SumEtJets = 0 SumEtLeptonsJets = 41.1959 Target = 0 TrainWeight = 0.324755 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 41.0132 addEt = 85.3408 dPhiLepSumMet = 3.10536 dPhiLeptons = 0.219342 dRLeptons = 0.424454 diltype = 17 dimass = 7.54723 event = 6717520 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 32.3548 lep2_E = 10.1027 rand = 0.999742 run = 271566 weight = 0.00296168 ===Show End Prepared event 10000 for Background with 136336 events Prepared event 20000 for Background with 136336 events Prepared event 30000 for Background with 136336 events Prepared event 40000 for Background with 136336 events Prepared event 50000 for Background with 136336 events Prepared event 60000 for Background with 136336 events Prepared event 70000 for Background with 136336 events Prepared event 80000 for Background with 136336 events Prepared event 90000 for Background with 136336 events Prepared event 100000 for Background with 136336 events Prepared event 110000 for Background with 136336 events Prepared event 120000 for Background with 136336 events Prepared event 130000 for Background with 136336 events Warning: found 4346 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 181163 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 4346 negative weights. Signal fraction: 62.5100441 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 60.0019798 62.8691864 65.1007233 66.6684265 67.6238251 68.6558838 69.7286072 70.4901276 71.3616486 72.1680298 72.9915237 73.7069855 74.3278427 74.9531555 75.6333466 76.2559128 76.9304657 77.39888 78.0874786 78.6289673 79.3011169 79.866684 80.5285645 81.0538025 81.5749283 82.3037109 82.907135 83.6105499 84.3676605 85.0844955 85.8780975 86.7318192 87.5661697 88.5263062 89.1242981 89.9211884 90.7810059 91.652565 92.391098 93.1572113 93.9961853 94.8564758 95.5926056 96.4510498 97.3082581 98.1580811 98.9328308 99.7290344 100.54837 101.398102 102.134872 102.969597 103.784973 104.607742 105.477264 106.299545 107.030525 107.914894 108.609215 109.502136 110.408165 111.473389 112.509216 113.407364 114.422623 115.66188 116.948303 118.362579 119.717468 121.081604 122.659622 124.313774 126.083527 127.929008 129.497299 131.43277 133.341522 135.197891 137.156372 139.220673 141.322327 143.699036 146.156799 148.429581 151.380905 154.198883 157.535156 160.655212 163.923187 168.079437 172.77655 177.85144 183.82135 190.121277 196.964325 205.358154 217.752838 232.108612 255.19873 293.755127 800.809509 ------------------------------ Transdef: Tab for variable 3 20.0008926 20.2939796 20.5902519 20.8452644 21.1071815 21.3683357 21.5821781 21.7921028 22.002861 22.1892242 22.3843498 22.584568 22.7627907 22.9287071 23.1133385 23.2844963 23.4841576 23.6965523 23.902916 24.1246758 24.3275185 24.5125198 24.7379837 24.9204063 25.1149139 25.2959518 25.4904518 25.68857 25.8927345 26.0755768 26.2538033 26.4599648 26.6518192 26.8738441 27.0912361 27.243248 27.4385643 27.6389542 27.8168583 27.9844284 28.1854591 28.3806248 28.6045113 28.8017197 29.0375214 29.2507477 29.4529953 29.7051601 29.9553108 30.17659 30.4248314 30.6407394 30.8943157 31.1111145 31.3732681 31.5977402 31.8152809 32.0586357 32.3114014 32.5398903 32.7926712 33.0660973 33.317749 33.617485 33.8921356 34.1444778 34.4217682 34.7092705 35.0279617 35.3404007 35.6363716 35.9482307 36.3329124 36.6462784 36.9639893 37.358696 37.708252 38.1433182 38.58321 39.0601616 39.3890457 39.8721924 40.3380051 40.9256744 41.5506363 42.1371231 42.8220444 43.5959435 44.5201073 45.417141 46.3987274 47.4342728 48.5206146 49.9086342 51.6457672 53.9226112 56.6888084 59.942955 64.721138 75.2859726 228.385986 ------------------------------ Transdef: Tab for variable 4 10.0001202 10.0926495 10.2031078 10.2979136 10.378602 10.4521866 10.5197296 10.6086903 10.6876621 10.7788048 10.877655 10.9730339 11.0629349 11.1619282 11.263443 11.3585253 11.4382238 11.5212803 11.6321802 11.7312746 11.8186731 11.9066353 11.9989166 12.1039209 12.2037659 12.3049202 12.4059086 12.5172739 12.6214657 12.7460632 12.8443995 12.9438267 13.0393715 13.15975 13.2820129 13.392458 13.5066957 13.6151352 13.7443867 13.8366947 13.9459324 14.0664167 14.1765919 14.2960215 14.4188614 14.5441465 14.699585 14.8389454 14.9783745 15.1128855 15.250617 15.373539 15.5038128 15.6491795 15.8047752 15.9357052 16.0792446 16.2461815 16.3984013 16.5626335 16.7023144 16.8397446 16.9984226 17.1648407 17.365284 17.5426273 17.7126808 17.9154224 18.101181 18.2995834 18.4992638 18.7199688 18.9297657 19.1336155 19.3685455 19.5950737 19.8056526 20.0581779 20.3027325 20.5401268 20.7801552 21.0276527 21.2895737 21.5721283 21.8467789 22.0458889 22.37043 22.7089787 23.1083031 23.527504 23.9263306 24.3549137 24.9026184 25.5518875 26.2945747 27.1165123 28.0743179 29.5125885 31.5768204 34.9758453 71.4863358 ------------------------------ Transdef: Tab for variable 5 0.97587049 2.29911041 2.69015074 2.95556831 3.1769886 3.35249996 3.53848934 3.71020842 3.85868216 3.99038363 4.0927124 4.17956448 4.26318693 4.35945129 4.43515015 4.52621937 4.59323215 4.66252613 4.74055672 4.80648184 4.8826623 4.94604206 5.01146317 5.07127666 5.12685585 5.17370987 5.22718763 5.27568007 5.33170986 5.37257767 5.42261457 5.47132587 5.52028227 5.5575037 5.59899998 5.63890696 5.68053341 5.72183895 5.76359272 5.80735588 5.84668541 5.89311409 5.93038273 5.96518373 6.00359249 6.04148579 6.08157825 6.11571598 6.1555624 6.1908927 6.22140694 6.25855064 6.29425335 6.33346415 6.37268257 6.41057253 6.45318699 6.48845863 6.52393007 6.5624671 6.59998512 6.63875008 6.68098545 6.72098827 6.75372314 6.78824234 6.83035851 6.87002897 6.89983559 6.94281578 6.98006678 7.02418566 7.06764889 7.11095858 7.15823984 7.20185661 7.23598576 7.27575016 7.31614113 7.36890984 7.41826344 7.47200203 7.50471878 7.55263519 7.61171341 7.66925049 7.7385788 7.81079817 7.87936878 7.96209574 8.0498848 8.13945007 8.24307346 8.37643433 8.49368 8.65694618 8.8547554 9.09751797 9.4717617 10.1593094 18.6198635 ------------------------------ Transdef: Tab for variable 6 15.0117254 18.7519531 22.8318176 25.3921337 26.2005539 26.9557114 27.5785065 28.3170013 29.0374718 29.58498 30.2652817 30.7942181 31.3298035 31.8261681 32.2596207 32.7026062 33.184761 33.5858879 34.0188293 34.4683151 34.7810898 35.0938644 35.4251404 35.7735519 36.1287003 36.4639359 36.7949829 37.0840836 37.4194946 37.7502289 38.04105 38.313446 38.6454086 38.9319 39.264019 39.5677567 39.9004974 40.2087593 40.4545593 40.7807884 41.0740852 41.3690109 41.7232742 42.0276222 42.3005066 42.6283493 42.9746628 43.2780838 43.6157494 43.8902779 44.1707344 44.50951 44.8412743 45.2173958 45.6088715 45.997818 46.4314117 46.8164062 47.1857986 47.5230255 47.9212875 48.2371597 48.6465759 49.0371628 49.4016838 49.7687531 50.123848 50.4898453 50.8504028 51.2882042 51.7177963 52.1765594 52.6147385 53.1163559 53.5215836 54.03022 54.529808 55.0387535 55.5350876 56.0781403 56.6384048 57.1454086 57.7708893 58.5206375 59.0201797 59.7832909 60.6793289 61.6258469 62.6979332 63.8769989 65.1165085 66.5217209 68.2237091 69.8520889 72.1559525 74.5971146 77.6912231 82.0903015 87.6489944 97.6848831 236.562576 ------------------------------ Transdef: Tab for variable 7 30.1476021 32.5177879 33.5508957 34.215023 34.7249146 35.1626854 35.6238022 36.046669 36.4557419 36.8499908 37.2119789 37.5182266 37.83078 38.1108742 38.3863983 38.6871643 38.9597778 39.2386513 39.5382347 39.7998886 40.0788193 40.3702774 40.6669273 40.9990387 41.3387146 41.6226959 41.9728165 42.3007889 42.6736298 43.0679855 43.4625473 43.9091682 44.3549728 44.7289581 45.1204529 45.4652176 45.9185257 46.3845673 46.9040604 47.3070984 47.8332481 48.3155975 48.8276596 49.3287125 49.8072243 50.3830643 50.9368286 51.4353333 51.9706879 52.4729843 53.0281219 53.5714569 54.1759796 54.7969131 55.4152527 56.0517349 56.7566872 57.5038338 58.2992935 59.1961823 60.2565002 61.1814804 61.9409332 62.9703445 63.9345093 65.2299194 66.506012 67.8976517 69.0983582 70.3965378 71.524826 72.8381042 74.1723175 75.6207733 76.8446426 78.2937622 79.681282 81.1996155 82.9159088 84.7665253 86.7929993 88.8677063 90.9243927 93.0848923 95.7292633 98.5376434 101.212799 104.068451 107.175003 110.907761 114.844406 118.953033 123.771996 129.403946 135.821808 143.91452 153.16156 165.274475 183.927689 215.44693 510.394806 ------------------------------ Transdef: Tab for variable 8 0.862621069 20.6470718 25.2402859 28.1975822 29.8771057 31.071682 31.8993607 32.5009117 32.9899902 33.5050583 33.856842 34.2100716 34.5422897 34.9219398 35.2236366 35.5256271 35.8656311 36.1207581 36.4153824 36.6774139 36.931488 37.1606178 37.4159889 37.6687088 37.9041367 38.1248016 38.3392487 38.6074715 38.8464355 39.0853882 39.298584 39.5459824 39.7717285 40.0136223 40.3077507 40.5816803 40.8461456 41.0913582 41.3750076 41.6671066 41.9516678 42.2222519 42.5488129 42.8931541 43.1818848 43.4930801 43.7358017 44.0325165 44.3474159 44.6568375 45.0501862 45.3539314 45.7357101 46.1142349 46.4915085 46.8315735 47.1897621 47.5723038 47.9196167 48.3147583 48.6976547 49.0213814 49.3817406 49.7402573 50.0972137 50.4889793 50.8978691 51.3115158 51.6871605 52.1135902 52.5265732 52.9543304 53.3984718 53.8510933 54.2841263 54.7460938 55.2708435 55.8125839 56.4266205 56.9828491 57.6118889 58.2736816 59.0066605 59.7715759 60.575592 61.2613678 62.2199554 63.3243866 64.4675903 65.6668472 67.1112823 68.6810303 70.458252 72.6092758 74.9718323 77.5201569 80.7994003 85.6835327 92.3314362 105.044167 299.881683 ------------------------------ Transdef: Tab for variable 9 48.8581314 62.3820229 64.1855927 65.8378143 66.9571381 67.7684326 68.6704559 69.6099243 70.2282562 70.9740295 71.6781693 72.4353256 73.0843964 73.7319946 74.3132477 74.8143616 75.415329 75.9343109 76.4152222 76.9781647 77.3435211 77.9302521 78.3696136 78.8354645 79.3808899 79.8201904 80.3665466 80.844162 81.2939148 81.7790527 82.3918304 82.8684921 83.4045868 83.9445953 84.6014023 85.1418839 85.7591248 86.3920746 87.0510559 87.6948547 88.3804016 88.8388138 89.3860931 90.0583038 90.5720749 91.2727051 91.893074 92.4657211 93.0682831 93.5945206 94.2729645 94.8900986 95.4633331 95.9927444 96.633812 97.3087006 98.0031433 98.5620804 99.1336136 99.7291794 100.398026 101.004395 101.633385 102.222595 102.829453 103.425957 104.036133 104.644958 105.273636 105.925522 106.46759 107.100113 107.715851 108.331909 109.001389 109.680496 110.34037 111.019493 111.799057 112.530228 113.399391 114.222908 115.021622 115.87291 116.891449 117.984634 119.209343 120.150452 121.521255 123.166656 125.12281 127.264153 129.81076 132.636627 135.951172 140.437057 145.921631 152.084045 162.177917 182.906616 424.390259 ------------------------------ Transdef: Tab for variable 10 0.00483142957 0.887163401 1.14349282 1.33590341 1.47955871 1.59000969 1.69959688 1.79466844 1.88099694 1.95287824 2.02031755 2.08358526 2.14302588 2.19508982 2.24236202 2.28658462 2.33223629 2.37244987 2.41418266 2.4467597 2.48107266 2.51213789 2.54091883 2.5686264 2.59732962 2.61862826 2.64243984 2.66791654 2.6920476 2.71597004 2.73801756 2.75610685 2.77423668 2.79157352 2.80812788 2.82405019 2.83568096 2.84833336 2.85734797 2.86817694 2.87965584 2.88903809 2.89767599 2.90649652 2.91594267 2.92527485 2.93157625 2.94009733 2.94787455 2.95398617 2.96098614 2.96637106 2.97333431 2.97868872 2.9840219 2.98888755 2.99334383 2.99864197 3.00355411 3.0085218 3.01304936 3.01783538 3.02296805 3.02767444 3.0317421 3.03515673 3.03857541 3.0423193 3.04597425 3.04976535 3.05370045 3.0573287 3.06103945 3.06483507 3.06826305 3.07142997 3.07462406 3.07777166 3.08111668 3.0843339 3.0877223 3.09108138 3.09399414 3.09660625 3.09982753 3.10295439 3.10552526 3.10861158 3.11124611 3.11378503 3.11659002 3.11935139 3.12179446 3.12437034 3.12706089 3.12927389 3.13143945 3.13363051 3.13604736 3.13873577 3.14159131 ------------------------------ Transdef: Tab for variable 11 1.71661377E-05 0.00533723878 0.0131726237 0.0208166838 0.0282037351 0.0373002291 0.0452412963 0.0537819862 0.0614612103 0.0701670647 0.0788338184 0.0874894783 0.0965701342 0.105831623 0.11171627 0.119555444 0.128162622 0.135758877 0.145055115 0.153489113 0.160125732 0.168002367 0.176115811 0.183349013 0.190695763 0.198162377 0.206317484 0.212869838 0.219918251 0.226509094 0.23323822 0.239307523 0.246974945 0.253266573 0.259401441 0.265045345 0.270713329 0.277511179 0.283114791 0.289282084 0.295240402 0.300899744 0.307211399 0.312596768 0.319424272 0.325576663 0.332155526 0.337636918 0.343174219 0.349015594 0.354951143 0.360338449 0.366420746 0.372714579 0.379178703 0.385396004 0.389755964 0.395985723 0.40137428 0.407795161 0.413371712 0.419884443 0.42608583 0.43204999 0.438480616 0.444670409 0.449909449 0.455565035 0.460895628 0.468142986 0.473928869 0.48002106 0.487075567 0.493919134 0.5014503 0.508677721 0.516351938 0.523989201 0.532305181 0.54063201 0.548395276 0.554793775 0.563565135 0.5717237 0.579767644 0.590779543 0.601355553 0.611917973 0.62298429 0.633767962 0.646194041 0.658904314 0.672676265 0.688841999 0.708290458 0.726209641 0.749948859 0.778980613 0.816600382 0.868388772 1.11657131 ------------------------------ Transdef: Tab for variable 12 0.200089261 0.22848919 0.248626828 0.263553619 0.276557922 0.287904918 0.299546182 0.30869019 0.318419218 0.327882469 0.336809754 0.345455229 0.352087229 0.360041767 0.368120551 0.375050038 0.381287813 0.3881495 0.394585788 0.400625557 0.405716181 0.410902828 0.415056318 0.419776231 0.424155951 0.428695172 0.433593988 0.437520981 0.441503584 0.446043313 0.450514674 0.455293357 0.459666014 0.462795913 0.467001081 0.470958233 0.475278974 0.477941692 0.481989503 0.48619169 0.490159988 0.493620574 0.497310996 0.501292109 0.505672097 0.509561658 0.513486743 0.517949104 0.521867931 0.52605933 0.530981183 0.535153031 0.53953141 0.544033527 0.548705339 0.552930236 0.557011127 0.560724497 0.564922512 0.569669843 0.573954761 0.578271508 0.582607865 0.587492049 0.592028916 0.596779704 0.601875782 0.606000423 0.611480534 0.616783082 0.622166753 0.627640247 0.633507609 0.639505863 0.646136165 0.652124763 0.657845676 0.662961245 0.667590678 0.674095213 0.679825068 0.686386466 0.692704022 0.698981106 0.705440342 0.714128196 0.722032666 0.730342507 0.739788651 0.748713315 0.756271958 0.768391728 0.779344261 0.791949272 0.804683924 0.820167303 0.840772152 0.859999299 0.887191296 0.934362173 1.13539624 ------------------------------ Transdef: Tab for variable 13 20.0121021 21.3422966 22.1182442 22.7086563 23.2437172 23.7193451 24.176075 24.6191673 25.0240479 25.4113846 25.8194237 26.2136745 26.6389294 27.0410728 27.3614655 27.6716251 28.0116405 28.3149014 28.6450996 28.9533138 29.2514572 29.5476418 29.8076134 30.1416817 30.4579773 30.7000122 30.9851036 31.2517929 31.5821877 31.8846016 32.2030258 32.4747849 32.7657776 33.0802422 33.3625183 33.6856003 33.9972229 34.3318672 34.5635872 34.9050674 35.1795273 35.452076 35.7475357 36.0836487 36.3542938 36.6983643 37.0493546 37.3193893 37.6870499 38.0226974 38.3606873 38.7229233 39.1338005 39.5282249 39.8903542 40.3093109 40.7071609 41.1688652 41.5520477 42.0014114 42.4847107 42.8874359 43.3651161 43.844635 44.3068619 44.8563309 45.4458961 46.004364 46.5457916 47.1947784 47.7649384 48.3716125 49.0590782 49.7488251 50.4653473 51.053833 51.8806305 52.6964951 53.5326118 54.3913116 55.2201805 56.1432571 57.2950897 58.4422264 59.6484146 60.8952484 62.2504807 63.8379364 65.2380066 67.0727692 69.0693665 71.2497253 73.6768799 76.7497406 79.7429352 83.1015549 87.602829 90.5404205 96.5735092 107.383972 232.717926 ------------------------------ Transdef: Tab for variable 14 10.0028534 10.5356932 10.8438416 11.0782013 11.3560028 11.5441236 11.7310543 11.930603 12.157753 12.3180561 12.4961843 12.6903343 12.8798256 13.0840588 13.2791376 13.4676456 13.6541672 13.830821 13.9803047 14.170085 14.3523054 14.5176601 14.7136221 14.9024506 15.0741463 15.241066 15.4361334 15.6099701 15.7707796 15.9392605 16.1381054 16.3161049 16.4945221 16.6577911 16.8095284 16.9802933 17.168644 17.37463 17.5932693 17.7751141 17.9486694 18.1323414 18.3083191 18.5312538 18.7360992 18.9360695 19.1243248 19.3114338 19.5161991 19.7335739 19.9388657 20.1393776 20.3788738 20.5893021 20.7958069 21.0459251 21.2629814 21.5070267 21.7205372 21.9344444 22.1461983 22.3998566 22.6514702 22.9306221 23.172966 23.4326801 23.6949501 23.9938831 24.2768173 24.5401077 24.8334274 25.1582375 25.4593391 25.7288265 26.0270042 26.3994179 26.7937965 27.2421989 27.6574974 28.0680199 28.4781837 28.971447 29.4879475 30.0739803 30.6903629 31.3065872 31.9147568 32.5764236 33.357132 33.9762955 34.7984467 35.8803024 37.004837 38.2259445 39.8344116 41.5104599 43.6292953 46.1957016 50.0815582 56.5646133 122.221924 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 51.4 36.1 22.3 1.0 25.5 50.7 32.7 44.9 -32.1 -6.8 -17.8 5.5 -6.4 2 51.4 100.0 63.3 43.0 9.5 53.9 93.7 59.5 88.8 -52.2 -21.2 -41.2 31.5 14.0 3 36.1 63.3 100.0 19.5 -10.1 21.6 66.9 44.4 69.1 -20.1 -20.3 -44.7 63.9 0.9 4 22.3 43.0 19.5 100.0 -5.4 15.3 45.3 30.8 46.9 -13.9 -24.7 -48.2 7.6 66.7 5 1.0 9.5 -10.1 -5.4 100.0 79.0 -19.1 40.6 39.3 34.7 1.1 6.4 -12.1 -7.0 6 25.5 53.9 21.6 15.3 79.0 100.0 30.4 68.6 72.1 0.0 -8.5 -13.7 5.3 0.9 7 50.7 93.7 66.9 45.3 -19.1 30.4 100.0 52.9 75.5 -58.9 -20.3 -42.4 35.9 16.7 8 32.7 59.5 44.4 30.8 40.6 68.6 52.9 100.0 73.0 -7.7 -18.1 -30.5 22.3 11.1 9 44.9 88.8 69.1 46.9 39.3 72.1 75.5 73.0 100.0 -24.8 -23.0 -45.6 36.7 18.1 10 -32.1 -52.2 -20.1 -13.9 34.7 0.0 -58.9 -7.7 -24.8 100.0 5.7 12.9 -6.4 -1.1 11 -6.8 -21.2 -20.3 -24.7 1.1 -8.5 -20.3 -18.1 -23.0 5.7 100.0 54.9 -10.0 -15.2 12 -17.8 -41.2 -44.7 -48.2 6.4 -13.7 -42.4 -30.5 -45.6 12.9 54.9 100.0 -22.8 -23.7 13 5.5 31.5 63.9 7.6 -12.1 5.3 35.9 22.3 36.7 -6.4 -10.0 -22.8 100.0 36.7 14 -6.4 14.0 0.9 66.7 -7.0 0.9 16.7 11.1 18.1 -1.1 -15.2 -23.7 36.7 100.0 TOTAL CORRELATION TO TARGET (diagonal) 110.478607 TOTAL CORRELATION OF ALL VARIABLES 56.5364458 ROUND 1: MAX CORR ( 56.5361723) AFTER KILLING INPUT VARIABLE 6 CONTR 0.175841219 ROUND 2: MAX CORR ( 56.5177468) AFTER KILLING INPUT VARIABLE 11 CONTR 1.4432866 ROUND 3: MAX CORR ( 56.4859152) AFTER KILLING INPUT VARIABLE 9 CONTR 1.89659929 ROUND 4: MAX CORR ( 56.4610998) AFTER KILLING INPUT VARIABLE 5 CONTR 1.67416366 ROUND 5: MAX CORR ( 56.3601302) AFTER KILLING INPUT VARIABLE 7 CONTR 3.37513241 ROUND 6: MAX CORR ( 56.1957796) AFTER KILLING INPUT VARIABLE 8 CONTR 4.30100368 ROUND 7: MAX CORR ( 55.8347191) AFTER KILLING INPUT VARIABLE 13 CONTR 6.36001494 ROUND 8: MAX CORR ( 55.3770116) AFTER KILLING INPUT VARIABLE 12 CONTR 7.13459429 ROUND 9: MAX CORR ( 55.1230462) AFTER KILLING INPUT VARIABLE 3 CONTR 5.29747032 ROUND 10: MAX CORR ( 54.7664303) AFTER KILLING INPUT VARIABLE 10 CONTR 6.26005876 ROUND 11: MAX CORR ( 53.2003184) AFTER KILLING INPUT VARIABLE 4 CONTR 13.0033845 ROUND 12: MAX CORR ( 51.41225) AFTER KILLING INPUT VARIABLE 14 CONTR 13.6767843 LAST REMAINING VARIABLE: 2 total correlation to target: 56.5364458 % total significance: 107.328467 sigma correlations of single variables to target: variable 2: 51.41225 % , in sigma: 97.600723 variable 3: 36.0798539 % , in sigma: 68.4937893 variable 4: 22.3359799 % , in sigma: 42.4024972 variable 5: 0.962191424 % , in sigma: 1.82661873 variable 6: 25.4735162 % , in sigma: 48.3587783 variable 7: 50.6878825 % , in sigma: 96.2255878 variable 8: 32.6647446 % , in sigma: 62.0105653 variable 9: 44.8651781 % , in sigma: 85.1717989 variable 10: -32.0662493 % , in sigma: 60.8743852 variable 11: -6.77008596 % , in sigma: 12.8522927 variable 12: -17.8345834 % , in sigma: 33.8570717 variable 13: 5.48490628 % , in sigma: 10.4125149 variable 14: -6.36076924 % , in sigma: 12.0752481 variables sorted by significance: 1 most relevant variable 2 corr 51.4122505 , in sigma: 97.6007239 2 most relevant variable 14 corr 13.6767845 , in sigma: 25.9639299 3 most relevant variable 4 corr 13.0033846 , in sigma: 24.6855513 4 most relevant variable 10 corr 6.26005888 , in sigma: 11.8840602 5 most relevant variable 3 corr 5.29747009 , in sigma: 10.0566871 6 most relevant variable 12 corr 7.13459444 , in sigma: 13.5442735 7 most relevant variable 13 corr 6.36001492 , in sigma: 12.0738161 8 most relevant variable 8 corr 4.30100346 , in sigma: 8.1650005 9 most relevant variable 7 corr 3.37513232 , in sigma: 6.407332 10 most relevant variable 5 corr 1.6741637 , in sigma: 3.17822284 11 most relevant variable 9 corr 1.89659929 , in sigma: 3.60049331 12 most relevant variable 11 corr 1.44328654 , in sigma: 2.73992695 13 most relevant variable 6 corr 0.175841212 , in sigma: 0.333815957 global correlations between input variables: variable 2: 98.9505821 % variable 3: 93.7037697 % variable 4: 89.1793683 % variable 5: 94.8385195 % variable 6: 93.206786 % variable 7: 98.4962651 % variable 8: 83.9164007 % variable 9: 98.6187607 % variable 10: 73.0851823 % variable 11: 56.1305171 % variable 12: 72.2643537 % variable 13: 83.8272019 % variable 14: 85.3616025 % significance loss when removing single variables: variable 2: corr = 3.87754089 % , sigma = 7.36110156 variable 3: corr = 7.35039294 % , sigma = 13.9539442 variable 4: corr = 8.75420709 % , sigma = 16.6189369 variable 5: corr = 2.08357937 % , sigma = 3.95545522 variable 6: corr = 0.175841219 % , sigma = 0.33381597 variable 7: corr = 2.67069408 % , sigma = 5.07003044 variable 8: corr = 2.99618896 % , sigma = 5.68794808 variable 9: corr = 1.99277651 % , sigma = 3.78307558 variable 10: corr = 6.07778985 % , sigma = 11.5380417 variable 11: corr = 1.4395708 % , sigma = 2.73287302 variable 12: corr = 5.33851982 % , sigma = 10.1346157 variable 13: corr = 6.38621711 % , sigma = 12.1235582 variable 14: corr = 9.08641204 % , sigma = 17.2495929 Keep only 9 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 10 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 5 --> 23.2629757 sigma out 15 active outputs RANK 2 NODE 9 --> 23.1026173 sigma out 15 active outputs RANK 3 NODE 6 --> 19.3687344 sigma out 15 active outputs RANK 4 NODE 4 --> 15.4657755 sigma out 15 active outputs RANK 5 NODE 7 --> 15.4050903 sigma out 15 active outputs RANK 6 NODE 10 --> 14.2594099 sigma out 15 active outputs RANK 7 NODE 1 --> 13.5477905 sigma out 15 active outputs RANK 8 NODE 2 --> 13.0312853 sigma out 15 active outputs RANK 9 NODE 8 --> 11.5675077 sigma out 15 active outputs RANK 10 NODE 3 --> 10.1209049 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 8 --> 28.8838768 sigma in 10act. ( 29.366869 sig out 1act.) RANK 2 NODE 7 --> 21.679306 sigma in 10act. ( 21.595377 sig out 1act.) RANK 3 NODE 6 --> 21.114912 sigma in 10act. ( 22.8113174 sig out 1act.) RANK 4 NODE 1 --> 19.1349697 sigma in 10act. ( 19.4938965 sig out 1act.) RANK 5 NODE 10 --> 16.8505001 sigma in 10act. ( 20.3485146 sig out 1act.) RANK 6 NODE 9 --> 7.65253067 sigma in 10act. ( 5.48181534 sig out 1act.) RANK 7 NODE 11 --> 7.32297468 sigma in 10act. ( 7.18724442 sig out 1act.) RANK 8 NODE 3 --> 6.93550348 sigma in 10act. ( 7.984097 sig out 1act.) RANK 9 NODE 2 --> 6.79870701 sigma in 10act. ( 6.38189697 sig out 1act.) RANK 10 NODE 13 --> 5.98607445 sigma in 10act. ( 4.39802599 sig out 1act.) RANK 11 NODE 12 --> 5.9688406 sigma in 10act. ( 3.65092826 sig out 1act.) RANK 12 NODE 5 --> 5.19626331 sigma in 10act. ( 0.390745938 sig out 1act.) RANK 13 NODE 14 --> 2.59905148 sigma in 10act. ( 5.31248331 sig out 1act.) RANK 14 NODE 4 --> 2.16247749 sigma in 10act. ( 0.536629677 sig out 1act.) RANK 15 NODE 15 --> 1.69691765 sigma in 10act. ( 1.09410012 sig out 1act.) sorted by output significance RANK 1 NODE 8 --> 29.366869 sigma out 1act.( 28.8838768 sig in 10act.) RANK 2 NODE 6 --> 22.8113174 sigma out 1act.( 21.114912 sig in 10act.) RANK 3 NODE 7 --> 21.595377 sigma out 1act.( 21.679306 sig in 10act.) RANK 4 NODE 10 --> 20.3485146 sigma out 1act.( 16.8505001 sig in 10act.) RANK 5 NODE 1 --> 19.4938965 sigma out 1act.( 19.1349697 sig in 10act.) RANK 6 NODE 3 --> 7.984097 sigma out 1act.( 6.93550348 sig in 10act.) RANK 7 NODE 11 --> 7.18724442 sigma out 1act.( 7.32297468 sig in 10act.) RANK 8 NODE 2 --> 6.38189697 sigma out 1act.( 6.79870701 sig in 10act.) RANK 9 NODE 9 --> 5.48181534 sigma out 1act.( 7.65253067 sig in 10act.) RANK 10 NODE 14 --> 5.31248331 sigma out 1act.( 2.59905148 sig in 10act.) RANK 11 NODE 13 --> 4.39802599 sigma out 1act.( 5.98607445 sig in 10act.) RANK 12 NODE 12 --> 3.65092826 sigma out 1act.( 5.9688406 sig in 10act.) RANK 13 NODE 15 --> 1.09410012 sigma out 1act.( 1.69691765 sig in 10act.) RANK 14 NODE 4 --> 0.536629677 sigma out 1act.( 2.16247749 sig in 10act.) RANK 15 NODE 5 --> 0.390745938 sigma out 1act.( 5.19626331 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 53.7765465 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 9 --> 23.2210808 sigma out 15 active outputs RANK 2 NODE 5 --> 23.1494179 sigma out 15 active outputs RANK 3 NODE 6 --> 18.6049213 sigma out 15 active outputs RANK 4 NODE 2 --> 17.5340252 sigma out 15 active outputs RANK 5 NODE 4 --> 15.695178 sigma out 15 active outputs RANK 6 NODE 10 --> 15.6402407 sigma out 15 active outputs RANK 7 NODE 7 --> 15.6250257 sigma out 15 active outputs RANK 8 NODE 1 --> 14.8789454 sigma out 15 active outputs RANK 9 NODE 8 --> 12.0779629 sigma out 15 active outputs RANK 10 NODE 3 --> 11.9920902 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 8 --> 27.4592476 sigma in 10act. ( 28.0089703 sig out 1act.) RANK 2 NODE 10 --> 20.4332619 sigma in 10act. ( 20.1527596 sig out 1act.) RANK 3 NODE 6 --> 20.0797195 sigma in 10act. ( 21.5642414 sig out 1act.) RANK 4 NODE 1 --> 19.2140255 sigma in 10act. ( 19.0061722 sig out 1act.) RANK 5 NODE 7 --> 19.090023 sigma in 10act. ( 19.6208973 sig out 1act.) RANK 6 NODE 14 --> 11.8850174 sigma in 10act. ( 7.87158012 sig out 1act.) RANK 7 NODE 11 --> 10.471899 sigma in 10act. ( 6.80096674 sig out 1act.) RANK 8 NODE 3 --> 9.9479084 sigma in 10act. ( 9.09650517 sig out 1act.) RANK 9 NODE 9 --> 7.90498352 sigma in 10act. ( 4.80816078 sig out 1act.) RANK 10 NODE 2 --> 7.70002508 sigma in 10act. ( 6.05331707 sig out 1act.) RANK 11 NODE 5 --> 6.91594648 sigma in 10act. ( 1.38945293 sig out 1act.) RANK 12 NODE 13 --> 6.26401758 sigma in 10act. ( 3.76099801 sig out 1act.) RANK 13 NODE 4 --> 6.25143242 sigma in 10act. ( 0.320572346 sig out 1act.) RANK 14 NODE 15 --> 5.99969578 sigma in 10act. ( 1.46760201 sig out 1act.) RANK 15 NODE 12 --> 5.72156572 sigma in 10act. ( 2.77700329 sig out 1act.) sorted by output significance RANK 1 NODE 8 --> 28.0089703 sigma out 1act.( 27.4592476 sig in 10act.) RANK 2 NODE 6 --> 21.5642414 sigma out 1act.( 20.0797195 sig in 10act.) RANK 3 NODE 10 --> 20.1527596 sigma out 1act.( 20.4332619 sig in 10act.) RANK 4 NODE 7 --> 19.6208973 sigma out 1act.( 19.090023 sig in 10act.) RANK 5 NODE 1 --> 19.0061722 sigma out 1act.( 19.2140255 sig in 10act.) RANK 6 NODE 3 --> 9.09650517 sigma out 1act.( 9.9479084 sig in 10act.) RANK 7 NODE 14 --> 7.87158012 sigma out 1act.( 11.8850174 sig in 10act.) RANK 8 NODE 11 --> 6.80096674 sigma out 1act.( 10.471899 sig in 10act.) RANK 9 NODE 2 --> 6.05331707 sigma out 1act.( 7.70002508 sig in 10act.) RANK 10 NODE 9 --> 4.80816078 sigma out 1act.( 7.90498352 sig in 10act.) RANK 11 NODE 13 --> 3.76099801 sigma out 1act.( 6.26401758 sig in 10act.) RANK 12 NODE 12 --> 2.77700329 sigma out 1act.( 5.72156572 sig in 10act.) RANK 13 NODE 15 --> 1.46760201 sigma out 1act.( 5.99969578 sig in 10act.) RANK 14 NODE 5 --> 1.38945293 sigma out 1act.( 6.91594648 sig in 10act.) RANK 15 NODE 4 --> 0.320572346 sigma out 1act.( 6.25143242 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 51.7555122 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.423436642 *** contribution from regularisation: 0.00358819356 *** contribution from error: -0.427024841 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.456270784 *** contribution from regularisation: 0.00205444056 *** contribution from error: -0.458325237 *********************************************** -----------------> Test sample ENTER BFGS code START -41338.5221 -0.0219896901 0.291982621 EXIT FROM BFGS code FG_START 0. -0.0219896901 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.474634916 *** contribution from regularisation: 0.00192081858 *** contribution from error: -0.476555735 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -42992.9064 -0.0219896901 116.432091 EXIT FROM BFGS code FG_LNSRCH 0. 0.00376645592 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0.501439214 *** contribution from regularisation: 0.00260781171 *** contribution from error: -0.504047036 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -45420.8632 0.00376645592 22.8063316 EXIT FROM BFGS code NEW_X -45420.8632 0.00376645592 22.8063316 ENTER BFGS code NEW_X -45420.8632 0.00376645592 22.8063316 EXIT FROM BFGS code FG_LNSRCH 0. 0.00871374179 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.502539992 *** contribution from regularisation: 0.00263297022 *** contribution from error: -0.505172968 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -45520.5749 0.00871374179 9.80585098 EXIT FROM BFGS code NEW_X -45520.5749 0.00871374179 9.80585098 ENTER BFGS code NEW_X -45520.5749 0.00871374179 9.80585098 EXIT FROM BFGS code FG_LNSRCH 0. 0.0126843732 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.502903402 *** contribution from regularisation: 0.00265331194 *** contribution from error: -0.505556703 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -45553.4921 0.0126843732 -8.49007034 EXIT FROM BFGS code NEW_X -45553.4921 0.0126843732 -8.49007034 ENTER BFGS code NEW_X -45553.4921 0.0126843732 -8.49007034 EXIT FROM BFGS code FG_LNSRCH 0. 0.010088197 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.503192961 *** contribution from regularisation: 0.002661634 *** contribution from error: -0.505854607 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -45579.7213 0.010088197 -24.9073009 EXIT FROM BFGS code NEW_X -45579.7213 0.010088197 -24.9073009 ENTER BFGS code NEW_X -45579.7213 0.010088197 -24.9073009 EXIT FROM BFGS code FG_LNSRCH 0. -0.00223171944 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.503651381 *** contribution from regularisation: 0.00263206405 *** contribution from error: -0.506283462 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -45621.2475 -0.00223171944 -40.619545 EXIT FROM BFGS code NEW_X -45621.2475 -0.00223171944 -40.619545 ENTER BFGS code NEW_X -45621.2475 -0.00223171944 -40.619545 EXIT FROM BFGS code FG_LNSRCH 0. -0.124079302 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.505074799 *** contribution from regularisation: 0.00271593896 *** contribution from error: -0.507790744 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -45750.1814 -0.124079302 -103.00251 EXIT FROM BFGS code NEW_X -45750.1814 -0.124079302 -103.00251 ENTER BFGS code NEW_X -45750.1814 -0.124079302 -103.00251 EXIT FROM BFGS code FG_LNSRCH 0. -0.138642967 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 38.9580383 sigma out 15 active outputs RANK 2 NODE 1 --> 26.6711407 sigma out 15 active outputs RANK 3 NODE 10 --> 26.5120983 sigma out 15 active outputs RANK 4 NODE 5 --> 21.9883785 sigma out 15 active outputs RANK 5 NODE 8 --> 20.2259636 sigma out 15 active outputs RANK 6 NODE 3 --> 18.1262894 sigma out 15 active outputs RANK 7 NODE 9 --> 17.9376278 sigma out 15 active outputs RANK 8 NODE 7 --> 11.6162615 sigma out 15 active outputs RANK 9 NODE 4 --> 11.4703054 sigma out 15 active outputs RANK 10 NODE 6 --> 5.61397362 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 14 --> 41.4515915 sigma in 10act. ( 38.6712227 sig out 1act.) RANK 2 NODE 3 --> 33.5291977 sigma in 10act. ( 33.5794983 sig out 1act.) RANK 3 NODE 10 --> 18.8609333 sigma in 10act. ( 17.7593498 sig out 1act.) RANK 4 NODE 1 --> 18.7400055 sigma in 10act. ( 17.6499939 sig out 1act.) RANK 5 NODE 11 --> 16.9962063 sigma in 10act. ( 16.8609066 sig out 1act.) RANK 6 NODE 8 --> 13.7324133 sigma in 10act. ( 12.2354803 sig out 1act.) RANK 7 NODE 2 --> 12.8976898 sigma in 10act. ( 12.8740358 sig out 1act.) RANK 8 NODE 6 --> 12.6985826 sigma in 10act. ( 12.1554155 sig out 1act.) RANK 9 NODE 13 --> 12.391284 sigma in 10act. ( 12.7722826 sig out 1act.) RANK 10 NODE 7 --> 11.2027073 sigma in 10act. ( 11.5224199 sig out 1act.) RANK 11 NODE 9 --> 7.75776529 sigma in 10act. ( 7.09691668 sig out 1act.) RANK 12 NODE 5 --> 5.93355036 sigma in 10act. ( 4.57609367 sig out 1act.) RANK 13 NODE 15 --> 4.54554081 sigma in 10act. ( 2.5958972 sig out 1act.) RANK 14 NODE 4 --> 3.65687871 sigma in 10act. ( 0.428180993 sig out 1act.) RANK 15 NODE 12 --> 3.16276884 sigma in 10act. ( 1.49965048 sig out 1act.) sorted by output significance RANK 1 NODE 14 --> 38.6712227 sigma out 1act.( 41.4515915 sig in 10act.) RANK 2 NODE 3 --> 33.5794983 sigma out 1act.( 33.5291977 sig in 10act.) RANK 3 NODE 10 --> 17.7593498 sigma out 1act.( 18.8609333 sig in 10act.) RANK 4 NODE 1 --> 17.6499939 sigma out 1act.( 18.7400055 sig in 10act.) RANK 5 NODE 11 --> 16.8609066 sigma out 1act.( 16.9962063 sig in 10act.) RANK 6 NODE 2 --> 12.8740358 sigma out 1act.( 12.8976898 sig in 10act.) RANK 7 NODE 13 --> 12.7722826 sigma out 1act.( 12.391284 sig in 10act.) RANK 8 NODE 8 --> 12.2354803 sigma out 1act.( 13.7324133 sig in 10act.) RANK 9 NODE 6 --> 12.1554155 sigma out 1act.( 12.6985826 sig in 10act.) RANK 10 NODE 7 --> 11.5224199 sigma out 1act.( 11.2027073 sig in 10act.) RANK 11 NODE 9 --> 7.09691668 sigma out 1act.( 7.75776529 sig in 10act.) RANK 12 NODE 5 --> 4.57609367 sigma out 1act.( 5.93355036 sig in 10act.) RANK 13 NODE 15 --> 2.5958972 sigma out 1act.( 4.54554081 sig in 10act.) RANK 14 NODE 12 --> 1.49965048 sigma out 1act.( 3.16276884 sig in 10act.) RANK 15 NODE 4 --> 0.428180993 sigma out 1act.( 3.65687871 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 66.1349411 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.507214546 *** contribution from regularisation: 0.00271164207 *** contribution from error: -0.5099262 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -45944.0018 -0.138642967 -91.9679489 EXIT FROM BFGS code NEW_X -45944.0018 -0.138642967 -91.9679489 ENTER BFGS code NEW_X -45944.0018 -0.138642967 -91.9679489 EXIT FROM BFGS code FG_LNSRCH 0. -0.209779724 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.50848335 *** contribution from regularisation: 0.00297327247 *** contribution from error: -0.511456609 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46058.9282 -0.209779724 -24.9209671 EXIT FROM BFGS code NEW_X -46058.9282 -0.209779724 -24.9209671 ENTER BFGS code NEW_X -46058.9282 -0.209779724 -24.9209671 EXIT FROM BFGS code FG_LNSRCH 0. -0.251773655 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.509269178 *** contribution from regularisation: 0.00287683425 *** contribution from error: -0.512145996 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46130.1111 -0.251773655 21.1577988 EXIT FROM BFGS code NEW_X -46130.1111 -0.251773655 21.1577988 ENTER BFGS code NEW_X -46130.1111 -0.251773655 21.1577988 EXIT FROM BFGS code FG_LNSRCH 0. -0.250862479 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.509636283 *** contribution from regularisation: 0.00277127954 *** contribution from error: -0.512407541 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46163.3648 -0.250862479 14.4708529 EXIT FROM BFGS code NEW_X -46163.3648 -0.250862479 14.4708529 ENTER BFGS code NEW_X -46163.3648 -0.250862479 14.4708529 EXIT FROM BFGS code FG_LNSRCH 0. -0.240852982 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.510017037 *** contribution from regularisation: 0.00276880059 *** contribution from error: -0.512785852 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46197.8556 -0.240852982 -3.3295238 EXIT FROM BFGS code NEW_X -46197.8556 -0.240852982 -3.3295238 ENTER BFGS code NEW_X -46197.8556 -0.240852982 -3.3295238 EXIT FROM BFGS code FG_LNSRCH 0. -0.226915538 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.510692477 *** contribution from regularisation: 0.00281956489 *** contribution from error: -0.513512015 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46259.035 -0.226915538 7.66775513 EXIT FROM BFGS code NEW_X -46259.035 -0.226915538 7.66775513 ENTER BFGS code NEW_X -46259.035 -0.226915538 7.66775513 EXIT FROM BFGS code FG_LNSRCH 0. -0.195606947 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.510832131 *** contribution from regularisation: 0.00296300044 *** contribution from error: -0.513795137 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46271.6856 -0.195606947 -18.8685474 EXIT FROM BFGS code NEW_X -46271.6856 -0.195606947 -18.8685474 ENTER BFGS code NEW_X -46271.6856 -0.195606947 -18.8685474 EXIT FROM BFGS code FG_LNSRCH 0. -0.200572565 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.511481702 *** contribution from regularisation: 0.00289663998 *** contribution from error: -0.514378369 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46330.5263 -0.200572565 -30.5437565 EXIT FROM BFGS code NEW_X -46330.5263 -0.200572565 -30.5437565 ENTER BFGS code NEW_X -46330.5263 -0.200572565 -30.5437565 EXIT FROM BFGS code FG_LNSRCH 0. -0.205229878 0. --------------------------------------------------- Iteration : 18 *********************************************** *** Learn Path 18 *** loss function: -0.511563122 *** contribution from regularisation: 0.00290625333 *** contribution from error: -0.514469385 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46337.8997 -0.205229878 -19.896431 EXIT FROM BFGS code NEW_X -46337.8997 -0.205229878 -19.896431 ENTER BFGS code NEW_X -46337.8997 -0.205229878 -19.896431 EXIT FROM BFGS code FG_LNSRCH 0. -0.211810976 0. --------------------------------------------------- Iteration : 19 *********************************************** *** Learn Path 19 *** loss function: -0.511651933 *** contribution from regularisation: 0.00290623889 *** contribution from error: -0.514558196 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46345.9455 -0.211810976 -10.5039167 EXIT FROM BFGS code NEW_X -46345.9455 -0.211810976 -10.5039167 ENTER BFGS code NEW_X -46345.9455 -0.211810976 -10.5039167 EXIT FROM BFGS code FG_LNSRCH 0. -0.227901027 0. --------------------------------------------------- Iteration : 20 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 58.1892242 sigma out 15 active outputs RANK 2 NODE 10 --> 36.8975258 sigma out 15 active outputs RANK 3 NODE 8 --> 35.9923096 sigma out 15 active outputs RANK 4 NODE 1 --> 25.750227 sigma out 15 active outputs RANK 5 NODE 9 --> 21.6911201 sigma out 15 active outputs RANK 6 NODE 5 --> 21.5136642 sigma out 15 active outputs RANK 7 NODE 3 --> 18.0704174 sigma out 15 active outputs RANK 8 NODE 7 --> 13.6947632 sigma out 15 active outputs RANK 9 NODE 4 --> 9.63596344 sigma out 15 active outputs RANK 10 NODE 6 --> 7.04432774 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 3 --> 57.2619438 sigma in 10act. ( 63.6493683 sig out 1act.) RANK 2 NODE 14 --> 54.3602142 sigma in 10act. ( 52.5108337 sig out 1act.) RANK 3 NODE 13 --> 23.4083118 sigma in 10act. ( 26.7530022 sig out 1act.) RANK 4 NODE 1 --> 18.8788166 sigma in 10act. ( 19.9378891 sig out 1act.) RANK 5 NODE 11 --> 16.9968605 sigma in 10act. ( 18.8072834 sig out 1act.) RANK 6 NODE 10 --> 15.5117283 sigma in 10act. ( 14.0156012 sig out 1act.) RANK 7 NODE 2 --> 15.3103008 sigma in 10act. ( 17.4109554 sig out 1act.) RANK 8 NODE 6 --> 11.5273685 sigma in 10act. ( 12.1285009 sig out 1act.) RANK 9 NODE 8 --> 9.54797173 sigma in 10act. ( 9.06957722 sig out 1act.) RANK 10 NODE 9 --> 7.84823227 sigma in 10act. ( 7.83163404 sig out 1act.) RANK 11 NODE 7 --> 7.57255554 sigma in 10act. ( 6.35582542 sig out 1act.) RANK 12 NODE 5 --> 4.92251015 sigma in 10act. ( 4.28665257 sig out 1act.) RANK 13 NODE 12 --> 3.6481998 sigma in 10act. ( 3.39251113 sig out 1act.) RANK 14 NODE 4 --> 2.67890811 sigma in 10act. ( 1.80466712 sig out 1act.) RANK 15 NODE 15 --> 2.42963815 sigma in 10act. ( 0.939921916 sig out 1act.) sorted by output significance RANK 1 NODE 3 --> 63.6493683 sigma out 1act.( 57.2619438 sig in 10act.) RANK 2 NODE 14 --> 52.5108337 sigma out 1act.( 54.3602142 sig in 10act.) RANK 3 NODE 13 --> 26.7530022 sigma out 1act.( 23.4083118 sig in 10act.) RANK 4 NODE 1 --> 19.9378891 sigma out 1act.( 18.8788166 sig in 10act.) RANK 5 NODE 11 --> 18.8072834 sigma out 1act.( 16.9968605 sig in 10act.) RANK 6 NODE 2 --> 17.4109554 sigma out 1act.( 15.3103008 sig in 10act.) RANK 7 NODE 10 --> 14.0156012 sigma out 1act.( 15.5117283 sig in 10act.) RANK 8 NODE 6 --> 12.1285009 sigma out 1act.( 11.5273685 sig in 10act.) RANK 9 NODE 8 --> 9.06957722 sigma out 1act.( 9.54797173 sig in 10act.) RANK 10 NODE 9 --> 7.83163404 sigma out 1act.( 7.84823227 sig in 10act.) RANK 11 NODE 7 --> 6.35582542 sigma out 1act.( 7.57255554 sig in 10act.) RANK 12 NODE 5 --> 4.28665257 sigma out 1act.( 4.92251015 sig in 10act.) RANK 13 NODE 12 --> 3.39251113 sigma out 1act.( 3.6481998 sig in 10act.) RANK 14 NODE 4 --> 1.80466712 sigma out 1act.( 2.67890811 sig in 10act.) RANK 15 NODE 15 --> 0.939921916 sigma out 1act.( 2.42963815 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 95.604805 sigma in 15 active inputs *********************************************** *** Learn Path 20 *** loss function: -0.511816382 *** contribution from regularisation: 0.00289708725 *** contribution from error: -0.514713466 *********************************************** -----------------> Test sample Iteration No: 20 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -46360.8424 -0.227901027 -3.61856151 EXIT FROM BFGS code NEW_X -46360.8424 -0.227901027 -3.61856151 ENTER BFGS code NEW_X -46360.8424 -0.227901027 -3.61856151 EXIT FROM BFGS code FG_LNSRCH 0. -0.278374285 0. --------------------------------------------------- Iteration : 21 *********************************************** *** Learn Path 21 *** loss function: -0.512128174 *** contribution from regularisation: 0.00288376887 *** contribution from error: -0.515011966 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46389.0833 -0.278374285 34.3137207 EXIT FROM BFGS code NEW_X -46389.0833 -0.278374285 34.3137207 ENTER BFGS code NEW_X -46389.0833 -0.278374285 34.3137207 EXIT FROM BFGS code FG_LNSRCH 0. -0.333081782 0. --------------------------------------------------- Iteration : 22 *********************************************** *** Learn Path 22 *** loss function: -0.51218164 *** contribution from regularisation: 0.00298521155 *** contribution from error: -0.515166879 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46393.9274 -0.333081782 41.3296471 EXIT FROM BFGS code NEW_X -46393.9274 -0.333081782 41.3296471 ENTER BFGS code NEW_X -46393.9274 -0.333081782 41.3296471 EXIT FROM BFGS code FG_LNSRCH 0. -0.316516459 0. --------------------------------------------------- Iteration : 23 *********************************************** *** Learn Path 23 *** loss function: -0.51253283 *** contribution from regularisation: 0.00293928897 *** contribution from error: -0.515472114 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46425.7367 -0.316516459 -13.0659142 EXIT FROM BFGS code NEW_X -46425.7367 -0.316516459 -13.0659142 ENTER BFGS code NEW_X -46425.7367 -0.316516459 -13.0659142 EXIT FROM BFGS code FG_LNSRCH 0. -0.321350753 0. --------------------------------------------------- Iteration : 24 *********************************************** *** Learn Path 24 *** loss function: -0.512570739 *** contribution from regularisation: 0.00294580893 *** contribution from error: -0.51551652 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46429.1689 -0.321350753 -7.94918728 EXIT FROM BFGS code NEW_X -46429.1689 -0.321350753 -7.94918728 ENTER BFGS code NEW_X -46429.1689 -0.321350753 -7.94918728 EXIT FROM BFGS code FG_LNSRCH 0. -0.334567219 0. --------------------------------------------------- Iteration : 25 *********************************************** *** Learn Path 25 *** loss function: -0.512701035 *** contribution from regularisation: 0.00290932204 *** contribution from error: -0.515610337 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46440.9718 -0.334567219 -2.63648391 EXIT FROM BFGS code NEW_X -46440.9718 -0.334567219 -2.63648391 ENTER BFGS code NEW_X -46440.9718 -0.334567219 -2.63648391 EXIT FROM BFGS code FG_LNSRCH 0. -0.357883722 0. --------------------------------------------------- Iteration : 26 *********************************************** *** Learn Path 26 *** loss function: -0.512849867 *** contribution from regularisation: 0.00289597525 *** contribution from error: -0.515745819 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46454.4522 -0.357883722 6.67167568 EXIT FROM BFGS code NEW_X -46454.4522 -0.357883722 6.67167568 ENTER BFGS code NEW_X -46454.4522 -0.357883722 6.67167568 EXIT FROM BFGS code FG_LNSRCH 0. -0.388500899 0. --------------------------------------------------- Iteration : 27 *********************************************** *** Learn Path 27 *** loss function: -0.512913704 *** contribution from regularisation: 0.00289510749 *** contribution from error: -0.515808821 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46460.2346 -0.388500899 19.0650215 EXIT FROM BFGS code NEW_X -46460.2346 -0.388500899 19.0650215 ENTER BFGS code NEW_X -46460.2346 -0.388500899 19.0650215 EXIT FROM BFGS code FG_LNSRCH 0. -0.393111646 0. --------------------------------------------------- Iteration : 28 *********************************************** *** Learn Path 28 *** loss function: -0.513022661 *** contribution from regularisation: 0.0029469009 *** contribution from error: -0.515969574 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46470.1044 -0.393111646 -12.6293669 EXIT FROM BFGS code NEW_X -46470.1044 -0.393111646 -12.6293669 ENTER BFGS code NEW_X -46470.1044 -0.393111646 -12.6293669 EXIT FROM BFGS code FG_LNSRCH 0. -0.389943689 0. --------------------------------------------------- Iteration : 29 *********************************************** *** Learn Path 29 *** loss function: -0.513064206 *** contribution from regularisation: 0.00295446976 *** contribution from error: -0.516018689 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46473.87 -0.389943689 -8.75696468 EXIT FROM BFGS code NEW_X -46473.87 -0.389943689 -8.75696468 ENTER BFGS code NEW_X -46473.87 -0.389943689 -8.75696468 EXIT FROM BFGS code FG_LNSRCH 0. -0.396149188 0. --------------------------------------------------- Iteration : 30 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 71.7552795 sigma out 15 active outputs RANK 2 NODE 8 --> 46.543457 sigma out 15 active outputs RANK 3 NODE 10 --> 45.5743408 sigma out 15 active outputs RANK 4 NODE 1 --> 38.2768364 sigma out 15 active outputs RANK 5 NODE 9 --> 34.8167229 sigma out 15 active outputs RANK 6 NODE 3 --> 29.2148247 sigma out 15 active outputs RANK 7 NODE 5 --> 25.671793 sigma out 15 active outputs RANK 8 NODE 7 --> 20.2184906 sigma out 15 active outputs RANK 9 NODE 4 --> 18.9182205 sigma out 15 active outputs RANK 10 NODE 6 --> 17.8579731 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 3 --> 72.5618744 sigma in 10act. ( 83.8498383 sig out 1act.) RANK 2 NODE 14 --> 69.6199951 sigma in 10act. ( 67.4982224 sig out 1act.) RANK 3 NODE 13 --> 39.0334282 sigma in 10act. ( 47.4058342 sig out 1act.) RANK 4 NODE 11 --> 29.3643131 sigma in 10act. ( 30.9458122 sig out 1act.) RANK 5 NODE 10 --> 26.200758 sigma in 10act. ( 25.7687645 sig out 1act.) RANK 6 NODE 1 --> 25.2035255 sigma in 10act. ( 27.8451576 sig out 1act.) RANK 7 NODE 2 --> 21.6724739 sigma in 10act. ( 23.2065544 sig out 1act.) RANK 8 NODE 8 --> 11.6543026 sigma in 10act. ( 12.624897 sig out 1act.) RANK 9 NODE 6 --> 11.5757217 sigma in 10act. ( 13.1480427 sig out 1act.) RANK 10 NODE 9 --> 7.41858625 sigma in 10act. ( 7.77344894 sig out 1act.) RANK 11 NODE 5 --> 4.58489323 sigma in 10act. ( 4.49920368 sig out 1act.) RANK 12 NODE 7 --> 4.33858824 sigma in 10act. ( 3.3424139 sig out 1act.) RANK 13 NODE 12 --> 3.33781171 sigma in 10act. ( 3.29858637 sig out 1act.) RANK 14 NODE 4 --> 1.25786757 sigma in 10act. ( 0.625527382 sig out 1act.) RANK 15 NODE 15 --> 0.859232605 sigma in 10act. ( 0.177262872 sig out 1act.) sorted by output significance RANK 1 NODE 3 --> 83.8498383 sigma out 1act.( 72.5618744 sig in 10act.) RANK 2 NODE 14 --> 67.4982224 sigma out 1act.( 69.6199951 sig in 10act.) RANK 3 NODE 13 --> 47.4058342 sigma out 1act.( 39.0334282 sig in 10act.) RANK 4 NODE 11 --> 30.9458122 sigma out 1act.( 29.3643131 sig in 10act.) RANK 5 NODE 1 --> 27.8451576 sigma out 1act.( 25.2035255 sig in 10act.) RANK 6 NODE 10 --> 25.7687645 sigma out 1act.( 26.200758 sig in 10act.) RANK 7 NODE 2 --> 23.2065544 sigma out 1act.( 21.6724739 sig in 10act.) RANK 8 NODE 6 --> 13.1480427 sigma out 1act.( 11.5757217 sig in 10act.) RANK 9 NODE 8 --> 12.624897 sigma out 1act.( 11.6543026 sig in 10act.) RANK 10 NODE 9 --> 7.77344894 sigma out 1act.( 7.41858625 sig in 10act.) RANK 11 NODE 5 --> 4.49920368 sigma out 1act.( 4.58489323 sig in 10act.) RANK 12 NODE 7 --> 3.3424139 sigma out 1act.( 4.33858824 sig in 10act.) RANK 13 NODE 12 --> 3.29858637 sigma out 1act.( 3.33781171 sig in 10act.) RANK 14 NODE 4 --> 0.625527382 sigma out 1act.( 1.25786757 sig in 10act.) RANK 15 NODE 15 --> 0.177262872 sigma out 1act.( 0.859232605 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 131.168182 sigma in 15 active inputs *********************************************** *** Learn Path 30 *** loss function: -0.513133764 *** contribution from regularisation: 0.00294420146 *** contribution from error: -0.516077995 *********************************************** -----------------> Test sample Iteration No: 30 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -46480.1671 -0.396149188 -7.38249397 EXIT FROM BFGS code NEW_X -46480.1671 -0.396149188 -7.38249397 ENTER BFGS code NEW_X -46480.1671 -0.396149188 -7.38249397 EXIT FROM BFGS code FG_LNSRCH 0. -0.429439664 0. --------------------------------------------------- Iteration : 31 *********************************************** *** Learn Path 31 *** loss function: -0.513344169 *** contribution from regularisation: 0.00291851489 *** contribution from error: -0.51626271 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46499.2299 -0.429439664 -11.9027166 EXIT FROM BFGS code NEW_X -46499.2299 -0.429439664 -11.9027166 ENTER BFGS code NEW_X -46499.2299 -0.429439664 -11.9027166 EXIT FROM BFGS code FG_LNSRCH 0. -0.483554035 0. --------------------------------------------------- Iteration : 32 *********************************************** *** Learn Path 32 *** loss function: -0.512590051 *** contribution from regularisation: 0.00295442785 *** contribution from error: -0.515544474 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46430.9176 -0.483554035 200.891464 EXIT FROM BFGS code FG_LNSRCH 0. -0.436901927 0. --------------------------------------------------- Iteration : 33 *********************************************** *** Learn Path 33 *** loss function: -0.513321698 *** contribution from regularisation: 0.00297975144 *** contribution from error: -0.516301453 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46497.1911 -0.436901927 13.3518066 EXIT FROM BFGS code FG_LNSRCH 0. -0.430618435 0. --------------------------------------------------- Iteration : 34 *********************************************** *** Learn Path 34 *** loss function: -0.5133093 *** contribution from regularisation: 0.00296111568 *** contribution from error: -0.516270399 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46496.0673 -0.430618435 -7.95648003 EXIT FROM BFGS code FG_LNSRCH 0. -0.429477841 0. --------------------------------------------------- Iteration : 35 *********************************************** *** Learn Path 35 *** loss function: -0.513301194 *** contribution from regularisation: 0.0029618456 *** contribution from error: -0.516263068 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46495.3334 -0.429477841 -11.7179451 EXIT FROM BFGS code FG_LNSRCH 0. -0.429439694 0. --------------------------------------------------- Iteration : 36 *********************************************** *** Learn Path 36 *** loss function: -0.513296843 *** contribution from regularisation: 0.00296584703 *** contribution from error: -0.51626271 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46494.9413 -0.429439694 -11.9141636 EXIT FROM BFGS code FG_LNSRCH 0. -0.429439664 0. --------------------------------------------------- Iteration : 37 *********************************************** *** Learn Path 37 *** loss function: -0.513308823 *** contribution from regularisation: 0.0029538786 *** contribution from error: -0.51626271 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46496.0267 -0.429439664 -11.9304008 EXIT FROM BFGS code FG_LNSRCH 0. -0.429439664 0. --------------------------------------------------- Iteration : 38 *********************************************** *** Learn Path 38 *** loss function: -0.513310373 *** contribution from regularisation: 0.00295232097 *** contribution from error: -0.51626271 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46496.1677 -0.429439664 -11.9045143 EXIT FROM BFGS code FG_LNSRCH 0. -0.429439664 0. --------------------------------------------------- Iteration : 39 *********************************************** *** Learn Path 39 *** loss function: -0.513309062 *** contribution from regularisation: 0.00295364647 *** contribution from error: -0.51626271 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -46496.0477 -0.429439664 -11.8399496 EXIT FROM BFGS code NEW_X -46496.0477 -0.429439664 -11.8399496 ENTER BFGS code NEW_X -46496.0477 -0.429439664 -11.8399496 EXIT FROM BFGS code CONVERGENC -46496.0477 -0.429439664 -11.8399496 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 2 --> 114.523506 sigma out 15 active outputs RANK 2 NODE 10 --> 77.8936386 sigma out 15 active outputs RANK 3 NODE 8 --> 76.3544846 sigma out 15 active outputs RANK 4 NODE 1 --> 63.8484535 sigma out 15 active outputs RANK 5 NODE 9 --> 58.2192841 sigma out 15 active outputs RANK 6 NODE 3 --> 46.9494629 sigma out 15 active outputs RANK 7 NODE 5 --> 41.5595932 sigma out 15 active outputs RANK 8 NODE 4 --> 32.1513863 sigma out 15 active outputs RANK 9 NODE 7 --> 31.4762592 sigma out 15 active outputs RANK 10 NODE 6 --> 27.5465679 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 3 --> 118.800819 sigma in 10act. ( 134.657578 sig out 1act.) RANK 2 NODE 14 --> 111.135399 sigma in 10act. ( 110.128532 sig out 1act.) RANK 3 NODE 13 --> 64.1978073 sigma in 10act. ( 80.2251511 sig out 1act.) RANK 4 NODE 11 --> 50.3612556 sigma in 10act. ( 52.648819 sig out 1act.) RANK 5 NODE 10 --> 46.7527695 sigma in 10act. ( 47.2263832 sig out 1act.) RANK 6 NODE 1 --> 41.7822189 sigma in 10act. ( 46.5507889 sig out 1act.) RANK 7 NODE 2 --> 35.6738701 sigma in 10act. ( 38.1066704 sig out 1act.) RANK 8 NODE 6 --> 18.350399 sigma in 10act. ( 21.4647541 sig out 1act.) RANK 9 NODE 8 --> 18.0713387 sigma in 10act. ( 19.6924 sig out 1act.) RANK 10 NODE 9 --> 10.8502846 sigma in 10act. ( 11.8621626 sig out 1act.) RANK 11 NODE 5 --> 6.14085102 sigma in 10act. ( 6.18555355 sig out 1act.) RANK 12 NODE 7 --> 5.03892136 sigma in 10act. ( 4.66821241 sig out 1act.) RANK 13 NODE 12 --> 4.63728952 sigma in 10act. ( 4.73687744 sig out 1act.) RANK 14 NODE 4 --> 1.15122187 sigma in 10act. ( 0.754123211 sig out 1act.) RANK 15 NODE 15 --> 0.635833502 sigma in 10act. ( 0.164070502 sig out 1act.) sorted by output significance RANK 1 NODE 3 --> 134.657578 sigma out 1act.( 118.800819 sig in 10act.) RANK 2 NODE 14 --> 110.128532 sigma out 1act.( 111.135399 sig in 10act.) RANK 3 NODE 13 --> 80.2251511 sigma out 1act.( 64.1978073 sig in 10act.) RANK 4 NODE 11 --> 52.648819 sigma out 1act.( 50.3612556 sig in 10act.) RANK 5 NODE 10 --> 47.2263832 sigma out 1act.( 46.7527695 sig in 10act.) RANK 6 NODE 1 --> 46.5507889 sigma out 1act.( 41.7822189 sig in 10act.) RANK 7 NODE 2 --> 38.1066704 sigma out 1act.( 35.6738701 sig in 10act.) RANK 8 NODE 6 --> 21.4647541 sigma out 1act.( 18.350399 sig in 10act.) RANK 9 NODE 8 --> 19.6924 sigma out 1act.( 18.0713387 sig in 10act.) RANK 10 NODE 9 --> 11.8621626 sigma out 1act.( 10.8502846 sig in 10act.) RANK 11 NODE 5 --> 6.18555355 sigma out 1act.( 6.14085102 sig in 10act.) RANK 12 NODE 12 --> 4.73687744 sigma out 1act.( 4.63728952 sig in 10act.) RANK 13 NODE 7 --> 4.66821241 sigma out 1act.( 5.03892136 sig in 10act.) RANK 14 NODE 4 --> 0.754123211 sigma out 1act.( 1.15122187 sig in 10act.) RANK 15 NODE 15 --> 0.164070502 sigma out 1act.( 0.635833502 sig in 10act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 215.384964 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.51329571 *** contribution from regularisation: 0.00296699069 *** contribution from error: -0.51626271 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 28444 Closing output file done