NNInput NNInputs_165.root Options for steering Constraint : lep1_E<400&&lep2_E<400&& HiLoSbString : SB SbString : Target WeightString : TrainWeight EqualizeSB : 0 EvaluateVariables : 0 SetNBProcessingDefault : 1 UseNeuroBayes : 1 WeightEvents : 1 NBTreePrepEvPrint : 1 NBTreePrepReportInterval : 10000 NB_Iter : 250 NBZero999Tol : 0.001 **** List Parameters **** Method:H6AONN5MEMLP:MLP : !V:NCycles=250:HiddenLayers=N+1,N Value for FileString Not found Value for RandString Not found Value for DilTypeString Not found Determine File Parameters : Info: No file info in tree NNAna::CopyTree: entries= 211110 file= 0 options= lep1_E<400&&lep2_E<400&& SBRatio= 0 wt= 1 SorB = 2 NNAna::CopyTree: SigChoice: lep1_E<400&&lep2_E<400&&Target==1 BkgChoice: lep1_E<400&&lep2_E<400&&Target==0 Creating Signal tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==1 Creating Background tree for file: 0 weight: 1 SBRatio = 0 Using copy string: lep1_E<400&&lep2_E<400&&Target==0 NNAna::CopyTree: nsig = 62768 nbkg = 148342 Bkg Entries: 148342 Sig Entries: 62768 Chosen entries: 62768 Signal fraction: 1 Background fraction: 0.42313 Signal Tree Copy Condition: Background Tree Copy Condition: Actual Background Entries: 148342 Actual Signal Entries: 62768 Entries to split: 62768 Test with : 31384 Train with : 31384 ********************************************* * This product is licenced for educational * * and scientific use only. Commercial use * * is prohibited ! * ********************************************* Your number of nodes in the input layer is: 14 Your number of nodes in the hidden layer is: 15 Your number of nodes in the output layer is: 1 You want to do classification You want to use the global preprocessing flag 812 You want to use standard regularisation You use entropy as a loss function You want to use the BFGS algorithm You do not want to fix the shape You use 50 % of patterns for training Weight update after 200 events You want to speed up learning by a factor of 1. You want to limit the learning rate to 0.5 You want to run 250 iterations NeuroBayesTeacher::NB_DEF_DEBUG : setting debug level to 0 You are not allowed to change the debug flag. This is only permitted with a developers licence. NBTreeReportInterval = 10000 NBTreePrepEvPrint = 1 Start Set Inidividual Variable Preprocessing Not touching individual preprocessing for Ht ( 0 ) in Neurobayes Not touching individual preprocessing for LepAPt ( 1 ) in Neurobayes Not touching individual preprocessing for LepBPt ( 2 ) in Neurobayes Not touching individual preprocessing for MetSigLeptonsJets ( 3 ) in Neurobayes Not touching individual preprocessing for MetSpec ( 4 ) in Neurobayes Not touching individual preprocessing for SumEtLeptonsJets ( 5 ) in Neurobayes Not touching individual preprocessing for VSumJetLeptonsPt ( 6 ) in Neurobayes Not touching individual preprocessing for addEt ( 7 ) in Neurobayes Not touching individual preprocessing for dPhiLepSumMet ( 8 ) in Neurobayes Not touching individual preprocessing for dPhiLeptons ( 9 ) in Neurobayes Not touching individual preprocessing for dRLeptons ( 10 ) in Neurobayes Not touching individual preprocessing for lep1_E ( 11 ) in Neurobayes Not touching individual preprocessing for lep2_E ( 12 ) in Neurobayes End Set Inidividual Variable Preprocessing Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 62768 for Signal Prepared event 0 for Signal with 62768 events ====Entry 0 Variable Ht : 267.874 Variable LepAPt : 45.292 Variable LepBPt : 35.604 Variable MetSigLeptonsJets : 7.24556 Variable MetSpec : 95.2083 Variable SumEtLeptonsJets : 172.666 Variable VSumJetLeptonsPt : 107.325 Variable addEt : 176.104 Variable dPhiLepSumMet : 2.28406 Variable dPhiLeptons : 0.0254087 Variable dRLeptons : 0.325152 Variable lep1_E : 46.0978 Variable lep2_E : 40.383 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 2165 Ht = 267.874 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 45.2922 LepAPt = 45.292 LepBEt = 35.604 LepBPt = 35.604 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 95.2083 MetDelPhi = 2.19471 MetSig = 6.53886 MetSigLeptonsJets = 7.24556 MetSpec = 95.2083 Mjj = 0 MostCentralJetEta = -1.47546 MtllMet = 179.575 Njets = 1 SB = 0 SumEt = 212.005 SumEtJets = 0 SumEtLeptonsJets = 172.666 Target = 1 TrainWeight = 1 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 107.325 addEt = 176.104 dPhiLepSumMet = 2.28406 dPhiLeptons = 0.0254087 dRLeptons = 0.325152 diltype = 52 dimass = 13.1154 event = 77 jet1_Et = 91.7697 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 46.0978 lep2_E = 40.383 rand = 0.999742 run = 237144 weight = 4.01724e-06 ===Show End Prepared event 10000 for Signal with 62768 events Prepared event 20000 for Signal with 62768 events Prepared event 30000 for Signal with 62768 events Prepared event 40000 for Signal with 62768 events Prepared event 50000 for Signal with 62768 events Prepared event 60000 for Signal with 62768 events Adding variable Ht To Neurobayes Adding variable LepAPt To Neurobayes Adding variable LepBPt To Neurobayes Adding variable MetSigLeptonsJets To Neurobayes Adding variable MetSpec To Neurobayes Adding variable SumEtLeptonsJets To Neurobayes Adding variable VSumJetLeptonsPt To Neurobayes Adding variable addEt To Neurobayes Adding variable dPhiLepSumMet To Neurobayes Adding variable dPhiLeptons To Neurobayes Adding variable dRLeptons To Neurobayes Adding variable lep1_E To Neurobayes Adding variable lep2_E To Neurobayes NNAna::PrepareNBTraining_: Nent= 148342 for Background Prepared event 0 for Background with 148342 events ====Entry 0 Variable Ht : 85.3408 Variable LepAPt : 31.1294 Variable LepBPt : 10.0664 Variable MetSigLeptonsJets : 6.87768 Variable MetSpec : 44.1437 Variable SumEtLeptonsJets : 41.1959 Variable VSumJetLeptonsPt : 41.0132 Variable addEt : 85.3408 Variable dPhiLepSumMet : 3.10536 Variable dPhiLeptons : 0.219342 Variable dRLeptons : 0.424454 Variable lep1_E : 32.3548 Variable lep2_E : 10.1027 ===Show Start ======> EVENT:0 DEtaJ1J2 = 0 DEtaJ1Lep1 = 0 DEtaJ1Lep2 = 0 DEtaJ2Lep1 = 0 DEtaJ2Lep2 = 0 DPhiJ1J2 = 0 DPhiJ1Lep1 = 0 DPhiJ1Lep2 = 0 DPhiJ2Lep1 = 0 DPhiJ2Lep2 = 0 DRJ1J2 = 0 DRJ1Lep1 = 0 DRJ1Lep2 = 0 DRJ2Lep1 = 0 DRJ2Lep2 = 0 DeltaRJet12 = 0 File = 1 Ht = 85.3408 IsMEBase = 0 LRHWW = 0 LRWW = 0 LRWg = 0 LRWj = 0 LRZZ = 0 LepAEt = 31.1297 LepAPt = 31.1294 LepBEt = 10.0674 LepBPt = 10.0664 LessCentralJetEta = 0 MJ1Lep1 = 0 MJ1Lep2 = 0 MJ2Lep1 = 0 MJ2Lep2 = 0 NN = 0 Met = 44.1437 MetDelPhi = 3.01191 MetSig = 3.74558 MetSigLeptonsJets = 6.87768 MetSpec = 44.1437 Mjj = 0 MostCentralJetEta = 0 MtllMet = 86.2332 Njets = 0 SB = 0 SumEt = 138.899 SumEtJets = 0 SumEtLeptonsJets = 41.1959 Target = 0 TrainWeight = 0.377498 VSum2JetLeptonsPt = 0 VSum2JetPt = 0 VSumJetLeptonsPt = 41.0132 addEt = 85.3408 dPhiLepSumMet = 3.10536 dPhiLeptons = 0.219342 dRLeptons = 0.424454 diltype = 17 dimass = 7.54723 event = 6717520 jet1_Et = 0 jet1_eta = 0 jet2_Et = 0 jet2_eta = 0 lep1_E = 32.3548 lep2_E = 10.1027 rand = 0.999742 run = 271566 weight = 0.00296168 ===Show End Prepared event 10000 for Background with 148342 events Prepared event 20000 for Background with 148342 events Prepared event 30000 for Background with 148342 events Prepared event 40000 for Background with 148342 events Prepared event 50000 for Background with 148342 events Prepared event 60000 for Background with 148342 events Prepared event 70000 for Background with 148342 events Prepared event 80000 for Background with 148342 events Prepared event 90000 for Background with 148342 events Prepared event 100000 for Background with 148342 events Prepared event 110000 for Background with 148342 events Prepared event 120000 for Background with 148342 events Prepared event 130000 for Background with 148342 events Prepared event 140000 for Background with 148342 events Warning: found 4725 negative weights. << hh tt >> << hh ii tt >> << hh tttttt >> << ppppp hhhhhh ii tt >> << pp pp hhh hh ii ----- tt >> << pp pp hh hh ii ----- tt >> << ppppp hh hh ii tt >> pp pp ////////////////////////////// pp \\\\\\\\\\\\\\\ Phi-T(R) NeuroBayes(R) Teacher Algorithms by Michael Feindt Implementation by Phi-T Project 2001-2003 Copyright Phi-T GmbH Version 20080312 Library compiled with: NB_MAXPATTERN= 1500000 NB_MAXNODE = 100 ----------------------------------- found 211110 samples to learn from preprocessing flags/parameters: global preprocessing flag: 812 individual preprocessing: now perform preprocessing *** called with option 12 *** This will do for you: *** input variable equalisation *** to Gaussian distribution with mean=0 and sigma=1 *** Then variables are decorrelated ************************************ Warning: found 4725 negative weights. Signal fraction: 63.1958847 % ------------------------------ Transdef: Tab for variable 1 -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. -1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. ------------------------------ Transdef: Tab for variable 2 60.0048981 63.6824875 66.0119476 67.6687317 68.8007965 69.9885864 70.92276 71.9494171 72.8506927 73.7350922 74.4962158 75.207016 75.9163208 76.6222229 77.1971359 78.0246429 78.738739 79.3659973 79.9898224 80.7359543 81.3352814 82.1322021 82.6935425 83.5556946 84.6324844 85.6854401 86.8271637 88.1522827 89.2641678 90.9054184 92.3707809 94.1464691 95.7880478 97.5589905 99.2294159 100.893402 102.296562 104.175697 105.723572 107.356613 109.056412 110.694214 112.694733 114.284866 116.153091 118.004654 119.906967 121.75769 123.641563 125.408157 127.292427 128.963287 130.805664 132.592178 134.209991 135.947815 137.586945 139.146637 140.737396 142.267227 143.768387 145.365448 146.664062 148.175522 149.625305 151.093246 152.553314 153.8853 155.199493 156.550049 157.969208 159.386536 160.83429 162.394791 164.047455 166.000504 167.979706 169.846954 172.004959 174.26503 176.586517 179.094498 181.72084 184.559235 187.647491 190.678314 194.021179 197.545685 201.455734 205.744629 210.400909 215.912415 221.801819 228.541672 236.563583 245.996735 257.726746 275.101746 301.036896 351.722534 1107.59326 ------------------------------ Transdef: Tab for variable 3 20.0009289 20.3850632 20.741766 21.048933 21.34095 21.5901871 21.9080734 22.1321774 22.3532639 22.6000309 22.8168869 23.0382252 23.2692184 23.515892 23.7859726 24.044241 24.3132 24.5431328 24.7758732 25.0329895 25.2264252 25.4467373 25.6668663 25.9112873 26.1456394 26.3869553 26.6294823 26.8984985 27.1263657 27.3570557 27.6217194 27.8745232 28.1338425 28.3567276 28.607338 28.9073143 29.1420479 29.4579411 29.7429085 30.0042114 30.2999649 30.5509834 30.8644428 31.1574745 31.470665 31.7413902 31.9966526 32.3165855 32.6493301 32.9351463 33.2543945 33.6158066 33.9122543 34.2637634 34.5850906 34.9048462 35.2039108 35.4816132 35.8032761 36.1333466 36.4640045 36.7990532 37.1112671 37.448925 37.77211 38.1226349 38.4753304 38.8079872 39.144722 39.5059128 39.8386383 40.1902771 40.5319099 40.8138657 41.1687546 41.5447769 41.8948975 42.3021698 42.7257156 43.1746025 43.6257935 44.0802422 44.5765991 45.0913391 45.6238632 46.1165695 46.6973305 47.3010178 48.0143623 48.695385 49.5148659 50.3752708 51.4292564 52.591114 53.9993057 55.6715927 57.8829689 60.8587265 65.288063 73.4948425 228.385986 ------------------------------ Transdef: Tab for variable 4 10.0001202 10.1310005 10.2914886 10.4198456 10.5401173 10.6724329 10.815382 10.9684982 11.1238537 11.2877102 11.4342222 11.5606365 11.7045155 11.8501101 11.9900837 12.1529732 12.3062525 12.4804697 12.6594772 12.8443995 12.9975243 13.2085409 13.397625 13.562439 13.7808208 13.9512863 14.1427908 14.3289165 14.5189838 14.7474232 15.0033207 15.233077 15.456543 15.692028 15.9291296 16.177906 16.4492111 16.7013168 16.9382744 17.230423 17.5473804 17.8115978 18.014164 18.3313999 18.6560001 18.9738541 19.3025017 19.6707458 20.0027599 20.3155594 20.628479 20.9369507 21.2478027 21.5581551 21.8737297 22.223608 22.5616741 22.8935776 23.2612915 23.6227989 24.0112591 24.3527088 24.6877327 25.0475082 25.4044342 25.7967529 26.1467781 26.5223389 26.9168015 27.2766895 27.6497002 28.0240288 28.3993492 28.7499962 29.1521893 29.5079994 29.8862324 30.2770081 30.6694221 31.0706482 31.4572754 31.8295708 32.2542419 32.6487961 33.0513878 33.4557724 33.9016266 34.4249039 34.9213066 35.4250259 35.9468956 36.534935 37.2304077 37.9737244 38.8611908 39.9501877 41.3046265 43.3577309 46.413063 51.699234 106.676407 ------------------------------ Transdef: Tab for variable 5 1.13045692 2.74374628 3.23050451 3.5658834 3.8316586 4.04912806 4.1832943 4.3164854 4.44430208 4.5563364 4.64984608 4.749722 4.83400011 4.93054533 5.01684284 5.09780407 5.16276646 5.24068832 5.30461693 5.36575937 5.43164349 5.48255157 5.53586674 5.58475208 5.64063311 5.69278431 5.74515724 5.79460716 5.83816242 5.89079189 5.93635082 5.98138523 6.01966333 6.07371569 6.12268019 6.17804432 6.21441746 6.25587034 6.30204821 6.35487652 6.4046464 6.45108414 6.49557209 6.53861141 6.58721876 6.63374758 6.67373371 6.72854233 6.77241993 6.82010221 6.86318398 6.91076946 6.96656513 7.01487303 7.06767511 7.11997223 7.1731863 7.22697926 7.28411293 7.34104395 7.39857388 7.45091629 7.50410748 7.56694031 7.62888336 7.68679047 7.74829626 7.81263638 7.87212086 7.93004799 7.9865818 8.04869366 8.1082859 8.1646843 8.22234821 8.2749424 8.33516693 8.39740086 8.45921135 8.51865005 8.58608532 8.64995098 8.71717072 8.7830143 8.85179901 8.93317413 9.00890732 9.08179855 9.16713524 9.25818062 9.36088562 9.47031593 9.59832096 9.7387085 9.90554619 10.107542 10.378458 10.7128983 11.2497864 12.1748505 19.1815319 ------------------------------ Transdef: Tab for variable 6 15.0022125 21.7229729 25.7152023 26.9359665 27.9873428 29.0177193 29.8069706 30.6926346 31.4133377 32.1255112 32.7409859 33.2991486 33.7593689 34.2990952 34.6990891 35.1564636 35.6055374 35.9859314 36.3507919 36.7338104 37.095047 37.5425262 37.8938599 38.2689247 38.5875893 38.9846916 39.4081154 39.7997742 40.1943932 40.5647202 41.014267 41.4849892 41.936142 42.3715897 42.8725128 43.3000679 43.7770081 44.2823029 44.7830658 45.3343277 45.9526329 46.6738968 47.3885269 48.0497284 48.7860565 49.5281181 50.2216759 50.9656296 51.7054825 52.5501556 53.4559555 54.2130661 55.0263367 55.8359375 56.6236191 57.2851067 58.112648 58.8481827 59.6327286 60.494957 61.3186493 62.1305695 62.9884567 63.7999802 64.6129684 65.4114761 66.2001877 66.9802094 67.8013916 68.6122284 69.4002686 70.2408905 70.9935226 71.7901764 72.5780182 73.3072052 74.0633545 74.7917786 75.5204315 76.3588562 76.8454361 77.6120911 78.4634705 79.262619 80.1867828 81.1373596 82.1710052 83.1241913 84.3006287 85.5380402 87.0467224 88.7253189 90.5942154 92.770752 95.0978546 98.4963837 102.358978 107.678009 115.589325 129.223022 280.646332 ------------------------------ Transdef: Tab for variable 7 30.1476021 32.9006195 33.7819977 34.6118011 35.2124176 35.6571045 36.1603432 36.6095085 37.0283051 37.3840561 37.7608452 38.0876007 38.463623 38.7894859 39.1514359 39.440033 39.7834625 40.120285 40.5067444 40.8987503 41.2655945 41.6370163 42.0373573 42.480751 42.925621 43.6160889 44.2212296 44.8557243 45.4082909 46.3389664 47.2761765 48.1872292 49.1618423 50.2561951 51.2728271 52.3524094 53.3015366 54.3749466 55.3416595 56.2712708 57.2086411 58.248867 59.2009621 60.1582031 61.1179695 62.1300278 62.9795799 63.9038239 64.8663635 65.8449554 66.7190399 67.6838531 68.6525345 69.5811615 70.404953 71.2725677 72.0844955 72.9859772 73.6843872 74.4772873 75.3225555 76.2261887 77.0746689 77.9420242 78.7901764 79.6895752 80.6326904 81.7335052 82.9430695 84.1354675 85.3907776 86.8110046 88.3506241 89.8360672 91.5281448 93.2880707 95.0735474 96.8825531 98.8479462 100.670242 102.605835 104.607277 106.578758 108.802246 111.247528 113.795609 116.473602 119.465195 123.078148 126.442406 130.126877 134.857849 140.269592 145.77916 152.907104 161.207184 172.316284 186.263062 207.183838 245.199692 826.380249 ------------------------------ Transdef: Tab for variable 8 0.172416463 26.501503 30.2705193 31.9315605 32.7429085 33.4269333 33.9917984 34.3806953 34.8317261 35.2081985 35.6151009 36.0058899 36.3394547 36.678009 36.9773445 37.2594566 37.5844498 37.8403244 38.1306534 38.4232407 38.7052383 39.009491 39.2953873 39.5963974 39.9008026 40.2384796 40.6012917 40.9664001 41.2938499 41.7323151 42.112011 42.5543594 43.0023117 43.5199585 43.905365 44.3844643 44.9319267 45.5214157 46.2652588 46.8541718 47.5003128 48.2282524 48.7965508 49.4553185 50.1390228 50.8748703 51.6305046 52.3769302 53.1322556 53.892952 54.600872 55.3607635 56.0914497 56.8638458 57.5785065 58.3864441 59.1312332 59.8327904 60.6272392 61.3963203 62.1577682 62.884079 63.6013374 64.385231 65.0919189 65.8194122 66.4906235 67.2205124 67.9044342 68.6179047 69.3046417 69.9910431 70.6567993 71.375885 72.0713959 72.7697601 73.4817963 74.155014 74.8061371 75.5649414 76.3626251 77.1152649 77.8604584 78.6407928 79.480217 80.4239197 81.4437256 82.5094376 83.7396545 85.0663757 86.6051559 88.4736633 90.6027069 93.2368927 96.1912537 99.8945694 104.410057 110.349289 119.325607 139.620819 347.479309 ------------------------------ Transdef: Tab for variable 9 48.8581314 63.1902466 65.4784851 67.0899963 68.3629913 69.3033142 70.2430878 71.1436691 72.0452576 72.862915 73.6652832 74.3266983 75.029068 75.5543823 76.1476593 76.7696228 77.2851257 77.9989624 78.641983 79.2043839 79.7727051 80.3445587 80.9544067 81.5096283 82.1725616 82.7095032 83.4588318 84.2799988 85.1336975 86.0708542 86.9929962 88.1241989 89.0624847 90.0788422 91.2523651 92.6101685 94.1275024 95.6053543 96.9327316 98.429306 99.8029633 101.319519 102.409096 104.11795 105.534309 106.962112 108.430237 109.914925 111.352402 112.893295 114.326126 115.741623 117.192307 118.602585 119.958122 121.316299 122.782837 124.125488 125.528915 126.903748 128.280823 129.683685 131.008026 132.398392 133.678818 135.001587 136.351593 137.602722 138.971848 140.175232 141.451782 142.595703 143.723907 144.883759 145.934479 147.142761 148.32901 149.432251 150.544861 151.678711 152.724518 153.685776 154.702728 155.718231 156.780701 157.918274 159.025101 160.139603 161.32959 162.612427 164.01506 165.542847 167.214203 169.185593 171.360809 174.139511 178.160797 184.21077 194.903564 222.726349 582.80835 ------------------------------ Transdef: Tab for variable 10 0.0335818008 1.01485634 1.37169218 1.60055208 1.78408146 1.91251159 2.01676226 2.10016346 2.1735301 2.23752928 2.28753304 2.33519506 2.38253784 2.42162895 2.45758557 2.4941411 2.52557802 2.55201721 2.58215904 2.60800934 2.63066339 2.65373039 2.67333126 2.69306397 2.71295691 2.73274803 2.7505722 2.76630187 2.78007627 2.79540157 2.8072691 2.82066154 2.8329072 2.84283423 2.85271144 2.86220026 2.87211227 2.88072205 2.89085674 2.89936042 2.90851402 2.91634846 2.92444897 2.9326036 2.94057751 2.94796228 2.95457745 2.96086311 2.96697378 2.9730835 2.97846293 2.98377085 2.98906541 2.99468136 2.99968576 3.00407457 3.00863504 3.01376438 3.01776266 3.02246141 3.02675629 3.0308938 3.03487563 3.03847694 3.04213715 3.04597878 3.04976368 3.05330086 3.05661154 3.05944109 3.06286263 3.06620979 3.06955361 3.07272983 3.07556581 3.07847977 3.0817678 3.08477306 3.08756447 3.09048748 3.09322906 3.09577227 3.09838676 3.10113764 3.10372353 3.106112 3.10861254 3.11105251 3.11328506 3.11586571 3.11812329 3.12064981 3.1231451 3.12536144 3.12762737 3.12981629 3.13213372 3.13440895 3.13661933 3.13907719 3.14158535 ------------------------------ Transdef: Tab for variable 11 7.15255737E-07 0.00535455346 0.0104427468 0.0160494 0.0219960213 0.0278169513 0.0338466167 0.0407016277 0.0466120206 0.0528769493 0.0585344955 0.0638920665 0.0690268576 0.0749495029 0.0803923681 0.0862836838 0.0920279622 0.0969141722 0.101867557 0.107138894 0.111869216 0.116899014 0.122060061 0.12703836 0.132496685 0.137270153 0.142354369 0.147451878 0.152111769 0.157055497 0.162026465 0.166665822 0.172180384 0.176847994 0.181744188 0.186855316 0.191982865 0.196755648 0.201156616 0.20638597 0.211055338 0.215793192 0.220913529 0.22636953 0.231688857 0.236542106 0.24183166 0.246466279 0.251404762 0.256404996 0.261861086 0.266510844 0.272066206 0.277323246 0.282701254 0.288053453 0.293778419 0.299256951 0.304659009 0.310878754 0.317296475 0.322973728 0.328919291 0.334866524 0.341107249 0.347896308 0.354467273 0.361294985 0.367219865 0.374182463 0.380723774 0.386790752 0.392866194 0.400281072 0.407485962 0.414273143 0.421773553 0.42893064 0.43729642 0.44570446 0.452925622 0.461751461 0.471455932 0.481315255 0.492397606 0.50357306 0.514833093 0.525638461 0.537122726 0.547630548 0.558987975 0.574540019 0.593091965 0.609078705 0.63193655 0.650441527 0.678109527 0.711995721 0.751042962 0.808011174 1.1301049 ------------------------------ Transdef: Tab for variable 12 0.100083143 0.117452309 0.132273674 0.145121068 0.156250864 0.165414155 0.174513221 0.183389902 0.192293793 0.199465632 0.206152946 0.213347346 0.220440894 0.22753939 0.234224588 0.23994717 0.246383309 0.252672136 0.258741379 0.264322639 0.270003438 0.276424795 0.28211531 0.287407756 0.293113708 0.298806965 0.304203004 0.309495866 0.314389139 0.319891334 0.325418591 0.330991447 0.33678925 0.342411816 0.347558558 0.352626443 0.358595312 0.364083916 0.369454324 0.374839723 0.380013049 0.386054516 0.391111016 0.396596014 0.401347458 0.4055143 0.409991443 0.413736045 0.417959869 0.421921313 0.426532447 0.430559695 0.435035169 0.439293951 0.444140702 0.448651612 0.453772843 0.458289355 0.462192625 0.467152596 0.471799642 0.476791799 0.481732666 0.486120403 0.491175711 0.496701956 0.501987576 0.507636905 0.513212085 0.51960969 0.525250316 0.531748593 0.538463354 0.545160413 0.549563408 0.556470871 0.562815189 0.569944084 0.577390432 0.585505128 0.593286455 0.601450682 0.609954655 0.618911505 0.627533317 0.637067199 0.648860693 0.660334468 0.67133379 0.681131482 0.694415092 0.706846595 0.721588969 0.735216081 0.750265002 0.7688151 0.788977921 0.81268096 0.844306111 0.881216884 1.13469779 ------------------------------ Transdef: Tab for variable 13 20.0262547 21.5862617 22.4581089 23.1957474 23.7527409 24.3779354 24.9551411 25.3827591 25.9346008 26.4072723 26.8575287 27.3062687 27.7376862 28.1714745 28.5610237 28.8894825 29.3229961 29.6686821 30.0222378 30.4002514 30.6884766 31.0429535 31.3885536 31.7695503 32.1334763 32.4753952 32.8434868 33.2056656 33.5836868 33.9543381 34.34375 34.7032318 34.9908562 35.3864441 35.6953812 36.0535736 36.3390579 36.7234268 37.1137009 37.4736519 37.8245316 38.1809998 38.554184 38.9166031 39.283989 39.5993042 39.9512863 40.3146057 40.7136497 41.1123314 41.4397354 41.8366051 42.270771 42.6250496 42.9975739 43.4020157 43.7914886 44.1799164 44.5838547 44.991272 45.3689728 45.8436966 46.2560539 46.652298 47.0935898 47.5048256 47.9647675 48.4357491 48.9081192 49.3767166 49.916893 50.403038 50.9106903 51.4574509 52.0415344 52.630249 53.2475433 53.873764 54.572731 55.2378197 55.9211388 56.7044983 57.5783615 58.4221878 59.2305527 60.2875557 61.3800583 62.5092316 63.704258 64.9937744 66.6074982 68.1407242 69.5820694 71.747261 74.2449951 77.3777466 80.8036346 85.4374695 91.3802795 100.956352 394.478912 ------------------------------ Transdef: Tab for variable 14 10.0086575 10.8238201 11.2528992 11.5924091 11.9257832 12.2792997 12.5958061 12.9232349 13.2926521 13.6112576 13.8835115 14.2104225 14.5045986 14.8393536 15.1582355 15.4794331 15.7817707 16.051445 16.335495 16.6321907 16.9220829 17.218586 17.6154137 17.9018135 18.2219162 18.5378132 18.8485832 19.1419983 19.4636536 19.8489227 20.1618996 20.5248985 20.8429108 21.1769562 21.5313015 21.908287 22.2460899 22.5888615 22.9518967 23.2640877 23.6526527 23.9938812 24.3412971 24.7148075 25.0893288 25.4123936 25.7198372 26.097414 26.442379 26.812067 27.2039032 27.5074577 27.8692207 28.1005955 28.4503975 28.7957001 29.1408997 29.5027218 29.8548508 30.2343483 30.6156235 31.0229664 31.3839188 31.7120552 32.0531425 32.4053802 32.7659149 33.1173592 33.4942589 33.8863792 34.2491112 34.6113815 35.0012779 35.4102631 35.8110123 36.2067337 36.6226578 37.0655746 37.5631027 38.0343475 38.4969368 39.0301781 39.5916138 40.1516418 40.7135735 41.342659 42.0547028 42.7498703 43.465538 44.3049431 45.2607384 46.2923813 47.4150505 48.6555672 50.2125854 51.9846268 54.097084 56.8196602 61.2329254 68.1660919 148.827057 COVARIANCE MATRIX (IN PERCENT) 0 1.0 2.0 3.0 4.0 5.0 6.0 7.0 8.0 9.0 10.0 11.0 12.0 13.0 14.0 0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1.0 1 100.0 66.2 52.9 58.2 31.7 51.9 63.8 57.0 65.3 -22.3 -17.6 -29.4 22.4 38.8 2 66.2 100.0 68.9 65.8 37.6 69.9 95.0 77.7 92.7 -44.3 -22.6 -39.5 39.1 46.8 3 52.9 68.9 100.0 56.1 10.6 36.4 72.3 56.7 74.0 -16.4 -25.6 -44.6 67.3 39.7 4 58.2 65.8 56.1 100.0 19.5 41.9 66.2 57.5 72.3 -12.1 -30.6 -53.0 31.6 78.7 5 31.7 37.6 10.6 19.5 100.0 83.6 12.3 58.9 56.6 20.6 -7.1 -11.5 -2.1 11.1 6 51.9 69.9 36.4 41.9 83.6 100.0 50.7 79.4 79.1 -8.4 -15.3 -25.5 14.4 27.9 7 63.8 95.0 72.3 66.2 12.3 50.7 100.0 70.2 82.7 -52.0 -21.0 -38.5 43.9 48.2 8 57.0 77.7 56.7 57.5 58.9 79.4 70.2 100.0 84.8 -13.2 -21.7 -36.2 31.5 41.4 9 65.3 92.7 74.0 72.3 56.6 79.1 82.7 84.8 100.0 -22.1 -25.9 -45.5 42.8 52.5 10 -22.3 -44.3 -16.4 -12.1 20.6 -8.4 -52.0 -13.2 -22.1 100.0 1.2 4.1 -9.0 -7.0 11 -17.6 -22.6 -25.6 -30.6 -7.1 -15.3 -21.0 -21.7 -25.9 1.2 100.0 59.8 -8.5 -20.4 12 -29.4 -39.5 -44.6 -53.0 -11.5 -25.5 -38.5 -36.2 -45.5 4.1 59.8 100.0 -18.1 -32.4 13 22.4 39.1 67.3 31.6 -2.1 14.4 43.9 31.5 42.8 -9.0 -8.5 -18.1 100.0 54.1 14 38.8 46.8 39.7 78.7 11.1 27.9 48.2 41.4 52.5 -7.0 -20.4 -32.4 54.1 100.0 TOTAL CORRELATION TO TARGET (diagonal) 171.831949 TOTAL CORRELATION OF ALL VARIABLES 73.4795142 ROUND 1: MAX CORR ( 73.4759359) AFTER KILLING INPUT VARIABLE 11 CONTR 0.725154803 ROUND 2: MAX CORR ( 73.4651015) AFTER KILLING INPUT VARIABLE 14 CONTR 1.26175391 ROUND 3: MAX CORR ( 73.4484948) AFTER KILLING INPUT VARIABLE 8 CONTR 1.56196573 ROUND 4: MAX CORR ( 73.4219496) AFTER KILLING INPUT VARIABLE 10 CONTR 1.97451524 ROUND 5: MAX CORR ( 73.3281236) AFTER KILLING INPUT VARIABLE 2 CONTR 3.71065679 ROUND 6: MAX CORR ( 73.1444951) AFTER KILLING INPUT VARIABLE 6 CONTR 5.18618831 ROUND 7: MAX CORR ( 72.8932992) AFTER KILLING INPUT VARIABLE 12 CONTR 6.05673905 ROUND 8: MAX CORR ( 72.1400346) AFTER KILLING INPUT VARIABLE 13 CONTR 10.4521997 ROUND 9: MAX CORR ( 70.776958) AFTER KILLING INPUT VARIABLE 9 CONTR 13.9573212 ROUND 10: MAX CORR ( 70.4725228) AFTER KILLING INPUT VARIABLE 3 CONTR 6.55753839 ROUND 11: MAX CORR ( 68.1785394) AFTER KILLING INPUT VARIABLE 4 CONTR 17.8343273 ROUND 12: MAX CORR ( 63.7924814) AFTER KILLING INPUT VARIABLE 5 CONTR 24.0589393 LAST REMAINING VARIABLE: 7 total correlation to target: 73.4795142 % total significance: 164.235122 sigma correlations of single variables to target: variable 2: 66.1552337 % , in sigma: 147.864518 variable 3: 52.9495079 % , in sigma: 118.348209 variable 4: 58.2017481 % , in sigma: 130.087567 variable 5: 31.6980615 % , in sigma: 70.8487947 variable 6: 51.9207472 % , in sigma: 116.048811 variable 7: 63.7924814 % , in sigma: 142.583496 variable 8: 57.0287248 % , in sigma: 127.465726 variable 9: 65.2578241 % , in sigma: 145.858704 variable 10: -22.2706728 % , in sigma: 49.7775021 variable 11: -17.5871329 % , in sigma: 39.3092546 variable 12: -29.4394423 % , in sigma: 65.8005222 variable 13: 22.372872 % , in sigma: 50.0059291 variable 14: 38.7972025 % , in sigma: 86.7161872 variables sorted by significance: 1 most relevant variable 7 corr 63.7924805 , in sigma: 142.583494 2 most relevant variable 5 corr 24.058939 , in sigma: 53.7744817 3 most relevant variable 4 corr 17.8343277 , in sigma: 39.8617632 4 most relevant variable 3 corr 6.55753851 , in sigma: 14.656849 5 most relevant variable 9 corr 13.9573212 , in sigma: 31.19621 6 most relevant variable 13 corr 10.4521999 , in sigma: 23.3618629 7 most relevant variable 12 corr 6.05673885 , in sigma: 13.5375044 8 most relevant variable 6 corr 5.18618822 , in sigma: 11.5917241 9 most relevant variable 2 corr 3.71065688 , in sigma: 8.29374274 10 most relevant variable 10 corr 1.9745152 , in sigma: 4.41326742 11 most relevant variable 8 corr 1.5619657 , in sigma: 3.49117208 12 most relevant variable 14 corr 1.26175392 , in sigma: 2.82016439 13 most relevant variable 11 corr 0.725154817 , in sigma: 1.620804 global correlations between input variables: variable 2: 99.1874754 % variable 3: 92.7115161 % variable 4: 93.4467647 % variable 5: 94.7722058 % variable 6: 94.1641858 % variable 7: 98.8111326 % variable 8: 90.5288375 % variable 9: 98.7224392 % variable 10: 71.3497269 % variable 11: 60.6130502 % variable 12: 72.5159808 % variable 13: 86.1849005 % variable 14: 90.5101438 % significance loss when removing single variables: variable 2: corr = 2.39360129 % , sigma = 5.34997279 variable 3: corr = 17.0885484 % , sigma = 38.194861 variable 4: corr = 13.6464547 % , sigma = 30.5013878 variable 5: corr = 13.2254713 % , sigma = 29.5604418 variable 6: corr = 4.68618799 % , sigma = 10.4741664 variable 7: corr = 7.6701268 % , sigma = 17.1436111 variable 8: corr = 1.60539693 % , sigma = 3.58824584 variable 9: corr = 12.3585808 % , sigma = 27.6228423 variable 10: corr = 1.7290971 % , sigma = 3.86472987 variable 11: corr = 0.725154803 % , sigma = 1.62080396 variable 12: corr = 5.56286959 % , sigma = 12.4336501 variable 13: corr = 8.50347441 % , sigma = 19.0062384 variable 14: corr = 1.19836037 % , sigma = 2.67847256 Keep only 10 most significant input variables ------------------------------------- Teacher: actual network topology: Nodes(1) = 11 Nodes(2) = 15 Nodes(3) = 1 ------------------------------------- --------------------------------------------------- Iteration : 1 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 3 --> 26.9458523 sigma out 15 active outputs RANK 2 NODE 8 --> 21.5110531 sigma out 15 active outputs RANK 3 NODE 6 --> 18.6062202 sigma out 15 active outputs RANK 4 NODE 9 --> 18.4675083 sigma out 15 active outputs RANK 5 NODE 5 --> 15.9145184 sigma out 15 active outputs RANK 6 NODE 11 --> 15.8719721 sigma out 15 active outputs RANK 7 NODE 10 --> 15.8571959 sigma out 15 active outputs RANK 8 NODE 2 --> 15.0441341 sigma out 15 active outputs RANK 9 NODE 1 --> 14.5905857 sigma out 15 active outputs RANK 10 NODE 7 --> 13.5144815 sigma out 15 active outputs RANK 11 NODE 4 --> 11.0628014 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 5 --> 27.9468918 sigma in 11act. ( 32.9131737 sig out 1act.) RANK 2 NODE 6 --> 23.9163208 sigma in 11act. ( 26.9178696 sig out 1act.) RANK 3 NODE 8 --> 22.4832573 sigma in 11act. ( 23.1017056 sig out 1act.) RANK 4 NODE 7 --> 21.0783825 sigma in 11act. ( 25.9055099 sig out 1act.) RANK 5 NODE 2 --> 19.7107677 sigma in 11act. ( 20.8691902 sig out 1act.) RANK 6 NODE 10 --> 16.8509731 sigma in 11act. ( 14.7282515 sig out 1act.) RANK 7 NODE 4 --> 11.7912512 sigma in 11act. ( 12.9937811 sig out 1act.) RANK 8 NODE 3 --> 10.4738226 sigma in 11act. ( 14.4088221 sig out 1act.) RANK 9 NODE 13 --> 6.75923014 sigma in 11act. ( 8.37155628 sig out 1act.) RANK 10 NODE 11 --> 6.52942705 sigma in 11act. ( 6.08545589 sig out 1act.) RANK 11 NODE 14 --> 5.04764748 sigma in 11act. ( 8.10703659 sig out 1act.) RANK 12 NODE 15 --> 3.36782527 sigma in 11act. ( 5.51614618 sig out 1act.) RANK 13 NODE 9 --> 3.24088883 sigma in 11act. ( 3.24924922 sig out 1act.) RANK 14 NODE 1 --> 2.94388056 sigma in 11act. ( 1.9756 sig out 1act.) RANK 15 NODE 12 --> 2.93166256 sigma in 11act. ( 5.40680218 sig out 1act.) sorted by output significance RANK 1 NODE 5 --> 32.9131737 sigma out 1act.( 27.9468918 sig in 11act.) RANK 2 NODE 6 --> 26.9178696 sigma out 1act.( 23.9163208 sig in 11act.) RANK 3 NODE 7 --> 25.9055099 sigma out 1act.( 21.0783825 sig in 11act.) RANK 4 NODE 8 --> 23.1017056 sigma out 1act.( 22.4832573 sig in 11act.) RANK 5 NODE 2 --> 20.8691902 sigma out 1act.( 19.7107677 sig in 11act.) RANK 6 NODE 10 --> 14.7282515 sigma out 1act.( 16.8509731 sig in 11act.) RANK 7 NODE 3 --> 14.4088221 sigma out 1act.( 10.4738226 sig in 11act.) RANK 8 NODE 4 --> 12.9937811 sigma out 1act.( 11.7912512 sig in 11act.) RANK 9 NODE 13 --> 8.37155628 sigma out 1act.( 6.75923014 sig in 11act.) RANK 10 NODE 14 --> 8.10703659 sigma out 1act.( 5.04764748 sig in 11act.) RANK 11 NODE 11 --> 6.08545589 sigma out 1act.( 6.52942705 sig in 11act.) RANK 12 NODE 15 --> 5.51614618 sigma out 1act.( 3.36782527 sig in 11act.) RANK 13 NODE 12 --> 5.40680218 sigma out 1act.( 2.93166256 sig in 11act.) RANK 14 NODE 9 --> 3.24924922 sigma out 1act.( 3.24088883 sig in 11act.) RANK 15 NODE 1 --> 1.9756 sigma out 1act.( 2.94388056 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 65.4865799 sigma in 15 active inputs SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 10 --> 28.3699894 sigma out 15 active outputs RANK 2 NODE 3 --> 25.793272 sigma out 15 active outputs RANK 3 NODE 8 --> 21.6705322 sigma out 15 active outputs RANK 4 NODE 9 --> 20.4651852 sigma out 15 active outputs RANK 5 NODE 6 --> 19.2861538 sigma out 15 active outputs RANK 6 NODE 1 --> 18.263361 sigma out 15 active outputs RANK 7 NODE 5 --> 17.0017681 sigma out 15 active outputs RANK 8 NODE 11 --> 16.6506786 sigma out 15 active outputs RANK 9 NODE 2 --> 15.8357019 sigma out 15 active outputs RANK 10 NODE 7 --> 15.6720247 sigma out 15 active outputs RANK 11 NODE 4 --> 13.0523863 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 5 --> 28.554184 sigma in 11act. ( 30.818491 sig out 1act.) RANK 2 NODE 6 --> 24.8532848 sigma in 11act. ( 24.5487823 sig out 1act.) RANK 3 NODE 7 --> 24.1649437 sigma in 11act. ( 24.2029686 sig out 1act.) RANK 4 NODE 8 --> 20.5180798 sigma in 11act. ( 22.0634556 sig out 1act.) RANK 5 NODE 2 --> 20.1346283 sigma in 11act. ( 20.678175 sig out 1act.) RANK 6 NODE 3 --> 16.795332 sigma in 11act. ( 13.3160744 sig out 1act.) RANK 7 NODE 10 --> 14.5054092 sigma in 11act. ( 12.7613783 sig out 1act.) RANK 8 NODE 14 --> 13.3960142 sigma in 11act. ( 9.29443836 sig out 1act.) RANK 9 NODE 4 --> 13.2572947 sigma in 11act. ( 13.6489468 sig out 1act.) RANK 10 NODE 12 --> 12.5689411 sigma in 11act. ( 8.08300686 sig out 1act.) RANK 11 NODE 15 --> 11.0319052 sigma in 11act. ( 7.7964735 sig out 1act.) RANK 12 NODE 13 --> 10.6134863 sigma in 11act. ( 8.39522934 sig out 1act.) RANK 13 NODE 9 --> 9.59814358 sigma in 11act. ( 3.97716475 sig out 1act.) RANK 14 NODE 1 --> 8.0242691 sigma in 11act. ( 2.74525499 sig out 1act.) RANK 15 NODE 11 --> 7.56549788 sigma in 11act. ( 5.53625584 sig out 1act.) sorted by output significance RANK 1 NODE 5 --> 30.818491 sigma out 1act.( 28.554184 sig in 11act.) RANK 2 NODE 6 --> 24.5487823 sigma out 1act.( 24.8532848 sig in 11act.) RANK 3 NODE 7 --> 24.2029686 sigma out 1act.( 24.1649437 sig in 11act.) RANK 4 NODE 8 --> 22.0634556 sigma out 1act.( 20.5180798 sig in 11act.) RANK 5 NODE 2 --> 20.678175 sigma out 1act.( 20.1346283 sig in 11act.) RANK 6 NODE 4 --> 13.6489468 sigma out 1act.( 13.2572947 sig in 11act.) RANK 7 NODE 3 --> 13.3160744 sigma out 1act.( 16.795332 sig in 11act.) RANK 8 NODE 10 --> 12.7613783 sigma out 1act.( 14.5054092 sig in 11act.) RANK 9 NODE 14 --> 9.29443836 sigma out 1act.( 13.3960142 sig in 11act.) RANK 10 NODE 13 --> 8.39522934 sigma out 1act.( 10.6134863 sig in 11act.) RANK 11 NODE 12 --> 8.08300686 sigma out 1act.( 12.5689411 sig in 11act.) RANK 12 NODE 15 --> 7.7964735 sigma out 1act.( 11.0319052 sig in 11act.) RANK 13 NODE 11 --> 5.53625584 sigma out 1act.( 7.56549788 sig in 11act.) RANK 14 NODE 9 --> 3.97716475 sigma out 1act.( 9.59814358 sig in 11act.) RANK 15 NODE 1 --> 2.74525499 sigma out 1act.( 8.0242691 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 62.5783043 sigma in 15 active inputs *********************************************** *** Learn Path 1 *** loss function: -0.42339161 *** contribution from regularisation: 0.00419000443 *** contribution from error: -0.427581608 *********************************************** -----------------> Test sample --------------------------------------------------- Iteration : 2 *********************************************** *** Learn Path 2 *** loss function: -0.483877152 *** contribution from regularisation: 0.00247493386 *** contribution from error: -0.486352086 *********************************************** -----------------> Test sample ENTER BFGS code START -51086.2088 0.118732132 -0.294145525 EXIT FROM BFGS code FG_START 0. 0.118732132 0. --------------------------------------------------- Iteration : 3 *********************************************** *** Learn Path 3 *** loss function: -0.503941357 *** contribution from regularisation: 0.00245199259 *** contribution from error: -0.506393373 *********************************************** -----------------> Test sample ENTER BFGS code FG_START -53193.5328 0.118732132 -373.500793 EXIT FROM BFGS code FG_LNSRCH 0. 0.0721977055 0. --------------------------------------------------- Iteration : 4 *********************************************** *** Learn Path 4 *** loss function: -0.540909946 *** contribution from regularisation: 0.0030444134 *** contribution from error: -0.543954372 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57095.7499 0.0721977055 40.2930145 EXIT FROM BFGS code NEW_X -57095.7499 0.0721977055 40.2930145 ENTER BFGS code NEW_X -57095.7499 0.0721977055 40.2930145 EXIT FROM BFGS code FG_LNSRCH 0. 0.0768197998 0. --------------------------------------------------- Iteration : 5 *********************************************** *** Learn Path 5 *** loss function: -0.541831613 *** contribution from regularisation: 0.00304895453 *** contribution from error: -0.544880569 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57193.0335 0.0768197998 5.87434578 EXIT FROM BFGS code NEW_X -57193.0335 0.0768197998 5.87434578 ENTER BFGS code NEW_X -57193.0335 0.0768197998 5.87434578 EXIT FROM BFGS code FG_LNSRCH 0. 0.0748036206 0. --------------------------------------------------- Iteration : 6 *********************************************** *** Learn Path 6 *** loss function: -0.543139756 *** contribution from regularisation: 0.00303177675 *** contribution from error: -0.546171546 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57331.1166 0.0748036206 0.793750286 EXIT FROM BFGS code NEW_X -57331.1166 0.0748036206 0.793750286 ENTER BFGS code NEW_X -57331.1166 0.0748036206 0.793750286 EXIT FROM BFGS code FG_LNSRCH 0. 0.0764805749 0. --------------------------------------------------- Iteration : 7 *********************************************** *** Learn Path 7 *** loss function: -0.544421375 *** contribution from regularisation: 0.00302354363 *** contribution from error: -0.54744494 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57466.3987 0.0764805749 -37.7548103 EXIT FROM BFGS code NEW_X -57466.3987 0.0764805749 -37.7548103 ENTER BFGS code NEW_X -57466.3987 0.0764805749 -37.7548103 EXIT FROM BFGS code FG_LNSRCH 0. 0.065516293 0. --------------------------------------------------- Iteration : 8 *********************************************** *** Learn Path 8 *** loss function: -0.545261383 *** contribution from regularisation: 0.00308342255 *** contribution from error: -0.548344791 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57555.0646 0.065516293 74.8629456 EXIT FROM BFGS code NEW_X -57555.0646 0.065516293 74.8629456 ENTER BFGS code NEW_X -57555.0646 0.065516293 74.8629456 EXIT FROM BFGS code FG_LNSRCH 0. 0.0756250769 0. --------------------------------------------------- Iteration : 9 *********************************************** *** Learn Path 9 *** loss function: -0.546416104 *** contribution from regularisation: 0.00309423148 *** contribution from error: -0.54951036 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57676.949 0.0756250769 55.9413223 EXIT FROM BFGS code NEW_X -57676.949 0.0756250769 55.9413223 ENTER BFGS code NEW_X -57676.949 0.0756250769 55.9413223 EXIT FROM BFGS code FG_LNSRCH 0. 0.0964820459 0. --------------------------------------------------- Iteration : 10 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 10 --> 39.92099 sigma out 15 active outputs RANK 2 NODE 1 --> 25.9528446 sigma out 15 active outputs RANK 3 NODE 9 --> 24.3668442 sigma out 15 active outputs RANK 4 NODE 2 --> 23.9690094 sigma out 15 active outputs RANK 5 NODE 11 --> 22.2505207 sigma out 15 active outputs RANK 6 NODE 7 --> 21.2170696 sigma out 15 active outputs RANK 7 NODE 4 --> 20.1703339 sigma out 15 active outputs RANK 8 NODE 8 --> 17.0408077 sigma out 15 active outputs RANK 9 NODE 3 --> 16.5351353 sigma out 15 active outputs RANK 10 NODE 5 --> 15.1332245 sigma out 15 active outputs RANK 11 NODE 6 --> 14.2071733 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 6 --> 30.7264805 sigma in 11act. ( 26.466423 sig out 1act.) RANK 2 NODE 7 --> 23.9853592 sigma in 11act. ( 25.5082817 sig out 1act.) RANK 3 NODE 9 --> 23.7182961 sigma in 11act. ( 20.09828 sig out 1act.) RANK 4 NODE 2 --> 22.3153172 sigma in 11act. ( 21.9036484 sig out 1act.) RANK 5 NODE 13 --> 21.8791714 sigma in 11act. ( 21.6922035 sig out 1act.) RANK 6 NODE 15 --> 20.0985508 sigma in 11act. ( 15.9753513 sig out 1act.) RANK 7 NODE 12 --> 19.9987774 sigma in 11act. ( 16.7008953 sig out 1act.) RANK 8 NODE 1 --> 19.2358055 sigma in 11act. ( 17.5730515 sig out 1act.) RANK 9 NODE 5 --> 18.4634876 sigma in 11act. ( 17.3585052 sig out 1act.) RANK 10 NODE 8 --> 17.6183472 sigma in 11act. ( 15.2676096 sig out 1act.) RANK 11 NODE 4 --> 15.807703 sigma in 11act. ( 12.5601149 sig out 1act.) RANK 12 NODE 14 --> 14.9670277 sigma in 11act. ( 10.7926617 sig out 1act.) RANK 13 NODE 3 --> 13.4603233 sigma in 11act. ( 9.69537544 sig out 1act.) RANK 14 NODE 11 --> 11.1845837 sigma in 11act. ( 9.35549068 sig out 1act.) RANK 15 NODE 10 --> 10.8721247 sigma in 11act. ( 10.5190716 sig out 1act.) sorted by output significance RANK 1 NODE 6 --> 26.466423 sigma out 1act.( 30.7264805 sig in 11act.) RANK 2 NODE 7 --> 25.5082817 sigma out 1act.( 23.9853592 sig in 11act.) RANK 3 NODE 2 --> 21.9036484 sigma out 1act.( 22.3153172 sig in 11act.) RANK 4 NODE 13 --> 21.6922035 sigma out 1act.( 21.8791714 sig in 11act.) RANK 5 NODE 9 --> 20.09828 sigma out 1act.( 23.7182961 sig in 11act.) RANK 6 NODE 1 --> 17.5730515 sigma out 1act.( 19.2358055 sig in 11act.) RANK 7 NODE 5 --> 17.3585052 sigma out 1act.( 18.4634876 sig in 11act.) RANK 8 NODE 12 --> 16.7008953 sigma out 1act.( 19.9987774 sig in 11act.) RANK 9 NODE 15 --> 15.9753513 sigma out 1act.( 20.0985508 sig in 11act.) RANK 10 NODE 8 --> 15.2676096 sigma out 1act.( 17.6183472 sig in 11act.) RANK 11 NODE 4 --> 12.5601149 sigma out 1act.( 15.807703 sig in 11act.) RANK 12 NODE 14 --> 10.7926617 sigma out 1act.( 14.9670277 sig in 11act.) RANK 13 NODE 10 --> 10.5190716 sigma out 1act.( 10.8721247 sig in 11act.) RANK 14 NODE 3 --> 9.69537544 sigma out 1act.( 13.4603233 sig in 11act.) RANK 15 NODE 11 --> 9.35549068 sigma out 1act.( 11.1845837 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 68.1849136 sigma in 15 active inputs *********************************************** *** Learn Path 10 *** loss function: -0.547208905 *** contribution from regularisation: 0.00320121273 *** contribution from error: -0.550410092 *********************************************** -----------------> Test sample Iteration No: 10 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -57760.6336 0.0964820459 38.0596504 EXIT FROM BFGS code NEW_X -57760.6336 0.0964820459 38.0596504 ENTER BFGS code NEW_X -57760.6336 0.0964820459 38.0596504 EXIT FROM BFGS code FG_LNSRCH 0. 0.116767444 0. --------------------------------------------------- Iteration : 11 *********************************************** *** Learn Path 11 *** loss function: -0.547738194 *** contribution from regularisation: 0.00311928242 *** contribution from error: -0.550857484 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57816.502 0.116767444 -46.2280731 EXIT FROM BFGS code NEW_X -57816.502 0.116767444 -46.2280731 ENTER BFGS code NEW_X -57816.502 0.116767444 -46.2280731 EXIT FROM BFGS code FG_LNSRCH 0. 0.122773521 0. --------------------------------------------------- Iteration : 12 *********************************************** *** Learn Path 12 *** loss function: -0.54816258 *** contribution from regularisation: 0.00309002749 *** contribution from error: -0.551252604 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57861.3013 0.122773521 -43.9517899 EXIT FROM BFGS code NEW_X -57861.3013 0.122773521 -43.9517899 ENTER BFGS code NEW_X -57861.3013 0.122773521 -43.9517899 EXIT FROM BFGS code FG_LNSRCH 0. 0.170492738 0. --------------------------------------------------- Iteration : 13 *********************************************** *** Learn Path 13 *** loss function: -0.549420416 *** contribution from regularisation: 0.00337292091 *** contribution from error: -0.552793324 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57994.0714 0.170492738 -83.0239487 EXIT FROM BFGS code NEW_X -57994.0714 0.170492738 -83.0239487 ENTER BFGS code NEW_X -57994.0714 0.170492738 -83.0239487 EXIT FROM BFGS code FG_LNSRCH 0. 0.138603181 0. --------------------------------------------------- Iteration : 14 *********************************************** *** Learn Path 14 *** loss function: -0.549241364 *** contribution from regularisation: 0.00314661115 *** contribution from error: -0.552387953 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -57975.1744 0.138603181 203.697937 EXIT FROM BFGS code FG_LNSRCH 0. 0.155394703 0. --------------------------------------------------- Iteration : 15 *********************************************** *** Learn Path 15 *** loss function: -0.550112009 *** contribution from regularisation: 0.00316618802 *** contribution from error: -0.553278208 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58067.076 0.155394703 47.2034149 EXIT FROM BFGS code NEW_X -58067.076 0.155394703 47.2034149 ENTER BFGS code NEW_X -58067.076 0.155394703 47.2034149 EXIT FROM BFGS code FG_LNSRCH 0. 0.150979504 0. --------------------------------------------------- Iteration : 16 *********************************************** *** Learn Path 16 *** loss function: -0.550342917 *** contribution from regularisation: 0.00307478826 *** contribution from error: -0.553417683 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58091.4448 0.150979504 -14.3785448 EXIT FROM BFGS code NEW_X -58091.4448 0.150979504 -14.3785448 ENTER BFGS code NEW_X -58091.4448 0.150979504 -14.3785448 EXIT FROM BFGS code FG_LNSRCH 0. 0.148662835 0. --------------------------------------------------- Iteration : 17 *********************************************** *** Learn Path 17 *** loss function: -0.550463021 *** contribution from regularisation: 0.00315378141 *** contribution from error: -0.553616822 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58104.1227 0.148662835 -5.91854095 EXIT FROM BFGS code NEW_X -58104.1227 0.148662835 -5.91854095 ENTER BFGS code NEW_X -58104.1227 0.148662835 -5.91854095 EXIT FROM BFGS code FG_LNSRCH 0. 0.141062006 0. --------------------------------------------------- Iteration : 18 *********************************************** *** Learn Path 18 *** loss function: -0.55125159 *** contribution from regularisation: 0.00331845763 *** contribution from error: -0.554570019 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58187.3601 0.141062006 -39.6439095 EXIT FROM BFGS code NEW_X -58187.3601 0.141062006 -39.6439095 ENTER BFGS code NEW_X -58187.3601 0.141062006 -39.6439095 EXIT FROM BFGS code FG_LNSRCH 0. 0.104981028 0. --------------------------------------------------- Iteration : 19 *********************************************** *** Learn Path 19 *** loss function: -0.551850617 *** contribution from regularisation: 0.00312716607 *** contribution from error: -0.554977775 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58250.5891 0.104981028 111.337311 EXIT FROM BFGS code NEW_X -58250.5891 0.104981028 111.337311 ENTER BFGS code NEW_X -58250.5891 0.104981028 111.337311 EXIT FROM BFGS code FG_LNSRCH 0. 0.107195437 0. --------------------------------------------------- Iteration : 20 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 10 --> 43.1779175 sigma out 15 active outputs RANK 2 NODE 4 --> 41.9179993 sigma out 15 active outputs RANK 3 NODE 7 --> 37.4937172 sigma out 15 active outputs RANK 4 NODE 1 --> 35.4011726 sigma out 15 active outputs RANK 5 NODE 11 --> 31.1814976 sigma out 15 active outputs RANK 6 NODE 8 --> 30.6212311 sigma out 15 active outputs RANK 7 NODE 5 --> 28.5888405 sigma out 15 active outputs RANK 8 NODE 2 --> 26.2302551 sigma out 15 active outputs RANK 9 NODE 9 --> 22.5228996 sigma out 15 active outputs RANK 10 NODE 3 --> 20.6908665 sigma out 15 active outputs RANK 11 NODE 6 --> 18.8796844 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 2 --> 45.4699554 sigma in 11act. ( 46.1368523 sig out 1act.) RANK 2 NODE 13 --> 42.0315781 sigma in 11act. ( 44.2143402 sig out 1act.) RANK 3 NODE 15 --> 33.0236168 sigma in 11act. ( 32.7936134 sig out 1act.) RANK 4 NODE 7 --> 31.4798965 sigma in 11act. ( 35.1596565 sig out 1act.) RANK 5 NODE 1 --> 30.7845211 sigma in 11act. ( 32.864502 sig out 1act.) RANK 6 NODE 12 --> 25.8272781 sigma in 11act. ( 25.7369747 sig out 1act.) RANK 7 NODE 5 --> 25.3437099 sigma in 11act. ( 24.2370758 sig out 1act.) RANK 8 NODE 6 --> 24.74119 sigma in 11act. ( 22.9959526 sig out 1act.) RANK 9 NODE 9 --> 20.8168697 sigma in 11act. ( 21.6485977 sig out 1act.) RANK 10 NODE 3 --> 19.7212791 sigma in 11act. ( 18.8935642 sig out 1act.) RANK 11 NODE 8 --> 19.6343288 sigma in 11act. ( 18.2217407 sig out 1act.) RANK 12 NODE 10 --> 18.4635887 sigma in 11act. ( 19.94137 sig out 1act.) RANK 13 NODE 4 --> 15.9494476 sigma in 11act. ( 16.6123524 sig out 1act.) RANK 14 NODE 14 --> 14.2561541 sigma in 11act. ( 13.2021284 sig out 1act.) RANK 15 NODE 11 --> 14.0149498 sigma in 11act. ( 14.0390892 sig out 1act.) sorted by output significance RANK 1 NODE 2 --> 46.1368523 sigma out 1act.( 45.4699554 sig in 11act.) RANK 2 NODE 13 --> 44.2143402 sigma out 1act.( 42.0315781 sig in 11act.) RANK 3 NODE 7 --> 35.1596565 sigma out 1act.( 31.4798965 sig in 11act.) RANK 4 NODE 1 --> 32.864502 sigma out 1act.( 30.7845211 sig in 11act.) RANK 5 NODE 15 --> 32.7936134 sigma out 1act.( 33.0236168 sig in 11act.) RANK 6 NODE 12 --> 25.7369747 sigma out 1act.( 25.8272781 sig in 11act.) RANK 7 NODE 5 --> 24.2370758 sigma out 1act.( 25.3437099 sig in 11act.) RANK 8 NODE 6 --> 22.9959526 sigma out 1act.( 24.74119 sig in 11act.) RANK 9 NODE 9 --> 21.6485977 sigma out 1act.( 20.8168697 sig in 11act.) RANK 10 NODE 10 --> 19.94137 sigma out 1act.( 18.4635887 sig in 11act.) RANK 11 NODE 3 --> 18.8935642 sigma out 1act.( 19.7212791 sig in 11act.) RANK 12 NODE 8 --> 18.2217407 sigma out 1act.( 19.6343288 sig in 11act.) RANK 13 NODE 4 --> 16.6123524 sigma out 1act.( 15.9494476 sig in 11act.) RANK 14 NODE 11 --> 14.0390892 sigma out 1act.( 14.0149498 sig in 11act.) RANK 15 NODE 14 --> 13.2021284 sigma out 1act.( 14.2561541 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 107.035217 sigma in 15 active inputs *********************************************** *** Learn Path 20 *** loss function: -0.552150726 *** contribution from regularisation: 0.00318621215 *** contribution from error: -0.555336952 *********************************************** -----------------> Test sample Iteration No: 20 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -58282.2704 0.107195437 14.1406269 EXIT FROM BFGS code NEW_X -58282.2704 0.107195437 14.1406269 ENTER BFGS code NEW_X -58282.2704 0.107195437 14.1406269 EXIT FROM BFGS code FG_LNSRCH 0. 0.103701167 0. --------------------------------------------------- Iteration : 21 *********************************************** *** Learn Path 21 *** loss function: -0.552275836 *** contribution from regularisation: 0.00326378876 *** contribution from error: -0.555539608 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58295.4784 0.103701167 15.8020554 EXIT FROM BFGS code NEW_X -58295.4784 0.103701167 15.8020554 ENTER BFGS code NEW_X -58295.4784 0.103701167 15.8020554 EXIT FROM BFGS code FG_LNSRCH 0. 0.108261086 0. --------------------------------------------------- Iteration : 22 *********************************************** *** Learn Path 22 *** loss function: -0.552511573 *** contribution from regularisation: 0.00324745406 *** contribution from error: -0.555759013 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58320.3615 0.108261086 -40.4865189 EXIT FROM BFGS code NEW_X -58320.3615 0.108261086 -40.4865189 ENTER BFGS code NEW_X -58320.3615 0.108261086 -40.4865189 EXIT FROM BFGS code FG_LNSRCH 0. 0.107608549 0. --------------------------------------------------- Iteration : 23 *********************************************** *** Learn Path 23 *** loss function: -0.552880049 *** contribution from regularisation: 0.00331535004 *** contribution from error: -0.556195378 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58359.2514 0.107608549 30.9646931 EXIT FROM BFGS code NEW_X -58359.2514 0.107608549 30.9646931 ENTER BFGS code NEW_X -58359.2514 0.107608549 30.9646931 EXIT FROM BFGS code FG_LNSRCH 0. 0.128585204 0. --------------------------------------------------- Iteration : 24 *********************************************** *** Learn Path 24 *** loss function: -0.553225636 *** contribution from regularisation: 0.0033559422 *** contribution from error: -0.556581557 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58395.7326 0.128585204 -52.9147224 EXIT FROM BFGS code NEW_X -58395.7326 0.128585204 -52.9147224 ENTER BFGS code NEW_X -58395.7326 0.128585204 -52.9147224 EXIT FROM BFGS code FG_LNSRCH 0. 0.131721124 0. --------------------------------------------------- Iteration : 25 *********************************************** *** Learn Path 25 *** loss function: -0.553543985 *** contribution from regularisation: 0.00323977182 *** contribution from error: -0.556783736 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58429.338 0.131721124 10.9683743 EXIT FROM BFGS code NEW_X -58429.338 0.131721124 10.9683743 ENTER BFGS code NEW_X -58429.338 0.131721124 10.9683743 EXIT FROM BFGS code FG_LNSRCH 0. 0.139288738 0. --------------------------------------------------- Iteration : 26 *********************************************** *** Learn Path 26 *** loss function: -0.553649366 *** contribution from regularisation: 0.00321845943 *** contribution from error: -0.556867838 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58440.4561 0.139288738 32.7896881 EXIT FROM BFGS code NEW_X -58440.4561 0.139288738 32.7896881 ENTER BFGS code NEW_X -58440.4561 0.139288738 32.7896881 EXIT FROM BFGS code FG_LNSRCH 0. 0.161445022 0. --------------------------------------------------- Iteration : 27 *********************************************** *** Learn Path 27 *** loss function: -0.553800404 *** contribution from regularisation: 0.00322263013 *** contribution from error: -0.557023048 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58456.4033 0.161445022 0.242140248 EXIT FROM BFGS code NEW_X -58456.4033 0.161445022 0.242140248 ENTER BFGS code NEW_X -58456.4033 0.161445022 0.242140248 EXIT FROM BFGS code FG_LNSRCH 0. 0.18088837 0. --------------------------------------------------- Iteration : 28 *********************************************** *** Learn Path 28 *** loss function: -0.553806305 *** contribution from regularisation: 0.0032523647 *** contribution from error: -0.557058692 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58457.0241 0.18088837 114.959885 EXIT FROM BFGS code NEW_X -58457.0241 0.18088837 114.959885 ENTER BFGS code NEW_X -58457.0241 0.18088837 114.959885 EXIT FROM BFGS code FG_LNSRCH 0. 0.184447736 0. --------------------------------------------------- Iteration : 29 *********************************************** *** Learn Path 29 *** loss function: -0.553950131 *** contribution from regularisation: 0.00327493437 *** contribution from error: -0.557225049 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58472.2071 0.184447736 -16.8216991 EXIT FROM BFGS code NEW_X -58472.2071 0.184447736 -16.8216991 ENTER BFGS code NEW_X -58472.2071 0.184447736 -16.8216991 EXIT FROM BFGS code FG_LNSRCH 0. 0.18337062 0. --------------------------------------------------- Iteration : 30 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 10 --> 50.9665833 sigma out 15 active outputs RANK 2 NODE 1 --> 46.3433685 sigma out 15 active outputs RANK 3 NODE 4 --> 46.0197754 sigma out 15 active outputs RANK 4 NODE 8 --> 43.7914085 sigma out 15 active outputs RANK 5 NODE 7 --> 42.7455215 sigma out 15 active outputs RANK 6 NODE 11 --> 41.0332375 sigma out 15 active outputs RANK 7 NODE 5 --> 40.5267792 sigma out 15 active outputs RANK 8 NODE 2 --> 29.8050938 sigma out 15 active outputs RANK 9 NODE 9 --> 28.008009 sigma out 15 active outputs RANK 10 NODE 3 --> 27.455864 sigma out 15 active outputs RANK 11 NODE 6 --> 22.6787739 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 2 --> 59.7182236 sigma in 11act. ( 58.5778046 sig out 1act.) RANK 2 NODE 15 --> 47.7787933 sigma in 11act. ( 46.6735268 sig out 1act.) RANK 3 NODE 13 --> 47.5736198 sigma in 11act. ( 47.3580284 sig out 1act.) RANK 4 NODE 1 --> 37.6013756 sigma in 11act. ( 39.9955101 sig out 1act.) RANK 5 NODE 7 --> 34.9953384 sigma in 11act. ( 40.120079 sig out 1act.) RANK 6 NODE 12 --> 34.6740685 sigma in 11act. ( 35.167244 sig out 1act.) RANK 7 NODE 3 --> 29.854311 sigma in 11act. ( 30.7849636 sig out 1act.) RANK 8 NODE 10 --> 27.6530876 sigma in 11act. ( 30.8048687 sig out 1act.) RANK 9 NODE 5 --> 27.1749592 sigma in 11act. ( 26.4883499 sig out 1act.) RANK 10 NODE 8 --> 23.1872349 sigma in 11act. ( 22.9582176 sig out 1act.) RANK 11 NODE 4 --> 22.5153141 sigma in 11act. ( 25.159502 sig out 1act.) RANK 12 NODE 6 --> 22.3601627 sigma in 11act. ( 21.3575916 sig out 1act.) RANK 13 NODE 9 --> 22.0580769 sigma in 11act. ( 23.2668743 sig out 1act.) RANK 14 NODE 11 --> 17.2250957 sigma in 11act. ( 18.2259407 sig out 1act.) RANK 15 NODE 14 --> 14.1294985 sigma in 11act. ( 14.061182 sig out 1act.) sorted by output significance RANK 1 NODE 2 --> 58.5778046 sigma out 1act.( 59.7182236 sig in 11act.) RANK 2 NODE 13 --> 47.3580284 sigma out 1act.( 47.5736198 sig in 11act.) RANK 3 NODE 15 --> 46.6735268 sigma out 1act.( 47.7787933 sig in 11act.) RANK 4 NODE 7 --> 40.120079 sigma out 1act.( 34.9953384 sig in 11act.) RANK 5 NODE 1 --> 39.9955101 sigma out 1act.( 37.6013756 sig in 11act.) RANK 6 NODE 12 --> 35.167244 sigma out 1act.( 34.6740685 sig in 11act.) RANK 7 NODE 10 --> 30.8048687 sigma out 1act.( 27.6530876 sig in 11act.) RANK 8 NODE 3 --> 30.7849636 sigma out 1act.( 29.854311 sig in 11act.) RANK 9 NODE 5 --> 26.4883499 sigma out 1act.( 27.1749592 sig in 11act.) RANK 10 NODE 4 --> 25.159502 sigma out 1act.( 22.5153141 sig in 11act.) RANK 11 NODE 9 --> 23.2668743 sigma out 1act.( 22.0580769 sig in 11act.) RANK 12 NODE 8 --> 22.9582176 sigma out 1act.( 23.1872349 sig in 11act.) RANK 13 NODE 6 --> 21.3575916 sigma out 1act.( 22.3601627 sig in 11act.) RANK 14 NODE 11 --> 18.2259407 sigma out 1act.( 17.2250957 sig in 11act.) RANK 15 NODE 14 --> 14.061182 sigma out 1act.( 14.1294985 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 132.605881 sigma in 15 active inputs *********************************************** *** Learn Path 30 *** loss function: -0.553997934 *** contribution from regularisation: 0.00327007007 *** contribution from error: -0.557268023 *********************************************** -----------------> Test sample Iteration No: 30 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -58477.2542 0.18337062 -27.9480495 EXIT FROM BFGS code NEW_X -58477.2542 0.18337062 -27.9480495 ENTER BFGS code NEW_X -58477.2542 0.18337062 -27.9480495 EXIT FROM BFGS code FG_LNSRCH 0. 0.181792617 0. --------------------------------------------------- Iteration : 31 *********************************************** *** Learn Path 31 *** loss function: -0.554183006 *** contribution from regularisation: 0.00328728999 *** contribution from error: -0.557470322 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58496.7898 0.181792617 -41.6418266 EXIT FROM BFGS code NEW_X -58496.7898 0.181792617 -41.6418266 ENTER BFGS code NEW_X -58496.7898 0.181792617 -41.6418266 EXIT FROM BFGS code FG_LNSRCH 0. 0.182028219 0. --------------------------------------------------- Iteration : 32 *********************************************** *** Learn Path 32 *** loss function: -0.554442227 *** contribution from regularisation: 0.00330204819 *** contribution from error: -0.557744265 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58524.1473 0.182028219 -38.4731712 EXIT FROM BFGS code NEW_X -58524.1473 0.182028219 -38.4731712 ENTER BFGS code NEW_X -58524.1473 0.182028219 -38.4731712 EXIT FROM BFGS code FG_LNSRCH 0. 0.181902245 0. --------------------------------------------------- Iteration : 33 *********************************************** *** Learn Path 33 *** loss function: -0.554855168 *** contribution from regularisation: 0.00337509438 *** contribution from error: -0.558230281 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58567.7364 0.181902245 33.1295738 EXIT FROM BFGS code NEW_X -58567.7364 0.181902245 33.1295738 ENTER BFGS code NEW_X -58567.7364 0.181902245 33.1295738 EXIT FROM BFGS code FG_LNSRCH 0. 0.201279551 0. --------------------------------------------------- Iteration : 34 *********************************************** *** Learn Path 34 *** loss function: -0.554646254 *** contribution from regularisation: 0.00338799623 *** contribution from error: -0.558034241 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58545.6827 0.201279551 5.23626709 EXIT FROM BFGS code FG_LNSRCH 0. 0.189321399 0. --------------------------------------------------- Iteration : 35 *********************************************** *** Learn Path 35 *** loss function: -0.555104971 *** contribution from regularisation: 0.00325171137 *** contribution from error: -0.558356702 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58594.1045 0.189321399 19.1375217 EXIT FROM BFGS code NEW_X -58594.1045 0.189321399 19.1375217 ENTER BFGS code NEW_X -58594.1045 0.189321399 19.1375217 EXIT FROM BFGS code FG_LNSRCH 0. 0.194517925 0. --------------------------------------------------- Iteration : 36 *********************************************** *** Learn Path 36 *** loss function: -0.555160582 *** contribution from regularisation: 0.00329743396 *** contribution from error: -0.55845803 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58599.9764 0.194517925 27.3772793 EXIT FROM BFGS code NEW_X -58599.9764 0.194517925 27.3772793 ENTER BFGS code NEW_X -58599.9764 0.194517925 27.3772793 EXIT FROM BFGS code FG_LNSRCH 0. 0.199757785 0. --------------------------------------------------- Iteration : 37 *********************************************** *** Learn Path 37 *** loss function: -0.55520618 *** contribution from regularisation: 0.00328406063 *** contribution from error: -0.558490217 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58604.7881 0.199757785 15.1483316 EXIT FROM BFGS code NEW_X -58604.7881 0.199757785 15.1483316 ENTER BFGS code NEW_X -58604.7881 0.199757785 15.1483316 EXIT FROM BFGS code FG_LNSRCH 0. 0.215381607 0. --------------------------------------------------- Iteration : 38 *********************************************** *** Learn Path 38 *** loss function: -0.555326521 *** contribution from regularisation: 0.00324944709 *** contribution from error: -0.558575988 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58617.4888 0.215381607 22.0956554 EXIT FROM BFGS code NEW_X -58617.4888 0.215381607 22.0956554 ENTER BFGS code NEW_X -58617.4888 0.215381607 22.0956554 EXIT FROM BFGS code FG_LNSRCH 0. 0.25567928 0. --------------------------------------------------- Iteration : 39 *********************************************** *** Learn Path 39 *** loss function: -0.555300593 *** contribution from regularisation: 0.00321409409 *** contribution from error: -0.558514714 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58614.7545 0.25567928 -207.733185 EXIT FROM BFGS code FG_LNSRCH 0. 0.232628092 0. --------------------------------------------------- Iteration : 40 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 1 --> 63.0311394 sigma out 15 active outputs RANK 2 NODE 10 --> 62.2225266 sigma out 15 active outputs RANK 3 NODE 4 --> 51.1295547 sigma out 15 active outputs RANK 4 NODE 8 --> 48.9646873 sigma out 15 active outputs RANK 5 NODE 5 --> 48.087616 sigma out 15 active outputs RANK 6 NODE 11 --> 47.1299896 sigma out 15 active outputs RANK 7 NODE 7 --> 45.6518364 sigma out 15 active outputs RANK 8 NODE 2 --> 38.1439018 sigma out 15 active outputs RANK 9 NODE 3 --> 34.8292999 sigma out 15 active outputs RANK 10 NODE 9 --> 27.8798962 sigma out 15 active outputs RANK 11 NODE 6 --> 27.7880383 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 2 --> 64.2052002 sigma in 11act. ( 62.8097916 sig out 1act.) RANK 2 NODE 15 --> 58.3802376 sigma in 11act. ( 55.421936 sig out 1act.) RANK 3 NODE 13 --> 56.2005768 sigma in 11act. ( 56.2344551 sig out 1act.) RANK 4 NODE 7 --> 46.4427299 sigma in 11act. ( 53.6882439 sig out 1act.) RANK 5 NODE 3 --> 45.8889427 sigma in 11act. ( 47.0547371 sig out 1act.) RANK 6 NODE 1 --> 44.5648384 sigma in 11act. ( 45.7820206 sig out 1act.) RANK 7 NODE 12 --> 42.2788353 sigma in 11act. ( 41.5331879 sig out 1act.) RANK 8 NODE 10 --> 35.2466621 sigma in 11act. ( 39.9139061 sig out 1act.) RANK 9 NODE 5 --> 28.9015408 sigma in 11act. ( 29.9555073 sig out 1act.) RANK 10 NODE 4 --> 27.2058258 sigma in 11act. ( 29.5606709 sig out 1act.) RANK 11 NODE 8 --> 23.9866333 sigma in 11act. ( 24.4654827 sig out 1act.) RANK 12 NODE 6 --> 22.5738525 sigma in 11act. ( 22.9381447 sig out 1act.) RANK 13 NODE 9 --> 22.319458 sigma in 11act. ( 24.2125626 sig out 1act.) RANK 14 NODE 11 --> 19.5063915 sigma in 11act. ( 20.6365795 sig out 1act.) RANK 15 NODE 14 --> 13.5976686 sigma in 11act. ( 14.8558493 sig out 1act.) sorted by output significance RANK 1 NODE 2 --> 62.8097916 sigma out 1act.( 64.2052002 sig in 11act.) RANK 2 NODE 13 --> 56.2344551 sigma out 1act.( 56.2005768 sig in 11act.) RANK 3 NODE 15 --> 55.421936 sigma out 1act.( 58.3802376 sig in 11act.) RANK 4 NODE 7 --> 53.6882439 sigma out 1act.( 46.4427299 sig in 11act.) RANK 5 NODE 3 --> 47.0547371 sigma out 1act.( 45.8889427 sig in 11act.) RANK 6 NODE 1 --> 45.7820206 sigma out 1act.( 44.5648384 sig in 11act.) RANK 7 NODE 12 --> 41.5331879 sigma out 1act.( 42.2788353 sig in 11act.) RANK 8 NODE 10 --> 39.9139061 sigma out 1act.( 35.2466621 sig in 11act.) RANK 9 NODE 5 --> 29.9555073 sigma out 1act.( 28.9015408 sig in 11act.) RANK 10 NODE 4 --> 29.5606709 sigma out 1act.( 27.2058258 sig in 11act.) RANK 11 NODE 8 --> 24.4654827 sigma out 1act.( 23.9866333 sig in 11act.) RANK 12 NODE 9 --> 24.2125626 sigma out 1act.( 22.319458 sig in 11act.) RANK 13 NODE 6 --> 22.9381447 sigma out 1act.( 22.5738525 sig in 11act.) RANK 14 NODE 11 --> 20.6365795 sigma out 1act.( 19.5063915 sig in 11act.) RANK 15 NODE 14 --> 14.8558493 sigma out 1act.( 13.5976686 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 157.538452 sigma in 15 active inputs *********************************************** *** Learn Path 40 *** loss function: -0.555290699 *** contribution from regularisation: 0.00335389469 *** contribution from error: -0.558644593 *********************************************** -----------------> Test sample Iteration No: 40 ********************************************** ***** write out current network **** ***** to "rescue.nb" **** ********************************************** SAVING EXPERTISE TO rescue.nb ENTER BFGS code FG_LNSRCH -58613.7086 0.232628092 -75.4790573 EXIT FROM BFGS code FG_LNSRCH 0. 0.219097704 0. --------------------------------------------------- Iteration : 41 *********************************************** *** Learn Path 41 *** loss function: -0.555269957 *** contribution from regularisation: 0.00333200931 *** contribution from error: -0.558601975 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58611.5209 0.219097704 1.18471742 EXIT FROM BFGS code FG_LNSRCH 0. 0.215650827 0. --------------------------------------------------- Iteration : 42 *********************************************** *** Learn Path 42 *** loss function: -0.555272162 *** contribution from regularisation: 0.00330593972 *** contribution from error: -0.558578074 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58611.7523 0.215650827 20.4811077 EXIT FROM BFGS code FG_LNSRCH 0. 0.215383723 0. --------------------------------------------------- Iteration : 43 *********************************************** *** Learn Path 43 *** loss function: -0.555273116 *** contribution from regularisation: 0.00330287381 *** contribution from error: -0.558575988 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58611.8516 0.215383723 21.9731312 EXIT FROM BFGS code FG_LNSRCH 0. 0.215381607 0. --------------------------------------------------- Iteration : 44 *********************************************** *** Learn Path 44 *** loss function: -0.555274308 *** contribution from regularisation: 0.00330165471 *** contribution from error: -0.558575988 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58611.978 0.215381607 21.9640942 EXIT FROM BFGS code FG_LNSRCH 0. 0.215381607 0. --------------------------------------------------- Iteration : 45 *********************************************** *** Learn Path 45 *** loss function: -0.555276752 *** contribution from regularisation: 0.00329921581 *** contribution from error: -0.558575988 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58612.2354 0.215381607 21.9411373 EXIT FROM BFGS code FG_LNSRCH 0. 0.215381607 0. --------------------------------------------------- Iteration : 46 *********************************************** *** Learn Path 46 *** loss function: -0.555275619 *** contribution from regularisation: 0.00330032874 *** contribution from error: -0.558575928 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58612.1179 0.215381607 21.9189758 EXIT FROM BFGS code FG_LNSRCH 0. 0.215381607 0. --------------------------------------------------- Iteration : 47 *********************************************** *** Learn Path 47 *** loss function: -0.555274367 *** contribution from regularisation: 0.00330159394 *** contribution from error: -0.558575988 *********************************************** -----------------> Test sample ENTER BFGS code FG_LNSRCH -58611.9844 0.215381607 21.8979301 EXIT FROM BFGS code NEW_X -58611.9844 0.215381607 21.8979301 ENTER BFGS code NEW_X -58611.9844 0.215381607 21.8979301 EXIT FROM BFGS code CONVERGENC -58611.9844 0.215381607 21.8979301 --------------------------------------------------- Iteration : 250 SIGNIFICANCE OF OUTPUTS IN LAYER 1 RANK 1 NODE 1 --> 89.8224411 sigma out 15 active outputs RANK 2 NODE 10 --> 88.214592 sigma out 15 active outputs RANK 3 NODE 4 --> 73.2636642 sigma out 15 active outputs RANK 4 NODE 8 --> 69.3115158 sigma out 15 active outputs RANK 5 NODE 5 --> 68.6817703 sigma out 15 active outputs RANK 6 NODE 11 --> 66.9586182 sigma out 15 active outputs RANK 7 NODE 7 --> 64.828125 sigma out 15 active outputs RANK 8 NODE 2 --> 54.8092613 sigma out 15 active outputs RANK 9 NODE 3 --> 48.9858894 sigma out 15 active outputs RANK 10 NODE 9 --> 40.3235931 sigma out 15 active outputs RANK 11 NODE 6 --> 38.6791916 sigma out 15 active outputs SIGNIFICANCE OF INPUTS TO LAYER 2 sorted by input significance RANK 1 NODE 2 --> 90.7493591 sigma in 11act. ( 89.912178 sig out 1act.) RANK 2 NODE 15 --> 83.860054 sigma in 11act. ( 79.3543549 sig out 1act.) RANK 3 NODE 13 --> 79.3031616 sigma in 11act. ( 80.0084534 sig out 1act.) RANK 4 NODE 7 --> 65.4912872 sigma in 11act. ( 75.7992249 sig out 1act.) RANK 5 NODE 3 --> 64.7276688 sigma in 11act. ( 66.7196121 sig out 1act.) RANK 6 NODE 1 --> 63.970459 sigma in 11act. ( 66.0105896 sig out 1act.) RANK 7 NODE 12 --> 60.4344864 sigma in 11act. ( 60.1373482 sig out 1act.) RANK 8 NODE 10 --> 50.2898865 sigma in 11act. ( 57.5578728 sig out 1act.) RANK 9 NODE 5 --> 41.5632133 sigma in 11act. ( 43.1001129 sig out 1act.) RANK 10 NODE 4 --> 39.159153 sigma in 11act. ( 42.9703865 sig out 1act.) RANK 11 NODE 8 --> 34.4029388 sigma in 11act. ( 35.5354996 sig out 1act.) RANK 12 NODE 6 --> 32.0897598 sigma in 11act. ( 33.0614204 sig out 1act.) RANK 13 NODE 9 --> 31.9226151 sigma in 11act. ( 34.794754 sig out 1act.) RANK 14 NODE 11 --> 27.7751465 sigma in 11act. ( 29.6896648 sig out 1act.) RANK 15 NODE 14 --> 19.4035435 sigma in 11act. ( 21.2354908 sig out 1act.) sorted by output significance RANK 1 NODE 2 --> 89.912178 sigma out 1act.( 90.7493591 sig in 11act.) RANK 2 NODE 13 --> 80.0084534 sigma out 1act.( 79.3031616 sig in 11act.) RANK 3 NODE 15 --> 79.3543549 sigma out 1act.( 83.860054 sig in 11act.) RANK 4 NODE 7 --> 75.7992249 sigma out 1act.( 65.4912872 sig in 11act.) RANK 5 NODE 3 --> 66.7196121 sigma out 1act.( 64.7276688 sig in 11act.) RANK 6 NODE 1 --> 66.0105896 sigma out 1act.( 63.970459 sig in 11act.) RANK 7 NODE 12 --> 60.1373482 sigma out 1act.( 60.4344864 sig in 11act.) RANK 8 NODE 10 --> 57.5578728 sigma out 1act.( 50.2898865 sig in 11act.) RANK 9 NODE 5 --> 43.1001129 sigma out 1act.( 41.5632133 sig in 11act.) RANK 10 NODE 4 --> 42.9703865 sigma out 1act.( 39.159153 sig in 11act.) RANK 11 NODE 8 --> 35.5354996 sigma out 1act.( 34.4029388 sig in 11act.) RANK 12 NODE 9 --> 34.794754 sigma out 1act.( 31.9226151 sig in 11act.) RANK 13 NODE 6 --> 33.0614204 sigma out 1act.( 32.0897598 sig in 11act.) RANK 14 NODE 11 --> 29.6896648 sigma out 1act.( 27.7751465 sig in 11act.) RANK 15 NODE 14 --> 21.2354908 sigma out 1act.( 19.4035435 sig in 11act.) SIGNIFICANCE OF INPUTS TO LAYER 3 RANK 1 NODE 1 --> 225.542679 sigma in 15 active inputs *********************************************** *** Learn Path 250 *** loss function: -0.555275559 *** contribution from regularisation: 0.00330041326 *** contribution from error: -0.558575988 *********************************************** -----------------> Test sample END OF LEARNING , export EXPERTISE SAVING EXPERTISE TO expert.nb NB_AHISTOUT: storage space 30761 Closing output file done