ML之xgboost&GBM:基于xgboost&GBM算法对HiggsBoson数据集(Kaggle竞赛)训练(两模型性能PK)实现二分类预测 - Go语言中文社区

ML之xgboost&GBM:基于xgboost&GBM算法对HiggsBoson数据集(Kaggle竞赛)训练(两模型性能PK)实现二分类预测


ML之xgboost&GBM:基于xgboost&GBM算法对HiggsBoson数据集(Kaggle竞赛)训练(两模型性能PK)实现二分类预测

 

 

 

目录

输出结果

设计思路

核心代码


 

 

 

输出结果



finish loading from csv 
weight statistics: wpos=1522.37, wneg=904200, ratio=593.94

loading data end, start to boost trees
training GBM from sklearn
      Iter       Train Loss   Remaining Time 
         1           1.2069           49.52s
         2           1.1437           43.51s
         3           1.0909           37.43s
         4           1.0471           30.96s
         5           1.0096           25.09s
         6           0.9775           19.90s
         7           0.9505           15.22s
         8           0.9264            9.94s
         9           0.9058            4.88s
        10           0.8878            0.00s
sklearn.GBM total costs: 50.88141202926636 seconds


training xgboost
[0]	train-ams@0.15:3.69849
[1]	train-ams@0.15:3.96339
[2]	train-ams@0.15:4.26978
[3]	train-ams@0.15:4.32619
[4]	train-ams@0.15:4.41415
[5]	train-ams@0.15:4.49395
[6]	train-ams@0.15:4.64614
[7]	train-ams@0.15:4.64058
[8]	train-ams@0.15:4.73064
[9]	train-ams@0.15:4.79447
XGBoost with 1 thread costs: 24.5108642578125 seconds
[0]	train-ams@0.15:3.69849
[1]	train-ams@0.15:3.96339
[2]	train-ams@0.15:4.26978
[3]	train-ams@0.15:4.32619
[4]	train-ams@0.15:4.41415
[5]	train-ams@0.15:4.49395
[6]	train-ams@0.15:4.64614
[7]	train-ams@0.15:4.64058
[8]	train-ams@0.15:4.73064
[9]	train-ams@0.15:4.79447
XGBoost with 2 thread costs: 11.449955940246582 seconds
[0]	train-ams@0.15:3.69849
[1]	train-ams@0.15:3.96339
[2]	train-ams@0.15:4.26978
[3]	train-ams@0.15:4.32619
[4]	train-ams@0.15:4.41415
[5]	train-ams@0.15:4.49395
[6]	train-ams@0.15:4.64614
[7]	train-ams@0.15:4.64058
[8]	train-ams@0.15:4.73064
[9]	train-ams@0.15:4.79447
XGBoost with 4 thread costs: 8.809934616088867 seconds
[0]	train-ams@0.15:3.69849
[1]	train-ams@0.15:3.96339
[2]	train-ams@0.15:4.26978
[3]	train-ams@0.15:4.32619
[4]	train-ams@0.15:4.41415
[5]	train-ams@0.15:4.49395
[6]	train-ams@0.15:4.64614
[7]	train-ams@0.15:4.64058
[8]	train-ams@0.15:4.73064
[9]	train-ams@0.15:4.79447
XGBoost with 8 thread costs: 7.875434875488281 seconds
XGBoost total costs: 52.64618968963623 seconds

 

设计思路

 

 

 

 

核心代码

 

 

 

 

版权声明:本文来源CSDN,感谢博主原创文章,遵循 CC 4.0 by-sa 版权协议,转载请附上原文出处链接和本声明。
原文链接:https://blog.csdn.net/qq_41185868/article/details/90897097
站方申明:本站部分内容来自社区用户分享,若涉及侵权,请联系站方删除。
  • 发表于 2020-03-08 16:06:26
  • 阅读 ( 1094 )
  • 分类:算法

0 条评论

请先 登录 后评论

官方社群

GO教程

猜你喜欢