Subscribe to DSC Newsletter


I'm working on some analysis using regression tree. But the accuracy level of the tree is coming low. Can I suggest some methods which will improve the accuracy of regression tree & able to represent the splitting logic by building tree?

I tried with Random forest. It is increasing the accuracy but I'm not able to represent my final outcome as a tree.

Thanks in advance

Views: 907

Reply to This

Replies to This Discussion

Boosted additive models is the state of the art technique when it comes to classifications trees. The other benefit is that boosted additive models also address nonlinearity of the underlying data very well.

Actually I need to show the decision logic also,other than improving the accuracy. That's the reason of not using Random forest.Do you think I can show the decision logic using Boosted additive model? Also I need to it for regression tree

The great thing with boosted additive models or gradient boosted models (GBM) as they are called is that they capture the relationship between dependent and independent variable very well. just like you do so in regression. On top they have the accuracy of machine learning algorithm.

The decision logic of regression and accuracy of machine learning - that is GBM.

But you are not giving answer to my basic question. IS IT CAPABLE TO SHOW THE DECISION LOGIC FOR THE  SPLIT???

Of course, I thought I answered that. You might like to check your software though.

Can you please suugest me some material on GBM?

Chapter 10, Book on statistical learning by Hastie, Tibshirani and Friedman. It should be available for download.

Thank you


On Data Science Central

© 2020   TechTarget, Inc.   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service