Lih Heng, Chan and Shaikh Salleh, Sheikh Hussain (2007) Classification for breast cancer diagnosis using adaboost. In: Recent Advancement In Biomedical Engineering. Penerbit UTM , Johor, pp. 50-62. ISBN 978-983-52-0559-0
Full text not available from this repository.
Boosting is a general method that can be applied on any learning algorithm to improve its performance. Throughout the evolution of boosting-based algorithms, the term “weak leaner” has always been mentioned. Literally it refers to weak learning algorithms that perform just slightly better than random guess. Schapire R.E. (1990) showed that these so-called weak learners can be efficiently combined or “boosted” to build a strong accurate classifier. This boosting algorithm applies weak learning algorithms multiple times to instance space with different distribution, and finally construct a strong hypothesis from numerous weak hypotheses Freund, Y. et al., (1997) first introduced theoretically the adaptive boosting (AdaBoost) method which significantly reduces the error of any learning algorithm that consistently generates classifiers with the condition of that: “better than random guess”. In AdaBoost algorithms, distribution over instance space of training set are adjusted adaptively to the errors of weak hypotheses. This helps to move the weak learner towards the “harder” part of classification space more efficiently.
|Item Type:||Book Section|
|Subjects:||R Medicine > RZ Other systems of medicine|
|Divisions:||?? FBSK ??|
|Deposited By:||Liza Porijo|
|Deposited On:||15 Aug 2011 05:26|
|Last Modified:||15 Aug 2011 05:26|
Repository Staff Only: item control page