Monday, April 22, 2013

Some observations "Multi Class classification with binary classifiers" and " Traditional Multi Class classification"


After experimenting with different dataset, analyzing different algorithms on "Multi Class classification with binary classifiers" and " Traditional Multi Class classification", we got the following points:

  1. Point-1: If you use large number of features or multiple criteria at a place, then with the increase in number of features, the quality of decision process reduces. This happens not only with "Multi-criteria decision making" but also with classification system. However, some results show that "Multi-class classification" shows better result over "Multi-class class classification achieved by using binary classifier". This is possible, if dataset contains features having at least a minimum remarkable margin, which do not lose their importance in the presence of features of other classes. Some algorithmic steps, which helps in increasing this marginal gap between features related to difference classes also increase the quality of classification. 
  2. Point-2: "Multi-class class classification achieved by using binary classifier" techniques are based upon mutual comparison (specially in the case of "One-to-One" mode of classification, where we compare each class with all other classes one by one and in error-correcting code strategy). Thus here the chances of loss of effect of margin of features are less w.r.t. the collective presence of features all classes. 
  3. Point-3: Thus the nature of data becomes more important here. According to my observations, "One-to-One based classifiers performs slightly better than or equal to "error-correcting code strategy". However some feature boosting strategies, as used with "direct Multi-class classification" will also be helpful here.  

No comments:

Post a Comment