Enhanced active learning in developing highly interpretable decision support system
Developing highly interpretable commonly presents significant challenges to decision support system. In previous research work, partial information had provided poor result in the problem of learning classifiers. The behavior of some learning algorithm may only be explored by uncertainty analyses. W...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Subjects: | |
| Online Access: | http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5564802 http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5564802 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Developing highly interpretable commonly presents significant challenges to decision support system. In previous research work, partial information had provided poor result in the problem of learning classifiers. The behavior of some learning algorithm may only be explored by uncertainty analyses. We propose a novel information extraction by utilizing fuzzy measure in active learning to focus on the most informative instances. By integrating an expert knowledge as weight to the existing datasets, we overcome the uncertainty and appropriately assign partial datasets to the nearest clusters for classification. By choosing appropriate weights for pre labeled data, the nearest neighbor classifier consistently improves on the original classifier. |
|---|