經過一番搜索,我結束了使用Weka。特別是我使用了樸素貝葉斯分類器。他們的數據結構有點神祕,但它的工作原理很快。
package agent.agenttype.ijcai;
import weka.classifiers.Classifier;
import weka.classifiers.bayes.NaiveBayes;
import weka.core.Attribute;
import weka.core.FastVector;
import weka.core.Instance;
import weka.core.Instances;
import weka.core.SparseInstance;
public class Example {
public static enum ClassLabel {A, B};
Instances trainingSet;
FastVector att = new FastVector(2);
FastVector cl = new FastVector(2);
public Example(){
//Add class labels
cl.addElement(ClassLabel.values()[0].name());
cl.addElement(ClassLabel.values()[1].name());
//set the name of our value attribute
Attribute Attribute1 = new Attribute("Value");
//set the name of our class label atrribute
Attribute ClassAttribute = new Attribute("Label", cl);
att.addElement(Attribute1);
att.addElement(ClassAttribute);
//create training set that uses our attributes to interpret instances
trainingSet = new Instances("TrainingSet", att, 2);
trainingSet.setClassIndex(1);//tell our training set that index 2 of instances is the class label
}
public void addObservationToEdge(int value, ClassLabel classLabel){
Instance instance = new SparseInstance(2);
instance.setValue((Attribute)att.elementAt(0), value); //set value
instance.setValue((Attribute)att.elementAt(1), classLabel.name());//set our
trainingSet.add(instance);
}
public ClassLabel classifyValue(int value) throws Exception{
Instance instanceForClassification = new SparseInstance(1);
instanceForClassification.setValue((Attribute)att.elementAt(0), value);
instanceForClassification.setDataset(trainingSet);//make instance inherit attribute labels from training set
Classifier cModel = (Classifier)new NaiveBayes();//create naive bayes classifier
cModel.buildClassifier(trainingSet);
int labelNumber = (int) cModel.classifyInstance(instanceForClassification);
return ClassLabel.values()[labelNumber];
}
public static void main(String[] args){
Example example = new Example();
example.addObservationToEdge(1, ClassLabel.A);
example.addObservationToEdge(2, ClassLabel.A);
example.addObservationToEdge(5, ClassLabel.A);
example.addObservationToEdge(11, ClassLabel.A);
example.addObservationToEdge(9, ClassLabel.B);
example.addObservationToEdge(12, ClassLabel.B);
example.addObservationToEdge(15, ClassLabel.B);
example.addObservationToEdge(20, ClassLabel.B);
try {
//print classification results
for(int i = 0; i<20; i++){
System.out.println("Value: " + i + " Class Label:" + example.classifyValue(i));
}
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
OUTPUT:
Value: 0 Class Label:A
Value: 1 Class Label:A
Value: 2 Class Label:A
Value: 3 Class Label:A
Value: 4 Class Label:A
Value: 5 Class Label:A
Value: 6 Class Label:A
Value: 7 Class Label:A
Value: 8 Class Label:A
Value: 9 Class Label:A
Value: 10 Class Label:B
Value: 11 Class Label:B
Value: 12 Class Label:B
Value: 13 Class Label:B
Value: 14 Class Label:B
Value: 15 Class Label:B
Value: 16 Class Label:B
Value: 17 Class Label:B
Value: 18 Class Label:B
Value: 19 Class Label:B
如果所有觀察是相互獨立的,並且只有一維數據,也沒有辦法來分類該趴在重疊區域的值。 – nullPointer
這是爲什麼?決策邊界不會是重疊之間的某個地方嗎?由於距離的定義不夠完整? –
也許我誤解你的問題,但考慮以下情況: [1,A] [2,A] [5,A] [11,A] [9,B] [12,B] [ 15,B] [20,B] 如何歸類10? – nullPointer