你是正確的,你需要一個HMM每個手勢。但是,如果使用HiddenMarkovClassifier類(它是在每個要嘗試檢測的類後創建的多個HMM的包裝器),則該框架已經可以爲您提供此構造。
如果每個圖像有4個特徵,則需要假設概率分佈能夠爲多元特徵建模。一個簡單的選擇就是假設你的特徵是相互獨立的,並且它們都遵循正態分佈。
因此,您可以使用下面的示例代碼來創建模型。它假定你的數據庫只有兩個訓練序列,但在現實中你必須有更多的許多。
double[][][] sequences = new double[][][]
{
new double[][] // This is the first sequence with label = 0
{
new double[] { 0, 1, 2, 1 }, // <-- this is the 4-features feature vector for
new double[] { 1, 2, 5, 2 }, // the first image of the first sequence
new double[] { 2, 3, 2, 5 },
new double[] { 3, 4, 1, 1 },
new double[] { 4, 5, 2, 2 },
},
new double[][] // This is the second sequence with label = 1
{
new double[] { 4, 3, 4, 1 }, // <-- this is the 4-features feature vector for
new double[] { 3, 2, 2, 2 }, // the first image of the second sequence
new double[] { 2, 1, 1, 1 },
new double[] { 1, 0, 2, 2 },
new double[] { 0, -1, 1, 2 },
}
};
// Labels for the sequences
int[] labels = { 0, 1 };
上面的代碼顯示瞭如何設置學習數據庫。現在,一旦它已經被設置,你可以創建一個隱馬爾可夫分類器4個的正常功能(正態分佈之間的假設獨立)作爲
// Create one base Normal distribution to be replicated accross the states
var initialDensity = new MultivariateNormalDistribution(4); // we have 4 features
// Creates a sequence classifier containing 2 hidden Markov Models with 2 states
// and an underlying multivariate mixture of Normal distributions as density.
var classifier = new HiddenMarkovClassifier<MultivariateNormalDistribution>(
classes: 2, topology: new Forward(2), initial: initialDensity);
// Configure the learning algorithms to train the sequence classifier
var teacher = new HiddenMarkovClassifierLearning<MultivariateNormalDistribution>(
classifier,
// Train each model until the log-likelihood changes less than 0.0001
modelIndex => new BaumWelchLearning<MultivariateNormalDistribution>(
classifier.Models[modelIndex])
{
Tolerance = 0.0001,
Iterations = 0,
FittingOptions = new NormalOptions()
{
Diagonal = true, // only diagonal covariance matrices
Regularization = 1e-5 // avoid non-positive definite errors
}
// PS: Setting diagonal = true means the features will be
// assumed independent of each other. This can also be
// achieved by using an Independent<NormalDistribution>
// instead of a diagonal multivariate Normal distribution
}
);
最後,我們可以訓練模型,並在學習數據測試它的輸出:
// Train the sequence classifier using the algorithm
double logLikelihood = teacher.Run(sequences, labels);
// Calculate the probability that the given
// sequences originated from the model
double likelihood, likelihood2;
// Try to classify the 1st sequence (output should be 0)
int c1 = classifier.Compute(sequences[0], out likelihood);
// Try to classify the 2nd sequence (output should be 1)
int c2 = classifier.Compute(sequences[1], out likelihood2);