Description
Problem 1 You are given a face image database of 10 subjects. Each subject has 10 images of 112 Γ 92 pixels. Convert each image to a vector of length D=112 Γ 92 = 10304.
1.1 Apply the principal-component analysis (PCA) method to the data set for face feature extraction. Use different rank values d=1,2,3,6,10,20, and 30 (i.e. find d principal components). Project the face images to the rank-d subspace (i.e. project the face images onto the d principal components) and apply the nearest-neighbor classifier in the projection space. Plot the recognition accuracy rate (ππ’ππππΒ ππ πππππππ‘ ππππ π ππππππ‘πππ %) versus different d
π‘ππ‘ππ ππ’ππππ ππ π‘ππ π‘ πππ ππ values.
1.2 Use the Fisherβs Linear Discriminant (FLD) method to find the projection directions to reduce the face image dimension, followed by a nearest-neighbor classifier to perform face recognition. Before applying FLD, you would first use PCA to reduce the dimensionality of face images to π0 = 40. The final reduced dimension of the images is π = [1,2,3,6,10,20,30]. Β
Use the PCA and LinearDiscriminantAnalysis libraries from sklearn. Mean subtraction is not needed. The following code snippet is for your reference:
from sklearn.decomposition import PCA from sklearn.discriminant_analysis import LinearDiscriminantAnalysis pca0 = PCA(n_components=d0)
pca0_operator = pca0.fit(L) # L is the training data set, each row is one training image
L0 = pca0_operator.transform(L) # reduced-dim of the data: rows are data points
# input of lda is the reduced-dim data from pca:
lda = LinearDiscriminantAnalysis(n_components=d) # FLD /LDA lda_operator = lda.fit(filled by yourself) train_proj_lda = lda_operator.transform(filled by yourself).transpose() # columns are examples Run 20 independent experiments. In each experiment, randomly choose 8 images per class to form the training set, and the remaining images form the test set. Plot the recognition accuracy rate
π‘βπ ππ’ππππ ππ πππππππ‘ ππππ π ππππππ‘πππ
% versus different d values for both PCA and FLD methods on the same figure
π‘βπ π‘ππ‘ππ ππ’ππππ ππ π‘ππ π‘ πππ ππ
with different colors, show the legends for both curves, and show the xlabel and ylabel on the figure. Comment on the results.
Β




