我正在尝试实现一个拉普拉斯特征图算法,它包括:
1)构造一个图(我用的是kNN,说到k个最近邻有一条边)
2) 将每条边与权重相关联
3) 定义对角线(即对角线放置的行的总和)
4) 执行广义特征分解(应该是 Lv = lambdaDv,其中 L 和 D 在下面的代码中计算)
我认为这可以通过 scipy.linalg.eig(vals) 以某种方式解决,但我不明白如何正确输入我的两个矩阵。有人可以帮助我理解如何执行广义特征分解步骤吗?
import numpy as np
import random as r
from math import exp as exp
from scipy.spatial import distance
def rweights((vectors,features)):
return 1 * np.random.random_sample((vectors,features)) - 0
def vEuclidean(v, m):
return np.apply_along_axis(lambda x: distance.euclidean(v,x), 1, m)
def mEuclideans(m):
return np.apply_along_axis(lambda v: vEuclidean(v,m), 1, m)
def neighbours(vector, neigh):
size = (vector.shape[0] - neigh)
for i in range(1,size):
vector[np.argmax(vector)] = 0.0
return vector
def kNN(m, k):
me = mEuclideans(m)
return np.array(map(lambda v: neighbours(v, k), me))
def diag(m):
sums = np.sum(m,1)
(vectors,features) = m.shape
zeros = np.zeros(vectors*features).reshape((vectors,features))
for i in range(features):
zeros[i][i] = sums[i]
return zeros
def vectorWeight(v, sigma):
f = lambda x: exp((-(x)/(sigma**2)))
size = v.shape[0]
for i in range(size):
v[i] = f(v[i])
return v
def weight(m):
return np.array(np.apply_along_axis(lambda v: vectorWeight(v,0.5), 1, m))
if __name__ == "__main__":
np.random.seed(666)
m = rweights((5,3))
w = weight(kNN(m, 2))
D = diag(w)
L = D-w
最佳答案
嗯, 这个答案得到了 Warrens 的帮助(所以他值得称赞),但我找到了一个关于光谱聚类的视频 https://www.youtube.com/watch?v=Ln0mgyvXNQE ,他在图上使用拉普拉斯算子。我认为根据他的结果检查我的实现情况会很好。因此,我补充说:
from scipy.linalg import eig
def distanceM():
return np.array([[0.0,0.8,0.6, 0.1,0.0,0.0],
[0.8,0.0,0.9,0.0,0.0,0.0], [0.6,0.9,0.0,0.0,0.0,0.2],
[0.1,0.0,0.0,0.0,0.6,0.7],[0.0,0.0,0.0,0.6,0.0,0.8]
[0.0,0.0,0.2,0.7,0.8,0.0]])
if __name__ == "__main__":
w = distanceM()
D = diag(w)
L = D-w
w,vr = eig(L)
print vr
我发现我得到了相同的拉普拉斯矩阵,以及相同的特征向量(vr 的第二列)。
关于python - 如何在这里进行广义特征分解?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/34184562/