首页 诗词 字典 板报 句子 名言 友答 励志 学校 网站地图
当前位置: 首页 > 教程频道 > 其他教程 > 其他相关 >

UFLDL习题(PCA and Whitening && Softmax Regression)

2013-03-21 
UFLDL练习(PCA and Whitening&&Softmax Regression)softmax纠结了两天,原因是自己不小心改了主程序还是照

UFLDL练习(PCA and Whitening && Softmax Regression)

softmax纠结了两天,原因是自己不小心改了主程序

还是照例只是贴贴代码,如果你有需要的话可以去看UFLDL的教程

至于效果和UFLDL都是一样的,我就不重复贴图了啊,ps:代码是matlab的,不是python的

PCA and Whitening:

pca_gen.m

function [cost, grad] = softmaxCost(theta, numClasses, inputSize, lambda, data, labels)% numClasses - the number of classes % inputSize - the size N of the input vector% lambda - weight decay parameter% data - the N x M input matrix, where each column data(:, i) corresponds to%        a single test set% labels - an M x 1 matrix containing the labels corresponding for the input data%% Unroll the parameters from thetatheta = reshape(theta, numClasses, inputSize);numCases = size(data, 2);groundTruth = full(sparse(labels, 1:numCases, 1));  %numClasses*Mcost = 0;thetagrad = zeros(numClasses, inputSize);M = theta*data;     % (numClasses,N)*(N,M)M = bsxfun(@minus, M, max(M, [], 1));h = exp(M);h =  bsxfun(@rdivide, h, sum(h));cost = -1/numCases*sum(sum(groundTruth.*log(h)))+lambda/2*sum(sum(theta.^2));thetagrad = -1/numCases*((groundTruth-h)*data')+lambda*theta;%log(h)下面一段是关键部分没有Vectorization版本的代码%for i=1:numCases%       s=groundTruth(:,i).*log(h(:,i));%      cost=cost+sum(s);%end%cost=cost*(-1)/numCases+lambda/2*sum(sum(theta.^2));%for i=1:numClasses%    for j=1:numCases%        %groundTruth(:,j)%        %h(:,j)%        k=((groundTruth(:,j)-h(:,j))*data(:,j)');%        %        thetagrad(i,:)=thetagrad(i,:)+k(i,:);%    end%     thetagrad(i,:)=-thetagrad(i,:)/numCases+lambda*theta(i,:);%endgrad = [thetagrad(:)];end


热点排行