Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Semantic segmentation on aerial image #39

Open
wjNam opened this issue Oct 7, 2016 · 1 comment
Open

Semantic segmentation on aerial image #39

wjNam opened this issue Oct 7, 2016 · 1 comment

Comments

@wjNam
Copy link

wjNam commented Oct 7, 2016

Hi i'm studying in semantic segmentation on aerial image

I try to implement caffe source code in github
https://github.com/mitmul/ssai

i just followed framework in paper and it doesn't work

The input image is 64x64x3 patch and ground truth is 16x16x3 center part of input image

I found another paper that used it in vgg model, so i used vgg framework and implement again.

but it also doesn't work

The classes are { building, roads, others }, and i used softmaxlog in vl_nnloss

Below is my initalize model part.

function net = init_building(varargin)
opts.networkType = 'dagnn' ;
opts = vl_argparse(opts, varargin) ;

lr = [.1 .2] ;
drop1 = struct('name', 'dropout1', 'type', 'dropout', 'rate' , 0.5) ;
drop2 = struct('name', 'dropout2', 'type', 'dropout', 'rate' , 0.5) ;
% Define network CIFAR10-quick
net.layers = {} ;

% Block 1
net.layers{end+1} = struct('type', 'conv', ...
                           'weights', {{0.01*randn(3,3,3,64, 'single'), zeros(1, 64, 'single')}}, ...
                           'learningRate', lr, ...
                           'stride', [1 1], ...
                           'pad', [1 1 1 1]) ;
net.layers{end+1} = struct('type', 'relu') ;

net.layers{end+1} = struct('type', 'conv', ...
                           'weights', {{0.01*randn(3,3,64,64, 'single'), zeros(1,64,'single')}}, ...
                           'learningRate', lr, ...
                           'stride', [1 1], ...
                           'pad', [1 1 1 1]) ;
net.layers{end+1} = struct('type', 'relu') ;

net.layers{end+1} = struct('type', 'pool', ...
                           'method', 'max', ...
                           'pool', [2 2], ...
                           'stride', [2 2], ...
                           'pad', [0 0 0 0]) ; % Emulate caffe

% Block 2
net.layers{end+1} = struct('type', 'conv', ...
                           'weights', {{0.01*randn(3,3,64,128, 'single'), zeros(1,128,'single')}}, ...
                           'learningRate', lr, ...
                           'stride', [1 1], ...
                           'pad', [1 1 1 1]) ;
net.layers{end+1} = struct('type', 'relu') ;

net.layers{end+1} = struct('type', 'conv', ...
                           'weights', {{0.01*randn(3,3,128,128, 'single'), zeros(1,128,'single')}}, ...
                           'learningRate', lr, ...
                           'stride', [1 1], ...
                           'pad', [1 1 1 1]) ;
net.layers{end+1} = struct('type', 'relu') ;

net.layers{end+1} = struct('type', 'pool', ...
                           'method', 'max', ...
                           'pool', [2 2], ...
                           'stride', [2 2], ...
                           'pad', [0 0 0 0]) ; % Emulate caffe

% Block 3
net.layers{end+1} = struct('type', 'conv', ...
                           'weights', {{0.01*randn(3,3,128,256, 'single'), zeros(1,256,'single')}}, ...
                           'learningRate', lr, ...
                           'stride', [1 1], ...
                           'pad', [1 1 1 1]) ;
net.layers{end+1} = struct('type', 'relu') ;

net.layers{end+1} = struct('type', 'conv', ...
                           'weights', {{0.01*randn(3,3,256,256, 'single'), zeros(1,256,'single')}}, ...
                           'learningRate', lr, ...
                           'stride', [1 1], ...
                           'pad', [1 1 1 1]) ;
net.layers{end+1} = struct('type', 'relu') ;

net.layers{end+1} = struct('type', 'conv', ...
                           'weights', {{0.01*randn(3,3,256,256, 'single'), zeros(1,256,'single')}}, ...
                           'learningRate', lr, ...
                           'stride', [1 1], ...
                           'pad', [1 1 1 1]) ;
net.layers{end+1} = struct('type', 'relu') ;

net.layers{end+1} = struct('type', 'pool', ...
                           'method', 'max', ...
                           'pool', [2 2], ...
                           'stride', [2 2], ...
                           'pad', [0 0 0 0]) ; % Emulate caffe
% Block 4

net.layers{end+1} = struct('type', 'conv', ...
                           'weights', {{0.01*randn(3,3,256,512, 'single'), zeros(1,512,'single')}}, ...
                           'learningRate', lr, ...
                           'stride', [1 1], ...
                           'pad', [1 1 1 1]) ;
net.layers{end+1} = struct('type', 'relu') ;

net.layers{end+1} = struct('type', 'conv', ...
                           'weights', {{0.01*randn(3,3,512,512, 'single'), zeros(1,512,'single')}}, ...
                           'learningRate', lr, ...
                           'stride', [1 1], ...
                           'pad', [1 1 1 1]) ;
net.layers{end+1} = struct('type', 'relu') ;

net.layers{end+1} = struct('type', 'conv', ...
                           'weights', {{0.01*randn(3,3,512,512, 'single'), zeros(1,512,'single')}}, ...
                           'learningRate', lr, ...
                           'stride', [1 1], ...
                           'pad', [1 1 1 1]) ;
net.layers{end+1} = struct('type', 'relu') ;

net.layers{end+1} = struct('type', 'pool', ...
                           'method', 'max', ...
                           'pool', [2 2], ...
                           'stride', [2 2], ...
                           'pad', [0 0 0 0]) ; % Emulate caffe
% Block 5
net.layers{end+1} = struct('type', 'conv', ...
                           'weights', {{0.01*randn(3,3,512,512, 'single'), zeros(1,512,'single')}}, ...
                           'learningRate', lr, ...
                           'stride', [1 1], ...
                           'pad', [1 1 1 1]) ;
net.layers{end+1} = struct('type', 'relu') ;

net.layers{end+1} = struct('type', 'conv', ...
                           'weights', {{0.01*randn(3,3,512,512, 'single'), zeros(1,512,'single')}}, ...
                           'learningRate', lr, ...
                           'stride', [1 1], ...
                           'pad', [1 1 1 1]) ;
net.layers{end+1} = struct('type', 'relu') ;

net.layers{end+1} = struct('type', 'conv', ...
                           'weights', {{0.01*randn(3,3,512,512, 'single'), zeros(1,512,'single')}}, ...
                           'learningRate', lr, ...
                           'stride', [1 1], ...
                           'pad', [1 1 1 1]) ;
net.layers{end+1} = struct('type', 'relu') ;

net.layers{end+1} = struct('type', 'pool', ...
                           'method', 'max', ...
                           'pool', [2 2], ...
                           'stride', [2 2], ...
                           'pad', [0 0 0 0]) ; % Emulate caffe
% Block 6
net.layers{end+1} = struct('type', 'conv', ...
                           'weights', {{0.01*randn(2,2,512,4096, 'single'), zeros(1,4096,'single')}}, ...
                           'learningRate', lr, ...
                           'stride', [1 1],...
                           'pad', [0 0 0 0]) ;
net.layers{end+1} = struct('type', 'relu') ;
net.layers{end+1} = drop1;
net.layers{end+1} = struct('type', 'conv', ...
                           'weights', {{0.01*randn(1,1,4096,4096, 'single'), zeros(1,4096,'single')}}, ...
                           'learningRate', lr, ...
                           'stride', [1 1],...
                           'pad', [0 0 0 0]) ;
net.layers{end+1} = struct('type', 'relu') ;
%net.layers{end+1} = drop2;
net.layers{end+1} = struct('name','fc8','type', 'conv', ...
                           'weights', {{0.01*randn(1,1,4096,512, 'single'), zeros(1,512,'single')}}, ...
                           'learningRate', lr, ...
                           'stride', [1 1],...
                           'pad', [0 0 0 0]) ;
% Loss layer
net.layers{end+1} = struct('name', 'prob','type', 'softmaxloss') ;

% Meta parameters
net.meta.inputSize = [16 16 3] ;
net.meta.trainOpts.learningRate = 0.0001 * ones(1,50) ;
net.meta.trainOpts.weightDecay = 0.0005 ;
net.meta.trainOpts.batchSize = 100 ;
net.meta.trainOpts.numEpochs = numel(net.meta.trainOpts.learningRate) ;


net = dagnn.DagNN.fromSimpleNN(net, 'canonicalNames', true) ;

net.removeLayer('prob') ;

net.addLayer('objective', ...
  SegmentationLoss_building('loss', 'softmaxlog'), ...
  {'prediction', 'label'}, 'objective') ;

% Add accuracy layer
net.addLayer('accuracy', ...
    SegmentationAccuracy2(), ...
    {'prediction', 'label'}, 'accuracy') ;

Is there any problem in initializing?
I really stuck in this problem

Plz help me

@daofeng2007
Copy link

what do you mean by "not working"? It did not produce the correct result or training procedure threw any error?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants