Caffe hdf5 layer data 大于2G 的导入

问题:

Datatype class: H5T_FLOAT,

Check failed: error == cudaSuccess (2 vs. 0) out of memory.

hdf5 layer 最大的导入的大小是2G, 超过会报错[1]。

解决方法:

有人 h5repart -m1g 将数据集分割成多个文件每个是固定大小,我尝试后发现,会出现数据截断[1]

参考Caffe中store2hdf5.m我自己写一个分割的小程序,主要是先把数据读出来,然后再分批存。

 num = 0;
 for i = 1:1000:8000
     data = h5read('data/train.h5', '/data', [1,1,1,i], [64,64,48,1000] );
     label = h5read('data/train.h5', '/label', [1,1,1,i], [64,64,1,1000] );
     
     count = size(data,4);
     chunksz = 100;
     created_flag = false;
     totalct = 0;
     
     savepath = ['data/train', num2str(num), '.h5'];
     num = num + 1;
     
     for batchno = 1:floor(count/chunksz)
         last_read=(batchno-1)*chunksz;
         batchdata = data(:,:,:,last_read+1:last_read+chunksz);
         batchlabs = label(:,:,1,last_read+1:last_read+chunksz);
         
         startloc = struct('dat',[1,1,1,totalct+1], 'lab', [1,1,1,totalct+1]);
         curr_dat_sz = store2hdf5(savepath, batchdata, batchlabs, ~created_flag, startloc, chunksz);
         created_flag = true;
         totalct = curr_dat_sz(end);
     end
     h5disp(savepath);
 end

里面的数据维数和dataname按需改下。

Ref:

[1] https://github.com/BVLC/caffe/issues/1470