EDIT: updating all the code to organize this question, same issue and question though.
def extract_hypercolumn(model, layer_indexes, instance): layers = [model.layers[li].output for li in layer_indexes] get_feature = K.function([model.layers[0].input],layers) assert instance.shape == (1,3,224,224) feature_maps = get_feature([instance]) hypercolumns = [] for convmap in feature_maps: for fmap in convmap[0]: upscaled = sp.misc.imresize(fmap, size=(224, 224), mode="F", interp='bilinear') hypercolumns.append(upscaled) return np.asarray(hypercolumns) def get_arrays(each_file): img = color.rgb2lab(io.imread(each_file)[..., :3]) X = img[:,:,:1] y = img[:,:,1:] X_rows,X_columns,X_channels=X.shape y_rows,y_columns,y_channels=y.shape X_channels_first = np.transpose(X,(2,0,1)) X_sample = np.expand_dims(X_channels_first,axis=0) X_3d = np.tile(X_sample,(1,3,1,1)) hc = extract_hypercolumn(model,[3,8],X_3d) hc_expand_dims = np.expand_dims(hc,axis=0) y_reshaped = np.reshape(y,(y_rows*y_columns,y_channels)) classed_pixels_first = KNN.predict_proba(y_reshaped) classed_classes_first = np.transpose(classed_pixels_first,(1,0)) classed_expand_dims = np.expand_dims(classed_classes_first,axis=0) print "hypercolumn shape: ",hc_expand_dims.shape,"classified output color shape: ",classed_expand_dims.shape return hc_expand_dims,classed_expand_dims def generate_batch(): files = glob.glob('../manga-resized/sliced/*.png') while True: random.shuffle(files) for fl in files: yield get_arrays(fl) colorize = Colorize() colorize.compile(optimizer=sgd,loss='categorical_crossentropy',metrics=["accuracy"]) colorize.fit_generator(generate_batch(),samples_per_epoch=1,nb_epoch=5)
Here is the traceback (using Tensorflow) :
Using TensorFlow backend. output shape: (None, 112, 228, 228) output_shape after reshaped: (None, 112, 51984) Epoch 1/5 Traceback (most recent call last): File "load.py", line 152, in <module> colorize.fit_generator(generate_batch(),samples_per_epoch=1,nb_epoch=5) File "/Users/alex/anaconda2/lib/python2.7/site-packages/keras/models.py", line 651, in fit_generator max_q_size=max_q_size) File "/Users/alex/anaconda2/lib/python2.7/site-packages/keras/engine/training.py", line 1358, in fit_generator 'or (x, y). Found: ' + str(generator_output)) Exception: output of generator should be a tuple (x, y, sample_weight) or (x, y). Found: None Exception in thread Thread-1: Traceback (most recent call last): File "/Users/alex/anaconda2/lib/python2.7/threading.py", line 801, in __bootstrap_inner self.run() File "/Users/alex/anaconda2/lib/python2.7/threading.py", line 754, in run self.__target(*self.__args, **self.__kwargs) File "/Users/alex/anaconda2/lib/python2.7/site-packages/keras/engine/training.py", line 404, in data_generator_task generator_output = next(generator) StopIteration
And using theano - note that here hyper column and the classed labels are successfully printed - it seems like this is closer to working:
UPDATE: it works using theano! Im satisfied. however, the question still stands with tensor flow I guess
Now, when I try:
for a, b in generate_batch(): print(a, b)
or
print list(islice(generate_batch(), 3))
EDIT: New development - They work!
This works perfectly at least it prints out numpy arrays rather than erring out. However, the Keras issue remains
This makes me wonder if I am simply running into a limitation of keras - since there is so much preprocessing of the data - feeding the image into VGG, extracting the hyper columns, performing the KNN classification on the labels, etc. The fit generator is trying to get batches but doing a ton of work to do it. Perhaps it is too much so it just sees the return value as empty because it is taking so much memory/bandwidth.
I know tensor flow for instance has an entire queueing system built out for this exact issue. It would be awesome to know if this is what Im experiencing - as opposed to implementation error. Any keras experts out there care to weight in??? :)
1 Answers
Answers 1
generator should be infinite (loops through data) in fit_generator.
c.f. keras documentation on fit_generator
The generator is expected to loop over its data indefinitely.
Try changing your function generate_batch
to:
def generate_batch(): files = glob.glob('../manga-resized/sliced/*.png') while True: random.shuffle(files) for fl in files: yield get_arrays(fl)
Also:
I think the problem of your code comes from the line
y_reshaped = (y,(y_rows*y_columns,y_channels))
This line doesn't seem to do the reshaping at all. It just create a tuple with 2 elements: the numpy array y
and the tuple (y_rows*y_columns,y_channels)
.
I suppose you should write something like
y_reshaped = np.reshape(y,(y_rows*y_columns,y_channels))
0 comments:
Post a Comment