Skip to content

Commit

Permalink
fixing conflict
Browse files Browse the repository at this point in the history
  • Loading branch information
jakeret committed Mar 27, 2017
1 parent a4601fa commit ecf67c9
Show file tree
Hide file tree
Showing 5 changed files with 143 additions and 156 deletions.
3 changes: 2 additions & 1 deletion AUTHORS.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,13 @@ Credits
Development Lead
----------------

* Joel Akeret <[email protected]>
* Joel Akeret <[email protected]>

Contributors
------------

* `@FelixGruen <https://github.com/FelixGruen>`_
* `@ameya005 <https://github.com/ameya005>`_

Citations
---------
Expand Down
111 changes: 66 additions & 45 deletions demo/demo_radio_data.ipynb

Large diffs are not rendered by default.

178 changes: 71 additions & 107 deletions demo/demo_toy_problem.ipynb

Large diffs are not rendered by default.

3 changes: 2 additions & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
click
numpy
Pillow
Pillow
tensorflow>=1.0.0
4 changes: 2 additions & 2 deletions tf_unet/unet.py
Original file line number Diff line number Diff line change
Expand Up @@ -215,11 +215,11 @@ def _get_cost(self, logits, cost_name, cost_kwargs):
if class_weights is not None:
class_weights = tf.constant(np.array(class_weights, dtype=np.float32))

weight_map = tf.mul(flat_labels, class_weights)
weight_map = tf.multiply(flat_labels, class_weights)
weight_map = tf.reduce_sum(weight_map, axis=1)

loss_map = tf.nn.softmax_cross_entropy_with_logits(flat_logits, flat_labels)
weighted_loss = tf.mul(loss_map, weight_map)
weighted_loss = tf.multiply(loss_map, weight_map)

loss = tf.reduce_mean(weighted_loss)

Expand Down

0 comments on commit ecf67c9

Please sign in to comment.