Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A small question regarding conv_cond_concat #40

Open
miqbal23 opened this issue Aug 27, 2018 · 0 comments
Open

A small question regarding conv_cond_concat #40

miqbal23 opened this issue Aug 27, 2018 · 0 comments

Comments

@miqbal23
Copy link

Hi, recently I've been studying your code, especially on the conditional DCGAN you made for MNIST dataset.

I see that you concatenated the condition on every layer right after BatchNorm and ReLu, but I still get puzzled with the conv_cond_concat function that you use to concat the condition into hidden layer. On some layer, you simply use T.concatenate to join them, but on the other layer, you join them using conv_cond_concat function as described below

def conv_cond_concat(x, y):
    """ 
    concatenate conditioning vector on feature map axis 
    """
    return T.concatenate([x, y*T.ones((x.shape[0], y.shape[1], x.shape[2], x.shape[3]))], axis=1)

My questions are,

  • why using this function, instead of simple T.concatenate?
  • judging from reshaping of y, I assume you are depth-concatenating it. Am I correct?
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant