Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

changing relu function on mobilenetV2 #10

Open
dagnichz opened this issue Sep 24, 2019 · 0 comments
Open

changing relu function on mobilenetV2 #10

dagnichz opened this issue Sep 24, 2019 · 0 comments

Comments

@dagnichz
Copy link

i'm trying to do guided backpropogation on mobilenet using your code with tensorflow 1.14.
unfortunately, tensorflow does not reposnd to the override you suggested:

@ops.RegisterGradient("GuidedRelu")
def _GuidedReluGrad(op, grad):
return tf.where(0. < grad, gen_nn_ops.relu_grad(grad, op.outputs[0]), tf.zeros(grad.get_shape()))

I also tried chganing the override line to "Relu6" but still nothing.
with graph.gradient_override_map({'Relu6': 'GuidedRelu'}):

mybe you tried doing it yourself on mobilnet and can offer a solution?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant