Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Corrected a bug leading to infinite loops with flat objectives #1542

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 6 additions & 4 deletions pylearn2/optimization/batch_gradient_descent.py
Original file line number Diff line number Diff line change
Expand Up @@ -350,9 +350,10 @@ def minimize(self, * inputs):
if self.verbose:
logger.info('\t{0} {1}'.format(alpha, obj))

# Use <= rather than = so if there are ties
# the bigger step size wins
if obj <= best_obj:
# Should not use <= instead of < because, for a flat
# objective, this leads to an infinite loop due to the
# condition used to grow the step sizes
if obj < best_obj:
best_obj = obj
best_alpha = alpha
best_alpha_ind = ind
Expand All @@ -374,7 +375,8 @@ def minimize(self, * inputs):
alpha_list = [alpha / 3. for alpha in alpha_list]
if self.verbose:
logger.info('shrinking the step size')
elif best_alpha_ind > len(alpha_list) - 2:
elif best_alpha_ind >= len(alpha_list) - 1:
# Grow the step size if the last step size was used
alpha_list = [alpha * 2. for alpha in alpha_list]
if self.verbose:
logger.info('growing the step size')
Expand Down