Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add some new features for ANN #146

Open
wants to merge 5 commits into
base: master
Choose a base branch
from
Open

Conversation

ZhangYuef
Copy link

To extend the universal support of ANN model I add the following new features for ANN:

  • Add dropout layer
  • Add embedding layer
  • Add two activation functions: relu and tanh
  • Revise some typos and mistaken representations in comments

Please check and comment on those code : )

- add dropout layer and its corresponding model.
- test it on Iris dataset with dropout rate setting as 0.2 get 0.96 accuracy if we do not use the dropout layer (dropout rate = 0) the accuracy is 0.88.
- add embedding layer and its corresponding model for ANN
- this is NOT a default setting for the construction of MLP forward topology
- add relu and tanh activation methods
- use them in feed forward topology part (path: `alink/operator/common/classification/ann/FeedForwardTopology.java`)
- use sigmoid as default one
@CLAassistant
Copy link

CLAassistant commented Sep 20, 2020

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

@ZhangYuef ZhangYuef mentioned this pull request Sep 20, 2020
@LastBlackRose LastBlackRose mentioned this pull request Feb 9, 2022
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants