How to get a binary bipolar activation function for output as +1 and -1 in keras?
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}
I want to have the y_pred output as either +1 or -1 only. It should not have the intermediate real values and not even zero.
classifier = Sequential()
#adding layers
# Adding the input layer and the first hidden l`enter code here`ayer
classifier.add(Dense(output_dim = 6, init = 'uniform', activation ='relu', input_shape = (22,)))
# Adding the second hidden layer classifier.add(Dense(output_dim = 6, init = 'uniform', activation = 'relu'))
# Adding the output layer
classifier.add(Dense(output_dim = 1, init = 'uniform', activation = 'tanh'))
# Compiling Neural Network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
# Fitting our model
classifier.fit(x_train, y_train, batch_size = 10, epochs = 100)
# Predicting the Test set results
y_pred = classifier.predict(x_test)
The output values of y_pred are in the range of [-1,1] but I expected values only to be either of 1 or -1.
python-3.x machine-learning keras deep-learning anaconda
add a comment |
I want to have the y_pred output as either +1 or -1 only. It should not have the intermediate real values and not even zero.
classifier = Sequential()
#adding layers
# Adding the input layer and the first hidden l`enter code here`ayer
classifier.add(Dense(output_dim = 6, init = 'uniform', activation ='relu', input_shape = (22,)))
# Adding the second hidden layer classifier.add(Dense(output_dim = 6, init = 'uniform', activation = 'relu'))
# Adding the output layer
classifier.add(Dense(output_dim = 1, init = 'uniform', activation = 'tanh'))
# Compiling Neural Network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
# Fitting our model
classifier.fit(x_train, y_train, batch_size = 10, epochs = 100)
# Predicting the Test set results
y_pred = classifier.predict(x_test)
The output values of y_pred are in the range of [-1,1] but I expected values only to be either of 1 or -1.
python-3.x machine-learning keras deep-learning anaconda
Set a threshold, say 0, anything above zero is 1 and below it is -1
– Oswald
Jan 3 at 7:22
@Oswald is there in modification I can do in the loss or activation parameters instead of modifying the y_pred as y_pred[y_pred > 0] = 1 y_pred[y_pred <= 0] = -1
– Umesh Desai
Jan 3 at 7:51
add a comment |
I want to have the y_pred output as either +1 or -1 only. It should not have the intermediate real values and not even zero.
classifier = Sequential()
#adding layers
# Adding the input layer and the first hidden l`enter code here`ayer
classifier.add(Dense(output_dim = 6, init = 'uniform', activation ='relu', input_shape = (22,)))
# Adding the second hidden layer classifier.add(Dense(output_dim = 6, init = 'uniform', activation = 'relu'))
# Adding the output layer
classifier.add(Dense(output_dim = 1, init = 'uniform', activation = 'tanh'))
# Compiling Neural Network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
# Fitting our model
classifier.fit(x_train, y_train, batch_size = 10, epochs = 100)
# Predicting the Test set results
y_pred = classifier.predict(x_test)
The output values of y_pred are in the range of [-1,1] but I expected values only to be either of 1 or -1.
python-3.x machine-learning keras deep-learning anaconda
I want to have the y_pred output as either +1 or -1 only. It should not have the intermediate real values and not even zero.
classifier = Sequential()
#adding layers
# Adding the input layer and the first hidden l`enter code here`ayer
classifier.add(Dense(output_dim = 6, init = 'uniform', activation ='relu', input_shape = (22,)))
# Adding the second hidden layer classifier.add(Dense(output_dim = 6, init = 'uniform', activation = 'relu'))
# Adding the output layer
classifier.add(Dense(output_dim = 1, init = 'uniform', activation = 'tanh'))
# Compiling Neural Network
classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])
# Fitting our model
classifier.fit(x_train, y_train, batch_size = 10, epochs = 100)
# Predicting the Test set results
y_pred = classifier.predict(x_test)
The output values of y_pred are in the range of [-1,1] but I expected values only to be either of 1 or -1.
python-3.x machine-learning keras deep-learning anaconda
python-3.x machine-learning keras deep-learning anaconda
edited Jan 3 at 7:49
Umesh Desai
asked Jan 3 at 7:19


Umesh DesaiUmesh Desai
11
11
Set a threshold, say 0, anything above zero is 1 and below it is -1
– Oswald
Jan 3 at 7:22
@Oswald is there in modification I can do in the loss or activation parameters instead of modifying the y_pred as y_pred[y_pred > 0] = 1 y_pred[y_pred <= 0] = -1
– Umesh Desai
Jan 3 at 7:51
add a comment |
Set a threshold, say 0, anything above zero is 1 and below it is -1
– Oswald
Jan 3 at 7:22
@Oswald is there in modification I can do in the loss or activation parameters instead of modifying the y_pred as y_pred[y_pred > 0] = 1 y_pred[y_pred <= 0] = -1
– Umesh Desai
Jan 3 at 7:51
Set a threshold, say 0, anything above zero is 1 and below it is -1
– Oswald
Jan 3 at 7:22
Set a threshold, say 0, anything above zero is 1 and below it is -1
– Oswald
Jan 3 at 7:22
@Oswald is there in modification I can do in the loss or activation parameters instead of modifying the y_pred as y_pred[y_pred > 0] = 1 y_pred[y_pred <= 0] = -1
– Umesh Desai
Jan 3 at 7:51
@Oswald is there in modification I can do in the loss or activation parameters instead of modifying the y_pred as y_pred[y_pred > 0] = 1 y_pred[y_pred <= 0] = -1
– Umesh Desai
Jan 3 at 7:51
add a comment |
2 Answers
2
active
oldest
votes
To function properly, neural networks require an activation function that can get non-integer values. If you need rigidly discrete output, you need to translate the output values yourself.
add a comment |
When you are implementing binary_crossentropy
loss in your code, Keras automatically takes the output and applies a threshold of 0.5 to the value. This makes anything above 0.5 as 1 and anything below as 0. Unfortunately, in keras there is no easy way to change the threshold. You will have to write your own loss function.
Here is a Stackoverflow link that will guide you in doing that.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54017938%2fhow-to-get-a-binary-bipolar-activation-function-for-output-as-1-and-1-in-keras%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
To function properly, neural networks require an activation function that can get non-integer values. If you need rigidly discrete output, you need to translate the output values yourself.
add a comment |
To function properly, neural networks require an activation function that can get non-integer values. If you need rigidly discrete output, you need to translate the output values yourself.
add a comment |
To function properly, neural networks require an activation function that can get non-integer values. If you need rigidly discrete output, you need to translate the output values yourself.
To function properly, neural networks require an activation function that can get non-integer values. If you need rigidly discrete output, you need to translate the output values yourself.
answered Jan 3 at 7:59


Sami HultSami Hult
2,3971613
2,3971613
add a comment |
add a comment |
When you are implementing binary_crossentropy
loss in your code, Keras automatically takes the output and applies a threshold of 0.5 to the value. This makes anything above 0.5 as 1 and anything below as 0. Unfortunately, in keras there is no easy way to change the threshold. You will have to write your own loss function.
Here is a Stackoverflow link that will guide you in doing that.
add a comment |
When you are implementing binary_crossentropy
loss in your code, Keras automatically takes the output and applies a threshold of 0.5 to the value. This makes anything above 0.5 as 1 and anything below as 0. Unfortunately, in keras there is no easy way to change the threshold. You will have to write your own loss function.
Here is a Stackoverflow link that will guide you in doing that.
add a comment |
When you are implementing binary_crossentropy
loss in your code, Keras automatically takes the output and applies a threshold of 0.5 to the value. This makes anything above 0.5 as 1 and anything below as 0. Unfortunately, in keras there is no easy way to change the threshold. You will have to write your own loss function.
Here is a Stackoverflow link that will guide you in doing that.
When you are implementing binary_crossentropy
loss in your code, Keras automatically takes the output and applies a threshold of 0.5 to the value. This makes anything above 0.5 as 1 and anything below as 0. Unfortunately, in keras there is no easy way to change the threshold. You will have to write your own loss function.
Here is a Stackoverflow link that will guide you in doing that.
answered Jan 3 at 8:30
Saket Kumar SinghSaket Kumar Singh
30238
30238
add a comment |
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54017938%2fhow-to-get-a-binary-bipolar-activation-function-for-output-as-1-and-1-in-keras%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Set a threshold, say 0, anything above zero is 1 and below it is -1
– Oswald
Jan 3 at 7:22
@Oswald is there in modification I can do in the loss or activation parameters instead of modifying the y_pred as y_pred[y_pred > 0] = 1 y_pred[y_pred <= 0] = -1
– Umesh Desai
Jan 3 at 7:51