1from keras.layers import LeakyReLU
2model = Sequential()
34# here change your line to leave out an activation 5model.add(Dense(90))
67# now add a ReLU layer explicitly:8model.add(LeakyReLU(alpha=0.05))
you will get a confirmation link on this - you will have to click that for successful submission of your answer. we require this to keep the website free of spam, bots and unhelpful content
please ensure to add code which is syntactically corrent and executes properly
you will get a confirmation link on this - you will have to click that for successful submission of your question. we require this to keep the website free of spam, bots and unhelpful content