feat: ✨ Finish IA part
This commit is contained in:
@ -0,0 +1,15 @@
|
||||
The activation function used for each neuron in the hidden and output layers is the \textit{sigmoid}, defined by the \texttt{sigmoid} function:
|
||||
\begin{verbatim}
|
||||
double sigmoid(double x) {
|
||||
if (x > 20) return 1.0;
|
||||
if (x < -20) return 0.0;
|
||||
double z = exp(-x);
|
||||
return 1.0 / (1.0 + z);
|
||||
}
|
||||
\end{verbatim}
|
||||
This function is bounded between 0 and 1, allowing for normalization of the activation values for each neuron. The derivative of the sigmoid, \texttt{sigmoid\_derivative}, is used in backpropagation to compute gradients:
|
||||
\begin{verbatim}
|
||||
double sigmoid_derivative(double x) {
|
||||
return x * (1.0 - x);
|
||||
}
|
||||
\end{verbatim}
|
Reference in New Issue
Block a user