feat: Finish IA part

This commit is contained in:
2024-10-31 16:45:15 +01:00
parent aa45331010
commit 4b2ab03dab
10 changed files with 76 additions and 3 deletions

View File

@ -0,0 +1,15 @@
The activation function used for each neuron in the hidden and output layers is the \textit{sigmoid}, defined by the \texttt{sigmoid} function:
\begin{verbatim}
double sigmoid(double x) {
if (x > 20) return 1.0;
if (x < -20) return 0.0;
double z = exp(-x);
return 1.0 / (1.0 + z);
}
\end{verbatim}
This function is bounded between 0 and 1, allowing for normalization of the activation values for each neuron. The derivative of the sigmoid, \texttt{sigmoid\_derivative}, is used in backpropagation to compute gradients:
\begin{verbatim}
double sigmoid_derivative(double x) {
return x * (1.0 - x);
}
\end{verbatim}