The article, part of an "Understanding" series, delves into the implementation of logistic regression using PyTorch, distinguishing between its binary and multi-class forms. Binary logistic regression is utilized for binary output labels and employs the sigmoid function as its activation function, with binary cross-entropy (BCE) as the loss function. The article explains the mechanics of these functions, emphasizing the conversion of outputs to probabilities and the calculation of loss. The transition to multi-class logistic regression, which handles more than two output classes, involves the use of the softmax function for activation and cross-entropy for loss, extending the BCE concept. The article highlights the differences in neural network architecture between binary and multi-class logistic regression and includes practical implementation steps, comparing results derived using NumPy and PyTorch to provide a comprehensive understanding of the concepts involved.