# Understanding glm fit fitted probabilities numerically 0 or 1 occurred

Source: bing.com

When it comes to fitting a generalized linear model (GLM) to data, one common problem that can arise is the occurrence of fitted probabilities that are numerically equal to either 0 or 1. In this article, we will explore what causes these situations, why they can be problematic, and how to address them.

## What is a GLM?

Source: bing.com

A generalized linear model is a statistical model that is used to analyze data that do not conform to the assumptions of ordinary linear regression. GLMs are particularly useful for modeling data that have a non-normal distribution, such as binary data (e.g. yes/no responses) or count data (e.g. number of accidents per month). They consist of three main components:

• A random component that specifies the probability distribution of the response variable
• A linear predictor that links the predictor variables to the response variable
• A link function that transforms the linear predictor to the scale of the response variable

## Why do fitted probabilities become 0 or 1?

Source: bing.com

One of the most common reasons why fitted probabilities in a GLM can become numerically equal to 0 or 1 is separation. Separation occurs when there is perfect or near-perfect separation of the data based on the levels of the predictor variables. This can happen when the predictor variables are highly correlated with the response variable, or when there are a small number of observations with extreme values of the predictor variables that are far away from the rest of the data.

Another reason why fitted probabilities can become numerically equal to 0 or 1 is complete separation. Complete separation occurs when there is a predictor variable that perfectly predicts the response variable.

## Why are 0 or 1 probabilities problematic?

Source: bing.com

When fitted probabilities become numerically equal to 0 or 1, it can lead to several issues with model estimation and inference. For example:

• It can lead to biased estimates of the regression coefficients, standard errors, and p-values
• It can cause convergence problems in estimation algorithms, leading to invalid or unstable estimates
• It can make it difficult to perform model selection and comparison, as likelihood-based criteria such as AIC and BIC may not be reliable

## How can you address 0 or 1 probabilities?

Source: bing.com

There are several methods that can be used to address the problem of 0 or 1 probabilities in a GLM. Some of the most common include:

• Fitting a penalized model, such as a ridge or lasso regression, that adds a penalty term to the likelihood function to prevent overfitting and improve stability
• Using a Bayesian approach, which can incorporate prior knowledge and regularization to improve estimation and reduce overfitting
• Dropping variables that are highly correlated or have low variance, which can reduce the risk of separation and improve model stability
• Merging categories of categorical variables that have similar effects, which can reduce the number of levels and improve model stability

## Conclusion

In summary, the problem of fitted probabilities becoming numerically equal to 0 or 1 is a common issue that can arise when fitting GLMs to data. This problem can lead to biased estimates, convergence problems, and difficulties with model selection and comparison. However, there are several methods that can be used to address this problem, including penalized regression, Bayesian methods, variable selection, and category merging.