

Social impact aficionado, coffee enthusiast, UPenn grad, and Princeton in Africa alumni.
…ng. With feature pruning we basically want to remove any features we see will be unimportant to our analysis. For example, after exploring a dataset we may find that out of the 10 features, 7 of them have a high correlation with the output but the other 3 have very low correlation. Then those 3 low correlation features probably aren’t worth the compute and we might just be able to remove them from our analysis without hurting the output.
Another way we can do dimensionality reduction is through feature pruning. With feature pruning we basically want to remove any features we see will be unimportant to our analysis. For example, after exploring a dataset we may find that out of the 10 features, 7 of them have a h…