[ Paper Summary ] Brain network from Sparse and Topological point of view

Photo by meo from Pexels

This post is for my future self. I have limited medical knowledge, but I just wanted to challenge myself.

Image from this website


The sparse Brain network is achieved via two ways, thresholding of connectivity matrix or imposing the sparseness constraint in the connectivity matrix estimation. (This seems very similar to k-sparse auto encoders and imposing sparsity via KL Divergence.). But it is not yet known what threshold or sparseness level is best in determining the hidden connectivity structure of the brain. So the authors of this paper show the equivalence between sparseness and threshold, additionally they observe topological changes when varying the threshold/sparseness from ADHD children.


When we have matrix X, that is composed of n dimensional vector, we can create a connectivity matrix C (correlation matrix/partial correlation) by either thresholding or imposing the sparseness by minimizing the L1 norm of C.

And to show that thresholding is equivalent to imposing sparseness, the authors introduce the penalized linear regression to estimate the correlation and partial correlation. Two functions are solve via gradient descent method. The visualization tool Barcode was used.


They found that brain networks of ASD and ADHD groups might be more difficult to be merged into a component due to common under-connectivity and local over-connectivity. ( I guess this is related to the brain symptoms, hence I have no idea what the above statement means.)


The authors were able to show equivalence between sparsity level and threshold of the network. ( I really have no idea what they are talking about. But from the diagram alone, it seems like as the hyper parameter lambda changes the brain regions can be either be more connected or less. )

Final Words

If any errors are found, please email me at jae.duk.seo@gmail.com, if you wish to see the list of all of my writing please view my website here.

Meanwhile follow me on my twitter here, and visit my website, or my Youtube channel for more content. I also implemented Wide Residual Networks, please click here to view the blog post.


  1. (2018). Citeseerx.ist.psu.edu. Retrieved 19 September 2018, from http://citeseerx.ist.psu.edu/viewdoc/download?doi=

https://jaedukseo.me I love to make my own notes my guy, let's get LIT with KNOWLEDGE in my GARAGE

https://jaedukseo.me I love to make my own notes my guy, let's get LIT with KNOWLEDGE in my GARAGE