# Inception: Going deeper with convolutions

## 2 动机

### 2.1 面临问题

Their main result states that if the probability distribution of the dataset is representable by a large, very sparse deep neural network, then the optimal network topology can be constructed layer after layer by analyzing the correlation statistics of the preceding layer activations and clustering neurons with highly correlated outputs.

neurons that fire together, wire together

### 2.2 解决思路

The vast literature on sparse matrix computations suggests that clustering sparse matrices into relatively dense submatrices tends to give competitive performance for sparse matrix multiplication.

## 3 技术手段

### 3.1 Inception的形成

a layer-by layer construction where one should analyze the correlation statistics of the last layer and cluster them into groups of units with high correlation.

**因为卷积操作实质上也就等价于稀疏连接，而不同尺度卷积核就相当于不同的稀疏连接方式，最后将这些卷积后的结果组合起来也就等价于某种潜在的稀疏操作[6]。**而这，其实也就是作者想要寻找的基于卷积网络的最优局部稀疏结构。

Inspired by a neuroscience model of the primate visual cortex, Serre et al. used a series of fixed Gabor filters of different sizes to handle multiple scales. We use a similar strategy here.

the design follows the practical intuition that visual information should be processed at various scales and then aggregated

## 4 总结

### 引用

[1] Szegedy C, Liu W, Jia Y, et al. Going deeper with convolutions[C]//Proceedings of the IEEE conference on computer vision and pattern recognition. 2015: 1-9.

[2]https://leonardoaraujosantos.gitbook.io/artificial-inteligence/machine_learning/deep_learning/object_localization_and_detection

[3]S. Arora, A. Bhaskara, R. Ge, and T. Ma. Provable bounds for learning some deep representations. CoRR, abs/1310.6343, 2013

[4]https://zhuanlan.zhihu.com/p/19939960

[6]https://www.programmersought.com/article/4186752331/

• 点赞 2
• 评论
• 分享
x

海报分享

扫一扫，分享海报

• 收藏 1
• 打赏

打赏

空字符（公众号：月来客栈）

你的鼓励将是我创作的最大动力

C币 余额
2C币 4C币 6C币 10C币 20C币 50C币
• 举报
• 一键三连

点赞Mark关注该博主, 随时了解TA的最新博文

11-05 4778
11-15 410
12-05 389