diff options
| author | Jules <jules@asdf.us> | 2018-10-30 21:15:00 -0400 |
|---|---|---|
| committer | Jules <jules@asdf.us> | 2018-10-30 21:15:00 -0400 |
| commit | ab81e78a0bca427ba9b0283ec3a1b5fc2d98cf2d (patch) | |
| tree | b450c93d40da09fde384d134151ef2ac2fbe04cb /datasets/scholar/entries/From Facial Parts Responses to Face Detection: A Deep Learning Approach.csv | |
| parent | 4345e7ba370113c56afbd7e0eda6a1696146a328 (diff) | |
| parent | 93b3392d9346226c328ea2a878ff968d0303f826 (diff) | |
Merge branch 'master' of asdf.us:megapixels_dev
Diffstat (limited to 'datasets/scholar/entries/From Facial Parts Responses to Face Detection: A Deep Learning Approach.csv')
| -rw-r--r-- | datasets/scholar/entries/From Facial Parts Responses to Face Detection: A Deep Learning Approach.csv | 1 |
1 files changed, 1 insertions, 0 deletions
diff --git a/datasets/scholar/entries/From Facial Parts Responses to Face Detection: A Deep Learning Approach.csv b/datasets/scholar/entries/From Facial Parts Responses to Face Detection: A Deep Learning Approach.csv new file mode 100644 index 00000000..e22f032b --- /dev/null +++ b/datasets/scholar/entries/From Facial Parts Responses to Face Detection: A Deep Learning Approach.csv @@ -0,0 +1 @@ +From facial parts responses to face detection: A deep learning approach|http://scholar.google.com/https://www.cv-foundation.org/openaccess/content_iccv_2015/html/Yang_From_Facial_Parts_ICCV_2015_paper.html|2015|213|12|1818335115841631894|None|http://scholar.google.com/scholar?cites=1818335115841631894&as_sdt=2005&sciodt=0,5&hl=en|http://scholar.google.com/scholar?cluster=1818335115841631894&hl=en&as_sdt=0,5|None|In this paper, we propose a novel deep convolutional network (DCN) that achieves outstanding performance on FDDB, PASCAL Face, and AFW. Specifically, our method achieves a high recall rate of 90.99% on the challenging FDDB benchmark, outperforming the state-of-the-art method by a large margin of 2.91%. Importantly, we consider finding faces from a new perspective through scoring facial parts responses by their spatial structure and arrangement. The scoring mechanism is carefully formulated considering challenging … |
