utzap50k (6 files)
ut-zap50k-lexi.zip | 213.85MB |
ut-zap50k-images-square.zip | 144.23MB |
ut-zap50k-images.zip | 305.32MB |
readme.txt | 7.70kB |
ut-zap50k-data.zip | 8.56MB |
ut-zap50k-feats.zip | 215.05MB |
Type: Dataset
Tags:
Bibtex:
Tags:
Bibtex:
@article{, title= {UT Zappos50K (Version 2.1)}, keywords= {}, author= {Aron Yu and Kristen Grauman}, abstract= {UT Zappos50K (UT-Zap50K) is a large shoe dataset consisting of 50,025 catalog images collected from Zappos.com. The images are divided into 4 major categories — shoes, sandals, slippers, and boots — followed by functional types and individual brands. The shoes are centered on a white background and pictured in the same orientation for convenient analysis. This dataset is created in the context of an online shopping task, where users pay special attentions to fine-grained visual differences. For instance, it is more likely that a shopper is deciding between two pairs of similar men's running shoes instead of between a woman's high heel and a man's slipper. GIST and LAB color features are provided. In addition, each image has 8 associated meta-data (gender, materials, etc.) labels that are used to filter the shoes on Zappos.com. https://i.imgur.com/RoVL6qr.jpg # Citation This dataset is for academic, non-commercial use only. If you use this dataset in a publication, please cite the following papers: A. Yu and K. Grauman. "Fine-Grained Visual Comparisons with Local Learning". In CVPR, 2014. [paper] [supp] [poster] [bibtex] [project page] @InProceedings{finegrained, author = {A. Yu and K. Grauman}, title = {Fine-Grained Visual Comparisons with Local Learning}, booktitle = {Computer Vision and Pattern Recognition (CVPR)}, month = {Jun}, year = {2014} } A. Yu and K. Grauman. "Semantic Jitter: Dense Supervision for Visual Comparisons via Synthetic Images". In ICCV, 2017. [paper] [supp] [poster] [bibtex] [project page]}, terms= {}, license= {}, superseded= {}, url= {http://vision.cs.utexas.edu/projects/finegrained/utzap50k/} }
No comments yet
Add a comment