Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
ParitoshParmar authored Oct 11, 2022
1 parent f718b50 commit ead59e5
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
Current video/action understanding systems have demonstrated impressive performance on large recognition tasks. However, they might be limiting themselves to learning to recognize spatiotemporal patterns, rather than attempting to thoroughly understand the actions. To spur progress in the direction of a truer, deeper understanding of videos, we introduce the task of win-fail action recognition -- differentiating between successful and failed attempts at various activities. We introduce a first of its kind **paired** win-fail action understanding dataset with samples from the following domains: "General Stunts," "Internet Wins-Fails," "Trick Shots," and "Party Games." Unlike existing action recognition datasets, intra-class variation is high making the task challenging, yet feasible. We systematically analyze the characteristics of the win-fail task/dataset with prototypical action recognition networks and a novel video retrieval task. While current action recognition methods work well on our task/dataset, they still leave a large gap to achieve high performance. We hope to motivate more work towards the true understanding of actions/videos.

## Dataset
Dataset can be downloaded from: https://drive.google.com/drive/folders/1q_El9GQJQgG8Agl8eUY3JUJvsYyb8UOP?usp=sharing. Please 7-zip to uncompress the dataset.
Dataset can be downloaded from: https://drive.google.com/drive/folders/1TEuOeOJwN0ehCdNkI0TAPfqWOR4LHImH?usp=sharing. Please use 7-zip to uncompress the dataset.

### Annotation format
1. All the samples are pairwise -- win and the corresponding fail.
Expand Down

0 comments on commit ead59e5

Please sign in to comment.