Skip to content

Implementation of the paper 'Situational Scene Graph for Structured Human-centric Situation Understanding'

Notifications You must be signed in to change notification settings

ChinthaniSugandhika/SSG

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 

Repository files navigation

Situational Scene Graph for Structured Human-centric Situation Understanding

This repository contains the official PyTorch implementation of the paper:

Situational Scene Graph for Structured Human-centric Situation Understanding
Chinthani Sugandhika, Chen Li, Deepu Rajan, Basura Fernando WACV 2025

arXiv

Abstract

Graph based representation has been widely used in modelling spatio-temporal relationships in video understanding. Although effective, existing graph-based approaches focus on capturing the human-object relationships while ignoring fine-grained semantic properties of the action components. These semantic properties are crucial for understanding the current situation, such as where does the action takes place, what tools are used and functional properties of the objects. In this work, we propose a graph-based representation called Situational Scene Graph (SSG) to encode both human-object relationships and the corresponding semantic properties. The semantic details are represented as predefined roles and values inspired by situation frame, which is originally designed to represent a single action. Based on our proposed representation, we introduce the task of situational scene graph generation and propose a multi-stage pipeline Interactive and Complementary Network (InComNet) to address the task. Given that the existing datasets are not applicable to the task, we further introduce a SSG dataset whose annotations consist of semantic role-value frames for human, objects and verb predicates of human-object relations. Finally, we demonstrate the effectiveness of our proposed SSG representation by testing on different downstream tasks. Experimental results show that the unified representation can not only benefit predicate classification and semantic role-value classification, but also benefit reasoning tasks on human-centric situation understanding.

Situational Scene Graph

Situational Scene Graph

We split our repository into three sections:

  1. SSG Dataset
  2. InComNet Model
  3. Data Annotation Tool

1. SSG Dataset

SSG dataset is based on Action Genome dataset.

  • Process and download the Action Genome video frames with the Toolkit.
  • Download the SSG annotations from SSG dataset.

2. InComNet Model

Please refer to InComNet/

3. Data Annotation Tool

Please refer to AnnotationTool/

Citation

If you use this code for your research, please cite our paper:

@article{sugandhika2024situational,
  title={Situational Scene Graph for Structured Human-centric Situation Understanding},
  author={Sugandhika, Chinthani and Li, Chen and Rajan, Deepu and Fernando, Basura},
  journal={arXiv preprint arXiv:2410.22829},
  year={2024}
}

Acknowledgments

This research/project is supported by the National Research Foundation, Singapore, under its NRF Fellowship (Award# NRF-NRFF14-2022-0001) and funding allocation to B.F. by A*STAR under its SERC Central Research Fund (CRF).

About

Implementation of the paper 'Situational Scene Graph for Structured Human-centric Situation Understanding'

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages