Skip to content

TencentARC/MotionCtrl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

6242d55 Β· Feb 19, 2025

History

38 Commits
Jul 19, 2024
Dec 28, 2023
Sep 20, 2024
Dec 25, 2023
Jul 18, 2024
Dec 25, 2023
Dec 25, 2023
Dec 25, 2023
Dec 25, 2023
Jul 19, 2024
Dec 25, 2023
Feb 19, 2025
Jul 18, 2024
Dec 26, 2023

Repository files navigation

MotionCtrl: A Unified and Flexible Motion Controller for Video Generation

[SIGGRAPH 2024 CONFERENCE PROCEEDINGS]

πŸ‘‰ MotionCtrl for [SVD], for [VideoCrafter], for [AnimateDiff]

Paper   arXiv   Project Page

πŸ€— HF Demo   πŸ€— HF Demo


πŸ”₯πŸ”₯πŸ”₯ We have released both Training and Inference code of MotionCtrl deployed on AnimateDiff

πŸ”₯πŸ”₯ We release the codes, models and demos for MotionCtrl on Stable Video Diffusion (SVD).


motionctrl_small.mp4

Official implementation of MotionCtrl: A Unified and Flexible Motion Controller for Video Generation.

MotionCtrl can Independently control complex camera motion and object motion of generated videos, with only a unified model.

Results of MotionCtrl+AnimateDiff

Results of MotionCtrl+SVD

More results are in showcase_svd and our Project Page.

Results of MotionCtrl+VideoCrafter

More results are in our Project Page.


πŸ“ Changelog

  • 20231225: Release MotionCtrl deployed on LVDM/VideoCrafter.
  • 20231225: Gradio demo available. πŸ€— HF Demo
  • 20231228: Provide local gradio demo for convenience.
  • 20240115 More camera poses used for testing are provided in dataset/camera_poses
  • 20240115 Release MotionCtrl deployed on SVD. Codes are in the branch svd and Gradio Demo is available in πŸ€— HF Demo.
  • ❗❗❗ Gradio demo of MotionCtrl deployed on VideoCrafter2 is available in πŸ€— HF Demo. You can also run it locally by python -m app --share.
  • ❗❗❗ Release MotionCtrl deployed on AnimateDiff are available in branch animatediff, containing both training and inference code.
  • 20240920 Provide scripts for collecting object trajectories with ParticleSfM.
  • 20240920 We provide a HandyTrajDrawer to customize object trajectories more conveniently.

βš™οΈ Environment

conda create -n motionctrl python=3.10.6
conda activate motionctrl
pip install -r requirements.txt

πŸ’« Inference

  • Run local inference script

  1. Download the weights of MotionCtrl motionctrl.pth and put it to ./checkpoints.
  2. Go into configs/inference/run.sh and set condtype as 'camera_motion', 'object_motion', or 'both'.
  • condtype=camera_motion means only control the camera motion in the generated video.
  • condtype=object_motion means only control the object motion in the generated video.
  • condtype=both means control the camera motion and object motion in the generated video simultaneously.
  1. Running scripts: sh configs/inference/run.sh
  • Run local gradio demo

    python -m app --share
    

πŸ”₯πŸ”₯ Training πŸ‘‰ Details

Preparing Dataset

  • RealEstate10K

    1. Following https://github.com/cashiwamochi/RealEstate10K_Downloader to download and process the videos.
    2. Corresponding Captions and List are provided in GoogleDrive.
  • WebVid with Object Trajectories

    1. Preparing ParticleSfM. Our experiments is running on CentOS 8.5 and we provide a detailed install note in dataset/object_trajectories/ParticleSfM_Install_Note.pdf.

    2. Moving dataset/object_trajectories/prepare_webvideo_len32.py and dataset/object_trajectories/run_particlesfm_obj_traj.py to ParticleSfM project.

    3. Step 1: Prepare sub-videos with lenth of 32 and size of 256 x 256.

      ## start_idx and end_idx is used to process a subset of the dataset in different machines parallelly
      
      python prepare_webvideo_len32.py --start_idx 0 --end_idx 1000
      
    4. Step 2: Get object trajectories

        root_dir="WebVid/train_256_32"
        start_idx=0
        end_idx=1000
      
        CUDA_VISIBLE_DEVICES=0 python run_particlesfm_obj_traj.py \
        --root_dir $root_dir \
        --start_idx $start_idx \
        --end_idx $end_idx \
      
  • You can customize object Trajectories with our provided HandyTrajDrawer.

πŸ“š Citation

If you make use of our work, please cite our paper.

@inproceedings{wang2024motionctrl,
  title={Motionctrl: A unified and flexible motion controller for video generation},
  author={Wang, Zhouxia and Yuan, Ziyang and Wang, Xintao and Li, Yaowei and Chen, Tianshui and Xia, Menghan and Luo, Ping and Shan, Ying},
  booktitle={ACM SIGGRAPH 2024 Conference Papers},
  pages={1--11},
  year={2024}
}

πŸ€— Acknowledgment

The current version of MotionCtrl is built on VideoCrafter. We appreciate the authors for sharing their awesome codebase.

❓ Contact

For any question, feel free to email wzhoux@connect.hku.hk or zhouzi1212@gmail.com.