Skip to content

Commit

Permalink
Update Personal Info
Browse files Browse the repository at this point in the history
  • Loading branch information
Orion-Zheng committed Apr 14, 2024
1 parent 2629e45 commit 7dadbf9
Show file tree
Hide file tree
Showing 4 changed files with 5 additions and 6 deletions.
11 changes: 5 additions & 6 deletions _pages/cv.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ redirect_from:
---

{% include base_path %}
The PDF format is available [here](/files/ZhengZian's_CV-240115.pdf).
The PDF format is available [here](/files/ZhengZian's_CV.pdf).

Education
======
Expand All @@ -27,9 +27,9 @@ Research Experience

Keywords: Data-Centric methods, Mixture-of-Experts Model

* Working on OpenMoE project (second author) with [Fuzhao Xue](https://xuefuzhao.github.io), which is the **first open-source, decoder-only MoE language model**. We released the code and checkpoint and got ~**750 stars** on [GitHub](https://github.com/XueFuzhao/OpenMoE).
* Working on OpenMoE project (second author) with [Fuzhao Xue](https://xuefuzhao.github.io), which is the **first open-source, decoder-only MoE language model**. We released the code and checkpoint and got **1k+ stars** on [GitHub](https://github.com/XueFuzhao/OpenMoE).
* Investigated publicly available pre-training corpus (English, Chinese, multilingual, code, etc), preprocessing methods and tokenization techniques. Do experiments comparing tokenizers. Prepare the pre-training, SFT and evaluation datasets in TFDS format.
* Worked on the Pytorch implementation of OpenMoE with ColossalAI team. Now conducting literature review of Mixture of Experts models and writing the paper.
* Worked on the Pytorch implementation of OpenMoE with the ColossalAI team. Performing model evaluations and contributed to the paper writing.


Work Experience
Expand All @@ -46,6 +46,5 @@ Work Experience

Publications
======
* OpenMoE: Open Mixture-of-Experts Language Models [[Code]](https://github.com/XueFuzhao/OpenMoE) [[Blog]](https://www.notion.so/Aug-2023-OpenMoE-v0-2-Release-43808efc0f5845caa788f2db52021879) [[Twitter]](https://x.com/XueFz/status/1693696988611739947?s=20) \
Fuzhao Xue, **Zian Zheng**, Yao Fu, Jinjie Ni, Zangwei Zheng, Wangchunshu Zhou and Yang You
***GitHub repository***
* OpenMoE: An Early Effort on Open Mixture-of-Experts Language Models [[Code]](https://github.com/XueFuzhao/OpenMoE) [[Paper]](https://arxiv.org/abs/2402.01739) [[Twitter]](https://x.com/XueFz/status/1693696988611739947?s=20) \
Fuzhao Xue, **Zian Zheng**, Yao Fu, Jinjie Ni, Zangwei Zheng, Wangchunshu Zhou and Yang You
Binary file removed files/ZhengZian's_CV-240115.pdf
Binary file not shown.
Binary file added files/ZhengZian's_CV.docx
Binary file not shown.
Binary file added files/ZhengZian's_CV.pdf
Binary file not shown.

0 comments on commit 7dadbf9

Please sign in to comment.