Multi-Stage Prompting for Making Pre-trained Language Models Better Translators

How can we leverage and improve prompt tuning techniques to faciliate pretrained LMs as a better translator? We answer this question in our ACL 2022 paper. (Read More)

Molecule Generation by Principal Subgraph Mining and Assembling

We proposed the notion of principal subgraph and designed a theoretically efficient two-step generation framework for the molecule generation task. (Read More)

On Robust Prefix-Tuning for Text Classification

Prefix-tuning lacks robustness, while current defense methods will hamper the modularity of the prefixes. We tune an additional prefix during inference to steer correct activation of the pretrained LM, which significantly improves the robustness. (Read More)

A Class of Short-term Recurrence Anderson Mixing Methods and Their Applications

We develop a novel class of short-term recurrence Anderson mixing methods and validate its effectiveness in several applications including solving fixed-point problems and training neural networks. (Read More)