In this entry, I wanted to highlight work done by Farhan Damani, Vishnu Sresht, and myself, where we cast molecular optimization as a neural translation problem. Our preprint is available on arXiv and was accepted at the 2020 NeurIPS Workshop on Machine Learning for Molecules.
The goal here is to map an input compound to a target compound in property space with the expectation that "translated" molecules have some improved biochemical property desiderata. Interestingly, we observe that when generated molecules are iteratively fed back into the translator, molecular compound attributes improve with each step akin to self-distillation dynamics. We demonstrate that this finding is largely invariant to the choice of translation model, making this a "black box" algorithm and accordingly call this method Black Box Recursive Translation (BBRT), a new inference method for molecular property optimization.