TY - JOUR A1 - Charron, Nicholas A1 - Musil, Félix A1 - Guljas, Andrea A1 - Chen, Yaoyi A1 - Bonneau, Klara A1 - Pasos-Trejo, Aldo A1 - Jacopo, Venturin A1 - Daria, Gusew A1 - Zaporozhets, Iryna A1 - Krämer, Andreas A1 - Templeton, Clark A1 - Atharva, Kelkar A1 - Durumeric, Aleksander A1 - Olsson, Simon A1 - Pérez, Adrià A1 - Majewski, Maciej A1 - Husic, Brooke A1 - Patel, Ankit A1 - De Fabritiis, Gianni A1 - Noé, Frank A1 - Clementi, Cecilia T1 - Navigating protein landscapes with a machine-learned transferable coarse-grained model JF - Arxiv N2 - The most popular and universally predictive protein simulation models employ all-atom molecular dynamics (MD), but they come at extreme computational cost. The development of a universal, computationally efficient coarse-grained (CG) model with similar prediction performance has been a long-standing challenge. By combining recent deep learning methods with a large and diverse training set of all-atom protein simulations, we here develop a bottom-up CG force field with chemical transferability, which can be used for extrapolative molecular dynamics on new sequences not used during model parametrization. We demonstrate that the model successfully predicts folded structures, intermediates, metastable folded and unfolded basins, and the fluctuations of intrinsically disordered proteins while it is several orders of magnitude faster than an all-atom model. This showcases the feasibility of a universal and computationally efficient machine-learned CG model for proteins. Y1 - 2023 U6 - https://doi.org/https://doi.org/10.48550/arXiv.2310.18278 ER - TY - JOUR A1 - Durumeric, Aleksander A1 - Charron, Nicholas A1 - Templeton, Clark A1 - Musil, Félix A1 - Bonneau, Klara A1 - Pasos-Trejo, Aldo A1 - Chen, Yaoyi A1 - Kelkar, Atharva A1 - Noé, Frank A1 - Clementi, Cecilia T1 - Machine learned coarse-grained protein force-fields: Are we there yet? JF - Current Opinion in Structural Biology N2 - The successful recent application of machine learning methods to scientific problems includes the learning of flexible and accurate atomic-level force-fields for materials and biomolecules from quantum chemical data. In parallel, the machine learning of force-fields at coarser resolutions is rapidly gaining relevance as an efficient way to represent the higher-body interactions needed in coarse-grained force-fields to compensate for the omitted degrees of freedom. Coarse-grained models are important for the study of systems at time and length scales exceeding those of atomistic simulations. However, the development of transferable coarse-grained models via machine learning still presents significant challenges. Here, we discuss recent developments in this field and current efforts to address the remaining challenges. Y1 - 2023 U6 - https://doi.org/10.1016/j.sbi.2023.102533 VL - 79 ER - TY - JOUR A1 - Krämer, Andreas A1 - Durumeric, Aleksander A1 - Charron, Nicholas A1 - Chen, Yaoyi A1 - Clementi, Cecilia A1 - Noé, Frank T1 - Statistically optimal force aggregation for coarse-graining molecular dynamics JF - The Journal of Physical Chemistry Letters N2 - Machine-learned coarse-grained (CG) models have the potential for simulating large molecular complexes beyond what is possible with atomistic molecular dynamics. However, training accurate CG models remains a challenge. A widely used methodology for learning bottom-up CG force fields maps forces from all-atom molecular dynamics to the CG representation and matches them with a CG force field on average. We show that there is flexibility in how to map all-atom forces to the CG representation and that the most commonly used mapping methods are statistically inefficient and potentially even incorrect in the presence of constraints in the all-atom simulation. We define an optimization statement for force mappings and demonstrate that substantially improved CG force fields can be learned from the same simulation data when using optimized force maps. The method is demonstrated on the miniproteins chignolin and tryptophan cage and published as open-source code. Y1 - 2023 U6 - https://doi.org/10.1021/acs.jpclett.3c00444 VL - 14 IS - 17 SP - 3970 EP - 3979 ER -