您的位置: 首页 > 院士专题 > 专题 > 详情页

S-PLM: Structure-Aware Protein Language Model via Contrastive Learning Between Sequence and Structure

S-PLM:基于序列和结构对比学习的结构感知蛋白质语言模型

关键词:
来源:
Advanced Science
来源地址:
https://onlinelibrary.wiley.com/doi/10.1002/advs.202404212
类型:
学术文献
语种:
英语
原文发布日期:
2024-12-12
摘要:
Proteins play an essential role in various biological and engineering processes. Large protein language models (PLMs) present excellent potential to reshape protein research by accelerating the determination of protein functions and the design of proteins with the desired functions. The prediction and design capacity of PLMs relies on the representation gained from the protein sequences. However, the lack of crucial 3D structure information in most PLMs restricts the prediction capacity of PLMs in various applications, especially those heavily dependent on 3D structures. To address this issue, S-PLM is introduced as a 3D structure-aware PLM that utilizes multi-view contrastive learning to align the sequence and 3D structure of a protein in a coordinated latent space. S-PLM applies Swin-Transformer on AlphaFold-predicted protein structures to embed the structural information and fuses it into sequence-based embedding from ESM2. Additionally, a library of lightweight tuning tools is provided to adapt S-PLM for diverse downstream protein prediction tasks. The results demonstrate S-PLM's superior performance over sequence-only PLMs on all protein clustering and classification tasks, achieving competitiveness comparable to state-of-the-art methods requiring both sequence and structure inputs. S-PLM and its lightweight tuning tools are available at https://github.com/duolinwang/S-PLM/.
相关推荐

意 见 箱

匿名:登录

个人用户登录

找回密码

第三方账号登录

忘记密码

个人用户注册

必须为有效邮箱
6~16位数字与字母组合
6~16位数字与字母组合
请输入正确的手机号码

信息补充