VTE-BERT
Disclaimer: This demo version is built using a standard set of generic preprocessing semantic rules. In the production environment, these rules can be customized to align with specific institutional formatting requirements. The demo runes on CPU only, while the production version will leverage GPU acceleration for signigicantly faster processing. The model is trained for precision, so if results appear negative, consider using other notes from the same encounter.
Resources:
-
VTE-BERT Model: Our fine-tuned model optimized for VTE classification is available under gated access on Hugging Face.
-
GitHub: NLPMed-Engine, a robust and extensible NLP engine tailored for medical text, internally using VTE-BERT.
-
Publications:
-
Jafari, O., Ma, S., Lam, B. D., Jiang, J. Y., Zhou, E., Ranjan, M., ... & Li, A. Development and Validation of VTE-BERT Natural Language Processing Model for Venous Thromboembolism. Journal of Thrombosis and Haemostasis. DOI: 10.1016/j.jtha.2025.07.021 (Open Access)