Poster
Quantitative Multimodal Anisotropy Imaging enables machine learning prediction of NASH CRN fibrosis stage without manual annotation
AASLD 2022
Study Background
Staging fibrosis severity in non-alcoholic steatohepatitis (NASH) requires pathologist review of tissue stained to visualize collagen. The accuracy of staging can be affected by both stain quality and variability of pathologists’ interpretation of the stain.
PathAI’s Quantitative Multimodal Anisotropy Imaging (QMAI) can highlight collagen in tissue and can be used in quantification and staging of NASH fibrosis (1). AIM-NASH is a machine learning model developed by PathAI using 26,000 pathologist annotations on whole slide images (WSI) of Masson’s Trichrome (MT)-stained tissue to accurately and reproducibly predict NASH Clinical Research Network (CRN) fibrosis stage (2; manuscript in preparation). Here, QMAI provides detailed, unbiased annotations of fibrosis on MT-stained tissue that are used to train deep neural network (DNN)-based ML models to infer a QMAI fibrosis pattern (iQMAI), which is then used by graph neural networks (GNNs) to predict slide level CRN fibrosis scores.
PathAI’s Quantitative Multimodal Anisotropy Imaging (QMAI) can highlight collagen in tissue and can be used in quantification and staging of NASH fibrosis (1). AIM-NASH is a machine learning model developed by PathAI using 26,000 pathologist annotations on whole slide images (WSI) of Masson’s Trichrome (MT)-stained tissue to accurately and reproducibly predict NASH Clinical Research Network (CRN) fibrosis stage (2; manuscript in preparation). Here, QMAI provides detailed, unbiased annotations of fibrosis on MT-stained tissue that are used to train deep neural network (DNN)-based ML models to infer a QMAI fibrosis pattern (iQMAI), which is then used by graph neural networks (GNNs) to predict slide level CRN fibrosis scores.
Authors
Tahir et al.