1. <div id="8sgz1"><ol id="8sgz1"></ol></div>

        <em id="8sgz1"><label id="8sgz1"></label></em>
      2. <em id="8sgz1"><label id="8sgz1"></label></em>
        <em id="8sgz1"></em>
        <div id="8sgz1"><ol id="8sgz1"><mark id="8sgz1"></mark></ol></div>

        <button id="8sgz1"></button>
        west china medical publishers
        Keyword
        • Title
        • Author
        • Keyword
        • Abstract
        Advance search
        Advance search

        Search

        find Keyword "Graph convolutional network" 4 results
        • Image classification of osteoarthritis based on improved shifted windows transformer and graph convolutional networks

          Osteoarthritis is a common degenerative joint disease, which is often analyzed by X-ray images. However, if there is a lack of clinical experience when reading the films, it is easy to cause misdiagnosis. Although deep learning has made significant progress in the field of medical image processing, existing models still have limitations in capturing subtle lesion features such as joint spaces. This paper proposes an automatic diagnosis method for osteoarthritis based on the improved shifted windows Transformer (Swin Transformer) and graph convolutional network. By enhancing the modeling of joint space features and cross-layer feature fusion, it is expected to effectively improve the accuracy of early diagnosis of osteoarthritis. Firstly, this paper designs the shifted windows horizontal attention mechanism (SW-HAM), which can enhance the feature extraction ability in the horizontal direction. Secondly, the central-attention graphSAGE (CAG-SAGE) is introduced to conduct weighted aggregation of the feature information of the lesion area through the dynamic attention mechanism. Finally, cross-layer connection technology is utilized to achieve efficient fusion of multi-layer features. The experimental results show that the SW-HAM and CAG-SAGE modules and cross-layer connections significantly improve the model performance. The classification accuracy, recall rate, precision rate, F1 score, and area under the curve are 94.59%, 95.14%, 94.05%, 94.41%, and 96.30% respectively, all of which are superior to the classical network and existing methods. It provides a new and effective method for the classification and diagnosis of osteoarthritis.

          Release date:2025-12-22 10:16 Export PDF Favorites Scan
        • Research on classification method of multimodal magnetic resonance images of Alzheimer’s disease based on generalized convolutional neural networks

          Alzheimer’s disease (AD) is a progressive and irreversible neurodegenerative disease. Neuroimaging based on magnetic resonance imaging (MRI) is one of the most intuitive and reliable methods to perform AD screening and diagnosis. Clinical head MRI detection generates multimodal image data, and to solve the problem of multimodal MRI processing and information fusion, this paper proposes a structural and functional MRI feature extraction and fusion method based on generalized convolutional neural networks (gCNN). The method includes a three-dimensional residual U-shaped network based on hybrid attention mechanism (3D HA-ResUNet) for feature representation and classification for structural MRI, and a U-shaped graph convolutional neural network (U-GCN) for node feature representation and classification of brain functional networks for functional MRI. Based on the fusion of the two types of image features, the optimal feature subset is selected based on discrete binary particle swarm optimization, and the prediction results are output by a machine learning classifier. The validation results of multimodal dataset from the AD Neuroimaging Initiative (ADNI) open-source database show that the proposed models have superior performance in their respective data domains. The gCNN framework combines the advantages of these two models and further improves the performance of the methods using single-modal MRI, improving the classification accuracy and sensitivity by 5.56% and 11.11%, respectively. In conclusion, the gCNN-based multimodal MRI classification method proposed in this paper can provide a technical basis for the auxiliary diagnosis of Alzheimer’s disease.

          Release date:2023-06-25 02:49 Export PDF Favorites Scan
        • Identification of breast cancer subtypes based on graph convolutional network

          Identification of molecular subtypes of malignant tumors plays a vital role in individualized diagnosis, personalized treatment, and prognosis prediction of cancer patients. The continuous improvement of comprehensive tumor genomics database and the ongoing breakthroughs in deep learning technology have driven further advancements in computer-aided tumor classification. Although the existing classification methods based on gene expression omnibus database take the complexity of cancer molecular classification into account, they ignore the internal correlation and synergism of genes. To solve this problem, we propose a multi-layer graph convolutional network model for breast cancer subtype classification combined with hierarchical attention network. This model constructs the graph embedding datasets of patients’ genes, and develops a new end-to-end multi-classification model, which can effectively recognize molecular subtypes of breast cancer. A large number of test data prove the good performance of this new model in the classification of breast cancer subtypes. Compared to the original graph convolutional neural networks and two mainstream graph neural network classification algorithms, the new model has remarkable advantages. The accuracy, weight-F1-score, weight-recall, and weight-precision of our model in seven-category classification has reached 0.851 7, 0.823 5, 0.851 7 and 0.793 6 respectively. In the four-category classification, the results are 0.928 5, 0.894 9, 0.928 5 and 0.865 0 respectively. In addition, compared with the latest breast cancer subtype classification algorithms, the method proposed in this paper also achieved the highest classification accuracy. In summary, the model proposed in this paper may serve as an auxiliary diagnostic technology, providing a reliable option for precise classification of breast cancer subtypes in the future and laying the theoretical foundation for computer-aided tumor classification.

          Release date:2024-04-24 09:40 Export PDF Favorites Scan
        • Brain computer interface nursing bed control system based on deep learning and dual visual feedback

          In order to meet the need of autonomous control of patients with severe limb disorders, this paper designs a nursing bed control system based on motor imagery-brain computer interface (MI-BCI). In view of the low decoding performance of cross-subjects and the dynamic fluctuation of cognitive state in the existing MI-BCI technology, the neural network structure optimization and user interaction feedback enhancement are improved. Firstly, the optimized dual-branch graph convolution multi-scale neural network integrates dynamic graph convolution and multi-scale convolution. The average classification accuracy is higher than that of multi-scale attention temporal convolution network, Gram angle field combined with convolution long short term memory hybrid network, Transformer-based graph convolution network and other existing methods. Secondly, a dual visual feedback mechanism is constructed, in which electroencephalogram (EEG) topographic map feedback can improve the discrimination of spatial patterns, and attention state feedback can enhance the temporal stability of signals. Compared with the single EEG topographic map feedback and non-feedback system, the average classification accuracy of the proposed method is also greatly improved. Finally, in the four classification control task of nursing bed, the average control accuracy of the system is 90.84%, and the information transmission rate is 84.78 bits/min. In summary, this paper provides a reliable technical solution for improving the autonomous interaction ability of patients with severe limb disorders, which has important theoretical significance and application value.

          Release date:2025-10-21 03:48 Export PDF Favorites Scan
        1 pages Previous 1 Next

        Format

        Content

          1. <div id="8sgz1"><ol id="8sgz1"></ol></div>

            <em id="8sgz1"><label id="8sgz1"></label></em>
          2. <em id="8sgz1"><label id="8sgz1"></label></em>
            <em id="8sgz1"></em>
            <div id="8sgz1"><ol id="8sgz1"><mark id="8sgz1"></mark></ol></div>

            <button id="8sgz1"></button>
            欧美人与性动交α欧美精品