Bio-Medical Materials and Engineering - Volume 26, issue s1
Purchase individual online access for 1 year to this journal.
Price: EUR 245.00
Impact Factor 2024: 1.0
The aim of
Bio-Medical Materials and Engineering is to promote the welfare of humans and to help them keep healthy. This international journal is an interdisciplinary journal that publishes original research papers, review articles and brief notes on materials and engineering for biological and medical systems.
Articles in this peer-reviewed journal cover a wide range of topics, including, but not limited to: Engineering as applied to improving diagnosis, therapy, and prevention of disease and injury, and better substitutes for damaged or disabled human organs; Studies of biomaterial interactions with the human body, bio-compatibility, interfacial and interaction problems; Biomechanical behavior under biological and/or medical conditions; Mechanical and biological properties of membrane biomaterials; Cellular and tissue engineering, physiological, biophysical, biochemical bioengineering aspects; Implant failure fields and degradation of implants. Biomimetics engineering and materials including system analysis as supporter for aged people and as rehabilitation; Bioengineering and materials technology as applied to the decontamination against environmental problems; Biosensors, bioreactors, bioprocess instrumentation and control system; Application to food engineering; Standardization problems on biomaterials and related products; Assessment of reliability and safety of biomedical materials and man-machine systems; and Product liability of biomaterials and related products.
Abstract: Support vector machine (SVM) is one of the most effective classification methods for cancer detection. The efficiency and quality of a SVM classifier depends strongly on several important features and a set of proper parameters. Here, a series of classification analyses, with one set of photoacoustic data from ovarian tissues ex vivo and a widely used breast cancer dataset- the Wisconsin Diagnostic Breast Cancer (WDBC), revealed the different accuracy of a SVM classification in terms of the number of features used and the parameters selected. A pattern recognition system is proposed by means of SVM-Recursive Feature Elimination (RFE) with the…Radial Basis Function (RBF) kernel. To improve the effectiveness and robustness of the system, an optimized tuning ensemble algorithm called as SVM-RFE(C) with correlation filter was implemented to quantify feature and parameter information based on cross validation. The proposed algorithm is first demonstrated outperforming SVM-RFE on WDBC. Then the best accuracy of 94.643% and sensitivity of 94.595% were achieved when using SVM-RFE(C) to test 57 new PAT data from 19 patients. The experiment results show that the classifier constructed with SVM-RFE(C) algorithm is able to learn additional information from new data and has significant potential in ovarian cancer diagnosis.
Show more
Keywords: Support vector machines, ovarian cancer detection, feature selection, correlation filter, SVM-RFE
Abstract: A grid-driven gridding (GDG) method is proposed to uniformly re-sample non-Cartesian raw data acquired in PROPELLER, in which a trajectory window for each Cartesian grid is first computed. The intensity of the reconstructed image at this grid is the weighted average of raw data in this window. Taking consider of the single instruction multiple data (SIMD) property of the proposed GDG, a CUDA accelerated method is then proposed to improve the performance of the proposed GDG. Two groups of raw data sampled by PROPELLER in two resolutions are reconstructed by the proposed method. To balance computation resources of the GPU…and obtain the best performance improvement, four thread-block strategies are adopted. Experimental results demonstrate that although the proposed GDG is more time consuming than traditional DDG, the CUDA accelerated GDG is almost 10 times faster than traditional DDG.
Show more
Keywords: Gridding, magnetic resonance reconstruction, uniform re-sampling, CUDA acceleration
Abstract: Diffusion tensor imaging allows for the non-invasive in vivo mapping of the brain tractography. However, fiber bundles have complex structures such as fiber crossings, fiber branchings and fibers with large curvatures that tensor imaging (DTI) cannot accurately handle. This study presents a novel brain white matter tractography method using Q-ball imaging as the data source instead of DTI, because QBI can provide accurate information about multiple fiber crossings and branchings in a single voxel using an orientation distribution function (ODF). The presented method also uses graph theory to construct the Bayesian model-based graph, so that the fiber tracking between two…voxels can be represented as the shortest path in a graph. Our experiment showed that our new method can accurately handle brain white matter fiber crossings and branchings, and reconstruct brain tractograhpy both in phantom data and real brain data.
Show more
Keywords: Brain white matter, graph theory, QBI, tractography, Bayesian model
Abstract: Laser can precisely deliver quantitative energy to a desired region in a non-contact way. Since it can stimulate regions and minutely control parameters such as the intensity, duration and frequency of stimulus, laser is often used for the areas such as low power laser treatment and clinical physiology. This study proposes simulation using pulse diode laser with reliable output and identifies laser parameters that can present a variety of somesthesis. It is found that typically, as frequency and energy increase, the ratio of feeling senses increases, and dominant sense moves from the sense of heat through tactile sense to pain.…This study will be baseline data for studies of the sense of heat, tactile sense and pain, contribute to studying neurophysiology sector and be applied to basic clinical research.
Show more
Abstract: Segmentation technique is widely accepted to reduce noise propagation from transmission scanning for positron emission tomography. The conventional routine is to sequentially perform reconstruction and segmentation. A smoothness penalty is also usually used to reduce noise, which can be imposed to both the ML and WLS estimators. In this paper we replace the smoothness penalty by a segmentation penalty that biases the object toward piecewise-homogeneous reconstruction. Two updating algorithms are developed to solve the penalized ML and WLS estimates, which monotonically decrease the cost functions. Experimental results on simulated phantom and real clinical data were both given to demonstrate the…effectiveness and efficiency of the algorithms which were proposed.
Show more
Keywords: Segmentation penalty, fuzzy c-means clustering (FCM), space alternating descent, auxiliary function
Abstract: This paper aims to solve automated feature selection problem in brain computer interface (BCI). In order to automate feature selection process, we proposed a novel EEG feature selection method based on decision tree (DT). During the electroencephalogram (EEG) signal processing, a feature extraction method based on principle component analysis (PCA) was used, and the selection process based on decision tree was performed by searching the feature space and automatically selecting optimal features. Considering that EEG signals are a series of non-linear signals, a generalized linear classifier named support vector machine (SVM) was chosen. In order to test the validity of…the proposed method, we applied the EEG feature selection method based on decision tree to BCI Competition II datasets Ia, and the experiment showed encouraging results.
Show more
Abstract: Computed tomography (CT) has been widely used to acquire volumetric anatomical information in the diagnosis and treatment of illnesses in many clinics. However, the ART algorithm for reconstruction from under-sampled and noisy projection is still time-consuming. It is the goal of our work to improve a block-wise approximate parallel implementation for the ART algorithm on CUDA-enabled GPU to make the ART algorithm applicable to the clinical environment. The resulting method has several compelling features: (1) the rays are allotted into blocks, making the rays in the same block parallel; (2) GPU implementation caters to the actual industrial and medical application…demand. We test the algorithm on a digital shepp-logan phantom, and the results indicate that our method is more efficient than the existing CPU implementation. The high computation efficiency achieved in our algorithm makes it possible for clinicians to obtain real-time 3D images.
Show more
Abstract: Radiotherapy treatment plan may be replanned due the changes of tumors and organs at risk (OARs) during the treatment. Deformable image registration (DIR) based Computed Tomography (CT) contour propagation in the routine clinical setting is expected to reduce time needed for necessary manual tumors and OARs delineations and increase the efficiency of replanning. In this study, a DIR method was developed for CT contour propagation. Prior structure delineations were incorporated into Demons DIR, which was represented by adding an intensity matching term of the delineated tissues pairs to the energy function of Demons. The performance of our DIR was evaluated…with five clinical head-and-neck and five lung cancer cases. The experimental results verified the improved accuracy of the proposed registration method compared with conventional registration and Demons DIR.
Show more
Abstract: Arrhythmia diagnosis is very significant to ensure human health. In this paper, a new model is developed for arrhythmia diagnosis. A salient feature of the algorithm is a synergistic combination of statistical and fuzzy set-based techniques. It is distribution-free and is realized in an unsupervised mode. Arrhythmia diagnosis is viewed as a certain statistical hypothesis testing. ‘Abnormal’ is typically a much complex concept, so it can be described with the technology of fuzzy sets which bring a facet of robustness to the overall scheme and play an important role in the successive step of hypothesis testing. Intensive fuzzification is engaged…in parameters determination which is self-adaptive and no parameter needs to be specified by the user. The algorithm is validated with a number of experiments, which prove its effectiveness for arrhythmia diagnosis.
Show more
Abstract: Epilepsy is a neurological disorder characterized by the sudden abnormal discharging of brain neurons that can lead to encephalographic (EEG) abnormalities. In this study, data was obtained from epileptic patients with intracranial depth electrodes and analyzed using wavelet entropy algorithms in order to locate the epileptic foci. Significant increases in the wavelet entropy of the epileptic signals were identified during multiple episodes of clinical seizures. The results indicated that the algorithm was capable of identifying entropy changes in the epileptic sources. Furthermore, the correlations among the electrocorticogram (ECoG) signals of different channels determined using the amplitude-amplitude coupling method verified that…the epileptic foci exhibited significantly higher coupling strengths. Thus, cross frequency coupling (CFC) could be an inspiration to energy and signal transitive mode of seizure and, thereby, improve diagnostic processes.
Show more