Archive for the ‘Machine Learning’ Category

Machine learning developed a CD8+ exhausted T cells signature for predicting prognosis, immune infiltration and drug … – Nature.com

Identification of TRGs and their prognostic value

From the data obtained from the single-cell RNA-seq analyses of OC tissue (GSE184880 dataset), we identified six major types of cells, including T/NK cells, myeloid cells, Epithelial cells, Fibroblasts, B cells and endothelial cells (Fig.2A). Figure2B showed the expression of cell markers. We then extracted T/NK cells for further analysis. As result, T/NK cells could be re-clustered into CD8+ cytotoxic T, CD8+ exhausted T, NK, CD4+ exhausted T and CD4+ nave T based on expression pattern of cell markers (Fig.2C,D). Development trajectory analyses of T/NK cells unveiled that CD4+ nave T, CD8+ cytotoxic T, and NK were enriched in initial differentiation phase while CD4+ exhausted T and CD8+ exhausted T were enriched in terminal differentiation phase (Fig.2E). Based on the FindAllMarkers function of the Seurat package, we identified 384 TRGs. Compared with normal tissues, we obtained 9638 DEGs in OC tissues (Fig.2F), including 248 TRGs (Fig.2G) in TCGA dataset. Among these differentially expressed TRGs, a total of 41 genes were significantly associated with the prognosis of OC patients in TCGA dataset (Fig.2H, P<0.05).

Identification of TRGs and their prognostic value. (A) t-SNE plot showing the identified cell types of from 7 ovarian cancer sample. (B) Dotplot showing average expression levels of cell marker. (C,D) SNE plot of sub-cell types of T cells and dotplot of expression pattern of cell markers. (E) Developmental trajectory of T cells inferred by monocle, colored by pseudotime and cell subtype. (F) Volcano plot showing DEGs in ovarian cancer. (G) Overlap between DEGs and TRGs. (H) Potential biomarkers identified by univariate cox analysis.

These 41 potential prognostic biomarkers were submitted to an integrative machine learning procedure including 10 methods, with which we developed a stable TRPS. As a result, we obtained a total of 101 kinds of prognostic models and their C-index in training and testing cohorts were shown in Fig.3A. The data suggested that the prognostic signature constructed by Enet (alpha=0.3) method was considered as the optimal TRPS with a highest average C-index of 0.58 (Fig.3A). The optimal TRPS was developed by 18 TRGs. The formula of the risk score was shown in Supplementary methods and results. Using the best cut-off value, we then divided into ovarian cancer cases into high and low TRPS score. As expected, OC patients with high risk score had a poor OS rate in TCGA cohort (P<0.001), GSE14764 cohort (P=0.0146), GSE26193 cohort (P=0.0039), GSE26712 cohort (P=0.0013), GSE63885 cohort (P<0.001) and GSE140082 (P=0.0032) cohort (Fig.3BG), with the AUCs of 2-, 3-, and 4-year being 0.728, 0.783, and 0.773 in TCGA cohort; 0.629, 0.642, and 0.739 in GSE14764 cohort; 0.617, 0.644, and 0.616 in GSE26193 cohort; 0.607, 0.587, and 0.591 in GSE26712 cohort, 0.672, 0.646 and 0.721 in GSE63885 cohort, 0.608 and 0.617 in GSE140082 cohort, respectively (Fig.3BG).

Identification of TRPS by machine learning. (A) The C-index of 101 kinds prognostic models constructed by 10 machine learning algorithms in training and testing cohort. (BG) The survival curve of ovarian cancer patients with different TRPS score and their corresponding ROC curve in TCGA, GSE14764, GSE26193, GSE26172, GSE63885 and GSE140082 cohort.

To compare the performance of TRPS with other prognostic signatures in predicting the OS rate of OC cases, we randomly collected 45 OC-related prognostic signatures (Supplementary Table 1) and calculated their C-index. As a result, the C-index of TRPS was higher than most of these prognostic signatures in TCGA dataset (Fig.4A). Moreover, the C-index of TRPS was higher than that of tumor grade and clinical stage in training and testing cohorts (Fig.4BF). These evidences suggested that the predictive value of TRPS in predicting the clinical outcome of OC patients was higher than most of signatures and clinical characters. However, we could not evaluate the predictive value of TRPS in predicting the OS rate of OC patients in GSE26712 cohort due to the missing data of tumor grade and clinical stage. Based on the result of univariate and multivariate cox regression analysis, TRPS served as an independent risk factor for the clinical outcome of OC patients in TCGA, GSE14764, GSE26193, GSE63885 and GSE140082 cohort (Fig.4G,H, all P<0.05). To predict the 1-year, 3-year and 5-year OS rate of OC patients, we then constructed a nomogram based on TRPS, clinical stage and tumor grade using TCGA dataset (Fig.4I). The comparison between the predicted curve and the ideal curve showed a high coincidence in TCGA dataset (Fig.4J). Compared with TPRS, clinical stage and tumor grade, the AUC of nomogram were higher in TCGA dataset (Fig.4K).

Evaluation the performance of TRPS in predicting prognosis of OC patients. (A) C-index of TRPS and other 45 established signatures in predicting the prognosis of OC patients. (BF) The C-index of TRPS, tumor grade and clinical stage in predicting prognosis of OC patients in TCGA, GSE14764, GSE26193, GSE63885 and GSE140082 cohort. (G,H) Univariate and multivariate cox regression analysis considering grade, stage and TRPS in training and testing cohort. (I,J) Predictive nomogram and calibration evaluating the 1-y, 3-y and 5-y overall survival rate of OC patients. (K) ROC curve evaluated the performance of nomogram in predicting prognosis of OC patients.

As shown in Fig.5A, TRPS showed significant correlation with the abundance of immune cells in TCGA dataset (all P<0.05). More specifically, TRPS showed a negative correlation with immuno-activated cell infiltration, such as CD8+ T cells, plasma cells, macrophage M1 and NK cells in TCGA dataset (Fig.5BE, all P<0.05). Interestingly, higher risk score indicated a higher level of cancer-related fibroblasts in TCGA dataset (Fig.5F). Similar results were obtained in ssGSEA analysis, suggesting a higher abundance of immuno-activated cells in low risk score group, including aDCs, B cells, CD8+ T cells, Neutrophils, NK cells, Tfh and TIL in TCGA dataset (Fig.5G, all P<0.05). Previous studies showed that macrophage M2/M1 polarization played a vital role in the progression of cancer9,10. Our study showed that OC patients with high risk score had a higher macrophage M2/M1 polarization in TCGA, GSE26712, and GSE140082 cohort (Fig.5H, all P<0.05). Further analysis suggested a higher stromal score, immune score and ESTIMAE score in low risk score group in TCGA dataset (Fig.5I, all P<0.001). Moreover, higher risk score indicated a higher APC co-stimulation score, CCR score, cytolytic activity score, para-inflammation promoting score, parainflammation and T cell co-stimulation score in TCGA dataset (Fig.5J).

Correlation between immune microenvironment and TRPS in OC. (A) Seven state-of-the-art algorithms evaluating the correlation between TRPS and immune cell infiltration in OC. (BF) The correlation between TRPS and the abundance of CD8+ T cells, plasma cells, macrophage M1 and CAFs. (G) The level of immune cells in different TRPS score group based on ssGSEA analysis. (H) The macrophage M2/M1 ratio in different TRPS score group in TCGA, GSE26712 and GSE140082 dataset. (I,J) The stromal score, immune score, ESTIMAE score and immune-related functions score in different TRPS score group. *P<0.05, **P<0.01, ***P<0.001.

High HLA-related gene expression indicated wider range of antigen presentation, increasing the likelihood of presenting more immunogenic antigens, and the likelihood of benefiting from immunotherapy11. We found that OC patients with low risk score had a higher HLA-related genes in TCGA dataset (Fig.6A, all P<0.05). Immune checkpoints played a vital role in immune escape of cancer. Based on our results, the expression of most of immune checkpoints was higher in high risk score groups in OC in TCGA dataset (Fig.6B, all P<0.05). Previous study showed that high TMB score was correlated with a better response to immunotherapy12. IPS was a superior predictor of response to anti-CTLA-4 and anti-PD-1 antibody and high IPS indicated a better response to immunotherapy13. High TIDE score indicated a greater likelihood of immune escape and less effectiveness of ICI treatment14. As showed in Fig.6CF, OC patients with low risk score had a higher TMB score, higher PD1 immunophenoscore, CTLA4 immunophenoscore, and PD1&CTLA4 immunophenoscore, lower immune escape score, lower TIDE score, lower T cell exclusion and dysfunction score in TCGA dataset. Thus, OC patients with low risk score may have a better immunotherapy benefit. To further verify the predictive value of TRPS in immunotherapy benefits, we then applied two immunotherapy cohorts to further verify our results. As shown in Fig.6G, the risk score in non-responders was significantly higher than that in responders in IMvigor210 cohort (P<0.01). Moreover, high risk score indicated a poor clinical outcome and lower response rate in IMvigor210 cohort (Fig.6G). Similar results were obtained in GSE91061 cohort (Fig.6H). As the vital role of chemotherapy, targeted therapy and endocrinotherapy for the treatment of OC, we also detected the IC50 value of common drugs in OC patients. We found that the IC50 value of 5-Fluorouracil, Camptothecin, Cisplatin, Gemcitabine, Foretunib, KRAS inhibitor, Erlotinib, and Tamoxifen were higher in in OC patients with high risk score in TCGA dataset (Fig.7A, all P<0.05). Moreover, positive correlation was obtained between risk score and these drugs in TCGA dataset (Fig.7B). Thus, OC patients with low risk score may be better sensitivity to chemotherapy and targeted therapy.

TRPS as an indicator for immunotherapy response in OC. (A,B) The level of HLA-related genes and immune checkpoints in different TRPS score group. (BF) The TMB score, immunophenoscore, immune escape score and TIDE, T cell dysfunction and exclusion score in different TRPS score group. (G,H) The overall rate and immunotherapy response rate in patients with high and low risk score in GSE91061 and IMvigor210 cohort. *P<0.05, **P<0.01, ***P<0.001.

The IC50 value of common drugs in different TRPS score group. (A) Low risk score indicated a lower IC50 value of common drugs. (B) The correlation between IC50 value of common drugs and TRPS score.

We finally performed gene set enrichment analysis to explore the potential mechanism mediating the difference of OC patients in clinical outcome, immune infiltration, and therapy response. High risk score indicated a higher sore of angiogenesis, DNA repair, EMT, G2M checkpoint, glycolysis, hypoxia, IL2-STAT5 signaling, IL6-JAK-STAT3 signaling, MTORC1 signaling, NOTCH signaling, P53 pathway, and P13K-AKT-mTOR signaling in OC in TCGA dataset (Fig.8AL, all P<0.05).

Gene set enrichment analysis in different TRPS score group. High risk score indicated a higher score of angiogenesis (A), DNA repair (B), EMT (C), G2M checkpoint (D), glycolysis (E), hypoxia (F), IL2-STAT5 signaling (G), IL6-JAK-STAT3 signaling (H), MTORC1 signaling (I), NOTCH signaling (J), P53 pathway (K), and P13K-AKT-mTOR signaling (L).

To further verify the performance of TRPS, we selected ARL6IP5 that contributed the most to the TRPS for further analysis. We first examined the expression of ARL6IP5 in OC cell lines, which showed that the expression of ARL6IP5 was lower in OC cell lines (Fig.9A). Typical immunohistochemical of ARL6IP5 in OC and normal tissues were showed in Fig.9B. In the follow-up study, the results of the CCK-8 assay proved that overexpression of ARL6IP5 obviously inhibited the proliferation of SKOV3 and TOV21G (Fig.9C,D).

Validation of the potential function of ARL6IP5 in OC by in vitro assays. (A) Comparison of ARL6IP5 expressions in normal and OC cell lines. (B) Typical immunohistochemical of ARL6IP5 in OC and normal tissues. (C,D) CCK-8 assay showed that overexpression of ARL6IP5 obviously inhibited the proliferation of SKOV3 and TOV21G cells. *P<0.05, **P<0.01.

See the original post:
Machine learning developed a CD8+ exhausted T cells signature for predicting prognosis, immune infiltration and drug ... - Nature.com

Single Transit Detection In Kepler With Machine Learning And Onboard Spacecraft Diagnostics – Astrobiology – Astrobiology News

Best-fit transit models for all 8 of the visible transits within Kepler of KOI 1271.01. Overlaid with the models is a scatter plot of the normalized flux values for the 4-day window with the associated error bars given by Kepler. The models were found using EXOPLANET, and the only parameter that was fit for was the time of center transit. The title of each subplot is the epoch number along with the best-fit time of the transit center. We fit the models over a 4-day range around the predicted time of transit using the ephemeris of KOI 1271.01. Therefore, the location of the transit within the window gives a hint of the order of magnitude of the epochs TTV. astro-ph.EP

Exoplanet discovery at long orbital periods requires reliably detecting individual transits without additional information about the system. Techniques like phase-folding of light curves and periodogram analysis of radial velocity data are more sensitive to planets with shorter orbital periods, leaving a dearth of planet discoveries at long periods.

We present a novel technique using an ensemble of Convolutional Neural Networks incorporating the onboard spacecraft diagnostics of Kepler to classify transits within a light curve. We create a pipeline to recover the location of individual transits, and the period of the orbiting planet, which maintains >80% transit recovery sensitivity out to an 800-day orbital period.

Our neural network pipeline has the potential to discover additional planets in the Kepler dataset, and crucially, within the -Earth regime. We report our first candidate from this pipeline, KOI 1271.02. KOI 1271.01 is known to exhibit strong Transit Timing Variations (TTVs), and so we jointly model the TTVs and transits of both transiting planets to constrain the orbital configuration and planetary parameters and conclude with a series of potential parameters for KOI 1271.02, as there is not enough data currently to uniquely constrain the system.

We conclude that KOI 1271.02 has a radius of 5.32 0.20 R and a mass of 28.940.230.47 M. Future constraints on the nature of KOI 1271.02 require measuring additional TTVs of KOI 1271.01 or observing a second transit of KOI 1271.02.

Matthew T. Hansen, Jason A. Dittmann

Comments: 23 pages, 23 figures, submitted to AJ Subjects: Earth and Planetary Astrophysics (astro-ph.EP); Instrumentation and Methods for Astrophysics (astro-ph.IM); Machine Learning (cs.LG) Cite as: arXiv:2403.03427 [astro-ph.EP] (or arXiv:2403.03427v1 [astro-ph.EP] for this version) Submission history From: Matthew Hansen [v1] Wed, 6 Mar 2024 03:16:47 UTC (1,474 KB) https://arxiv.org/abs/2403.03427 Astrobiology,

Explorers Club Fellow, ex-NASA Space Station Payload manager/space biologist, Away Teams, Journalist, Lapsed climber, Synaesthete, NaVi-Jedi-Freman-Buddhist-mix, ASL, Devon Island and Everest Base Camp veteran, (he/him)

Originally posted here:
Single Transit Detection In Kepler With Machine Learning And Onboard Spacecraft Diagnostics - Astrobiology - Astrobiology News

Putting the AI in NIA: New opportunities in artificial intelligence – National Institute on Aging

Acknowledgments: Many thanks to the NIA AI Working Group members for their contributions to this blog post.

Artificial intelligence (AI) the science of computer systems that can mimic human-like thinking and decision-making processes has continued to evolve since our 2022 blog on this topic. With that growth comes added fascination for AIs possibilities and caution about its potential pitfalls.

Beyond the headlines, the aging science community is most excited about how AI and its related field of machine learning (ML) can turbocharge tools and models to accelerate research in Alzheimers disease and related dementias as well as other complex health challenges.

As NIA continues to expand its portfolio of AI/ML initiatives, be sure to check out our latest funding opportunity on multi-scale computational models in aging and Alzheimers (RFA-AG-25-016) with an application deadline of June 13, 2024. This RFA encompasses a variety of computational approaches such as mathematical and computational modeling, image analysis, AI, and ML to better understand aging processes and Alzheimers and related dementias across molecules, cells, and cellular networks, and how they affect cognition and behavior.

If youre interested in learning more, the NIH Center for Alzheimers and Related Dementias (CARD) has numerous training opportunities, open-access resources, and tools to help investigators take advantage of AI and ML capabilities. For example, GenoML, an open-source project created by CARD staff and collaborators, offers a streamlined approach to machine learning in genomics and has been downloaded more than 15,000 times since its launch.

NIA also participates in broad efforts to advance cutting-edge AI research in partnership with other federal and international funders through programs such as:

NIA recognizes the transformative potential of AI in analyzing complex datasets, accelerating the understanding of Alzheimers pathology, and identifying novel treatment avenues. Together, we hope these advanced tools and methods will help us better understand the aging process and find a cure for dementia and other age-related diseases.

To be a part of the next chapter, apply for the latest multi-scale computational models in aging and Alzheimers funding opportunity by June 13. To learn more, visit theNIA AI page. As always, we invite comments below!

Read the rest here:
Putting the AI in NIA: New opportunities in artificial intelligence - National Institute on Aging

Uncertainty-aware deep learning for trustworthy prediction of long-term outcome after endovascular thrombectomy … – Nature.com

Global Burden of Disease Stroke Expert Group and others. Global, regional, and country-specific lifetime risks of stroke, 1990 and 2016. N. Engl. J. Med. 379, 24292437 (2018).

Article Google Scholar

Goyal, M. et al. Endovascular thrombectomy after large-vessel Ischaemic stroke: A meta-analysis of individual patient data from five randomised trials. Lancet 387, 17231731 (2016).

Article PubMed Google Scholar

Albers, G. W. et al. Thrombectomy for stroke at 6 to 16 hours with selection by perfusion imaging. N. Engl. J. Med. 378, 708718 (2018).

Article PubMed PubMed Central Google Scholar

Nogueira, R. G. et al. Thrombectomy 6 to 24 hours after stroke with a mismatch between deficit and infarct. N. Engl. J. Med. 378, 1121 (2018).

Article PubMed Google Scholar

Quinn, T., Dawson, J., Walters, M. & Lees, K. Functional outcome measures in contemporary stroke trials. Int. J. Stroke 4, 200205 (2009).

Article CAS PubMed Google Scholar

Johnston, K. C., Wagner, D. P., Haley, E. C. Jr. & Connors, A. F. Jr. Combined clinical and imaging information as an early stroke outcome measure. Stroke 33, 466472 (2002).

Article PubMed PubMed Central Google Scholar

Asadi, H., Dowling, R., Yan, B. & Mitchell, P. Machine learning for outcome prediction of acute ischemic stroke post intra-arterial therapy. PLoS ONE 9, e88225 (2014).

Article ADS PubMed PubMed Central Google Scholar

Monteiro, M. et al. Using machine learning to improve the prediction of functional outcome in ischemic stroke patients. IEEE/ACM Trans. Comput. Biol. Bioinf. 15, 19531959 (2018).

Article Google Scholar

Heo, J. et al. Machine learning-based model for prediction of outcomes in acute stroke. Stroke 50, 12631265 (2019).

Article PubMed Google Scholar

Bacchi, S. et al. Deep learning in the prediction of Ischaemic stroke thrombolysis functional outcomes: A pilot study. Acad. Radiol. 27, e19e23 (2020).

Article PubMed Google Scholar

Alaka, S. A. et al. Functional outcome prediction in ischemic stroke: A comparison of machine learning algorithms and regression models. Front. Neurol. 11, 889 (2020).

Article PubMed PubMed Central Google Scholar

Begoli, E., Bhattacharya, T. & Kusnezov, D. The need for uncertainty quantification in machine-assisted medical decision making. Nat. Mach. Intell. 1, 2023 (2019).

Article Google Scholar

Kim, D.-Y. et al. Deep learning-based personalised outcome prediction after acute ischaemic stroke. J. Neurol. Neurosurg. Psychiatry 94, 369378 (2023).

Article PubMed Google Scholar

Vora, N. A. et al. A 5-item scale to predict stroke outcome after cortical middle cerebral artery territory infarction: Validation from results of the diffusion and perfusion imaging evaluation for understanding stroke evolution (defuse) study. Stroke 42, 645649 (2011).

Article PubMed Google Scholar

Panni, P. et al. Acute stroke with large ischemic core treated by thrombectomy: Predictors of good outcome and mortality. Stroke 50, 11641171 (2019).

Article PubMed Google Scholar

Van Os, H. J. et al. Predicting outcome of endovascular treatment for acute ischemic stroke: Potential value of machine learning algorithms. Front. Neurol. 9, 784 (2018).

Article PubMed PubMed Central Google Scholar

Xie, Y. et al. Use of gradient boosting machine learning to predict patient outcome in acute ischemic stroke on the basis of imaging, demographic, and clinical information. Am. J. Roentgenol. 212, 4451 (2019).

Article Google Scholar

Thakkar, H. K., Liao, W.-W., Wu, C.-Y., Hsieh, Y.-W. & Lee, T.-H. Predicting clinically significant motor function improvement after contemporary task-oriented interventions using machine learning approaches. J. Neuroeng. Rehabil. 17, 110 (2020).

Article Google Scholar

Shao, H. et al. A new machine learning algorithm with high interpretability for improving the safety and efficiency of thrombolysis for stroke patients: A hospital-based pilot study. Digit. Health 9, 20552076221149530 (2023).

PubMed PubMed Central Google Scholar

Bronstein, M. M., Bruna, J., LeCun, Y., Szlam, A. & Vandergheynst, P. Geometric deep learning: Going beyond Euclidean data. IEEE Signal Process. Mag. 34, 1842 (2017).

Article ADS Google Scholar

Parisot, S. et al. Spectral graph convolutions for population-based disease prediction. In International Conference on Medical Image Computing and Computer-Assisted Intervention (eds Parisot, S. et al.) 177185 (Springer, 2017).

Google Scholar

Kazi, A. et al. Inceptiongcn: Receptive field aware graph convolutional network for disease prediction. In International Conference on Information Processing in Medical Imaging (eds Kazi, A. et al.) 7385 (Springer, 2019).

Chapter Google Scholar

Ravindra, N., Sehanobish, A., Pappalardo, J.L., Hafler, D.A. & van Dijk, D. Disease state prediction from single-cell data using graph attention networks. In: Proc. of the ACM conference on health, inference, and learning, 121130 (2020).

Huang, Y. & Chung, A. C. Disease prediction with edge-variational graph convolutional networks. Med. Image Anal. 77, 102375 (2022).

Article PubMed Google Scholar

Loftus, T. J. et al. Uncertainty-aware deep learning in healthcare: A scoping review. PLOS Digit. Health 1, e0000085 (2022).

Article PubMed PubMed Central Google Scholar

Abdar, M. et al. A review of uncertainty quantification in deep learning: Techniques, applications and challenges. Information Fusion 76, 243297 (2021).

Article Google Scholar

Abdar, M., Khosravi, A., Islam, S. M. S., Acharya, U. R. & Vasilakos, A. V. The need for quantification of uncertainty in artificial intelligence for clinical data analysis: Increasing the level of trust in the decision-making process. IEEE Syst. Man Cybern. Magaz. 8, 2840 (2022).

Article Google Scholar

Guo, C., Pleiss, G., Sun, Y. & Weinberger, K. Q. On calibration of modern neural networks. In Proceedings of the 34th International Conference on Machine Learning Vol. 70 (eds Precup, D. & Teh, Y. W.) 13211330 (PMLR, 2017).

Google Scholar

Pearce, T., Brintrup, A. & Zhu, J. Understanding softmax confidence and uncertainty. Preprint at arXiv:2106.04972 (2021).

Alarab, I., Prakoonwit, S. & Nacer, M. I. Illustrative discussion of mc-dropout in general dataset: Uncertainty estimation in bitcoin. Neural Process. Lett. 53, 10011011 (2021).

Article Google Scholar

Alarab, I. & Prakoonwit, S. Uncertainty estimation-based adversarial attacks: a viable approach for graph neural networks. Soft Computing 113 (2023).

Gal, Y. & Ghahramani, Z. Dropout as a Bayesian approximation: Representing model uncertainty in deep learning. In International Conference on Machine Learning (eds Gal, Y. & Ghahramani, Z.) 10501059 (PMLR, 2016).

Google Scholar

Singer, O. C. et al. Collateral vessels in proximal middle cerebral artery occlusion: The endostroke study. Radiology 274, 851858 (2015).

Article PubMed Google Scholar

Bang, O. Y. et al. Impact of collateral flow on tissue fate in acute Ischaemic stroke. J. Neurol. Neurosurg. Psychiatry 79, 625629 (2008).

Article CAS PubMed Google Scholar

Menon, B. K. et al. Assessment of leptomeningeal collaterals using dynamic ct angiography in patients with acute ischemic stroke. J. Cerebral Blood Flow Metabol. 33, 365371 (2013).

Article Google Scholar

Berkhemer, O. A. et al. Collateral status on baseline computed tomographic angiography and intra-arterial treatment effect in patients with proximal anterior circulation stroke. Stroke 47, 768776 (2016).

Article CAS PubMed Google Scholar

Menon, B. et al. Regional leptomeningeal score on ct angiography predicts clinical and imaging outcomes in patients with acute anterior circulation occlusions. Am. J. Neuroradiol. 32, 16401645 (2011).

Article CAS PubMed PubMed Central Google Scholar

Kucinski, T. et al. Collateral circulation is an independent radiological predictor of outcome after thrombolysis in acute ischaemic stroke. Neuroradiology 45, 1118 (2003).

Article CAS PubMed Google Scholar

Sheth, S. A. et al. Collateral flow as causative of good outcomes in endovascular stroke therapy. J. Neurointerv. Surg. 8, 27 (2016).

Article PubMed Google Scholar

Seyman, E. et al. The collateral circulation determines cortical infarct volume in anterior circulation ischemic stroke. BMC Neurol. 16, 19 (2016).

Article Google Scholar

Elijovich, L. et al. Cta collateral score predicts infarct volume and clinical outcome after endovascular therapy for acute ischemic stroke: a retrospective chart review. J. Neurointerv. Surg. 8, 559562 (2016).

Article PubMed Google Scholar

Prasetya, H. et al. Value of ct perfusion for collateral status assessment in patients with acute ischemic stroke. Diagnostics 12, 3014 (2022).

Article PubMed PubMed Central Google Scholar

Potreck, A. et al. Rapid ct perfusion-based relative cbf identifies good collateral status better than hypoperfusion intensity ratio, cbv-index, and time-to-maximum in anterior circulation stroke. Am. J. Neuroradiol. 43, 960965 (2022).

Article CAS PubMed PubMed Central Google Scholar

Olivot, J. M. et al. Hypoperfusion intensity ratio predicts infarct progression and functional outcome in the defuse 2 cohort. Stroke 45, 10181023 (2014).

Article PubMed PubMed Central Google Scholar

Li, B.-H. et al. Cerebral blood volume index may be a predictor of independent outcome of thrombectomy in stroke patients with low aspects. J. Clin. Neurosci. 103, 188192 (2022).

Article ADS PubMed Google Scholar

Laredo, C. et al. Clinical and therapeutic variables may influence the association between infarct core predicted by ct perfusion and clinical outcome in acute stroke. Eur. Radiol. 32, 45104520 (2022).

Article CAS PubMed Google Scholar

Christodoulou, E. et al. A systematic review shows no performance benefit of machine learning over logistic regression for clinical prediction models. J. Clin. Epidemiol. 110, 1222 (2019).

Article PubMed Google Scholar

Ramos, L. A. et al. Predicting poor outcome before endovascular treatment in patients with acute ischemic stroke. Front. Neurol. 11, 580957 (2020).

Article PubMed PubMed Central Google Scholar

Leker, R. R. et al. Post-stroke aspects predicts outcome after thrombectomy. Neuroradiology 63, 769775 (2021).

Article PubMed Google Scholar

Peng, H., Long, F. & Ding, C. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 27, 12261238 (2005).

Article PubMed Google Scholar

Zhao, Z., Anand, R. & Wang, M. Maximum relevance and minimum redundancy feature selection methods for a marketing machine learning platform. In 2019 IEEE International Conference on Data Science and Advanced Analytics (DSAA) (eds Zhao, Z. et al.) 442452 (IEEE, 2019).

Chapter Google Scholar

Paszke, A. et al. Pytorch: An imperative style, high-performance deep learning library. In Advances in Neural Information Processing Systems Vol. 32 (eds Paszke, A. et al.) 80248035 (Curran Associates, Inc., 2019).

Visit link:
Uncertainty-aware deep learning for trustworthy prediction of long-term outcome after endovascular thrombectomy ... - Nature.com

AI Engineer Salary: The Lucrative World of AI Engineering – Simplilearn

A few decades ago, the term Artificial Intelligence was reserved for scientific circles and tech-enthusiasts who wanted to sound cool. But, ever since its coining in 1955, AI has only grown in popularity. Today, you wouldnt find a technology magazine that doesnt mention artificial intelligence in every other paragraph.

Here's a quick video explaining the rise in demand for AI engineers and trends in an AI engineer's salary worldwide.

An AI Engineer is a professional skilled in developing, programming, and implementing artificial intelligence (AI) systems and applications. Their expertise lies in utilizing algorithms, data sets, and machine learning (ML) principles to create intelligent systems performing tasks that typically require human intelligence. These tasks may include problem-solving, decision-making, natural language processing, and understanding human speech.

AI Engineers work across various stages of AI project development, from conceptualizing and designing AI models to deploying and maintaining these systems in production environments. Their responsibilities often encompass:

AI Engineers typically have a strong foundation in computer science, mathematics, and statistics, with specialized knowledge in machine learning, deep learning, natural language processing, and computer vision. They must also be proficient in programming languages commonly used in AI, such as Python, and tools and frameworks like TensorFlow, PyTorch, and Keras.

Due to the interdisciplinary nature of AI, engineers often collaborate with data scientists, software engineers, and domain experts to develop solutions tailored to specific business needs or research objectives. The role requires continuous learning to keep up with the rapidly evolving field of artificial intelligence.

Before getting on the question at hand, we need to know top AI engineer's job roles. Machine Learning (ML) Engineers, Data Scientists, Data Analyst, Computer Vision Engineer, Business Intelligence Developer, and Algorithm Engineers are just some of the many different positions that come under the umbrella of AI engineering. Each of these positions entails a different job-profile, but, generally speaking, most AI engineers deal with designing and creating AI models. Everything from the maintenance to performance supervision of the model is the responsibility of the AI engineer.

Most AI engineers come from a computer science background and have strong programming skills, which is a non-negotiable part of an AI engineers position. Proficiency in Python and Object-Oriented Programming is highly desirable. But for an AI engineer, what is even more important than programming languages is the programming aptitude. Since the whole point of an AI system is to work without human supervision, AI algorithms are very different from traditional codes. So, the AI engineer must be able to design algorithms that are adaptable and capable of evolving.

Other than programming, an AI engineer needs to be conversant in an assortment of disciplines like robotics, physics, and mathematics. Mathematical knowledge is especially crucial as linear algebra and statistics play a vital role in designing AI models.

Read More: Gaurav Tyagis love for learning inspired to him to upskill with our AI For Decision Making: Business Strategies And Applications. Read about his journey and his experience with our course in his Simplilearn AI Program Review.

At the moment, AI engineering is one of the most lucrative career paths in the world. The AI job market has been growing at a phenomenal rate for some time now. The entry-level annual average AI engineer salary in India is around 10 lakhs, which is significantly higher than the average salary of any other engineering graduate. At high-level positions, the AI engineer salary can be as high as 50 lakhs.

AI engineers earn an average salary of well over $100,000 annually. According to Glassdoor, the average national salary is over $110,000; and the high salary is $150,000.

However, you must note that these figures can vary significantly based on several factors like:

Companies Hiring for Artificial Intelligence Engineers:

Here is the list of companies/ startups hiring in AI right now are IBM, Fractal.ai, JPMorgan, Intel, Oracle, Microsoft, etc.

City (India)

Average Salary (Annual)

Bangalore

12,00,000

Hyderabad

10,00,000

Mumbai

15,00,000

Chennai

8,00,000

Delhi

12,00,000

The salary for AI professionals in India can vary based on a variety of factors, including experience, job role, industry, and location. However, here's an estimate of the AI salary based on experience in India:

It's important to note that these figures are just estimates and can vary based on individual circumstances. Additionally, the industry and location can also play a role in determining AI salaries, with industries such as finance, healthcare, and technology typically paying higher salaries and cities such as Bangalore, Mumbai, and Delhi generally paying higher salaries than other cities in India.

If you're interested in pursuing a career in Artificial Intelligence (AI), here are some steps that can help you get started:

By following these steps, you can build a successful career in AI and become a valuable contributor to the field.

The top 7 countries with the maximum opportunities for Artificial Intelligence (AI) Professionals are:

There are various positions that an AI engineer can take up. An AI engineers salary depends on the market demand for his/her job profile. Presently, ML engineers are in greater demand and hence bag a relatively higher package than other AI engineers. Similarly, the greater the experience in artificial intelligence, the higher the salary companies will offer. Although you can become an AI engineer without a Masters degree, it is imperative that you keep updating and growing your skillset to remain competitive in the ever-evolving world of AI engineering.

There are a number of exciting and in-demand jobs in the field of artificial intelligence (AI). Here are some of the top AI jobs that you may want to consider:

As a machine learning engineer, you will be responsible for developing and implementing algorithms that enable computers to learn from data. This includes working with large data sets, designing and testing machine learning models, and tuning algorithms for efficient execution.

Data scientists use their expertise in statistics, mathematics, and computer science to analyze complex data sets. They work with organizations to gain insights that can be used to improve decision-making.

As an AI researcher, you will be responsible for investigating and developing new artificial intelligence algorithms and applications. This includes conducting research, writing papers, and presenting your findings at conferences.

Software engineers develop the software that enables computers to function. This includes creating algorithms, testing code, and debugging programs.

Systems engineers design and oversee the implementation of complex systems. This includes planning and coordinating system development, ensuring compatibility between components, and troubleshooting issues.

Hardware engineers design and oversee the manufacture of computer hardware components. This includes circuit boards, processors, and memory devices.

Network engineers design and implement computer networks. This includes configuring networking equipment, developing network architectures, and troubleshooting network problems.

Database administrators maintain databases and ensure that data is stored securely and efficiently. This includes designing database structures, implementing security measures, and backing up data.

Information security analysts plan and implement security measures to protect computer networks and systems. This includes researching security threats, assessing risks, and developing countermeasures.

User experience designers create user interfaces that are both effective and efficient. This includes developing navigation schemes, designing graphical elements, and testing prototypes.

These are just a few of the many exciting and in-demand jobs in the field of artificial intelligence. With the right skills and experience, you can find a position that matches your interests and abilities.

Just as AI is transforming the business landscape, it is also opening up new opportunities in the recruiting sphere. Here are some of the top companies and recruiters who are hiring for AI roles:

These are just some of the top companies and recruiters who are hiring for AI roles. If you have the right skills and experience, don't hesitate to apply!

There are a few key things you can do to help boost your AI salary. First, focus on acquiring in-demand skills. One of the best ways to do this is to enroll in a top-rated certification program. Second, keep up with the latest industry trends and developments. Finally, consider pursuing management or leadership roles within your organization. By taking these steps, you can position yourself for success and earn a higher salary in the AI field.

Supercharge your career in AI and ML with Simplilearn's comprehensive courses. Gain the skills and knowledge to transform industries and unleash your true potential. Enroll now and unlock limitless possibilities!

Even as you read this article, the demand for AI is booming across the globe. AI engineer salaries will keep rising as industries like tech, financial services, and medical research turn to artificial intelligence. As more global brands like Google and Nvidia dive deeper into Artificial Intelligence (AI), the demand and the salaries for AI engineers will only go upwards in 2024 and the decades to follow. Even government agencies in many developed and developing nations will open up AI engineer positions as they realize the enormous impact AI can have on the defense and governance sector.

Looking at the current pandemic scenario, jobs are better left till the dawn of next year. The time you have right now will be far better utilized in upgrading your AI repertoire.

Unlike most other fields, AI of tomorrow will look nothing like the AI of today. It is evolving at a breathtaking speed, and ensuring your Artificial Intelligence (AI) skills are relevant to current market needs, you better keep upgrading it. If you wish to get a step closer to these lucrative salaries, sharpen your AI skills with the world-class Artificial Intelligence Engineer program, and, before you know it, you will be standing in the world of AI engineers!

The salary of an AI Engineer in India can range from 8 lakhs to 50 lakhs annually.

The starting salary for an AI Engineer in India can be from 8 lakhs annually.

50 laksh is the highest salary for an AI Engineer in India

As experience and position increases, the salary also increases.

IT is one of the highest paying industry for AI Engineer.

Popular skills for AI Engineers to have are programming languages, data engineering, exploratory data analysis, deploying, modelling, and security.

Average Artificial Intelligence Engineer salary in the US is around $100k annually.

Top 5 Artificial Intelligence Jobs in the US are Machine Learning Engineer, Data Scientist, Business Intelligence Developer, Research Scientist, and Big Data Engineer/Architect.

The lowest salary for an AI Enginner in the US is around $100k annually.

Highest salary can go over $150 to $200k annually.

See the original post here:
AI Engineer Salary: The Lucrative World of AI Engineering - Simplilearn