Archive for the ‘Machine Learning’ Category

The Machine Learning Guide for Predictive Accuracy: Interpolation and Extrapolation – Towards Data Science

class ModelFitterAndVisualizer: def __init__(self, X_train, y_train, y_truth, scaling=False, random_state=41): """ Initialize the ModelFitterAndVisualizer class with training and testing data.

Parameters: X_train (pd.DataFrame): Training data features y_train (pd.Series): Training data target y_truth (pd.Series): Ground truth for predictions scaling (bool): Flag to indicate if scaling should be applied random_state (int): Seed for random number generation """ self.X_train = X_train self.y_train = y_train self.y_truth = y_truth

self.initialize_models(random_state)

self.scaling = scaling

# Initialize models # ----------------------------------------------------------------- def initialize_models(self, random_state): """ Initialize the models to be used for fitting and prediction.

Parameters: random_state (int): Seed for random number generation """

# Define kernel for GPR kernel = 1.0 * RBF(length_scale=1.0) + WhiteKernel(noise_level=1.0)

# Define Ensemble Models Estimator # Decision Tree + Kernel Method estimators_rf_svr = [ ('rf', RandomForestRegressor(n_estimators=30, random_state=random_state)), ('svr', SVR(kernel='rbf')), ] estimators_rf_gpr = [ ('rf', RandomForestRegressor(n_estimators=30, random_state=random_state)), ('gpr', GaussianProcessRegressor(kernel=kernel, normalize_y=True, random_state=random_state)) ] # Decision Trees estimators_rf_xgb = [ ('rf', RandomForestRegressor(n_estimators=30, random_state=random_state)), ('xgb', xgb.XGBRegressor(random_state=random_state)), ]

self.models = [ SymbolicRegressor(random_state=random_state), SVR(kernel='rbf'), GaussianProcessRegressor(kernel=kernel, normalize_y=True, random_state=random_state), DecisionTreeRegressor(random_state=random_state), RandomForestRegressor(random_state=random_state), xgb.XGBRegressor(random_state=random_state), lgbm.LGBMRegressor(n_estimators=50, num_leaves=10, min_child_samples=3, random_state=random_state), VotingRegressor(estimators=estimators_rf_svr), StackingRegressor(estimators=estimators_rf_svr, final_estimator=RandomForestRegressor(random_state=random_state)), VotingRegressor(estimators=estimators_rf_gpr), StackingRegressor(estimators=estimators_rf_gpr, final_estimator=RandomForestRegressor(random_state=random_state)), VotingRegressor(estimators=estimators_rf_xgb), StackingRegressor(estimators=estimators_rf_xgb, final_estimator=RandomForestRegressor(random_state=random_state)), ]

# Define graph titles self.titles = [ "Ground Truth", "Training Points", "SymbolicRegressor", "SVR", "GPR", "DecisionTree", "RForest", "XGBoost", "LGBM", "Vote_rf_svr", "Stack_rf_svr__rf", "Vote_rf_gpr", "Stack_rf_gpr__rf", "Vote_rf_xgb", "Stack_rf_xgb__rf", ]

def fit_models(self): """ Fit the models to the training data.

Returns: self: Instance of the class with fitted models """ if self.scaling: scaler_X = MinMaxScaler() self.X_train_scaled = scaler_X.fit_transform(self.X_train) else: self.X_train_scaled = self.X_train.copy()

for model in self.models: model.fit(self.X_train_scaled, self.y_train) return self

def visualize_surface(self, x0, x1, width=400, height=500, num_panel_columns=5, vertical_spacing=0.06, horizontal_spacing=0, output=None, display=False, return_fig=False): """ Visualize the prediction surface for each model.

Parameters: x0 (np.ndarray): Meshgrid for feature 1 x1 (np.ndarray): Meshgrid for feature 2 width (int): Width of the plot height (int): Height of the plot output (str): File path to save the plot display (bool): Flag to display the plot """

num_plots = len(self.models) + 2 num_panel_rows = num_plots // num_panel_columns

whole_width = width * num_panel_columns whole_height = height * num_panel_rows

specs = [[{'type': 'surface'} for _ in range(num_panel_columns)] for _ in range(num_panel_rows)] fig = make_subplots(rows=num_panel_rows, cols=num_panel_columns, specs=specs, subplot_titles=self.titles, vertical_spacing=vertical_spacing, horizontal_spacing=horizontal_spacing)

for i, model in enumerate([None, None] + self.models): # Assign the subplot panels row = i // num_panel_columns + 1 col = i % num_panel_columns + 1

# Plot training points if i == 1: fig.add_trace(go.Scatter3d(x=self.X_train[:, 0], y=self.X_train[:, 1], z=self.y_train, mode='markers', marker=dict(size=2, color='darkslategray'), name='Training Data'), row=row, col=col)

surface = go.Surface(z=self.y_truth, x=x0, y=x1, showscale=False, opacity=.4) fig.add_trace(surface, row=row, col=col)

# Plot predicted surface for each model and ground truth else: y_pred = self.y_truth if model is None else model.predict(np.c_[x0.ravel(), x1.ravel()]).reshape(x0.shape) surface = go.Surface(z=y_pred, x=x0, y=x1, showscale=False) fig.add_trace(surface, row=row, col=col)

fig.update_scenes(dict( xaxis_title='x0', yaxis_title='x1', zaxis_title='y', ), row=row, col=col)

fig.update_layout(title='Model Predictions and Ground Truth', width=whole_width, height=whole_height)

# Change camera angle camera = dict( up=dict(x=0, y=0, z=1), center=dict(x=0, y=0, z=0), eye=dict(x=-1.25, y=-1.25, z=2) ) for i in range(num_plots): fig.update_layout(**{f'scene{i+1}_camera': camera})

if display: fig.show()

if output: fig.write_html(output)

if return_fig: return fig

Read this article:
The Machine Learning Guide for Predictive Accuracy: Interpolation and Extrapolation - Towards Data Science

Machine learning-based decision support model for selecting intra-arterial therapies for unresectable hepatocellular … – Nature.com

Sung H, Ferlay J, Siegel RL, Laversanne M, Soerjomataram L, Jemal A, et al. Global Cancer Statistics 2020: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA Cancer J Clin. 2021;71:20949. https://doi.org/10.3322/caac.21660.

Article CAS PubMed Google Scholar

Marrero JA, Kulik LM, Sirlin CB, Zhu AX, Finn RS, Abecassis MM, et al. Diagnosis, staging, and management of hepatocellular Carcinoma: 2018 Practice Guidance by the American Association for the Study of Liver Diseases. Hepatology. 2018;68:72350. https://doi.org/10.1002/hep.29913.

Article PubMed Google Scholar

Villanueva A. Hepatocellular Carcinoma. N. Engl J Med. 2019;380:145062. https://doi.org/10.1056/NEJMra1713263.

Article CAS PubMed Google Scholar

Park JW, Chen M, Colombo M, Roberts LR, Schwartz M, Chen PJ, et al. Global patterns of hepatocellular carcinoma management from diagnosis to death: the BRIDGE Study. Liver Int. 2015;35:215566. https://doi.org/10.1111/liv.12818.

Article PubMed PubMed Central Google Scholar

Ghanaati H, Mohammadifard M, Mohammadifard M. A review of applying transarterial chemoembolization (TACE) method for management of hepatocellular carcinoma. J Fam Med Prim Care. 2021;10:355360. https://doi.org/10.4103/jfmpc.jfmpc_2347_20.

Article Google Scholar

Sidaway P. HAIC-FO improves outcomes in HCC. Nat Rev Clin Oncol. 2022;19:150. https://doi.org/10.1038/s41571-022-00599-0.

Article CAS PubMed Google Scholar

He M, Li Q, Zou R, Shen JX, Fang WQ, Tan GS, et al. Sorafenib plus hepatic arterial infusion of Oxaliplatin, Fluorouracil, and Leucovorin vs Sorafenib alone for hepatocellular carcinoma with portal vein invasion: a randomized clinical trial. JAMA Oncol. 2019;5:95360. https://doi.org/10.1001/jamaoncol.2019.0250.

Article PubMed PubMed Central Google Scholar

Zhang Z, Li C, Liao W, Huang Y, Wang Z A Combination of Sorafenib, an Immune Checkpoint Inhibitor, TACE and Stereotactic Body Radiation Therapy versus Sorafenib and TACE in Advanced Hepatocellular Carcinoma Accompanied by Portal Vein Tumor Thrombus. Cancers. 2022;14. https://doi.org/10.3390/cancers14153619.

Lencioni R, Llovet JM, Han G, Tak WY, Yang JM, Alfredo G, et al. Sorafenib or placebo plus TACE with doxorubicin-eluting beads for intermediate stage HCC: The SPACE trial. J Hepatol. 2016;64:10908. https://doi.org/10.1016/j.jhep.2016.01.012.

Article CAS PubMed Google Scholar

McGlynn KA, Petrick JL, El-Serag HB. Epidemiology of Hepatocellular Carcinoma. Hepatology. 2021;73:413. https://doi.org/10.1002/hep.31288.

Article CAS PubMed Google Scholar

An C, Zuo M, Li W, Chen Q, Wu P. Infiltrative Hepatocellular Carcinoma: Transcatheter arterial chemoembolization versus hepatic arterial infusion chemotherapy. Front Oncol. 2021;11:747496. https://doi.org/10.3389/fonc.2021.747496.

Article CAS PubMed PubMed Central Google Scholar

An C, Yao W, Zuo M, Li W, Chen Q, Wu P Pseudo-capsulated hepatocellular carcinoma: hepatic arterial infusion chemotherapy versus Transcatheter Arterial Chemoembolization. Acad Radiol. 2023. https://doi.org/10.1016/j.acra.2023.06.021.

Kourou K, Exarchos TP, Exarchos KP, Karamouzis MV, Fotiadis DI. Machine learning applications in cancer prognosis and prediction. Comput Struct Biotechnol J. 2015;13:817. https://doi.org/10.1016/j.csbj.2014.11.005.

Article CAS PubMed Google Scholar

An C, Yang H, Yu X, Han Z, Cheng Z, Liu F, et al. A machine learning model based on health records for predicting recurrence after microwave ablation of hepatocellular carcinoma. J Hepatocell Carcinoma. 2022;9:67184. https://doi.org/10.2147/JHC.S358197.

Article PubMed PubMed Central Google Scholar

Uche-Anya E, Anyane-Yeboa A, Berzin TM, Ghassemi M, May FP. Artificial intelligence in gastroenterology and hepatology: how to advance clinical practice while ensuring health equity. Gut. 2022;71:190915. https://doi.org/10.1136/gutjnl-2021-326271.

Article PubMed Google Scholar

Moons KGM, Altman DG, Reitsma JB, Ioannidis JPA, Macaskill P, Steyerberg EW. Transparent Reporting of a multivariable prediction model for Individual Prognosis or Diagnosis (TRIPOD): explanation and elaboration. Ann Intern Med. 2015;162:W173. https://doi.org/10.7326/M14-0698. Jan 6

Article PubMed Google Scholar

EASL Clinical Practice Guidelines: Management of hepatocellular carcinoma. J Hepatol. 2018;69:182-236. https://doi.org/10.1016/j.jhep.2018.03.019.

Cardella JF, Kundu S, Miller DL, Millward SF, Sacks D. Society of Interventional Radiology Clinical Practice Guidelines. J Vasc Inter Radio. 2009;20:S189191. https://doi.org/10.1016/j.jvir.2009.04.035.

Article Google Scholar

Liu W, Wei R, Chen J, Li Y, Pang H, Zhang W, et al. Prognosis prediction and risk stratification of transarterial chemoembolization or intraarterial chemotherapy for unresectable hepatocellular carcinoma based on machine learning. Eur Radiol. 2024 Jan 30. https://doi.org/10.1007/s00330-024-10581-2.

Article PubMed Google Scholar

Wang K, Tian J, Zheng C, Yang H, Ren J, Liu Y, et al. Interpretable prediction of 3-year all-cause mortality in patients with heart failure caused by coronary heart disease based on machine learning and SHAP. Comput Biol Med. 2021;137:104813. https://doi.org/10.1016/j.compbiomed.2021.104813.

Article PubMed Google Scholar

Ma M, Liu R, Wen C, Xu W, Xu Z, Wang S, et al. Predicting the molecular subtype of breast cancer and identifying interpretable imaging features using machine learning algorithms. Eur Radiol. 2022;32:165262. https://doi.org/10.1007/s00330-021-08271-4.

Article CAS PubMed Google Scholar

Li QJ, He MK, Chen HW, Fang WQ, Zhou WM, Liu X, et al. Hepatic arterial infusion of Oxaliplatin, Fluorouracil, and Leucovorin versus transarterial chemoembolization for large hepatocellular carcinoma: a randomized Phase III trial. J Clin Oncol. 2022;40:15060. https://doi.org/10.1200/JCO.21.00608.

Article CAS PubMed Google Scholar

Jin ZC, Zhong BY, Chen JJ, Zhu HD, Sun JH, Yin GW, et al. Real-world efficacy and safety of TACE plus camrelizumab and apatinib in patients with HCC (CHANCE2211): a propensity score matching study. Eur Radiol. 2023. https://doi.org/10.1007/s00330-023-09754-2.

Johnson PJ, Berhane S, Kagebayashi C, Shinji S, Mabel T, Helen LR, et al. Assessment of liver function in patients with hepatocellular carcinoma: a new evidence-based approach-the ALBI grade. J Clin Oncol. 2015;33:5508. https://doi.org/10.1200/JCO.2014.57.9151.

Article PubMed Google Scholar

Reig M, Forner A, Rimola J, Joana F, Marta B, ngeles G, et al. BCLC strategy for prognosis prediction and treatment recommendation: The 2022 update. J Hepatol. 2022;76:68193. https://doi.org/10.1016/j.jhep.2021.11.018.

Article PubMed Google Scholar

Song S, Bai M, Li X, Guo S, Yang W, Li C, et al. Early predictive value of circulating biomarkers for sorafenib in advanced hepatocellular carcinoma. Expert Rev Mol Diagn. 2022;22:36178. https://doi.org/10.1080/14737159.2022.2049248.

Article CAS PubMed Google Scholar

Hiraoka A, Ishimaru Y, Kawasaki H, Aibiki T, Okudaira T, Toshimori A, et al. Tumor Markers AFP, AFP-L3, and DCP in Hepatocellular Carcinoma Refractory to Transcatheter Arterial Chemoembolization. Oncology. 2015;89:16774. https://doi.org/10.1159/000381808.

Article CAS PubMed Google Scholar

Zhou H, Song T. Conversion therapy and maintenance therapy for primary hepatocellular carcinoma. Biosci Trends. 2021;15:15560. https://doi.org/10.5582/bst.2021.01091.

Article CAS PubMed Google Scholar

Fan J, Tang ZY, Yu YQ, Wu ZQ, Ma ZC, Zhou XD, et al. Improved survival with resection after transcatheter arterial chemoembolization (TACE) for unresectable hepatocellular carcinoma. Dig Surg. 1998;15:6748. https://doi.org/10.1159/000018676.

Article CAS PubMed Google Scholar

Shi F, Lian S, Mai Q, Mo ZQ, Zhuang WH, Cui W, et al. Microwave ablation after downstaging of hepatocellular carcinoma: outcome was similar to tumor within Milan criteria. Eur Radiol. 2020;30:245462. https://doi.org/10.1007/s00330-019-06604-y.

Article PubMed Google Scholar

Binnewies M, Roberts EW, Kersten K, Vincent C, Douglas FF, Miriam M, et al. Understanding the tumor immune microenvironment (TIME) for effective therapy. Nat Med. 2018;24:54150. https://doi.org/10.1038/s41591-018-0014-x.

Article CAS PubMed PubMed Central Google Scholar

Cao J, Su B, Peng R, Tang H, Tu DY, Tang YH, et al. Bioinformatics analysis of immune infiltrates and tripartite motif (TRIM) family genes in hepatocellular carcinoma. J Gastrointest Oncol. 2022;13:194258. https://doi.org/10.21037/jgo-22-619.

Article PubMed PubMed Central Google Scholar

Liu F, Liu D, Wang K, Xie XH, Su LY, Kuang M, et al. Deep learning radiomics based on contrast-enhanced ultrasound might optimize curative treatments for very-early or early-stage hepatocellular carcinoma patients. Liver Cancer. 2020;9:397413. https://doi.org/10.1159/000505694.

Article PubMed PubMed Central Google Scholar

Ding W, Wang Z, Liu FY, Cheng ZG, Yu XL, Han ZY, et al. A hybrid machine learning model based on semantic information can optimize treatment decision for nave single 3-5-cm HCC patients. Liver Cancer. 2022;11:25667. https://doi.org/10.1159/000522123.

Article CAS PubMed PubMed Central Google Scholar

See the rest here:
Machine learning-based decision support model for selecting intra-arterial therapies for unresectable hepatocellular ... - Nature.com

Here are 7 free AI classes you can take online from top tech firms, universities – Fortune

Almost a quarter of global jobs is expected to change within the next five years thanks to AI, and with only a small percentage of workers with skills in this field, the rush to learn the ins-and-outs of AI is ever more important.

ADVERTISEMENT

Join a global network of thought leaders and innovators. Understand the changes required across your organization to drive successful AI adoption.

AI is providing people with on-demand learning anywhere they are at any time of day on any day, says Jared Curham, a professor of work and organizational studies at MITs Sloan School of Management. Curhan recently launched two new AI-powered courses focused on the world of strategic negotiation and says that the technology is overall making education more accessible with personalized feedback and coaching.

While there are an increasing number of full-fledged AI degree programs, including within business schools, some students may be looking for a simpler or self-paced route. If youre interested in learning more about this in-demand field, several top tech firms and universities offer free online courses that serve as an introduction to AI technologies.

Amazon has more than 100 free and low-cost AI courses and learning resources available through AWS. Learners can obtain the basic skills in machine learning, generative AI, and foundational models. As a whole, the company has a commitment to provide free AI skills training to 2 million people by 2025.

The machine learning plan has nearly seven hours of free content in which individuals can learn the foundations of the technology, including relevant terminology, and decision-making processes. It also teaches users how to utilize Amazon SageMaker, the companys machine learning platform used by companies like AT&T and LG.

Google offers a beginner course for anyone who may be interested in how AI is being used in the real world. Google AI for Everyone, which is offered through online education platform edX, is a self-paced course that takes about four weeks to complete, assuming you dedicate two-to-three hours per week to the course. Participants learn about both AI and machine-learning principles and real-world applications of the technologies.

Google also covers what AI programming looks like and the process of teaching a computer how to learn. The course is taught by Laurence Moroney, who leads AI Advocacy at Google as part of the Google Research into Machine Intelligence (RMI) team. Nearly 12,000 people have enrolled in this free online course, according to edX.

If youre one of the 5.7 million people who has taken Harvard Universitys CS50 Introduction to Computer Science course through edX, then the universitys introductory AI class might be the best option for you. CS50, which is one of the most popular free online courses of all time, is a prerequisite for Harvards Introduction to Artificial Intelligence with Python course.

This seven-week course covers AI algorithms, game-playing engines, handwriting recognition, and machine translation. Students have to commit between 10 and 30 hours per week to complete the course, which includes hands-on projects and lectures. The course is taught by David J. Malan, a renowned computer scientist and Harvard professor.

IBM, which is recognized as a revolutionary leader in emerging technologies, offers an AI Foundations for Everyone specialization through Coursera. The specialization includes three courses:

The entire specialization takes about three months to complete, assuming you dedicate two hours per week to coursework. Students will learn the basics of what AI is, as well as its applications and ethical concerns. Theyll also hear from experts about starting a career in AI. The program is taught by Rav Ahuja and Antonio Cangiano, who work for IBMs Skills Network. Participants earn a certificate upon completion.

Intel has a goal to provide more than 30 million people with AI skills by 2030. As part of this commitment, the company provides dozens of free self-paced courses online on subjects such as deep learning for robotics, deep learning, and natural language processing.

Intel also has several AI Concepts educational pages that will walk you through definitions, real-world examples, tools, and resources for topics such as generative AI, AI inference, and transfer learning. Additionally, the company provides free on-demand webinars on more advanced AI use cases such as optimizing transformer models, optimizing AI workloads, and AI performance tuning.

As part of its Computational Social Science specialization through Coursera, the University of CaliforniaDavis offers a course focused on AI: Big Data, Artificial Intelligence, and Ethics. During this four-week course, participants learn about big data and its limitations, the history of artificial intelligence, and research ethics. The entire self-paced course takes about 12 total hours to complete.

The course is taught by Martin Hilbert, who is a professor at UC Davis and serves as a chair for computational social science. The course uses case studies to help participants learn AI concepts. More than 31,000 participants have completed this course, and those who do earn a certificate that can be shared on LinkedIn.

For someone who may be looking to break into AI or who wants to learn more about the applications of this new technology to different industries, the University of Pennsylvania offers a string of courses focused on artificial intelligence. The AI for Business specialization includes four courses:

These beginner courses take a total of about four months to complete and culminate in an applied learning project. Program participants complete peer-reviewed exercises to illustrate what theyve learned about data analytics, machine learning tools, and people management.The specialization is taught by eight UPenn professors from the Wharton School, a top-ranked business school by Fortune Education, and other professors from the university. The courses are offered through online education platform Coursera, and students can earn a certificate that can be displayed on their LinkedIn profile.

There is no one best course or program since AI is still so new. What ultimately matters is your curiosity to learn about AI, which you can do by working directly with prompt engineering or machine learning to gain hands-on skills.

You can certainly learn the foundations of AI in three monthsespecially if you already have a background in computer science. It is important to keep in mind that because AI is always changing and developing, you will need to keep up to date with the latest trends if you are looking to pursue a career focused on working with the technology.

Taking free AI courses on platforms such as Udemy or Codecademy is a great place to learn AI if youre a beginner. You can also learn AI by watching YouTube videos or reading through AI subreddits. The number of ways to learn AI are only growing, so there is ultimately no perfect path. Above all, just be curious, ask important questions, and dont be afraid to dive down rabbit holes

Check out all ofFortunesrankings of degree programs, and learn more about specific career paths.

Sydney Lake contributed to this piece.

Read the original:
Here are 7 free AI classes you can take online from top tech firms, universities - Fortune

Identification and validation of potential common biomarkers for papillary thyroid carcinoma and Hashimoto’s thyroiditis … – Nature.com

Identify shared differential genes

When conducting PCA analysis on the expression matrices of GSE33570 (Fig.2a) and GSE29315 (Fig.2d), we observed a clear two-sided distribution of samples in both the disease group and the control group. In the analysis of the GSE35570 dataset, a total of 1572 distinct genes were detected as being differentially expressed. These DEGs were categorized into 824 up-regulated genes and 748 down-regulated genes (Fig.2b). Similarly, we observed 423 DEGs in the GSE29315 dataset, including 271 up-regulated DEGs and 152 down-regulated DEGs (Fig.2e). Next, the GEGs of the two datasets are displayed heatmaps for both datasets (Fig.2c,f). Furthermore, we employed a Venn diagram to identify the overlapping genes with the same directional trend, resulting in 64 genes being up-regulated (Fig.2g) and 37 genes being down-regulated (Fig.2h).

Differential expression gene analysis, function enrichment analysis and pathway enrichment analysis. (a) The PCA plot of GSE35570. (b, c) The Volcano plot and heatmap of DEGs in GSE33570. (d) The PCA plot of GSE29315. (e, f) The Volcano plot and heatmap of DEGs in GSE29315. (g) Venn plot of the up-regulated DEGs. (h) Venn plot of the down-regulated DEGs. (i) The KEGG enrichment analyses of DEGs. (j) The GO enrichment analyses of DEGs.

In order to enhance our comprehension of the fundamental biological functions linked to the 101 DEGs, an assessment of GO and KEGG enrichment was conducted using the clusterProfiler software package in R. An analysis of GO highlighted that these shared genes were mainly enriched in leukocyte mediated immunity, myeloid leukocyte activation, and antigen processing and presentation (Fig.2j). Additionally, the DEGs exhibited significant enrichment across the top five KEGG pathways, including Tuberculosis, Phagosome, Viral myocarditis, Inflammatory bowel disease, and Th1 and Th2 cell differentiation (Fig.2i). Apparently, the functions of differentially expressed genes are closely associated with the immune function of the body. The core genes primarily serve the purpose of activating immune cells.

To carry out the PPI analysis, we utilized the STRING online tool and visualized the outcomes using the Cytoscape software (Supplementary Fig. S1a). The PPI network showed 68 nodes and 498 edges. The DC value of each node was calculated, with a median value of 11. Based on this, we identified 17 hub genes of PPI network: TYROBP, ITGB2, STAT1, HLA-DRA, C1QB, MMP9, FCER1G, IL10RA, LCP2, LY86, CD53, CD14, CD163, HCK, MNDA, HLA-DPA1, and ALOX5AP. Subsequently, we employed the MCODE plug-in to identify six modules (Supplementary Fig. S1b,c), which included a total of 29 common DEGs. These DEGs were LCP2, TYROBP, CD53, LY86, ITGB2, FCER1G, MNDA, C1QB, HCK, IL10RA, HLA-DRA, ALOX5AP, MT1G, MT1F, MT1E, MT1X, ISG15, IFIT3, PSMB9, GBP2, CD14, CD163, VSIG4, CAV1, TIMP1, S100A4, SDC2, FGFR2, and STAT1. The most important module comprises 12 genes (LCP2, TYROBP, CD53, LY86, ITGB2, FCER1G, MNDA, C1QB, HCK, IL10RA, HLA-DRA, ALOX5AP), which were further analyzed using the ClueGO plug-in in Cytoscape software. The investigation revealed that these genes primarily function in activating neutrophils to participate in the immune response and activating innate immunity (Supplementary Fig. S1d).

In this study, we analyzed a total of 26 genes from six modules extracted from MCODE. To determine the importance of each gene, we employed the RF algorithm in two datasets, namely GSE35570 (Fig.3a) and GSE29315 (Fig.3b). By comparing the rankings of gene importance in both datasets, we identified the top eight genes that were consistently ranked highly. To visualize this overlap, we created a Venn diagram (Fig.3c), which revealed three genes (CD53, FCER1G and TYROBP) that were shared between the two datasets. Remarkably, these three genes overlap with the hub genes identified through the PPI analysis based on DC values, as well as the genes found in the most significant module. These three genes showed promising diagnostic potential for HT and PTC. To evaluate the diagnostic value of the common hub genes, we computed the Cutoff Value, sensitivity, specificity, AUC and 95% CI for each gene in the four datasets (Table 1). In the GSE35570 dataset (Fig.3d), the AUC values were as follows: CD53 (AUC 0.71, 95% CI 0.610.82), FCER1G (AUC 0.81, 95% CI 0.730.89), and TYROBP (AUC 0.79, 95% CI 0.710.88). In the GSE29315 dataset (Fig.3e), the AUC values were as follows: CD53 (AUC 1.00, 95% CI 1.001.00), FCER1G (AUC 1.00, 95% CI 1.001.00) and TYROBP (AUC 1.00, 95% CI 1.001.00). In the TCGA dataset (Fig.3f), we validated the diagnostic value of the common hub genes for PTC. The AUC values were as follows: CD53 (AUC 0.71 95% CI 0.610.82), FCER1G (AUC 0.74, 95% CI 0.640.89) and TYROBP (AUC 0.80, 95% CI 0.700.89). To further evaluate the diagnostic value of the common hub genes for PTC in HT, we computed the AUC and 95% CI for each gene using GSE1398198. In the GSE138198 dataset (Fig.3g), the AUC values were as follows: CD53 (AUC 0.83, 95%CI 0.571.00), FCER1G (AUC 0.92, 95% CI 0.721.00) and TYROBP (AUC 1.00, 95% CI 1.001.00). We also analyzed the difference box plots between the two groups in the four datasets (Supplementary Fig. S2). Our analysis using box plots revealed a noteworthy disparity in gene expression between the HT group and the control group in GSE29315. This disparity serves as an explanation for the AUC values of the three hub genes in GSE29315, all of which were observed to be 1.

Screening of hub genes and the diagnostic value of hub genes. (a) The rankings of gene importance in GSE35570. (b) The rankings of gene importance in GSE29315. (c) Venn plot of the top eight genes in GSE35570 and GSE29315. (d) Diagnostic value of hub genes in the GSE35570. (e) Diagnostic value of hub genes in the GSE29315, (f) Diagnostic value of hub genes in the TCGA. (g) Diagnostic value of hub genes in the GSE138198.

By using the GSE35570 dataset, we developed three diagnostic model specifically for PTC, incorporating these pivotal genes that were identified through our analysis. The ANN model (Fig.4a) had 4 hidden units, a penalty of 0.0108, and was trained for 537 epochs. The ANN model achieved an AUC of 0.94 (95% CI 0.910.98) in the training set, while in the test set, the AUC was 0.94 (95% CI 0.831.00) (Fig.4b). The XGBoost model had 8 mtry, 6 min_n, 3 max_depth, 0.001 learn_rate, and 0.07 loss_reduction and 0.97 sample_size. The XGBoost model achieved an AUC of 0.84 (95% CI 0.750.93) in the training set, while in the test set, the AUC was 0.62 (95% CI 0.420.83) (Supplementary Fig. S3a). The DT model had 0.0003 cost_complexity, 5 tree_depth and 6 min_n. The DT model achieved an AUC of 0.93 (95% CI 0.900.97) in the training set, while in the test set, the AUC was 0.83 (95% CI 0.651.00) (Supplementary Fig. S3b). Supplementary Table S1 displays the predictive performance of three machine learning models. The results indicate that the ANN model outperformed the other models, leading us to choose the ANN model for further analysis. TCGA dataset as external validation dataset was utilized to assess the diagnostic performance of the ANN model for PTC, yielding an AUC value of 0.77 (95% CI 0.660.87) (Fig.4c). The GSE138198 dataset was used to evaluate the ANN models diagnostic efficacy for PTC in HT. In the GSE138198 dataset (Fig.4d), the ANN model demonstrated a perfect AUC of 1.00 (95% CI 1.001.00). To provide clinicians with a better understanding of variable contributions, we utilized the SHAP algorithm to interpret the ANN prediction results. Figure4e, f, g illustrated how the attributed importance of features changed as their values varied. Our findings reveal that CD53 had the most significant impact on the output of the ANN model. Initially, it was positively associated with the risk of PTC and then became negatively correlated after a turning point of approximately 6. TYROBP and FCER1G showed a positive correlation with the occurrence of PTC.

ANN model construction and feature importance analysis. (a) The ANN was constructed based on the shared hub genes. (b) Diagnostic value of the ANN model in the GSE35570. (c) Diagnostic value of the ANN model in the TCGA. (d) Diagnostic value of the ANN model in the GSE138198. (e) A score calculated by SHAP was used for each input feature. (f, g) Distribution of the impact of each feature on the full model output estimated using the SHAP values.

We analyzed the protein expression of the hub genes based on the HPA database (Supplementary Fig. S4). CD53 was highly expressed in both tumor and normal tissues, while FCER1G and TYROBP showed higher expression in tumors compared to normal tissues. Furthermore, IF staining was performed to measure the expressions of CD53, FCER1G, and TYROBP in our clinical samples, including 10 HT-related PTC tissues and 6 NAT. By performing IF analysis (Fig.5), we obtained semi-quantitative results indicating significantly elevated fluorescence signal intensities for CD53, FCER1G, and TYROBP in the HT-related PTC group, as compared to the NAT group (P<0.05).

Microscopy scan of IF staining showed the distribution of CD53(green), FCER1G(green), and TYROBP(green), in HT-related PTC tissues and normal tissues adjacent to the tumour (NAT); as well as diagnostic value of CD53, FCER1G and TYROBP. MFI: Mean Fluorescence Intensity.

Considering the important roles of immune and inflammatory responses in the development of HT and PTC, we analyzed the differences in immune cell infiltration patterns between PTC, HT and normal samples using the CIBERSORT algorithm. By utilizing the GSE35570 dataset, we identified 12 immune subgroups that exhibited significant variations between PTC and normal samples (Supplementary Fig. S5a). Additionally, the analysis of the GSE29315 dataset revealed 5 immune subgroups that were significantly different between HT and normal samples (Supplementary Fig. S5b). Among these, 4 common immune subpopulations were found to be significantly higher in both PTC and HT samples compared to normal samples. These subpopulations included T cells CD8, T cells CD4 memory resting, macrophages M1 and mast cells resting. Additionally, we conducted spearman correlation analysis between hub genes and immune cells (Supplementary Fig. S5c,d). The results suggested that immune responses could potentially contribute to the involvement of hub genes in PTC and HT progression. IF staining was utilized to identify immune cell infiltration in 5 cases of PTC in HT tissues and 5 cases of NAT (Fig.6). The expression levels of CD4+T-cell marker Cd4, CD8+T-cell marker Cd8, and macrophage marker Cd86 were found to be significantly higher in the PTC in HT group compared to the NAT group. The IF staining results provided some extent of verification for the accuracy of the immune infiltration analysis results.

Microscopy scan of IF staining showed the distribution of Cd4(green), Cd8(green), and Cd86(green), in HT-related PTC tissues and normal tissues adjacent to the tumour (NAT). MFI: Mean Fluorescence Intensity.

Based on the three core genes screened in the RF algorithm, we conducted a search in the DGIdb database for relevant potential drugs. The results showed that only FCER1G had relevant drugs, while no relevant drugs were found for CD53 and TYROBP. FCER1G was predicted to have two potential drugs: benzylpenicilloyl polylysine and aspirin. Among these, benzylpenicilloyl polylysine had the highest score of 29.49, while aspirin had a score of only 1.26. We hypothesise that benzylpenicilloyl polylysine and aspirin may be effective in the treatment of HT and PTC and may prevent HT carcinogenesis.

See original here:
Identification and validation of potential common biomarkers for papillary thyroid carcinoma and Hashimoto's thyroiditis ... - Nature.com

Looking to break into AI? These 6 schools offer master’s in artificial intelligence programs – Fortune

While buzz about artificial intelligence (AI) has largely focused on the growing popularity of generative AI tools such as ChatGPT, the demand for jobs and growth in the sector is booming. In fact, AI and machine learning specialist roles are growing faster than any other occupation in the world, according to the World Economic Reports Future of Jobs Report.

Ryan Aytay, CEO of Tableau, says AI and big datas rapid growth in popularity and growth has created a need for everyone to learn the appropriate skills as well as to more broadly adopt a philosophy of lifelong learning.

ADVERTISEMENT

Join a global network of thought leaders and innovators. Understand the changes required across your organization to drive successful AI adoption.

[AI] only seems to have accelerated this need for everyone, not just business users, not just analysts, really everyone to have the ability to not only see and understand but also use that data to make decisions with regardless of what they need to be focused on, Aytay says.

Over the past few months, more universities have sought to meet the AI demand head on by creating degree programs specifically focused on the subject. For example, just in March 2024, Purdue Universitya school known for its strong engineering armannounced a brand new online masters in AI.

If AI from a business perspective interests you, youre in luck, too. Many business schools now offer MBA specializations in AI as well as certifications focused on the subject.

And while there are also options to take free online courses in artificial intelligence, many schools now offer full-fledged degree tracks. Fortune compiled a list of six masters in AI programs to check out if youre looking to make a career switch.

At Duke University, students in the artificial intelligence for product innovation master of engineering program can complete courses in -person in 12 to 16 months or online within 24 months. Students can also choose from a variety of learning tracksor a focusincluding data science and machine learning.

The program also includes a capstone project and summer internship. Graduates often move intotake jobs as machine learning engineers, AI engineers, data scientists, and data engineers for companies including OpenAI, Doordash, and Targets AI Lab within six months of graduation. All students must complete an online data science and Python bootcamp the summer before the start of their program.

Students complete 10 courses during the program, covering topics including AI, machine learning, operations, and management. The management courses are offered through Dukes Law School and Fuqua School of Business, which Fortune ranks as having one of the top full-time MBA programs in the U.S.

Applicants are expected to have an undergraduate degree in science or engineering (or equivalent technical work experience), minimum one year of programming experience, two semesters completed of calculus, and meet English proficiency admission requirements (for international students).

The cost of Dukes program depends on the modality (online or in-person) and the amount of time taken to complete the degree. Applications require transcripts, short-answer essay responses, a resume, three letters of recommendation, and an introductory video. Prospective students have the option to submit GRE scores.

Format: Online or in-person

Cost: $99,734 (online); $113,892 (in-person)

Deadlines: Round 1: January 15 (online and in-person); Round 2: March 15 (in-person), April 15 (online)

Johns Hopkins University offers both a masters degree and a graduate certificate in artificial intelligence through its Whiting School of Engineering. The online masters in AI includes 10 coursesfour core courses and six electivesand students can take up to five years to complete them.

Curriculum includes algorithms, applied machine learning, and creating AI-enabled systems. Johns Hopkins does require several prerequisite courses including calculus, programming, and linear algebra, but will offer provisional admission for students to complete the required courses prior to enrollment.

GRE scores arent required to apply, but most admitted students have at least a 3.0 undergraduate GPA.

Format: Online

Cost: $52,700 (estimated total program price)

Deadlines: Open year-round (terms begin in spring, summer, and fall)

Northwestern Universitys masters in artificial intelligence seeks to train those with a desire to become architects of intelligent systems. Through the program, students learn the psychological and design implications of AI and how business needs may be satisfied.

Students can take a traditional track or choose the MSAI+X program and combine AI with their original field of study. The program is limited to approximately 40 students per year and lasts for 15 months.

Applicants should have a bachelors in computer science or related field, and preference will be given to those with at least two years of relevant work experience.

Format: In-person

Cost: ~$110,000

Deadlines: December 15 (priority); March 15 (final)

Purdues new masters in artificial intelligence seeks to prepare students to succeed in todays increasingly tech-reliant world. Students will learn practical skills in AI and computing as well as professional skills like leadership and project management and technical skills like programming and machine learning.

Participants can choose two major tracks: AI and machine learning or AI management and policy. Admissions requirements differ depending on which major is chosen. There is no application fee to apply. While English proficiency testing is required for international students, GRE and GMAT scores are not needed.

Format: Online

Cost: ~$28,000

Deadlines: August 1 (fall); December 1 (spring); April 1 (summer)

The masters in AI at the University of MichiganDearborn teaches students the foundational theory and practice of AI. The program is very flexibility in the sense that students can choose to learn online, in-person, or hybrid, and learn either on a full- or part-time basis. Because of the latter offering, courses are hosted in the late afternoon or evening hours.

Students can focus on four different concentrations: computer vision, intelligence interaction, machine learning, or knowledge management and reasoning. Admission into the program requires students to have graduated with bachelors degree in a STEM field with a B average. Mathematics skills, such as calculus III and linear algebra, is recommended but not required.

Format: Online, in-person, or hybrid

Cost: $50,208/year (direct + indirect costs, out-of-state)

Deadlines: Rolling admission

UTAustin offers its online masters program in AI through its department of computer science and machine learning laboratory, and the degree can be completed at your own pace. The degree covers about two years worth of content. The program is offered on the online education platform, edX, an online education platform, and costs $10,000 to complete, making it one of the more affordable options.

The degree covers AI-related topics, including natural language processing, reinforcement learning, computer vision, and deep learning, which prepares graduates for A.I. jobs in engineering, research and development, product management, and consulting.

The program quickly skyrocketed in popularity, with more than 4,000 prospective students requesting more information from the university within 24 hours of its launch announcement.

Prospective students must submit an application to the Graduate School at The University of Texas at Austin as well as a statement of purpose, resume, and transcripts. Letters of recommendation and GRE scores are optional to submit.

Format: Online

Cost: $10,000 (202324 academic year)

Deadlines: Fall: April 1 (priority), May 1 (final); Spring: August 15 (priority), September 15 (final)

Yes, having a masters in AI can be very beneficial for those wanting to become AI experts. However, it is also important to keep in mind that AI is always evolving. By the time your program completes, some of the skills and best practices you initially learned could be out of date.

Yes, you will need to learn how to code if you plan to study AI in an advanced degree program. Python is generally considered to be the most relevant programming language to AI. Having skills in Java, SQL, C++, and R also couldnt hurt. Some masters in AI programs, like Duke, require students to have some programming experience as well as to enroll in a Python bootcamp.

The best degree pathway for those interested in AI truly depends on your interests.

A masters in AI will likely give you a perfect entry into careers in AI, data science, machine learning, and beyond. If you know a particular specialization in the tech space interests you more than another, that is a great place to start. Above all, keep in mind that because AI masters are new, there is no perfect path; its up to you to define it.

Check out all of Fortunes rankings of degree programs, and learn more about specific career paths.

Sydney Lake contributed to this piece.

Original post:
Looking to break into AI? These 6 schools offer master's in artificial intelligence programs - Fortune