Archive for the ‘Machine Learning’ Category

Can Machine Learning be the Best Remedy in the Education Sector? – Analytics Insight

The classrooms in present era are not only expanding to use more technologies and digital tools but they are also engaging in machine learning

Technology in the classroom is becoming more and more popular as we pass through the 21st century. Laptops are replacing our textbooks, and on our smart phones, we can study just about everything we want. Social media has become ubiquitous, and the way we use technology has changed the way we live our lives fully.

Technology has become the core component of distance education programs. It enhances teachers and students to digitally interconnect and exchange material and student work, retaining a human link, which is important for the growth of young minds. Enhanced connections and customized experience can allow educators torecognizeopportunities for learning skills and enhance the potential of a student.

Hence, the classrooms in present era are not only expanding to use more technologies and digital tools but they are also engaging in machine learning.

Machine learning is an artificial intelligence (AI) element, which lets machines or computers learn from all previous knowledge and make smart decisions. The architecture for machine learning involves gathering and storing a rich collection of information and turning it into a standardized knowledge base for various uses in different fields. Educators could save time in their non-classroom practices in the field of education by concentrating on machine learning.

For instance, teachers may use virtual helpers to work for their students directly from home. This form of assistance helps to boost the learning environment of students and can promote growth and educational success.

According to ODSC, Last years report by MarketWatch has revealed that Machine Learning in education will remain one of the top industries to drive investment, with the U.S. and China becoming the top key players by 2030. Major companies, like Google and IBM, are getting involved in making school education more progressive and innovative.

Analyzing all-round material

By making the content more up-to-date and applicable to an exact request, the use of machine learning in education aims to bring the online learning sector to a new stage. How? ML technologies evaluate the content of courses online and help to assess whether the quality of the knowledge presented meets the applicable criteria. On the other hand, know how users interpret the data and understand what is being explained. Users then obtain the data according to their particular preferences and expertise, and the overall learning experience increases dramatically.

Customized Learning

This is the greatest application of machine learning. It is adaptable and it takes care of individual needs. Students are able to guide their own learning through this education system. They can have theirown speed and decide what to study and how to learn. They can select the topics they are interested in, the instructor they want to learn from, and what program they want to pursue, expectations and trends.

Effective Grading

In education, there is another application of machine learning that deals with grades and scoring. Since the learning skills of a large number of students are expressed in each online course, grading them becomes a challenge. ML technology makes the grading process a few seconds problem. In this context, we talk more about the exact sciences. There are places where teachers cannot be replaced by computers, but even in such situations, they can contribute to enhance current approaches of grading and evaluation.

According to TechXplore, Researchers at University of Tbingen and Leibniz Institute fr Wissensmedien in Germany, as well as University of Colorado Boulder, have recently investigated the potential of machine-learning techniques for assessing student engagement in the context of classroom research. More specifically, they devised a deep-neural-network-based architecture that can estimate student engagement by analyzing video footage collected in classroom environments.

They also mentioned that, We used camera data collected during lessons to teach a deep-neural-network-based model to predict student engagement levels, Enkelejda Kasneci the leading HCI researcher in the multidisciplinary team that carried out the study, told TechXplore. We trained our model on ground-truth data (e.g., expert ratings of students level of engagement based on the videos recorded in the classroom). After this training, the model was able to predict, for instance, whether data obtained from a particular student at a particular point in time indicates high or low levels of engagement.

Read more:
Can Machine Learning be the Best Remedy in the Education Sector? - Analytics Insight

Microchip Accelerates Machine Learning and Hyperscale Computing Infrastructure with the World’s First PCI Express 5.0 Switches – EE Journal

Switchtec PFX PCIe Gen 5 high performance switches double the data rate of PCIe Gen 4.0 solutions while delivering ultra-low latency and advanced diagnostics

CHANDLER, Ariz., Feb. 02, 2021 (GLOBE NEWSWIRE) Applications such as data analytics, autonomous-driving and medical diagnostics are driving extraordinary demands for machine learning and hyperscale compute infrastructure. To meet these demands, Microchip Technology Inc.(Nasdaq: MCHP)today announced the worlds first PCI Express (PCIe) 5.0 switch solutions theSwitchtec PFX PCIe 5.0 family doubling the interconnect performance for dense compute, high speed networking and NVM Express(NVMe) storage. Together with the XpressConnectretimers, Microchip is the industrys only supplier of both PCIe Gen 5 switches and PCIe Gen 5 retimer products, delivering a complete portfolio of PCIe Gen 5 infrastructure solutions with proven interoperability.

Accelerators, graphic processing units (GPUs), central processing units (CPUs) and high-speed network adapters continue to drive the need for higher performance PCIe infrastructure. Microchips introduction of the worlds first PCIe 5.0 switch doubles the PCIe Gen 4 interconnect link rates to 32 GT/s to support the most demanding next-generation machine learning platforms, said Andrew Dieckmann, associate vice president of marketing and applications engineering for Microchips data center solutions business unit. Coupled with our XpressConnect family of PCIe 5.0 and Compute Express Link(CXL) 1.1/2.0 retimers, Microchip offers the industrys broadest portfolio of PCIe Gen 5 infrastructure solutions with the lowest latency and end-to-end interoperability.

The Switchtec PFX PCIe 5.0 switch family comprises high density, high reliability switches supporting 28 lanes to 100 lanes and up to 48 non-transparent bridges (NTBs). The Switchtec technology devices support high reliability capabilities, including hot-and surprise-plug as well as secure boot authentication. With PCIe 5.0 data rates of 32 GT/s, signal integrity and complex system topologies pose significant development and debug challenges. To accelerate time-to-market, the Switchtec PFX PCIe 5.0 switch provides a comprehensive suite of debug and diagnostic features including sophisticated internal PCIe analyzers supporting Transaction Layer Packet (TLP) generation and analysis and on-chip non-obtrusive SerDes eye capture capabilities. Rapid system bring-up and debug is further supported with ChipLink an intuitive graphical user interface (GUI) based device configuration and topology viewer that provides full access to the PFX PCIe switchs registers, counters, diagnostics and forensic capture capabilities.

Intels upcoming Sapphire Rapids Xeon processors will implement PCI Express 5.0 and Compute Express Link running up to 32.0 GT/s to deliver the low-latency and high-bandwidth I/O solutions our customers need to deploy, said Dr. Debendra Das Sharma, Intel fellow and director of I/O technology and standards. We are pleased to see Microchips PCIe 5.0 switch and retimer investment strengthen the ecosystem and drive broader deployment of PCIe 5.0 and CXL enabled solutions.

Development ToolsMicrochip has released a full set of design-in collateral, reference designs, evaluation boards and tools to support customers building systems that take advantage of the high-bandwidth of PCIe 5.0.

In addition to PCIe technology, Microchip also provides data center infrastructure builders worldwide with total system solutions including RAID over NVMe, storage, memory, timing and synchronization systems, stand-alone secure boot, secure firmware and authentication, wireless products, touch-enabled displays to configure and monitor data center equipment and predictive fan controls.

AvailabilityThe Switchtec PFX PCIe 5.0 family of switches are sampling now to qualified customers. For additional information, contact a Microchip sales representative.

ResourcesHigh-res image available through Flickr or editorial contact (feel free to publish):

About Microchip TechnologyMicrochip Technology Inc. is a leading provider of smart, connected and secure embedded control solutions. Its easy-to-use development tools and comprehensive product portfolio enable customers to create optimal designs which reduce risk while lowering total system cost and time to market. The companys solutions serve more than 120,000 customers across the industrial, automotive, consumer, aerospace and defense, communications and computing markets. Headquartered in Chandler, Arizona, Microchip offers outstanding technical support along with dependable delivery and quality. For more information, visit the Microchip website atwww.microchip.com.

Related

More:
Microchip Accelerates Machine Learning and Hyperscale Computing Infrastructure with the World's First PCI Express 5.0 Switches - EE Journal

The POWER Interview: The Importance of AI and Machine Learning – POWER magazine

Artificial intelligence (AI) and machine learning (ML) are becoming synonymous with the operation of power generation facilities. The increased digitization of power plants, from equipment to software, involves both thermal generation and renewable energy installations.

Both AI and ML will be key elements for the design of future energy systems, supporting the growth of smart grids and improving the efficiency of power generation, along with the interaction among electricity customers and utilities.

The technology group Wrtsil is a global leader in using data to improve operations in the power generation sector. The company helps generators make better asset management decisions, which supports predictive maintenance. The company uses AI, along with advanced diagnostics, and its deep equipment expertise greatly to enhance the safety, reliability, and efficiency of power equipment and systems.

Luke Witmer, general manager, Data Science, Energy Storage & Optimization at Wrtsil, talked with POWER about the importance of AI and ML to the future of power generation and electricity markets.

POWER: How can artificial intelligence (AI) be used in power trading, and with regard to forecasts and other issues?

Witmer: Artificial intelligence is a very wide field. Even a simple if/else statement is technically AI (a computer making a decision). Forecasts for price and power are generated by AI (some algorithm with some historic data set), and represent the expected trajectory or probability distribution of that value.

Power trading is also a wide field. There are many different markets that span different time periods and different electricity (power) services that power plants provide. Its more than just buying low and selling high, though that is a large piece of it. Forecasts are generally not very good at predicting exactly when electricity price spikes will happen. There is always a tradeoff between saving some power capacity for the biggest price spikes versus allocating more of your power for marginal prices. In the end, as a power trader, it is important to remember that the historical data is not a picture of the future, but rather a statistical distribution that can be leveraged to inform the most probable outcome of the unknown future. AI is more capable at leveraging statistics than people will ever be.

POWER: Machine learning and AI in power generation rely on digitalization. As the use of data becomes more important, what steps need to be taken to support AI and machine learning while still accounting for cybersecurity?

Witmer: A lot of steps. Sorry for the lame duck answer here. Regular whitehat penetration testing by ethical hackers is probably the best first step. The second step should be to diligently and quickly address each critical issue that is discovered through that process. This can be done by partnering with technology providers who have the right solution (cyber security practices, certifications, and technology) to enable the data flow that is required.

POWER: How can the power generation industry benefit from machine learning?

Witmer: The benefit is higher utilization of the existing infrastructure. There is a lot of under-utilized intrastructure in the power generation industry. This can be accomplished with greater intelligence on the edges of the network (out at each substation and at each independent generation facility) coupled with greater intelligence at the points of central dispatch.

POWER: Can machines used in power generation learn from their experiences; would an example be that a machine could perform more efficiently over time based on past experience?

Witmer: Yes and no. It depends what you mean by machines. A machine itself is simply pieces of metal. An analogy would be that your air conditioner at home cant learn anything, but your smart thermostat can. Your air conditioner needs to just operate as efficiently as possible when its told to operate, constrained by physics. Power generation equipment is the same. The controls however, whether at some point of aggregation, or transmission intersection, or at a central dispatch center, can certainly apply machine learning to operate differently as time goes on, adapting in real time to changing trends and conditions in the electricity grids and markets of the world.

POWER: What are some of the uses of artificial intelligence in the power industry?

Witmer: As mentioned in the response to question 1, I think it appropriate to point you at some definitions and descriptions of AI. I find wikipedia to be the best organized and moderated by experts.

In the end, its a question of intelligent control. There are many uses of AI in the power industry. To start listing some of them is insufficient, but, to give some idea, I would say that we use AI in the form of rules that automatically ramp power plants up/down by speeding up or slowing down their speed governors, in the form of neural networks that perform load forecasting based on historic data and the present state data (time of day, metering values, etc.), in the form of economic dispatch systems that leverage these forecasts, and in the form of reinforcement learning for statistically based automated bid generation in open markets. Our electricity grids combined with their associated controls and markets are arguably the most complex machines that humans have built.

POWER: How can AI benefit centralized generation, and can it provide cost savings for power customers?

Witmer: Centralized power systems continue to thrive from significant economies of scale. Centralized power systems enable equal access to clean power at the lowest cost, reducing economic inequality. I view large renewable power plants that are owned by independent power producers as centralized power generation, dispatched by centralized grid operators. Regardless of whether the path forward is more or less centralized, AI brings value to all parties. Not only does it maximize revenue for any specific asset (thus the asset owner), it also reduces overall electricity prices for all consumers.

POWER: How important is AI to smart grids? How important is AI to the integration of e-mobility (electric vehicles, etc.) to the grid?

Witmer: AI is very important to smart grids. AI is extremely important to the integration of smart charging of electric vehicles, and leveraging of those mobile batteries for grid services when they are plugged into the grid (vehicles to grid, or V2G). However, the more important piece is for the right market forces to be created (economics), so that people can realize the value (actually get paid) for allowing their vehicles to participate in these kinds of services.

The mobile batteries of EVs will be under-utilized if we do not integrate the controls for charging/discharging this equipment in a way that gives both the consumers the ability to opt in/out of any service but also for the centralized dispatch to leverage this equipment as well. Its less a question of AI, and more a question of economics and human behavioral science. Once the economics are leveraged and the right tools are in place, then AI will be able to forecast the availability and subsequent utility that the grid will be able to extract from the variable infrastructure of plugged in EVs.

POWER: How important is AI to the design and construction of virtual power plants?

Witmer: Interesting question. On one level, this is a question that raises an existential threat to aspects of my own job (but thats a good thing because if a computer can do it, I dont want to do it!). Its a bit of a chicken-and-egg scenario. Today, any power plant (virtual or actual), is designed through a process that involves a lot of modeling, or simulations of what-if scenarios. That model must be as accurate as possible, including the controls behavior of not only the new plant in question, but also the rest of the grid and/or markets nearby.

As more AI is used in the actual context of this new potential power plant, the model must also contain a reflection of that same AI. No model is perfect, but as more AI gets used in the actual dispatch of power plants, more AI will be needed in the design and creation process for new power plants or aggregations of power generation equipment.

POWER: What do you see as the future of AI and machine learning for power generation / utilities?

Witmer: The short-term future is simply an extension of what we see today. As more renewables come onto the grids, we will see more negative price events and more price volatility. AI will be able to thrive in that environment. I suspect that as time goes on, the existing market structures will cease to be the most efficient for society. In fact, AI is likely going to be able to take advantage of some of those legacy features (think Enron).

Hopefully the independent system operators of the world can adapt quickly enough to the changing conditions, but I remain skeptical of that in all scenarios. With growing renewables that have free fuel, the model of vertically integrated utilities with an integrated resource planning (IRP) process will likely yield the most economically efficient structure. I think that we will see growing inefficiencies in regions that have too many manufactured rules and structure imposed by legacy markets, designed around marginal costs of operating fossil fuel-burning plants.

Darrell Proctor is associate editor for POWER (@POWERmagazine).

Read more from the original source:
The POWER Interview: The Importance of AI and Machine Learning - POWER magazine

BioSig and Mayo Clinic Collaborate on New R&D Program to Develop Transformative AI and Machine Learning Technologies for its PURE EP System – BioSpace

Westport, CT, Feb. 02, 2021 (GLOBE NEWSWIRE) --

BioSig Technologies, Inc. (NASDAQ: BSGM) (BioSig or the Company), a medical technology company commercializing an innovative signal processing platform designed to improve signal fidelity and uncover the full range of ECG and intra-cardiac signals, today announced a strategic collaboration with the Mayo Foundation for Medical Education and Research to develop a next-generation AI- and machine learning-powered software for its PURE EP system.

The new collaboration will include an R&D program that will expand the clinical value of the Companys proprietary hardware and software with advanced signal processing capabilities and aim to develop novel technological solutions by combining the electrophysiological signals delivered by the PURE EPand other data sources. The development program will be conducted under the leadership of Samuel J. Asirvatham, M.D., Mayo Clinics Vice-Chair of Innovation and Medical Director, Electrophysiology Laboratory, and Alexander D. Wissner-Gross, Ph.D., Managing Director of Reified LLC.

The global market for AI in healthcare is expected to grow from $4.9 billion in 2020 to $45.2 billion by 2026 at an estimated compound annual growth rate (CAGR) of 44.9%1. According to Accenture, key clinical health AI applications, when combined, can potentially create $150 billion in annual savings for the United States healthcare economy by 20262.

AI-powered algorithms that are developed on superior data from multiple biomarkers could drastically improve the way we deliver therapies, and therefore may help address the rising global demand for healthcare, commented Kenneth L Londoner, Chairman and CEO of BioSig Technologies, Inc. We believe that combining the clinical science of Mayo Clinic with the best-in-class domain expertise of Dr. Wissner-Gross and the technical leadership of our engineering team will enable us to develop powerful applications and help pave the way toward improved patient outcomes in cardiology and beyond.

Artificial intelligence presents a variety of novel opportunities for extracting clinically actionable information from existing electrophysiological signals that might otherwise be inaccessible. We are excited to contribute to the advancement of this field, said Dr. Wissner-Gross.

BioSig announced its partnership with Reified LLC, a provider of advanced artificial intelligence-focused technical advisory services to the private sector in late 2019. The new research program builds upon the progress achieved by this collaboration in 2020, which included an abstract for Computational Reconstruction of Electrocardiogram Lead Placement presented during the 2020 Computing in Cardiology Conference in Rimini, Italy, and the development of an initial suite of electrophysiological analytics for the PURE EPSystem.

BioSig signed a 10-year collaboration agreement with Mayo Clinic in March 2017. In November 2019, the Company announced that it signed three new patent and know-how license agreements with the Mayo Foundation for Medical Education and Research.

About BioSig TechnologiesBioSig Technologies is a medical technology company commercializing a proprietary biomedical signal processing platform designed toimprove signal fidelity and uncover the full range of ECG and intra-cardiac signals(www.biosig.com).

The Companys first product,PURE EP Systemis a computerized system intended for acquiring, digitizing, amplifying, filtering, measuring and calculating, displaying, recording and storing of electrocardiographic and intracardiac signals for patients undergoing electrophysiology (EP) procedures in an EP laboratory.

Forward-looking Statements

This press release contains forward-looking statements. Such statements may be preceded by the words intends, may, will, plans, expects, anticipates, projects, predicts, estimates, aims, believes, hopes, potential or similar words. Forward- looking statements are not guarantees of future performance, are based on certain assumptions and are subject to various known and unknown risks and uncertainties, many of which are beyond the Companys control, and cannot be predicted or quantified and consequently, actual results may differ materially from those expressed or implied by such forward-looking statements. Such risks and uncertainties include, without limitation, risks and uncertainties associated with (i) the geographic, social and economic impact of COVID-19 on our ability to conduct our business and raise capital in the future when needed, (ii) our inability to manufacture our products and product candidates on a commercial scale on our own, or in collaboration with third parties; (iii) difficulties in obtaining financing on commercially reasonable terms; (iv) changes in the size and nature of our competition; (v) loss of one or more key executives or scientists; and (vi) difficulties in securing regulatory approval to market our products and product candidates. More detailed information about the Company and the risk factors that may affect the realization of forward-looking statements is set forth in the Companys filings with the Securities and Exchange Commission (SEC), including the Companys Annual Report on Form 10-K and its Quarterly Reports on Form 10-Q. Investors and security holders are urged to read these documents free of charge on the SECs website at http://www.sec.gov. The Company assumes no obligation to publicly update or revise its forward-looking statements as a result of new information, future events or otherwise.

1 Artificial Intelligence in Healthcare Market with COVID-19 Impact Analysis by Offering, Technology, End-Use Application, End User and Region Global Forecast to 2026; Markets and Markets

2 Artificial Intelligence (AI): Healthcares New Nervous System https://www.accenture.com/us-en/insight-artificial-intelligence-healthcare%C2%A0

Read this article:
BioSig and Mayo Clinic Collaborate on New R&D Program to Develop Transformative AI and Machine Learning Technologies for its PURE EP System - BioSpace

Five trends in machine learning-enhanced analytics to watch in 2021 – Information Age

AI usage is growing rapidly. What does 2021 hold for the world of analytics, and how will AI drive it?

Progress of AI-powered operations looks set to grow this year.

As the world prepares to recover from the Covid-19 pandemic, businesses will need to increasingly rely on analytics to deal with new consumer behaviour.

According to Gartner analyst Rita Sallam, In the face of unprecedented market shifts, data and analytics leaders require an ever-increasing velocity and scale of analysis in terms of processing and access to accelerate innovation and forge new paths to a post-Covid-19 world.

Machine learning and artificial intelligence are finding increasingly significant use cases in data analytics for business. Here are five trends to watch out for in 2021.

Gartner predicts that by 2024, 75% of enterprises will shift towards putting AI and ML into operation. A big reason for this is the way the pandemic has changed consumer behaviour. Regression learning models that rely on historical data might not be valid anymore. In their place, reinforcement and distributed learning models will find more use, thanks to their adaptability.

A large share of businesses have already democratised their data through the use of embedded analytics dashboards. The use of AI to generate augmented analytics to drive business decisions will increase as businesses seek to react faster to shifting conditions. Powering data democratisation efforts with AI will help non-technical users make a greater number of business decisions, without having to rely on IT support to query data.

Companies such as Sisense already offer companies the ability to integrate powerful analytics into custom applications. As AI algorithms become smarter, its a given that theyll help companies use low-latency alerts to help managers react to quantifiable anomalies that indicate changes in their business. Also, AI is expected to play a major role in delivering dynamic data stories and might reduce a users role in data exploration.

A fact thats often forgotten in AI conversations is that these technologies are still nascent. Many of the major developments have been driven by open source efforts, but 2021 will see an increasing number of companies commercialise AI through product releases.

This event will truly be a marker of AI going mainstream. While open source has been highly beneficial to AI, scaling these projects for commercial purposes has been difficult. With companies investing more in AI research, expect a greater proliferation of AI technology in project management, data reusability, and transparency products.

Using AI for better data management is a particular focus of big companies right now. A Pathfinder report in 2018 found that a lack of skilled resources in data management was hampering AI development. However, with ML growing increasingly sophisticated, companies are beginning to use AI to manage data, which fuels even faster AI development.

As a result, metadata management becomes streamlined, and architectures become simpler. Moving forward, expect an increasing number of AI-driven solutions to be released commercially instead of on open source platforms.

Vendors such as Informatica are already using AI and ML algorithms to help develop better enterprise data management solutions for their clients. Everything from data extraction to enrichment is optimised by AI, according to the company.

This article explores the ways in which Kubernetes enhances the use of machine learning (ML) within the enterprise. Read here

Voice search and data is increasing by the day. With products such as Amazons Alexa and Googles Assistant finding their way into smartphones and growing adoption of smart speakers in our homes, natural language processing will increase.

Companies will wake up to the immense benefits of voice analytics and will provide their customers with voice tools. The benefits of enhanced NLP include better social listening, sentiment analysis, and increased personalisation.

Companies such as AX Semantics provide self-service natural language generation software that allows customers to self-automate text commands. Companies such as Porsche, Deloitte and Nivea are among their customers.

As augmented analytics make their way into embedded dashboards, low-level data analysis tasks will be automated. An area that is ripe for automation is data collection and synthesis. Currently, data scientists spend large amounts of time cleaning and collecting data. Automating these tasks by specifying standardised protocols will help companies employ their talent in tasks better suited to their abilities.

A side effect of data analysis automation will be the speeding up of analytics and reporting. As a result, we can expect businesses to make decisions faster along with installing infrastructure that allows them to respond and react to changing conditions quickly.

As the worlds of data and analytics come closer together, vendors who provide end-to-end stacks will provide better value to their customers. Combine this with increased data democratisation and its easy to see why legacy enterprise software vendors such as SAP offer everything from data management to analytics to storage solutions to their clients.

Tech experts provide their tips on how to effectively implement automation into your customer relationship management (CRM) process. Read here

IoT devices are making their way into not just B2C products but B2B, enterprise and public projects as well, from smart cities to industry 4.0.

Data is being generated at unprecedented rates, and to make sense of it, companies are increasingly turning to AI. With so much signal, this is a key help for arriving at insights.

While the rise of embedded and augmented analytics has already been discussed, its critical to point out that the sources of data are more varied than ever before. This makes the use of AI critical, since manual processes cannot process such large volumes efficiently.

As AI technology continues to make giant strides the business world is gearing up to take full advantage of it. Weve reached a stage where AI is powering further AI development, and the rate of progress will only increase.

See more here:
Five trends in machine learning-enhanced analytics to watch in 2021 - Information Age