Archive for the ‘Machine Learning’ Category

The POWER Interview: The Importance of AI and Machine Learning – POWER magazine

Artificial intelligence (AI) and machine learning (ML) are becoming synonymous with the operation of power generation facilities. The increased digitization of power plants, from equipment to software, involves both thermal generation and renewable energy installations.

Both AI and ML will be key elements for the design of future energy systems, supporting the growth of smart grids and improving the efficiency of power generation, along with the interaction among electricity customers and utilities.

The technology group Wrtsil is a global leader in using data to improve operations in the power generation sector. The company helps generators make better asset management decisions, which supports predictive maintenance. The company uses AI, along with advanced diagnostics, and its deep equipment expertise greatly to enhance the safety, reliability, and efficiency of power equipment and systems.

Luke Witmer, general manager, Data Science, Energy Storage & Optimization at Wrtsil, talked with POWER about the importance of AI and ML to the future of power generation and electricity markets.

POWER: How can artificial intelligence (AI) be used in power trading, and with regard to forecasts and other issues?

Witmer: Artificial intelligence is a very wide field. Even a simple if/else statement is technically AI (a computer making a decision). Forecasts for price and power are generated by AI (some algorithm with some historic data set), and represent the expected trajectory or probability distribution of that value.

Power trading is also a wide field. There are many different markets that span different time periods and different electricity (power) services that power plants provide. Its more than just buying low and selling high, though that is a large piece of it. Forecasts are generally not very good at predicting exactly when electricity price spikes will happen. There is always a tradeoff between saving some power capacity for the biggest price spikes versus allocating more of your power for marginal prices. In the end, as a power trader, it is important to remember that the historical data is not a picture of the future, but rather a statistical distribution that can be leveraged to inform the most probable outcome of the unknown future. AI is more capable at leveraging statistics than people will ever be.

POWER: Machine learning and AI in power generation rely on digitalization. As the use of data becomes more important, what steps need to be taken to support AI and machine learning while still accounting for cybersecurity?

Witmer: A lot of steps. Sorry for the lame duck answer here. Regular whitehat penetration testing by ethical hackers is probably the best first step. The second step should be to diligently and quickly address each critical issue that is discovered through that process. This can be done by partnering with technology providers who have the right solution (cyber security practices, certifications, and technology) to enable the data flow that is required.

POWER: How can the power generation industry benefit from machine learning?

Witmer: The benefit is higher utilization of the existing infrastructure. There is a lot of under-utilized intrastructure in the power generation industry. This can be accomplished with greater intelligence on the edges of the network (out at each substation and at each independent generation facility) coupled with greater intelligence at the points of central dispatch.

POWER: Can machines used in power generation learn from their experiences; would an example be that a machine could perform more efficiently over time based on past experience?

Witmer: Yes and no. It depends what you mean by machines. A machine itself is simply pieces of metal. An analogy would be that your air conditioner at home cant learn anything, but your smart thermostat can. Your air conditioner needs to just operate as efficiently as possible when its told to operate, constrained by physics. Power generation equipment is the same. The controls however, whether at some point of aggregation, or transmission intersection, or at a central dispatch center, can certainly apply machine learning to operate differently as time goes on, adapting in real time to changing trends and conditions in the electricity grids and markets of the world.

POWER: What are some of the uses of artificial intelligence in the power industry?

Witmer: As mentioned in the response to question 1, I think it appropriate to point you at some definitions and descriptions of AI. I find wikipedia to be the best organized and moderated by experts.

In the end, its a question of intelligent control. There are many uses of AI in the power industry. To start listing some of them is insufficient, but, to give some idea, I would say that we use AI in the form of rules that automatically ramp power plants up/down by speeding up or slowing down their speed governors, in the form of neural networks that perform load forecasting based on historic data and the present state data (time of day, metering values, etc.), in the form of economic dispatch systems that leverage these forecasts, and in the form of reinforcement learning for statistically based automated bid generation in open markets. Our electricity grids combined with their associated controls and markets are arguably the most complex machines that humans have built.

POWER: How can AI benefit centralized generation, and can it provide cost savings for power customers?

Witmer: Centralized power systems continue to thrive from significant economies of scale. Centralized power systems enable equal access to clean power at the lowest cost, reducing economic inequality. I view large renewable power plants that are owned by independent power producers as centralized power generation, dispatched by centralized grid operators. Regardless of whether the path forward is more or less centralized, AI brings value to all parties. Not only does it maximize revenue for any specific asset (thus the asset owner), it also reduces overall electricity prices for all consumers.

POWER: How important is AI to smart grids? How important is AI to the integration of e-mobility (electric vehicles, etc.) to the grid?

Witmer: AI is very important to smart grids. AI is extremely important to the integration of smart charging of electric vehicles, and leveraging of those mobile batteries for grid services when they are plugged into the grid (vehicles to grid, or V2G). However, the more important piece is for the right market forces to be created (economics), so that people can realize the value (actually get paid) for allowing their vehicles to participate in these kinds of services.

The mobile batteries of EVs will be under-utilized if we do not integrate the controls for charging/discharging this equipment in a way that gives both the consumers the ability to opt in/out of any service but also for the centralized dispatch to leverage this equipment as well. Its less a question of AI, and more a question of economics and human behavioral science. Once the economics are leveraged and the right tools are in place, then AI will be able to forecast the availability and subsequent utility that the grid will be able to extract from the variable infrastructure of plugged in EVs.

POWER: How important is AI to the design and construction of virtual power plants?

Witmer: Interesting question. On one level, this is a question that raises an existential threat to aspects of my own job (but thats a good thing because if a computer can do it, I dont want to do it!). Its a bit of a chicken-and-egg scenario. Today, any power plant (virtual or actual), is designed through a process that involves a lot of modeling, or simulations of what-if scenarios. That model must be as accurate as possible, including the controls behavior of not only the new plant in question, but also the rest of the grid and/or markets nearby.

As more AI is used in the actual context of this new potential power plant, the model must also contain a reflection of that same AI. No model is perfect, but as more AI gets used in the actual dispatch of power plants, more AI will be needed in the design and creation process for new power plants or aggregations of power generation equipment.

POWER: What do you see as the future of AI and machine learning for power generation / utilities?

Witmer: The short-term future is simply an extension of what we see today. As more renewables come onto the grids, we will see more negative price events and more price volatility. AI will be able to thrive in that environment. I suspect that as time goes on, the existing market structures will cease to be the most efficient for society. In fact, AI is likely going to be able to take advantage of some of those legacy features (think Enron).

Hopefully the independent system operators of the world can adapt quickly enough to the changing conditions, but I remain skeptical of that in all scenarios. With growing renewables that have free fuel, the model of vertically integrated utilities with an integrated resource planning (IRP) process will likely yield the most economically efficient structure. I think that we will see growing inefficiencies in regions that have too many manufactured rules and structure imposed by legacy markets, designed around marginal costs of operating fossil fuel-burning plants.

Darrell Proctor is associate editor for POWER (@POWERmagazine).

Read more from the original source:
The POWER Interview: The Importance of AI and Machine Learning - POWER magazine

BioSig and Mayo Clinic Collaborate on New R&D Program to Develop Transformative AI and Machine Learning Technologies for its PURE EP System – BioSpace

Westport, CT, Feb. 02, 2021 (GLOBE NEWSWIRE) --

BioSig Technologies, Inc. (NASDAQ: BSGM) (BioSig or the Company), a medical technology company commercializing an innovative signal processing platform designed to improve signal fidelity and uncover the full range of ECG and intra-cardiac signals, today announced a strategic collaboration with the Mayo Foundation for Medical Education and Research to develop a next-generation AI- and machine learning-powered software for its PURE EP system.

The new collaboration will include an R&D program that will expand the clinical value of the Companys proprietary hardware and software with advanced signal processing capabilities and aim to develop novel technological solutions by combining the electrophysiological signals delivered by the PURE EPand other data sources. The development program will be conducted under the leadership of Samuel J. Asirvatham, M.D., Mayo Clinics Vice-Chair of Innovation and Medical Director, Electrophysiology Laboratory, and Alexander D. Wissner-Gross, Ph.D., Managing Director of Reified LLC.

The global market for AI in healthcare is expected to grow from $4.9 billion in 2020 to $45.2 billion by 2026 at an estimated compound annual growth rate (CAGR) of 44.9%1. According to Accenture, key clinical health AI applications, when combined, can potentially create $150 billion in annual savings for the United States healthcare economy by 20262.

AI-powered algorithms that are developed on superior data from multiple biomarkers could drastically improve the way we deliver therapies, and therefore may help address the rising global demand for healthcare, commented Kenneth L Londoner, Chairman and CEO of BioSig Technologies, Inc. We believe that combining the clinical science of Mayo Clinic with the best-in-class domain expertise of Dr. Wissner-Gross and the technical leadership of our engineering team will enable us to develop powerful applications and help pave the way toward improved patient outcomes in cardiology and beyond.

Artificial intelligence presents a variety of novel opportunities for extracting clinically actionable information from existing electrophysiological signals that might otherwise be inaccessible. We are excited to contribute to the advancement of this field, said Dr. Wissner-Gross.

BioSig announced its partnership with Reified LLC, a provider of advanced artificial intelligence-focused technical advisory services to the private sector in late 2019. The new research program builds upon the progress achieved by this collaboration in 2020, which included an abstract for Computational Reconstruction of Electrocardiogram Lead Placement presented during the 2020 Computing in Cardiology Conference in Rimini, Italy, and the development of an initial suite of electrophysiological analytics for the PURE EPSystem.

BioSig signed a 10-year collaboration agreement with Mayo Clinic in March 2017. In November 2019, the Company announced that it signed three new patent and know-how license agreements with the Mayo Foundation for Medical Education and Research.

About BioSig TechnologiesBioSig Technologies is a medical technology company commercializing a proprietary biomedical signal processing platform designed toimprove signal fidelity and uncover the full range of ECG and intra-cardiac signals(www.biosig.com).

The Companys first product,PURE EP Systemis a computerized system intended for acquiring, digitizing, amplifying, filtering, measuring and calculating, displaying, recording and storing of electrocardiographic and intracardiac signals for patients undergoing electrophysiology (EP) procedures in an EP laboratory.

Forward-looking Statements

This press release contains forward-looking statements. Such statements may be preceded by the words intends, may, will, plans, expects, anticipates, projects, predicts, estimates, aims, believes, hopes, potential or similar words. Forward- looking statements are not guarantees of future performance, are based on certain assumptions and are subject to various known and unknown risks and uncertainties, many of which are beyond the Companys control, and cannot be predicted or quantified and consequently, actual results may differ materially from those expressed or implied by such forward-looking statements. Such risks and uncertainties include, without limitation, risks and uncertainties associated with (i) the geographic, social and economic impact of COVID-19 on our ability to conduct our business and raise capital in the future when needed, (ii) our inability to manufacture our products and product candidates on a commercial scale on our own, or in collaboration with third parties; (iii) difficulties in obtaining financing on commercially reasonable terms; (iv) changes in the size and nature of our competition; (v) loss of one or more key executives or scientists; and (vi) difficulties in securing regulatory approval to market our products and product candidates. More detailed information about the Company and the risk factors that may affect the realization of forward-looking statements is set forth in the Companys filings with the Securities and Exchange Commission (SEC), including the Companys Annual Report on Form 10-K and its Quarterly Reports on Form 10-Q. Investors and security holders are urged to read these documents free of charge on the SECs website at http://www.sec.gov. The Company assumes no obligation to publicly update or revise its forward-looking statements as a result of new information, future events or otherwise.

1 Artificial Intelligence in Healthcare Market with COVID-19 Impact Analysis by Offering, Technology, End-Use Application, End User and Region Global Forecast to 2026; Markets and Markets

2 Artificial Intelligence (AI): Healthcares New Nervous System https://www.accenture.com/us-en/insight-artificial-intelligence-healthcare%C2%A0

Read this article:
BioSig and Mayo Clinic Collaborate on New R&D Program to Develop Transformative AI and Machine Learning Technologies for its PURE EP System - BioSpace

Five trends in machine learning-enhanced analytics to watch in 2021 – Information Age

AI usage is growing rapidly. What does 2021 hold for the world of analytics, and how will AI drive it?

Progress of AI-powered operations looks set to grow this year.

As the world prepares to recover from the Covid-19 pandemic, businesses will need to increasingly rely on analytics to deal with new consumer behaviour.

According to Gartner analyst Rita Sallam, In the face of unprecedented market shifts, data and analytics leaders require an ever-increasing velocity and scale of analysis in terms of processing and access to accelerate innovation and forge new paths to a post-Covid-19 world.

Machine learning and artificial intelligence are finding increasingly significant use cases in data analytics for business. Here are five trends to watch out for in 2021.

Gartner predicts that by 2024, 75% of enterprises will shift towards putting AI and ML into operation. A big reason for this is the way the pandemic has changed consumer behaviour. Regression learning models that rely on historical data might not be valid anymore. In their place, reinforcement and distributed learning models will find more use, thanks to their adaptability.

A large share of businesses have already democratised their data through the use of embedded analytics dashboards. The use of AI to generate augmented analytics to drive business decisions will increase as businesses seek to react faster to shifting conditions. Powering data democratisation efforts with AI will help non-technical users make a greater number of business decisions, without having to rely on IT support to query data.

Companies such as Sisense already offer companies the ability to integrate powerful analytics into custom applications. As AI algorithms become smarter, its a given that theyll help companies use low-latency alerts to help managers react to quantifiable anomalies that indicate changes in their business. Also, AI is expected to play a major role in delivering dynamic data stories and might reduce a users role in data exploration.

A fact thats often forgotten in AI conversations is that these technologies are still nascent. Many of the major developments have been driven by open source efforts, but 2021 will see an increasing number of companies commercialise AI through product releases.

This event will truly be a marker of AI going mainstream. While open source has been highly beneficial to AI, scaling these projects for commercial purposes has been difficult. With companies investing more in AI research, expect a greater proliferation of AI technology in project management, data reusability, and transparency products.

Using AI for better data management is a particular focus of big companies right now. A Pathfinder report in 2018 found that a lack of skilled resources in data management was hampering AI development. However, with ML growing increasingly sophisticated, companies are beginning to use AI to manage data, which fuels even faster AI development.

As a result, metadata management becomes streamlined, and architectures become simpler. Moving forward, expect an increasing number of AI-driven solutions to be released commercially instead of on open source platforms.

Vendors such as Informatica are already using AI and ML algorithms to help develop better enterprise data management solutions for their clients. Everything from data extraction to enrichment is optimised by AI, according to the company.

This article explores the ways in which Kubernetes enhances the use of machine learning (ML) within the enterprise. Read here

Voice search and data is increasing by the day. With products such as Amazons Alexa and Googles Assistant finding their way into smartphones and growing adoption of smart speakers in our homes, natural language processing will increase.

Companies will wake up to the immense benefits of voice analytics and will provide their customers with voice tools. The benefits of enhanced NLP include better social listening, sentiment analysis, and increased personalisation.

Companies such as AX Semantics provide self-service natural language generation software that allows customers to self-automate text commands. Companies such as Porsche, Deloitte and Nivea are among their customers.

As augmented analytics make their way into embedded dashboards, low-level data analysis tasks will be automated. An area that is ripe for automation is data collection and synthesis. Currently, data scientists spend large amounts of time cleaning and collecting data. Automating these tasks by specifying standardised protocols will help companies employ their talent in tasks better suited to their abilities.

A side effect of data analysis automation will be the speeding up of analytics and reporting. As a result, we can expect businesses to make decisions faster along with installing infrastructure that allows them to respond and react to changing conditions quickly.

As the worlds of data and analytics come closer together, vendors who provide end-to-end stacks will provide better value to their customers. Combine this with increased data democratisation and its easy to see why legacy enterprise software vendors such as SAP offer everything from data management to analytics to storage solutions to their clients.

Tech experts provide their tips on how to effectively implement automation into your customer relationship management (CRM) process. Read here

IoT devices are making their way into not just B2C products but B2B, enterprise and public projects as well, from smart cities to industry 4.0.

Data is being generated at unprecedented rates, and to make sense of it, companies are increasingly turning to AI. With so much signal, this is a key help for arriving at insights.

While the rise of embedded and augmented analytics has already been discussed, its critical to point out that the sources of data are more varied than ever before. This makes the use of AI critical, since manual processes cannot process such large volumes efficiently.

As AI technology continues to make giant strides the business world is gearing up to take full advantage of it. Weve reached a stage where AI is powering further AI development, and the rate of progress will only increase.

See more here:
Five trends in machine learning-enhanced analytics to watch in 2021 - Information Age

When Are We Going to Start Designing AI With Purpose? Machine Learning Times – The Predictive Analytics Times

Originally published in UX Collective, Jan 19, 2021.

For an industry that prides itself on moving fast, the tech community has been remarkably slow to adapt to the differences of designing with AI. Machine learning is an intrinsically fuzzy science, yet when it inevitably returns unpredictable results, we tend to react like its a puzzle to be solved; believing that with enough algorithmic brilliance, we can eventually fit all the pieces into place and render something approaching objective truth. But objectivity and truth are often far afield from the true promise of AI, as well soon discuss.

I think a lot of the confusion stems from language;in particular the way we talk about machine-like efficiency. Machines are expected to make precise measurements about whatever theyre pointed at; to produce data.

But machinelearningdoesnt produce data. Machine learning producespredictionsabout how observations in the present overlap with patterns from the past. In this way, its literally aninversionof the classicif-this-then-thatlogic thats driven conventional software development for so long. My colleague Rick Barraza has a great way of describing the distinction:

To continue reading this article, click here.

Here is the original post:
When Are We Going to Start Designing AI With Purpose? Machine Learning Times - The Predictive Analytics Times

Learn in-demand technical skills in Python, machine learning, and more with this academy – The Next Web

Credit: Clment Hlardot/Unsplash

TLDR: With access to the Zenva Academy, users can take over 250 tech courses packed with real world programming training to become a knowledgeable and hirable professional coder.

The tech industry is expected to grow by as many as 13 million new jobs in the U.S. alone over the next five years, with another 20 million likely to spring up in the EU.

And you can rest assured that coding will be at the heart of almost every single one of those new positions.

Its no surprise that programming courses are being taught to our youngest students these days. From web development to gaming to data science, all the tech innovations well see over those next five years and beyond will come from innovators who understand how to make those static lines of code get together and dance.

If you feel behind the programming curve or just want a stockpile of tech training to have you ready for anything, the Zenva Academy ($139.99 for a one-year subscription) may be just the bootcamp you need to grab one of those new jobs.

This access unlocks everything in the Zenva Academys vast archives, a collection of more than 250 courses that dive into every aspect of learning to build games, websites, apps and more.

With courses taught by knowledgeable industry professionals, even newbies coming in with zero experience receive world-class training on in-demand programming skills on their way to becoming professionals themselves. Classes are based entirely around your own schedule with no deadlines or due dates so you can work at your own pace on bolstering your abilities.

Whether a student is interested in crafting mobile apps, mastering data science, or exploring machine learning and AI, these courses dont just tell you how to interact with these disciplines, they actually show you. Zenva coursework is based around creating real projects in tandem with the learning.

As you build a VR or AR app, or craft your first artificial neural networks using Python and TensorFlow, or create an awesome game, youll be building work for a professional portfolio that can help you land one of these prime coding positions. And with their ties to elite developer programs for outlets like Intel, Microsoft, and CompTIA, students can get on the fast track toward getting hired.

Regularly $169 for a year of Zenva Academy access, you can get it foronly $139.99 for a limited time.

Prices are subject to change.

Read next: Forget Hyperloop, check out Chinas new 620kmph maglev prototype

See more here:
Learn in-demand technical skills in Python, machine learning, and more with this academy - The Next Web