Archive for the ‘Machine Learning’ Category

There is a direct correlation between AI adoption and superior business outcomes – Help Net Security

Adoption of artificial intelligence (AI) is growing worldwide, according to an IDC survey of more than 2,000 IT and line of business (LoB) decision makers.

Over a quarter of all AI initiatives are already in production and more than one third are in advanced development stages. And organizations are reporting an increase in their AI spending this year.

Delivering a better customer experience was identified as the leading driver for AI adoption by more than half the large companies surveyed. At the same time, a similar number of respondents indicated that AIs greatest impact is in helping employees to get better at their jobs.

Whether it is an improved customer experience or better employee experience, there is a direct correlation between AI adoption and superior business outcomes.

Early adopters report an improvement of almost 25 percent in customer experience, accelerated rates of innovation, higher competitiveness, higher margins, and better employee experience with the roll out of AI solutions.

Organizations worldwide are adopting AI in their business transformation journey, not just because they can but because they must to be agile, resilient, innovative, and able to scale, said Ritu Jyoti, program vice president, Artificial Intelligence Strategies.

While there is considerable agreement on the benefits of AI, there is some divergence in how companies deploy AI solutions. IT automation, intelligent task/process automation, automated threat analysis and investigation, supply and logistics, automated customer service agents, and automated human resources are the top use cases where AI is being currently employed.

While automated customer services agents and automated human resources are a priority for larger companies (5000+ employees), IT automation is the priority for smaller and medium sized companies (less than 1000 employees).

Despite the benefits, deploying AI continues to present challenges, particularly with regard to data. Lack of adequate volumes and quality of training data remains a significant development challenge. Data security, governance, performance, and latency (transfer rate) are the top data integration challenges.

Solution price, performance and scale are the top data management issues. And enterprises report cost of the solution to be the number one challenge for implementing AI. As enterprises scale up their efforts, fragmented pricing across different services and pay-as-you-go pricing may present barriers to AI adoption.

An AI-ready data architecture, MLOps, and trustworthy AI are critical for realizing AI and Machine Learning at scale, added Jyoti.

Read more from the original source:
There is a direct correlation between AI adoption and superior business outcomes - Help Net Security

Machine Learning Definition – Investopedia

What Is Machine Learning?

Machine learning is theconcept that a computer program can learn and adapt to new data without human interference. Machine learning is a field of artificial intelligence (AI) that keeps a computers built-in algorithms current regardless of changes in the worldwide economy.

Various sectors of the economy are dealing with huge amounts of data available in different formats from disparate sources. The enormous amount of data, known as big data, is becoming easily available and accessible due to the progressive use of technology. Companies and governments realize the huge insights that can be gained from tapping into big data but lack the resources and time required to comb through its wealth of information. As such, artificial intelligence measures are being employed by different industries to gather, process, communicate, and share useful information from data sets. One method of AI that is increasingly utilized for big data processing is machine learning.

The various data applications of machine learning are formed through a complex algorithm or source code built into the machine or computer. This programming code creates a model that identifies the data and builds predictions around the data it identifies. The model uses parameters built in the algorithm to form patterns for its decision-making process. When new or additional data becomes available, the algorithm automatically adjusts the parameters to check for a pattern change, if any. However, the model shouldnt change.

Machine learning is used in different sectors for various reasons. Trading systems can be calibrated to identify new investment opportunities. Marketing and e-commerce platforms can be tuned to provide accurate and personalized recommendations to their users based on the users internet search history or previous transactions. Lending institutions can incorporate machine learning to predict bad loans and build a credit risk model. Information hubs can use machine learning to cover huge amounts of news stories from all corners of the world. Banks can create fraud detection tools from machine learning techniques. The incorporation of machine learning in the digital-savvy era is endless as businesses and governments become more aware of the opportunities that big data presents.

How machine learning works can be better explained by an illustration in the financial world. Traditionally, investment players in the securities market like financial researchers, analysts, asset managers, individual investors scour through a lot of information from different companies around the world to make profitable investment decisions. However, some pertinent information may not be widely publicized by the media and may be privy to only a select few who have the advantage of being employees of the company or residents of the country where the information stems from. In addition, theres only so much information humans can collect and process within a given time frame. This is where machine learning comes in.

An asset management firm may employ machine learning in its investment analysis and research area. Say the asset manager only invests in mining stocks. The model built into the system scans the web and collects all types of news events from businesses, industries, cities, and countries, and this information gathered makes up the data set. The asset managers and researchers of the firm would not have been able to get the information in the data set using their human powers and intellects. The parameters built alongside the model extracts only data about mining companies, regulatory policies on the exploration sector, and political events in select countries from the data set. Saya mining company XYZ just discovered a diamond mine in a small town in South Africa, the machine learning app would highlight this as relevant data. The model could then use an analytics tool called predictive analytics to make predictions on whether the mining industry will be profitable for a time period, or which mining stocks are likely to increase in value at a certain time. This information is relayed to the asset manager to analyze and make a decision for his portfolio. The asset manager may make a decision to invest millions of dollars into XYZ stock.

In the wake of an unfavorable event, such as South African miners going on strike, the computer algorithm adjusts its parameters automatically to create a new pattern. This way, the computational model built into the machine stays current even with changes in world events and without needing a human to tweak its code to reflect the changes. Because the asset manager received this new data on time, they are able to limit his losses by exiting the stock.

View original post here:
Machine Learning Definition - Investopedia

Machine Learning Tutorial for Beginners – Guru99

What is Machine Learning?

Machine Learning is a system that can learn from example through self-improvement and without being explicitly coded by programmer. The breakthrough comes with the idea that a machine can singularly learn from the data (i.e., example) to produce accurate results.

Machine learning combines data with statistical tools to predict an output. This output is then used by corporate to makes actionable insights. Machine learning is closely related to data mining and Bayesian predictive modeling. The machine receives data as input, use an algorithm to formulate answers.

A typical machine learning tasks are to provide a recommendation. For those who have a Netflix account, all recommendations of movies or series are based on the user's historical data. Tech companies are using unsupervised learning to improve the user experience with personalizing recommendation.

Machine learning is also used for a variety of task like fraud detection, predictive maintenance, portfolio optimization, automatize task and so on.

In this basic tutorial, you will learn-

Traditional programming differs significantly from machine learning. In traditional programming, a programmer code all the rules in consultation with an expert in the industry for which software is being developed. Each rule is based on a logical foundation; the machine will execute an output following the logical statement. When the system grows complex, more rules need to be written. It can quickly become unsustainable to maintain.

Machine learning is supposed to overcome this issue. The machine learns how the input and output data are correlated and it writes a rule. The programmers do not need to write new rules each time there is new data. The algorithms adapt in response to new data and experiences to improve efficacy over time.

Machine learning is the brain where all the learning takes place. The way the machine learns is similar to the human being. Humans learn from experience. The more we know, the more easily we can predict. By analogy, when we face an unknown situation, the likelihood of success is lower than the known situation. Machines are trained the same. To make an accurate prediction, the machine sees an example. When we give the machine a similar example, it can figure out the outcome. However, like a human, if its feed a previously unseen example, the machine has difficulties to predict.

The core objective of machine learning is the learning and inference. First of all, the machine learns through the discovery of patterns. This discovery is made thanks to the data. One crucial part of the data scientist is to choose carefully which data to provide to the machine. The list of attributes used to solve a problem is called a feature vector. You can think of a feature vector as a subset of data that is used to tackle a problem.

The machine uses some fancy algorithms to simplify the reality and transform this discovery into a model. Therefore, the learning stage is used to describe the data and summarize it into a model.

For instance, the machine is trying to understand the relationship between the wage of an individual and the likelihood to go to a fancy restaurant. It turns out the machine finds a positive relationship between wage and going to a high-end restaurant: This is the model

When the model is built, it is possible to test how powerful it is on never-seen-before data. The new data are transformed into a features vector, go through the model and give a prediction. This is all the beautiful part of machine learning. There is no need to update the rules or train again the model. You can use the model previously trained to make inference on new data.

The life of Machine Learning programs is straightforward and can be summarized in the following points:

Once the algorithm gets good at drawing the right conclusions, it applies that knowledge to new sets of data.

Machine learning can be grouped into two broad learning tasks: Supervised and Unsupervised. There are many other algorithms

An algorithm uses training data and feedback from humans to learn the relationship of given inputs to a given output. For instance, a practitioner can use marketing expense and weather forecast as input data to predict the sales of cans.

You can use supervised learning when the output data is known. The algorithm will predict new data.

There are two categories of supervised learning:

Imagine you want to predict the gender of a customer for a commercial. You will start gathering data on the height, weight, job, salary, purchasing basket, etc. from your customer database. You know the gender of each of your customer, it can only be male or female. The objective of the classifier will be to assign a probability of being a male or a female (i.e., the label) based on the information (i.e., features you have collected). When the model learned how to recognize male or female, you can use new data to make a prediction. For instance, you just got new information from an unknown customer, and you want to know if it is a male or female. If the classifier predicts male = 70%, it means the algorithm is sure at 70% that this customer is a male, and 30% it is a female.

The label can be of two or more classes. The above example has only two classes, but if a classifier needs to predict object, it has dozens of classes (e.g., glass, table, shoes, etc. each object represents a class)

When the output is a continuous value, the task is a regression. For instance, a financial analyst may need to forecast the value of a stock based on a range of feature like equity, previous stock performances, macroeconomics index. The system will be trained to estimate the price of the stocks with the lowest possible error.

In unsupervised learning, an algorithm explores input data without being given an explicit output variable (e.g., explores customer demographic data to identify patterns)

You can use it when you do not know how to classify the data, and you want the algorithm to find patterns and classify the data for you

Type

K-means clustering

Puts data into some groups (k) that each contains data with similar characteristics (as determined by the model, not in advance by humans)

Clustering

Gaussian mixture model

A generalization of k-means clustering that provides more flexibility in the size and shape of groups (clusters

Clustering

Hierarchical clustering

Splits clusters along a hierarchical tree to form a classification system.

Can be used for Cluster loyalty-card customer

Clustering

Recommender system

Help to define the relevant data for making a recommendation.

Clustering

PCA/T-SNE

Mostly used to decrease the dimensionality of the data. The algorithms reduce the number of features to 3 or 4 vectors with the highest variances.

Dimension Reduction

There are plenty of machine learning algorithms. The choice of the algorithm is based on the objective.

In the example below, the task is to predict the type of flower among the three varieties. The predictions are based on the length and the width of the petal. The picture depicts the results of ten different algorithms. The picture on the top left is the dataset. The data is classified into three categories: red, light blue and dark blue. There are some groupings. For instance, from the second image, everything in the upper left belongs to the red category, in the middle part, there is a mixture of uncertainty and light blue while the bottom corresponds to the dark category. The other images show different algorithms and how they try to classified the data.

The primary challenge of machine learning is the lack of data or the diversity in the dataset. A machine cannot learn if there is no data available. Besides, a dataset with a lack of diversity gives the machine a hard time. A machine needs to have heterogeneity to learn meaningful insight. It is rare that an algorithm can extract information when there are no or few variations. It is recommended to have at least 20 observations per group to help the machine learn. This constraint leads to poor evaluation and prediction.

Augmentation:

Automation:

Finance Industry

Government organization

Healthcare industry

Marketing

Example of application of Machine Learning in Supply Chain

Machine learning gives terrific results for visual pattern recognition, opening up many potential applications in physical inspection and maintenance across the entire supply chain network.

Unsupervised learning can quickly search for comparable patterns in the diverse dataset. In turn, the machine can perform quality inspection throughout the logistics hub, shipment with damage and wear.

For instance, IBM's Watson platform can determine shipping container damage. Watson combines visual and systems-based data to track, report and make recommendations in real-time.

In past year stock manager relies extensively on the primary method to evaluate and forecast the inventory. When combining big data and machine learning, better forecasting techniques have been implemented (an improvement of 20 to 30 % over traditional forecasting tools). In term of sales, it means an increase of 2 to 3 % due to the potential reduction in inventory costs.

Example of Machine Learning Google Car

For example, everybody knows the Google car. The car is full of lasers on the roof which are telling it where it is regarding the surrounding area. It has radar in the front, which is informing the car of the speed and motion of all the cars around it. It uses all of that data to figure out not only how to drive the car but also to figure out and predict what potential drivers around the car are going to do. What's impressive is that the car is processing almost a gigabyte a second of data.

Machine learning is the best tool so far to analyze, understand and identify a pattern in the data. One of the main ideas behind machine learning is that the computer can be trained to automate tasks that would be exhaustive or impossible for a human being. The clear breach from the traditional analysis is that machine learning can take decisions with minimal human intervention.

Take the following example; a retail agent can estimate the price of a house based on his own experience and his knowledge of the market.

A machine can be trained to translate the knowledge of an expert into features. The features are all the characteristics of a house, neighborhood, economic environment, etc. that make the price difference. For the expert, it took him probably some years to master the art of estimate the price of a house. His expertise is getting better and better after each sale.

For the machine, it takes millions of data, (i.e., example) to master this art. At the very beginning of its learning, the machine makes a mistake, somehow like the junior salesman. Once the machine sees all the example, it got enough knowledge to make its estimation. At the same time, with incredible accuracy. The machine is also able to adjust its mistake accordingly.

Most of the big company have understood the value of machine learning and holding data. McKinsey have estimated that the value of analytics ranges from $9.5 trillion to $15.4 trillion while $5 to 7 trillion can be attributed to the most advanced AI techniques.

Continued here:
Machine Learning Tutorial for Beginners - Guru99

Machine Learning – India | IBM

Machine-learning techniques are required to improve the accuracy of predictive models. Depending on the nature of the business problem being addressed, there are different approaches based on the type and volume of the data. In this section, we discuss the categories of machine learning.

Supervised learning

Supervised learning typically begins with an established set of data and a certain understanding of how that data is classified. Supervised learning is intended to find patterns in data that can be applied to an analytics process. This data has labeled features that define the meaning of data. For example, you can create a machine-learning application that distinguishes between millions of animals, based onimages and written descriptions.

Unsupervised learning

Unsupervised learning is used when the problem requires a massive amount of unlabeled data. For example, social media applications, such as Twitter, Instagram and Snapchat, all have large amounts of unlabeled data. Understanding the meaning behind this data requires algorithms that classify the data based on the patterns or clusters it finds. Unsupervised learning conducts an iterative process, analyzing data without human intervention. It is used with email spam-detecting technology. There are far too many variables in legitimate and spam emails for an analyst to tag unsolicited bulk email. Instead, machine-learning classifiers, based on clustering and association, are applied to identify unwanted email.

Reinforcement learning

Reinforcement learning is a behavioral learning model. The algorithm receives feedback from the data analysis, guiding the user to the best outcome. Reinforcement learning differs from other types of supervised learning, because the system isnt trained with the sample data set. Rather, the system learns through trial and error. Therefore, a sequence of successful decisions will result in the process being reinforced, because it best solves the problem at hand.

Deep learning

Deep learning is a specific method of machine learning that incorporates neural networks in successive layers to learn from data in an iterative manner. Deep learning is especially useful when youre trying to learn patterns from unstructured data. Deep learning complex neural networks are designed to emulate how the human brain works, so computers can be trained to deal with poorly defined abstractions and problems. The average five-year-old child can easily recognize the difference between his teachers face and the face of the crossing guard. In contrast, the computer must do a lot of work to figure out who is who. Neural networks and deep learning are often used in image recognition, speech, and computer vision applications.

See the original post here:
Machine Learning - India | IBM

Microsoft and Udacity partner in new $4 million machine-learning scholarship program for Microsoft Azure – TechRepublic

Applications are now open for the nanodegree program, which will help Udacity train developers on the Microsoft Azure cloud infrastructure.

Microsoft and Udacity are teaming together to invest $4 million in a machine learning (ML) training collaboration, which begins with the Machine Learning Scholarship Program for Microsoft Azure which starts today.

The program focuses on artificial intelligence, which is continuing to grow at a face pace. AI engineers are in high demand, particularly as enterprises build new cloud applications and move old ones to the cloud. The average AI salary in the US is $114,121 a year based on data from Glassdoor.

"AI is driving transformation across organizations and there is increased demand for data science skills," said Julia White, corporate vice president, Azure Marketing, Microsoft, in a Microsoft blog post. "Through our collaboration with Udacity to offer low-code and advanced courses on Azure Machine Learning, we hope to expand data science expertise as experienced professionals will truly be invaluable resources to solving business problems."

SEE: Building the bionic brain (free PDF) (TechRepublic)

The interactive scholarship courses begin with a two-month long course, "Introduction to machine learning on Azure with a low-code experience."

Students will work with live Azure environments directly within the Udacity classroom and build on these foundations with advanced techniques such as ensemble learning and deep learning.

To earn a spot in th foundations course, students will need to submit an application. According to the blog post, "Successful applicants will ideally have basic programming knowledge in any language, preferably Python, and be comfortable writing scripts and performing loop operations."

Udacity's nanodegrees have been growing in popularity. Monthly enrollment in Udacity's nanodegrees has increased by a factor of four since the beginning of the coronavirus lockdown. Among Udacity's consumer customers, in the three weeks starting March 9 the company saw a 56% jump in weekly active users and a 102% increase in new enrollments, and they've stayed at or just below those new levels since then, according to a Udacity spokesperson.

After students complete the foundations course, Udacity will select top performers to receive a scholarship to the new machine learning nanodegree program with Microsoft Azure.

This typically four-month nanodegree program will include:

Students who aren't selected for the scholarship will still be able to enroll in the nanodegree program when it is available to the general public.

Anyone interested in becoming an Azure Machine Learning engineer and learning from experts at the forefront of the field can apply for the scholarshiphere.Applications will be open from June 10 to June 30.

We deliver the top business tech news stories about the companies, the people, and the products revolutionizing the planet. Delivered Daily

Image: NicolasMcComber / Getty Images

See the original post here:
Microsoft and Udacity partner in new $4 million machine-learning scholarship program for Microsoft Azure - TechRepublic