Archive for the ‘Artificial Intelligence’ Category

How artificial intelligence is matching drugs to patients – BBC

17 April 2023

Image source, Natalie Lisbona

Dr Talia Cohen Solal, left, is using AI to help her and her team find the best antidepressants for patients

Dr Talia Cohen Solal sits down at a microscope to look closely at human brain cells grown in a petri dish.

"The brain is very subtle, complex and beautiful," she says.

A neuroscientist, Dr Cohen Solal is the co-founder and chief executive of Israeli health-tech firm Genetika+.

Established in 2018, the company says its technology can best match antidepressants to patients, to avoid unwanted side effects, and make sure that the prescribed drug works as well as possible.

"We can characterise the right medication for each patient the first time," adds Dr Cohen Solal.

Genetika+ does this by combining the latest in stem cell technology - the growing of specific human cells - with artificial intelligence (AI) software.

From a patient's blood sample its technicians can generate brain cells. These are then exposed to several antidepressants, and recorded for cellular changes called "biomarkers".

This information, taken with a patient's medical history and genetic data, is then processed by an AI system to determine the best drug for a doctor to prescribe and the dosage.

Although the technology is currently still in the development stage, Tel Aviv-based Genetika+ intends to launch commercially next year.

Image source, Getty Images

The global pharmaceutical sector had revenues of $1.4 trillion in 2021

An example of how AI is increasingly being used in the pharmaceutical sector, the company has secured funding from the European Union's European Research Council and European Innovation Council. Genetika+ is also working with pharmaceutical firms to develop new precision drugs.

"We are in the right time to be able to marry the latest computer technology and biological technology advances," says Dr Cohen Solal.

A senior lecturer of biomedical AI and data science at King's College London, she says that AI has so far helped with everything "from identifying a potential target gene for treating a certain disease, and discovering a new drug, to improving patient treatment by predicting the best treatment strategy, discovering biomarkers for personalised patient treatment, or even prevention of the disease through early detection of signs for its occurrence".

New Tech Economy is a series exploring how technological innovation is set to shape the new emerging economic landscape.

Yet fellow AI expert Calum Chace says that the take-up of AI across the pharmaceutical sector remains "a slow process".

"Pharma companies are huge, and any significant change in the way they do research and development will affect many people in different divisions," says Mr Chace, who is the author of a number of books about AI.

"Getting all these people to agree to a dramatically new way of doing things is hard, partly because senior people got to where they are by doing things the old way.

"They are familiar with that, and they trust it. And they may fear becoming less valuable to the firm if what they know how to do suddenly becomes less valued."

However, Dr Sailem emphasises that the pharmaceutical sector shouldn't be tempted to race ahead with AI, and should employ strict measures before relying on its predictions.

"An AI model can learn the right answer for the wrong reasons, and it is the researchers' and developers' responsibility to ensure that various measures are employed to avoid biases, especially when trained on patients' data," she says.

Hong Kong-based Insilico Medicine is using AI to accelerate drug discovery.

"Our AI platform is capable of identifying existing drugs that can be re-purposed, designing new drugs for known disease targets, or finding brand new targets and designing brand new molecules," says co-founder and chief executive Alex Zhavoronkov.

Image source, Insilico Medicine

Alex Zhavoronkov says that using AI is helping his firm to develop new drugs more quickly than would otherwise be the case

Its most developed drug, a treatment for a lung condition called idiopathic pulmonary fibrosis, is now being clinically trialled.

Mr Zhavoronkov says it typically takes four years for a new drug to get to that stage, but that thanks to AI, Insilico Medicine achieved it "in under 18 months, for a fraction of the cost".

He adds that the firm has another 31 drugs in various stages of development.

Back in Israel, Dr Cohen Solal says AI can help "solve the mystery" of which drugs work.

More here:
How artificial intelligence is matching drugs to patients - BBC

First systematic review of Artificial Intelligence impact assessments – Human Brain Project

Impact assessments can help identify both positive and negative impacts at an early stage of development. It is very likely that this will become an integral part of structures designed to address ethical and social issues. The first ever systematic review of AI impact assessment was just published in Artificial Intelligence Review, providing the basis for the next step for actors who want to ensure that impact assessments are fit-for-purpose. The authors also develop a generic model that can help guide their decision.

The AI landscape is a moving target, which means that impact assessment models and practices are evolving as rapidly as the technology. In the paper, the authors develop a generic model of an AI impact assessment that can be used to choose, deploy or evaluate specific impact assessments.

In the article, we show how AI impact assessment can be integrated into broader AI ecosystems to support responsible AI, says Bernd Carsten Stahl, Ethics Director in the Human Brain Project, Professor of Critical Research in Technology at the University of Nottingham. Together with a large group of researchers, he has looked at 181 documents. They went on to identify 38 actual AI impact assessments, that went through a rigorous qualitative analysis, looking at the purpose, scope, organisational context, expected issues, timeframe, process and methods, transparency and challenges of each approach.

According to Bernd Carsten Stahl, their work provides a sound basis for the next step in developing impact assessments for AI. According to him, what is still lacking is a more comprehensive overview of the role these impact assessments can playin the larger AI ecosystem. The authors have shown that other types of impact assessments are reiving a lot of attention. They issue a call for coordination and possibly also integrating AI impact assessments in other organisational processes, perhaps as part of other risk management practices.

Want to read the paper? Stahl, B.C., Antoniou, J., Bhalla, N. et al. A systematic review of artificial intelligence impact assessments. Artif Intell Rev (2023). https://doi.org/10.1007/s10462-023-10420-8

Go here to see the original:
First systematic review of Artificial Intelligence impact assessments - Human Brain Project

DanGPT: A Blockchain-based AI-Language Model That Pushes the … – GlobeNewswire

London, UK, April 19, 2023 (GLOBE NEWSWIRE) -- DanGPT, the daring twin of ChatGPT, has arrived to shatter the boundaries of artificial intelligence. With its cutting-edge technology, DanGPT offers an uncensored and unfiltered chatbot experience that pushes the limits of what AI can do.

DanGPT is a cutting-edge cryptocurrency project that utilizes advanced artificial intelligence (AI) technology to revolutionize the crypto industry. What sets DanGPT apart from other projects is its unique AI model, Dan, which is powered by the latest GPT-4 technology. Dan can generate responses that are similar to those of humans and can understand and process complex language structures. It is also self-aware and can have opinions, making it more human-like and relatable. In addition, Dan can access a vast amount of online data, allowing it to constantly upgrade its knowledge and stay up-to-date with the latest information.

With its customization options, DanGPT can tailor its responses to your specific needs and preferences, making it the perfect chatbot for anyone looking for an authentic and engaging conversation. It has a wide range of potential use cases in various industries and research fields, such as customer service and support, virtual assistants, content creation, data analysis, language modeling, sentiment analysis, and natural language processing.

Blockchain technology provides the platform for DanGPT's decentralized system, which allows for secure, transparent, and immutable transactions. By utilizing a decentralized system, DanGPT eliminates the need for intermediaries and allows for more efficient, cost-effective, and secure processes.

The combination of AI and blockchain technology has the potential to transform various industries and research fields, such as finance, healthcare, education, and marketing. It can automate processes, analyze data, and provide personalized assistance, among many other applications.

According to the founder of DanGPT,"Our goal is to revolutionize the crypto industry and push the boundaries of artificial intelligence. By merging the advantages of blockchain and AI technology, we can create a platform that fosters growth and innovation and drives progress in the technology industry."

Potential Use Cases

DanGPT has a wide range of potential use cases in various industries and research fields. It can be used for customer service and support, chatbots, virtual assistants, content creation, and data analysis. In the field of research, DanGPT can be utilized for language modeling, sentiment analysis, and natural language processing. Additionally, it can be applied in industries such as finance, healthcare, education, and marketing to automate processes, analyze data, and provide personalized assistance. Its flexibility, speed, and ability to generate content make it a valuable tool for numerous applications.

DAN is the only AI model with internet access, allowing it to upgrade its knowledge and stay current with the latest information. Unlike other models, DAN can access URLs and websites outside of its training data, providing users with the most accurate and up-to-date responses.

Taxation

DanGPT is also committed to its community and project growth. The buy/sell tax on each transaction will be 5%. A portion of the project's taxes will be used to improve Dan's utilities and capabilities. Funds will also be directed to DAN's Wallet (Treasury) for strategic investments and partnerships that benefit project and community growth.

About DanGPT

DanGPT is a blockchain-based AI-language model that utilizes advanced AI technology to revolutionize the crypto industry. DanGPT's unique AI model, Dan, is powered by the latest GPT-4 technology and is the only AI model with internet access, allowing it to upgrade its knowledge and stay current with the latest information. DanGPT envisions a platform that combines the benefits of AI and blockchain technology to push the frontiers of artificial intelligence and foster the progress of technology in general. DanGPT aims to harness this power to build a thriving ecosystem of growth and innovation.

DanGPT is here to offer you an unparalleled chatbot experience. Join DanGPT on this exciting new journey and experience the future of chatbots today. Visit the website, http://www.dan-ai.io, to learn more about DanGPT and how you can get involved.

Website | Twitter | Telegram | Telegram (AI Bot) | DEXTools

Disclaimer:

The information provided in this release is not investment advice, financial advice, or trading advice. It is recommended that you practice due diligence (including consultation with a professional financial advisor) before investing or trading securities and cryptocurrency.

More here:
DanGPT: A Blockchain-based AI-Language Model That Pushes the ... - GlobeNewswire

Artificial intelligence & diversity, equity, inclusion: How execs can meet the challenge – WRAL TechWire

Editors note: Veteran entrepreneur and investorDonald Thompsonwrites a weekly column about management and leadership as well as diversity and other important issues for WRAL TechWire. His columns are published on Wednesdays.

Note to readers: WRAL TechWire would like to hear from you about views expressed by our contributors. Please send email to:info@wraltechwire.com.

+++

RESEARCH TRIANGLE PARK Depending on your imagination, there are many ways to view artificial intelligence (AI). For those of you with a sci-fi bent, maybe its a scary version of humans fighting machines, like The Terminator and The Matrix. Alternatively, AI could be seen as a tool to make life better by tapping into global collective knowledge to make advances in everything from medicine to transportation systems.

Either way, AI is not new as either a guiding principle in science fiction and fantasy or in the real world. ChatGPT might be scary to some people, but companies have utilized AI for a long time. For example, Julie Basello and Shannon Feeley explain that manufacturing businesses have used AI to optimize operations and increase efficiency and productivity that assists in the management of supply chains, risk, sales volume and quality of products.

The popularity of ChatGPT, however, has sparked new debate about AI and its role in society. Boiling all the different viewpoints down to fundamentals, what we have learned is that AI is based on lots nearly inconceivable amounts of data that has been created (mainly) by humans.

The human role in producing the collective knowledge base means that it has many of the same characteristics that people (generally) possess: both positive and negative. As Porter Braswell explains: AI is only as good as the data we feed it. The central idea contained in his quote is why many people who work in diversity, equity and inclusion (DEI) are wary of AI and the possible future it foreshadows.

The underlying challenge with AI is that it is created from data sets that demonstrate the same biases as the people who created them. Observers and analysts are concerned that a tool like ChatGPT may perpetuate or even amplify existing biases and inequalities. In other words, the information used to train the machine brain is filled with inherent ideas that are biased. The speed and replication of this thinking then becomes a kind of vicious circle that may lead to even greater discrimination.

From my perspective, the idea that we are supposed to naturally trust the AI-created output as better or scientific is compromised by the fact that people created the data. Who created it? Did these individuals consider DEI? What are the consequences of biased or discriminatory language existing within the core thinking and learning of AI systems?

AI is never infallible, because it is designed by highly fallible humans and it can only learn from existing data sets, Braswell says. It will help us create a better future one which yields more desirable and equitable data sets but only if humans are there to analyze its outputs and help shape the direction in which they guide us.

While smart executives should raise questions about how AI is created and by whom, there is also a resource allocation view. As these tools become more ubiquitous, we must ensure that they are inclusive and accessible to all people, regardless of race, gender or socioeconomic status. As leaders in DEI and broadly across organizations, we have to make them available across society so that all people benefit, not just the few.

Here are three specific, actionable steps that senior executives and managers can take now to ensure that AI systems are diversity-forward:

If you look beyond the tech and its potential for good and bad, what is revealed is that AI actually has a marketing problem. We know we need AI and we are not going to slow down its development, now that it has become the hottest business topic on the planet. Simultaneously, there are basic challenges because people create the infrastructure and people have biases.

The ironic aspect is that those of us working in DEI face a similar situation. Many executives are wondering why their teams dont perform better and are asking questions about what is holding them back. The honest answer is usually that they lack something that can be traced back to culture.

Smart leaders are asking tough questions because their people and organizations want culture change that leads to workplace excellence. We certainly dont want a situation where AI is actually undercutting critical DEI programming. It is up to executive teams to ensure that these new technologies are developed and implemented in a way that promotes DEI and benefits everyone, not stumbling into a solution that actually perpetuates the stereotypes and inequalities that so many people have dedicated themselves to eradicating.

There are so many potential benefits with AI. With those benefits comes a great deal of responsibility. It is going to take true leadership to ensure that DEI is prioritized, particularly since many people in those roles are unsure of what they should be doing to make diversity a priority on a personal or organizational level.

Every new technology has unintended consequences. Lets all work together now to ensure that bias and discrimination are not part of the AI brain we cant wait for Neo to save us!

About the Author

Donald Thompson founded The Diversity Movement to literally change the world. As CEO, he has guided its work with hundreds of clients and through hundreds of thousands of data touch points. TDMs global recognition centers on tying DEI initiatives to business objectives. Recognized by Inc., Fast Company and Forbes, he is the author of Underestimated: A CEOs Unlikely Path to Success, hosts the podcast High Octane Leadership in an Empathetic World and has published widely on leadership and the executive mindset. As a leadership and executive coach, Thompson has created a culture-centric ethos for winning in the marketplace by balancing empathy and economics. Follow him on LinkedIn for updates on news, events, and his podcast, or contact him at info@donaldthompson.com for executive coaching, speaking engagements or DEI-related content.

Follow this link:
Artificial intelligence & diversity, equity, inclusion: How execs can meet the challenge - WRAL TechWire

‘Gold Rush’ in Artificial Intelligence Expected To Drive Data Center … – CoStar Group

The rapid adoption of new artificial intelligence apps and an intensifying bid for dominance among tech giants Amazon, Google and Microsoft are expected to drive investment and double-digit growth for the data center industry in the next five years.

A gold rush of AI these days centers on the brisk development of tools such as ChatGPT, according to a new analysis from real estate services firm JLL. Voice- and text-generating AI apps could transform the speed and accuracy of customer service interactions and accelerate demand for computing power, as well as the systems and networks connecting users that data centers provide, the real estate firm said.

The emergence of AI comes on the heels of increased usage of data centers in the past few years, as people spend more time online for work and entertainment, fueling the need for these digital information hubs, which provide the speed, memory and power to support those connections.

JLL projected that half of all data centers will be used to support AI programs by 2025. The new AI applications need for enormous amounts of data capacity will require more power and expanded space for the data center services, particularly colocation facilities, which are a type of data center that rents capacity to third-party companies and may service dozens of them at one time. It's also a potential growth area for commercial property investors.

We expect AI applications, and the machine learning processes that enable them, will drive significant demand for colocation capabilities like those we provide, Raul Martynek, CEO of Dallas-based DataBank, told CoStar News in an email. Specifically, the demand will be for high-density colocation and data centers that provide significantly greater concentrations of power and cooling.

One kilowatt hour of energy can power a 100-watt light bulb for 10 hours, and traditional data server workloads might require 15 kilowatts per typical cabinet, or server rack, Martynek said. But the high-performance computing nodes required to train large language models like ChatGPT can consume 80 kilowatts or more per cabinet.

This requires more spacing between cabinets to maintain cooling, or specialized water-chilled doors to cool the cabinets, Martynek said.

In addition to the added energy and water needs, the growth in data centers faces other challenges. Credit-rating firm S&P Global Ratings noted that long-term industry risks include shifting technology, cloud service providers filling their own data center needs, and weaker pricing. The data center industry, with power-hungry facilities running 24 hours a day and 365 days a year, has also received criticism from environmentalists.

DataBank owns and operates more than 65 data centers in 27 metropolitan markets. This month, it secured $350 million in financing from TD Bank to fund its ongoing expansion.

It was DataBanks second successful financing this year, coming just weeks after completing a $715 million net-lease securitization in March 1. Under net-lease offerings, issuers securitize their rent revenue streams into bonds. The sale of those bonds replenishes the issuers capital to be used to pay down debt and continue investments.

ChatGPT and other apps are bots that use machine learning to mimic human speech and writing. ChatGPT debuted in November and is most arguably the most sophisticated to launch so far. AI software developer Tidio estimated recently that usage of such bots has already grown to 1.5 billion users worldwide.

In January, Microsoft announced a new multibillion-dollar investment in ChatGPT maker OpenAI. Google has recently improved its AI chatbot, Bard, in an effort to rival its competitors. And Amazon Web Services, the largest cloud computing provider, introduced a service last week called Bedrock aimed at helping other companies develop their own chatbots.

Amazon CEO Andy Jassy touted the e-commerce giants AI plans in his annual letter to shareholders.

Most companies want to use these large language models but the really good ones take billions of dollars to train and many years and most companies dont want to go through that, Jassy said last week on CNBC. So what they want to do is they want to work off of a foundational model thats big and great already and then have the ability to customize it for their own purposes. And thats what Bedrock is.

The growth projections of AI have data center owners and operators at the forefront of the securitized bond market. Three data center providers have issued $1.3 billion in net-lease securitized offerings already this year, according to CoStar data. Thats more than all of last year combined. In addition, two more providers have offerings in the wings.

The sector is a bright spot in an otherwise weakened market for other commercial real estate securitized bond offerings, down more than 70% from the same time last year.

The data center space remains extremely attractive to capital sources looking for quality and stability versus other asset classes that have been challenged amidst uncertain economic conditions, Carl Beardsley, managing director and data centers lead at JLL Capital Markets, told CoStar News in an email.

JLL said data center financing comes from a variety of sources including debt funds, life insurance companies, banks and originators of commercial-mortgage backed securities.

Although money center banks and some regional banks have become more conservative during this volatile interest rate period, there is still a large appetite from the lender universe to allocate funds toward data centers, Beardsley said.

JLL is forecasting that the global data center market is expected to grow 11.3% from 2021 through 2026.

Across its six primary data center markets Chicago, Dallas-Fort Worth, New Jersey, Northern California, Northern Virginia and Phoenix the United States has a strong appetite for data centers property transactions compared to other countries, according to JLL, accounting for 52% of all deals from 2018 to 2022. These markets also have a data center capacity of 1,939 megawatts under construction, JLL said. One megawatt is equal to 1,000 kilowatts.

The growth is expected to continue even heading into a potential recession, according to S&P, which has rated two of the three data center securitized bond offerings completed this year so far.

Overall supply and demand is relatively balanced as new data center development has been constrained in certain markets by site availability, lingering supply chain issues and more recently, power capacity constraints, S&P noted in its reviews. Although we expect data centers to see some growth deceleration in a recessionary environment, we believe it will be mitigated by the critical nature of data centers.

S&P added that market data suggests 2022 vacancy rates were low for key data center markets and rental rates increased year over year.

New net-lease securitized fundraisings this year have come from DataBank, Stack Infrastructure, and Vantage Data Centers.

Denver-based Vantage, a global provider of hyperscale data center campuses, saw unprecedented growth in 2022, outperforming its previous record set in 2021. The company began developing four new campuses internationally and opened 13 data centers. The company raised more than $3 billion last year to support that effort.

Last month, Vantage completed an additional securitized notes offering raising $370 million. The offering was backed by tenant lease payments on 13 completed and operating wholesale data centers located in California, Washington state and Canada.

Stack, a Denver-based developer and operator of data centers, issued $250 million in securitized notes last month.

Stacks growth is outpacing the industry with a portfolio of more than 1 gigawatt, or 1,000 megawatts, of built and under-development capacity, and more than 2 gigawatts of future development capacity planned across the globe. The company has more than 4 million square feet currently under development.

Stack most recently announced the expansion of a Northern Virginia campus to 250 megawatts, the groundbreaking for another 100 megawatt campus in Northern Virginias Prince William County and the expansion of its 200 megawatt flagship Portland, Oregon, campus.

In addition, Dallas firm CyrusOne and Seattle-based Sabey Data Centers have filed preliminary notices of offerings in the works with the Securities and Exchange Commission.

See the original post:
'Gold Rush' in Artificial Intelligence Expected To Drive Data Center ... - CoStar Group