Archive for the ‘Quantum Computer’ Category

Material found in paint may hold the key to a technological revolution – Advanced Science News

The waste chips of paint you strip off the walls might not be so useless afterall.

Image credit: Sandia National Laboratories

For the next generation of computer processors, one persistent challenge for researchers is finding novel ways to make non-volatile memory on an ever-smaller scale. As smaller processors inevitably lead toward a finite limit on space and therefore processing power quantum computing or new materials that move away from traditional silicon chips are thought to overcome this barrier.

Now, researchers at Sandia National Laboratory, California, and the University of Michigan, publishing in Advanced Materials, have made a step forward in realizing a solution to this problem using a new material in processing chips for machine-learning applications, that gives these computers more processing power than conventional computer chips. The specific obstacle the authors wanted to overcome was the limitations with filamentary resistive random access memory (RRAM), in which defects occur within the nanosized filaments. The team instead wanted to create filament-free bulk RRAM cells.

The materials the authors use, titanium oxide or TiO2, may sound like a rather mundane inorganic substance to readers unfamiliar with it, but it is in fact a lot more common than most people realize. If you ever watched Bob Rosss wonderful The Joy of Painting, you may be more familiar with TiO2 as titanium white the name it is given when used as a pigment in paints. In fact, TiO2 is ubiquitous in paints not just on the landscape artists pallet, but in house paints, varnishes, and other coatings. It is also found in sunscreen and toothpaste.

The point is, TiO2 is cheap and easy to make, which is one of the reasons this new-found application in computer technology is so exciting.

A. Alec Talin of the Sandia National Laboratory, lead author of the paper, explained why this cheap, nontoxic substance is ideal for his teams novel processing chip: Its an oxide, theres already oxygen there. But if you take a few out, you create what are called oxygen vacancies. It turns out that when you create oxygen vacancies, you make this material electrically conductive.

These vacancies can also store electrical data, a key ingredient computing power. These oxygen vacancies are created by heating a computer chip with a titanium oxide coating 150 C, and through basic electrochemistry, some of the oxygen in the TiO2 coating can be removed, creating oxygen vacancies.

When it cools off, it stores any information you program it with, Talin said.

Furthermore, their TiO2-based processor not only offers a new way of processing digital information, it also has the potential to fundamentally alter the way computers operate. Currently, computers work by storing data in one place and processing that same data in another place. In other words, energy is wasted in moving data from one place to another before it can be processed.

What weve done is make the processing and the storage at the same place, said Yiyang Li of the University of Michigan and first author of the paper. Whats new is that weve been able to do it in a predictable and repeatable manner.

This is particularly important for machine-learning and deep neural network applications, where as much computing power is needed for data processing, and not data moving.

Li explained: If you have autonomous vehicles, making decisions about driving consumes a large amount of energy to process all the inputs. If we can create an alternative material for computer chips, they will be able to process information more efficiently, saving energy and processing a lot more data.

Talin also sees applications in everyday devices that are already ubiquitous. Think about your cell phone, he said. If you want to give it a voice command, you need to be connected to a network that transfers the command to a central hub of computers that listen to your voice and then send a signal back telling your phone what to do. Through this process, voice recognition and other functions happen right in your phone.

In an age where digital privacy is important, it may be attractive to consumers to know sensitive datasuch as the sound of their own voice stays in their phone, rather than being sent to the Cloud first, where accountability and control is less clear-cut.

Like many advances in science, the discovery of this technological application of TiO2 is, as Bob Ross would call it, yet another happy accident, that has real-world, positive applications.

Reference: Yiyang Li et al., FilamentFree Bulk Resistive Memory Enables Deterministic Analogue Switching, Advanced Materials (2020) DOI: 10.1002/adma.202003984

Quotes adapted from the Sandia National Laboratory press release.

Read more:
Material found in paint may hold the key to a technological revolution - Advanced Science News

What is Quantum Computing, and How does it Help Us? – Analytics Insight

The term quantum computing gained momentum in the late 20thcentury. These systems aim to utilize these capabilities to become highly-efficient. They use quantum bits or qubits instead of the simple manipulation of ones and zeros in existing binary-based computers. These qubits also have a third state called superposition that simultaneously represents a one or a zero. Instead of analyzing a one or a zero sequentially, superposition allows two qubits in superposition to represent four scenarios at the same time. So we are at the cusp of a computing revolution where future systems have capability beyond mathematical calculations and algorithms.

Quantum computers also follow the principle of entanglement, which Albert Einstein had referred to as spooky action at a distance. Entanglement refers to the observation that the state of particles from the same quantum system cannot be described independently of each other. Even when they are separated by great distances, they are still part of the same system.

Several nations, giant tech firms, universities, and startups are currently exploring quantum computing and its range of potential applications. IBM, Google, Microsoft, Amazon, and other companies are investing heavilyin developing large-scale quantum computing hardware and software. Google and UCSB have a partnership to develop a 50 qubits computer, as it would represent 10,000,000,000,000,000 numbers that would take a modern computer petabyte-scale memory to store. A petabyte is the unit above a terabyte and represents 1,024 terabytes. It is also equivalent to 4,000 digital photos taken every day. Meanwhile, names like Rigetti Computing, D-Wave Systems, 1Qbit Information Technologies, Inc., Quantum Circuits, Inc., QC Ware, Zapata Computing, Inc. are emerging as bigger players in quantum computing.

IEEE Standards Association Quantum Computing Working Group is developing two technical standards for quantum computing. One is for quantum computing definitions and nomenclature, so we can all speak the same language. The other addresses performance metrics and performance benchmarking to measure quantum computers performance against classical computers and, ultimately, each other. If required, new standards will also be added with time.

The rapid growth in the quantum tech sector over the past five years has been exciting. This is because quantum computing presents immense potential. For instance, a quantum system can be useful for scientists for conducting virtual experiments and sifting through vast amounts of data. Quantum algorithms like quantum parallelism can perform a large number of computations simultaneously. In contrast, quantum interference will combine their results into something meaningful and can be measured according to quantum mechanics laws. Even Chinese scientists are looking to developquantum internet, which shall be a more secure communication system in which information is stored and transmitted withadvanced cryptography.

Researchers at Case Western Reserve University used quantum algorithms to transform MRI scans for cancer, allowing the scans to be performed three times faster and to improve their quality by 30%. In practice, this can mean patients wont need to be sedated to stay still for the length of an MRI, and physicians could track the success of chemotherapy at the earliest stages of treatment.

Laboratoire de Photonique Numrique et Nanosciences of France has built a hybrid device that pairs a quantum accelerometer with a classical one and uses a high-pass filter to subtract the classical data from the quantum data. This has the potential to offer an highly precise quantum compass that would eliminate the bias and scale factor drifts commonly associated with gyroscopic components. Meanwhile, the University of Bristolhas founded a quantum solution for increasing security threats. Researchers at the University of Virginia School of Medicine are working to uncover the potential quantum computers hold to help understand genetic diseases.Scientists are also using quantum computing to find a vaccine for COVID and other life-threatening diseases.

In July 2017, in collaboration with commercial photonics tools providerM Squared, QuantIC demonstrated how a quantum gravimeter detects the presence of deeply hidden objects by measuring disturbances in the gravitational field. If such a device becomes practical and portable, the team believes it could become invaluable in an early warning system for predicting seismic events and tsunamis.

Continued here:
What is Quantum Computing, and How does it Help Us? - Analytics Insight

The Future of Computing: Hype, Hope, and Reality – CIOReview

Bill Reichert, Partner, Pegasus Tech Ventures

For roughly 75 years, the fundamental architecture of computers has not changed much. Certainly, the hardware has changed tremendously, and software has evolved accordingly. But the basic idea of storing instructions and data in binary code, and using on/off digital hardware to execute mathematical and logical operations, has remained roughly the same for decades.

All that is changing.

The same advances in semiconductor fabrication technology that powered Moores Lawthe exponential increase in the power of computers over the last several decadeshave enabled hardware engineers to develop new architectures that promise to transform the computing landscape over the coming decades.

At the same time, software engineering is also progressing. Marc Andreessen has famously said, Software is eating the world. What he did not make clear, though, is that virtually all the progress in computing over the past 30 years has been thanks to hardware, not software.

Heterogeneous Computing

New architectures, however,require that both software engineers and hardware engineers work together. A new class of hardware is emerging that takes advantage of what is called heterogeneous computing, multi-core chips that incorporate multiple different co-processors on the chip that are optimized for specialized tasks. Writing software that takes full advantage of these new chips is extremely challenging, and so companies like SambaNova Systems are developing operating systems and software compilers that optimize the application code automatically and allocate resources to compute tasks dynamically in real-time as computing demands change.

AI Chips

With the emergence of deep neural network software, engineers realized that Graphics Processing Units, an architecture commercialized by Nvidia, were nicely designed for doing the massive matrix calculations required by neural network models. But GPUs are not exactly optimized for AI, and so there has been an explosion of startups seeking to develop chips that offer 10x or 100x the performance and power efficiency of GPUs. On the server side, companies like Cerebras Systems and Graphcore, and more recently SambaNova, are promising order of magnitude improvements. And on the edge, companies like Gyrfalcon Technology, Syntiant, and Blaize are promising even greater improvements inperformance and power efficiency.

Virtually all the progress in computing over the past 30 years has been thanks to hardware, not software

Edge Computing

The second half of the 20th century was all about moving computing from centralized mainframe computers to desktop and laptop distributed computers. With the development of a high-speed Internet, the thinking shifted, and an application could sit in the cloud and support thousands, even millions, of users. But as the Internet of Things took off and enabled data collection from literally billions of devices, moving all that data up to the cloud in order to crunch it has become a challenge. Now companies are looking to process data at the edge, at the point of collection, rather than sending it up to the cloud, thereby reducing latency and cutting bandwidth and storage costs. At its simplest level, edge computing filters out unimportant data and sends only the most important data to the cloud. For more complex tasks, such as autonomous driving, edge computing requires processing massive AI models and making very accurate judgments in milliseconds. For these tasks, the new special-purpose chips discussed above and below are fighting for design wins.

Analog Computing

As brilliant as binary code is for enabling absolutely precise calculations, the real world is analog, not digital, and many compute tasks could be done more efficiently if we could operate with analog values rather than having to digitize them. But analog computing is imprecise, and most computing problems require exact values, not approximate values. (How much money do you have in your bank account?) Some problems, like AI inference and monitoring sensor data, do not need six sigma precision to get the right answer or make the right decision. Companies like Mythic, Analog Inference, and Aspinity are incorporating analog computing architectures into their chips to make them up to 100x more efficient solving problems involving data from our analog world.

Photonic Computing

Light has been used for digital communications and computer networks for decades, but using photons to do the math and putting photonic processors on a chip are extremely challenging. That is what several startups are trying to do. Spinning technologies out of MIT and Princeton, three companies, Lightelligence, Lightmatter, and Luminous Computing, are racing to commercialize the first photonic chip for doing AI inference at the edge.

Neuromorphic Computing

In spite of what the media portrays as the imminent cyber-apocalypse where robots rebel against their human masters and take over the world, we are a long way away from the science fiction world imagined in popular culture. The fact is that the human brain is still massively more powerful and efficient than the most powerful supercomputers on earth. But computer scientists think there is a path to create an artificial brain. The branch of artificial intelligence that uses neural network mathematical frameworks to compute information in a manner similar to the human brain is sometimes referred to as neuromorphic, because it mimics human neuro-biology. But researchers have been working on models that even more closely mimic the human brain in its design and efficiency. The brain sends signals as electrochemical spikes, not digital bytes, and the brains roughly 86 billion neurons are interconnected in a way that is very different from transistors on a chip. Researchers at Stanford, Intel, IBM, and several startup companies, such as Rain Neuromorphics and BrainChip, are trying to develop hardware and software that uses neuromorphic principles to deliver very high-power computing on very small semiconductor chips.

Quantum Computing

Almost certainly the most radical initiative in computing is the attempt to harness the potential of quantum computing. At the subatomic level, particles of matter behave in extraordinary and wonderful ways they can exist in more than one state simultaneously, and they can entangle with one another across a distance without any apparent physical connection. It turns out that electronic devices like transistors and diodes wouldnt even work if the universe were strictly Newtonian. If we can figure out how to control the quantum properties of light and matter the way we figured out how to use gears to make adding machines and transistors to make computers, we will be able to make quantum computers that are as superior to current supercomputers as supercomputers are to adding machines.

Some people say we are still a long way away from quantum supremacy, when quantum computers can solve problems that no classical computer can solve. But recent advances indicate that we may not be that far away from quantum advantage, when quantum computers can solve certain specialized problems faster than classical computers.

Already big players like IBM, Google, Intel, Honeywell, and Microsoft are demonstrating machines that can execute quantum algorithms and startups like Rigetti Computing,IonQ, and PsiQuantum are joining the race, along with quantum software companies like QC Ware, Cambridge Quantum Computing, and Zapata Computing. Big corporations and governments are investing in projects that will take advantage of the power of quantum computing in chemistry, pharmaceuticals, finance, logistics, failure analysis, and artificial intelligence.

Each of these emerging technologies promises to significantly advance computing, and with these advances will come new technology leaders. The evolution of computing has given rise to multiple generations of spectacular success stories like IBM, Intel, Microsoft, Nvidia, Google, and Amazon Web Services. Most of these companies are trying to reinvent themselves to catch the next wave of computing technology, but certainly new companies will emerge in these new sectors, and some famous names will founder and go the way of the dinosaurs, like Univac, Digital Equipment, MIPS, and Silicon Graphics. Meanwhile, corporate CIOs will have to decide where to place their bets and start investing in these new technologies, if they havent already.

See the rest here:
The Future of Computing: Hype, Hope, and Reality - CIOReview

Ten-year Forecasts for Quantum Networking Opportunities and Deployments Over the Coming Decade – WFMZ Allentown

DUBLIN, Oct. 12, 2020 /PRNewswire/ -- The "Quantum Networking: A Ten-year Forecast and Opportunity Analysis" report has been added to ResearchAndMarkets.com's offering.

This report presents detailed ten-year forecasts for quantum networking opportunities and deployments over the coming decade.

Today there increasing talk about the Quantum Internet. This network will have the same geographical breadth of coverage as today's Internet but where the Internet carries bits, the Quantum Internet will carry qubits, represented by quantum states. The Quantum Internet will provide a powerful platform for communications among quantum computers and other quantum devices. It will also further enable a quantum version of the Internet-of-Things. Finally, quantum networks can be the most secure networks ever built - completely invulnerable if constructed properly.

Already there are sophisticated roadmaps showing how the Quantum Internet will come to be. At the present time, however, quantum networking in the real world consists of three research programs and commercialization efforts: Quantum Key Distribution (QKD) adds unbreakable coding of key distribution to public-key encryption. Cloud/network access to quantum computers is core to the business strategies of leading quantum computer companies. Quantum sensor networks promise enhanced navigation and positioning; more sensitive medical imaging modalities, etc. This report provides power ten-year forecasts of all three of these sectors.

This report provides a detailed quantitative analysis of where the emerging opportunities can be found today and how they will emerge in the future:

With regard to the scope of the report, the focus is, of course, on quantum networking opportunities of all kinds. It looks especially, however, on three areas: quantum key distribution (QKD,) quantum computer networking/quantum clouds, and quantum sensor networks. The report also includes in the forecasts breakouts by all the end-user segments of this market including military and intelligence, law enforcement, banking and financial services, and general business applications, as well as niche applications. There are also breakouts by hardware, software and services as appropriate.

In addition, there is also some discussion of the latest research into quantum networking, including the critical work on quantum repeaters. Quantum repeaters allow entanglement between quantum devices over long distances. Most experts predict repeaters will start to prototype in real-world applications in about five years, but this is far from certain.

This report will be essential reading for equipment companies, service providers, telephone companies, data center managers, cybersecurity firms, IT companies and investors of various kinds.

Key Topics Covered:

Executive SummaryE.1 Goals, Scope and Methodology of this ReportE.1.1 A Definition of Quantum NetworkingE.2 Quantum Networks Today: QKD, Quantum Clouds and Quantum Networked SensorsE.2.1 Towards the Quantum Internet: Possible Business OpportunitiesE.2.2 Quantum Key DistributionE.2.3 Quantum Computer Networks/Quantum CloudsE.2.4 Quantum Sensor NetworksE.3 Summary of Quantum Networking Market by Type of NetworkE.4 The Need for Quantum Repeaters to Realize Quantum Networking's PotentialE.5 Plan of this Report

Chapter One: Ten-year Forecast of Quantum Key Distribution1.1 Opportunities and Drivers for Quantum Key Distribution Networks1.1.1 QKD vs. PQC1.1.2 Evolution of QKD1.1.3 Technology Assessment1.2 Ten-year Forecasts of QKD Markets1.2.1 QKD Equipment and Services1.2.2 A Note on Mobile QKD1.3 Key Takeaways from this Chapter

Chapter Two: Ten-Year Forecast of Quantum Computing Clouds2.1 Quantum Computing: State of the Art2.2 Current State of Quantum Clouds and Networks2.3 Commercialization of Cloud Access to Quantum Computers2.4 Ten-Year Forecast for Cloud Access to Quantum Computers2.4.1 Penetration of Clouds in the Quantum Computing Space2.4.2 Revenue from Network Equipment for Quantum Computer Networks by End-User Industry2.4.3 Revenue from Network Equipment Software by End-User Industry2.5 Key Takeaways from this Chapter

Chapter Three: Ten-Year Forecast of Quantum Sensor Networks3.1 The Emergence of Networked Sensors3.1.1 The Demand for Quantum Sensors Seems to be Real3.2 The Future of Networked Sensors3.3 Forecasts for Networked Quantum Sensors3.4 Five Companies that will Shape the Future of the Quantum Sensor Business: Some Speculations

Chapter Four: Towards the Quantum Internet4.1 A Roadmap for the Quantum Internet4.1.1 The Quantum Internet in Europe4.1.2 The Quantum Internet in China4.1.3 The Quantum Internet in the U.S.4.2 Evolution of Repeater Technology: Ten-year Forecast4.3 Evolution of the Quantum Network4.4 About the Analyst4.5 Acronyms and Abbreviations Used In this Report

For more information about this report visit https://www.researchandmarkets.com/r/rksyxu

About ResearchAndMarkets.comResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.

Research and Markets also offers Custom Research services providing focused, comprehensive and tailored research.

Media Contact:

Research and MarketsLaura Wood, Senior Managerpress@researchandmarkets.com

For E.S.T Office Hours Call +1-917-300-0470For U.S./CAN Toll Free Call +1-800-526-8630For GMT Office Hours Call +353-1-416-8900

U.S. Fax: 646-607-1907Fax (outside U.S.): +353-1-481-1716

Read more here:
Ten-year Forecasts for Quantum Networking Opportunities and Deployments Over the Coming Decade - WFMZ Allentown

Billionaire Investor Vinod Khosla Speaks Out On AI’s Future and the COVID-19 Economy – EnterpriseAI

Vinod Khosla, a co-founder of the former Sun Microsystems and a longtime technology entrepreneur, venture capitalist and IT sage, makes billions of dollars betting on new technologies.

Khosla shared some of his technology and investment thoughts at a recent tech conference about the future of AI in business, AI chip design and quantum computing -- and even gave some advice to AI developers and companies about how they can successfully navigate the tumultuous times of the COVID-19 pandemic. Khosla gave his remarks at an Ask Me Anything Industry Luminary Keynote at the virtual AI Hardware Summit earlier in October. The Q&A was hosted by Rene Haas, the president of Arms IP products group, and a former executive with AI chipmaker Nvidia.

Khosla, who is ranked #353 on the Forbes 400 2020 list, has a net worth today of $2.6 billion, largely earned through his investment successes in the tech field. He founded his VC firm, Khosla Ventures, in 2004.

Here are edited segments from that 30-minute Q&A, which centered on questions asked by viewers of the virtual conference:

Rene Haas: What has been the most significant technological advancement in AI in the last year or two? And how do you anticipate it is going to change the landscape of business?

Vinod Khosla

Vinod Khosla: What's surprised me the most is bifurcation along two lines one that argues that deep learning goes all the way, and others who argue that AGI (artificial general intelligence) requires very different kinds [of uses]. My bet is that each will be good at certain functions. Now, I don't worry about AGI. Being a philosopher, I do worry about AI and AGI being used for most valuable economic functions human beings do. That's where the big opportunity is. What surprised me most is there's been great progress in language models and algorithms. But the outsize role of hardware in building models that are much more powerful, trillions of parameters per model, and how effective they can be, has been surprising. I'm somewhat biased because we have large investors in open AI. On the flip side, we are large investors in companies like Vicarious, which are taking that AGI in a very different approach.

Haas: Building on that a little bit, there are a lot of AI hardware startup companies. Some are well funded, some with high burn rates. When you think about competing with the software support ecosystem, like Nvidia has, how can startups really rely on the strength of their architecture alone? What are the kinds of things that you look at it in terms of guidelines for startups in this space?

Khosla: There's many different markets, you have to be clear. There is a training market in the data center. There's an inferencing market in the data center. There's a market for edge devices where the criteria are very different. And then there's this emerging area of what quantum computing might do in hardware. We can talk about any of these, but what's really interesting to me is how much innovation we are seeing. Companies like Nvidia and the big cloud providers, especially Google and others, have very strong efforts.

And probably the thing we've learned in semiconductors, having access to process technology and process nodes that others don't thats where the software ecosystem gives them such a large advantage. It's hard for startups to compete. Now, I could be wrong, but we've tended to avoid digital architectures, for the data center or for inferencing. We've looked at a dozen of those and chosen not to jump in. Because there's bigger players with huge software and process and resource advantages. On the analog side, it's a whole different ballgame. We've invested in analog inference. There's been multiple analog efforts. I think some haven't addressed enough of the problem to get a large enough power advantage.

So, the bottom line for a startup, is that to do better than Nvidia or one of the other larger players or cloud providers, then you've got to talk about 20X to 100X advantage in TeraOPS per watt. I think if you're not in the hundred TeraOPS per watt range, it's going to be hard to sustain a large advantage. And I see most digital efforts sort of in this one to 10 TeraOPS per watt power range. So I find the edge much more promising than the data center.

Haas: What about the difficulties of startups or companies trying to enter this field? Much of it is horizontal in nature. Do they need some kind of vertical stack or some tie into the ecosystems? Do the same challenges apply, relative to being a horizontal versus vertical business or do you think there are some different opportunities there?

Khosla: I think there will be classes of algorithms. There's clearly one class of algorithms around deep learning and things like that. The question of how architecture maps to different types of algorithms, and algorithmic approaches, is a little too early to predict, and that will determine what architectures work best.

On the edge, what's clearly going to be important is power efficiency. The really volume markets are under five watts and $5 and a couple of hundred TeraOPS. That's the price point I look at as differentiated enough for edge devices to do a lot of interesting things. Every speaker, every microphone, every sensor. You start to see price points that go from tens of pennies to a few dollars that go into these very high volume devices. I think that would be a different architecture than the stuff in the data center.

In the data center, whether inferencing and training are the same architecture or the same software stack even, I still think it's open for debate. I think in inferencing, cost matters and efficiency matters. In training, especially for the really large algorithms, probably not so much. So, hard to tap.

And then there's this really surprise thing of what quantum computing will do, and what kinds of algorithms that will run. The things we are most interested in is very specialized applications for quantum computing. We have one effort in drug discovery for quantum computing. I think material science with quantum computing is going to be interesting, possibly some financial services products based on quantum computing. So, plenty of these interesting options. I think for a while we'll see more of a bifurcation, but if I were to predict five years from now I think we'll see more unification around the types of algorithms that do certain economic tasks well.

Rene Haas

Haas: Quantum is something that has been written about for a long time and now you're starting to see some things product-wise that are looking a bit more real. As an investor, and looking at private company opportunities around quantum, do you feel like the time is now to start investing in companies that are doing things around the hardware space in quantum? Or do you look at it and say it's still years away from being commercially viable?

Khosla: In the big company world, it's definitely time for the big companies to be investing, and they're investing heavily. But that's Microsoft, Google, IBM and others. There's also a whole slew of startups where the market and products have emerged slower. And whenever things emerge slower especially on the hardware side, the big companies have an advantage because they can catch up. Whenever it takes lots and lots of resources, then the big companies have an advantage. Autonomous driving is the one area where that's mostly true, but not completely true. We've seen some radical innovation out of startups there.

So, it depends on the pace of development of a technology or deployment. I do think the time is very ripe for quantum software applications, specialized applications, to develop. But given how complex quantum is to use, such as the the interface between quantum and the regular computing world, and the full stack of software and how it runs algorithms, I think specialized algorithms will do better there.

Haas: You're obviously involved in AI chip startups. Looking at the last four years of AI chip startups, are you bullish, in general, looking back? And if so, which areas are you most excited about?

Khosla: When there's radical innovation, it's still interesting. We've seen a lot of startups, but I wouldn't say we've seen radical innovation in architectures or performance or power efficiency. And when I say power efficiency, it's really TeraOPS per watt, which is performance per watt that is really the key metric. If you see the kinds of large jumps, like 20X, 50X, 100X, then that's really interesting. Still, there's less room for it in the data center, more room for it in the edge, but every time I say something like this then some really clever person surprises me with a counter-narrative that actually is pretty compelling. So would I say I'm open for architectures? Yes. Radical changes, yes, and I think that will happen, but it's just very hard to predict today. The predictability on where things go is still low on innovation. But I always say, improbables are not unimportant. We just don't know which improbable is important. In the meantime, the traditional digital data center, even the digital edge, will probably belong to the larger players.

I do want to encourage the folks out there trying to build products. When we did the Nextgen product to compete with Intel, we very quickly got to 50% market share of the under $1,000 PC market, where we were competing on an x86 architecture with Intel. So surprises are possible, and people who take specialized approaches in market segments, there can be very interesting innovation to be done.

Haas: How large is the economic opportunity around AI and what do you think drives it?

Khosla: I'm probably more bullish. Whether you call it, AI or AGI, I think this area will be able to do most economically valuable human functions within the next decade. Probably a lot sooner. They will take time, integrating into regular workflows and traditional systems and all that. But the way I look at it, if we can replace human judgment in a task, you're saving far more money than selling a chip or a computer or something. So, if you can replace a security analyst and do their job, or have one security analyst do the job of five security analysts, or have one physician do the job of five physicians, you're saving gobs of money. And then you get to share in the human labor saving, which is where the large opportunities are. That could belong to both these combination software and hardware systems, I think that opportunity is orders of magnitude larger than any estimate I've seen today.

Haas: 2020 has been a very turbulent year. What advice would you give to tech entrepreneurs who are pushing through a recession and the remarkable situation involving the COVID-19 pandemic, while trying to build a product and build a company? What advice would you give to those entrepreneurs?

Khosla: I think the best ideas survive turbulent times. I find recessions are really the times when bigger companies cut back on some of their spending. I haven't seen that happen in this particular area. That's when people with the best ideas or with passion for a particular vision, leave those companies. So, I do see very good startups during turbulent times in general. Now, one has to be just pragmatic and adapt to the times. When money's cheap, you raise lots of money. When money is not cheap or not easily available you spend less, and take more time doing some fundamental work and getting it right. Which by the way is usually a better strategy than raising lots of money.

I do think that there is lots of opportunity. I think they have to adapt to the times and be much more thoughtful, maybe even more radical in their approach. Take larger leaps because you can take more time before you start spending the money to go to market. One of the things to keep in mind with most technologies thinking about the technology has huge implications downstream, but takes very little money. It takes very special talent. Then there's the building of the technology. And then there's the selling, and the sales or marketing usually ends up costing the most. Now's a good time to trade off for more compelling product and postpone some of the sales and marketing while the markets are uncertain. You can't afford to spend lots of money on that. So you have to adjust strategy as an entrepreneur and entrepreneurs do that fairly well.

Haas: What is your own investment philosophy, particularly when it comes to tech companies, and how does your overall portfolio, reflect that philosophy?

Khosla: We like the higher-risk, higher-upside things. I find investors generally reduce risk for good reasons, but make the consequences of success relatively inconsequential. I personally prefer larger risk, which is why I like analog right now, and make the consequential success, be it 50X or 100X better than what's available in the digital domain. I do see plenty of those kinds of opportunities still. I am not discouraged. I'm actually quite encouraged about the opportunities in this area. But, entrepreneurs usually find specialized paths to get to that first MVP product, that early traction, and then use it to broaden.

Haas: Model performance has been increasing slowly in the field of AI. Can you share your insights about that?

Khosla: In certain dimensions, I think that's true. When a technology plays out a certain way, it makes rapid progress in the beginning and then starts to peter out. Software models themselves are getting to a level of saturation. The progress on the hardware side, just scaling hardware, has been stunningly valuable as GTC-3 shows. It may give more of an advantage to the large cloud providers the people who can build, 500,000 CPU, GPU systems. But that's not for everyday use. I think that still needs to be told.

There are alternative approaches that still need to be discovered. I gave you the example of Vicarious, the robotics company we've invested in. Instead of needing 10 million or 100 million cats to recognize a cat, they're saying can we do it from 10 cats? So, maybe data becomes a lot less important. And what implications does that have for hardware architectures? It's very clear to me seeing the early results at Vicarious that it is entirely possible for AI systems to learn as rapidly and with as few examples as humans do, if the architecture is different than deep learning.

My bet is different approaches will be very good at different points, and we'll see that kind of specialization of architectures. A long time ago, 25 to 30 years ago, when you looked at Lego blocks, it came in large yellow, white, red, black and blue blocks. And there were three or four types of components. I think that's where software algorithms in AI may be today. Now, you couldn't build the Sydney Opera House out of Lego blocks back then, but then they got all these specialized components. The possibilities explode exponentially, so the combinations allow a lot more flexibility on what can happen, what systems can do. So, it might be we just need different types of algorithms to explore the capability of end-use systems. And that might have large implications for which hardware architectures work.

Hardware scaling may matter in some of these and clever architectures may matter in others. That's why I'm tracking what quantum computing may do for algorithms. Not just your standard quantum computing Shor's algorithm, etc. but real applications like drug discovery or material science. Or could you do better battery material? Those are really interesting now.

Haas: What advice do you have for first time hardware entrepreneurs, with strong architecture ideas, with really smart engineers, who don't really have a track record, and who haven't done this before -- how do you advise them to position themselves to get into this segment?

Khosla: Silicon Valley is very good at recognizing thoughtful, clever people -- they don't have to have a track record. Most successful entrepreneurs don't have track records. So, I wouldn't be afraid of that. I don't think you need a lot of management experience. Building great teams is probably the single piece of advice I give to entrepreneurs. Great and multi-dimensional teams to go after the problem, even if they haven't done it yet. Also, how the cleverness of your architecture isnt as important as the end results you deliver. Can you deliver that 20X, 50X over what the traditional players will do for your market? I think people underappreciate how much of an advantage you need in your architecture to make it worthwhile to do that startup.

And one more thing. There's a whole lot of tricks both on the models on the software side, on the hardware side. You can do hardware tricks and there's half a dozen which are very common in hardware and half a dozen that are pretty common in software, like reducing the model size. Everybody really gets there. Others have fundamental long-lasting advantages and if you're doing the startup, focus not on the tricks that give you a 5X improvement, because others will catch up to those tricks, either on software or hardware. Instead, focus on what will be the fundamental innovations five years from now, where you'll still have an advantage.

Related

Read the rest here:
Billionaire Investor Vinod Khosla Speaks Out On AI's Future and the COVID-19 Economy - EnterpriseAI