Archive for the ‘Quantum Computer’ Category

New York needs to be reimagined with technology and job training – Crain’s New York Business

Our response to Covid-19 offers a similar opportunity. Although theres no doubt we must focus on addressing immediate problems (schools, contact tracing, saving small businesses), we also should put thought into New Yorks future. Repairing is one thing, but designing a foundation is another. The new street grid, transit reforms and development policies that came out of 9/11 attest to the importance of the latter.

New York leaders should therefore take a few steps to chart the 21st century. In addition to controlling the virus and helping people in need, we must develop a grand strategy that recognizes the economic changes that were already happening before the pandemic, and leverage them in a way that benefits everyone.

Step one: capitalizing on emerging industries. Here the tech sector is a good starting point. Not only will tech companies continue to grow, but so too will tech aid and fuel the growth of every other kind of business. The areas that we should invest in include cybersecurity, quantum computing, artificial intelligence, transportation and smart manufacturing. Not only are they slated to create many jobs, but they also will increasingly undergird every other industry. A recent study on the projected impact of quantum computing on the New York economy, for instance, found that more than 57,000 new jobs will be generated in this area during the next five years, with that number expected to continue to grow as the technology advances. Policymakers and entrepreneurs need to work together to ensure that momentum keeps moving into the next decade, and create the right business conditions for New York to become an emerging tech hub.

Another way of putting this is reinvention by necessity. With more and more of our lives happening in a virtual world, the safety and efficiency challenges facing organizations have changed. Cyber threats, for example, are now a regular vulnerability for businesses and governments alike. Companies need rapid data processing like never before. Quantum computing and advanced malware detection are crucial for the economy. Not only will emerging tech generate growth, but it will also be a necessary component for the economy of tomorrow.

The next steps are doubling down on workforce development and ensuring that people can actually break into the sectors. Job openings in AI and cybersecurity dont mean much if New Yorkers arent qualified for them. We, therefore, need to expand our roster of digital skills programmingwhich includes computer science in the classroom, boot camps for aspiring coders, and a bevy of private training classes for entrepreneurs and workers. If the tech economy is to be inclusive, well need to put as much emphasis on teaching people the requisite skills as we do teaching them arithmetic.

Closing the digital divide is another step. Before Covid-19, we were already spending a lot of time online. In the midst of the pandemic, that trend has been amplified. People now need speedy, affordable internet connections to do their job, go to school, pay bills and get through each day. The fact that there are disparities in internet access is an impediment to the economy and only exacerbates existing inequalities. A strong 5G network throughout the city and state would help solve that issue and ultimately allow workers to take the necessary steps to move into the tech sector.

The good news is we already have parts of the foundation. New York has nearly unlimited investment resources, and state and local leaders have shown their appreciation for what tech can do.

The key is tying all the parts together and creating a new economy that offers opportunities to all.

Lynn McMahon is themanaging director of Accentures metro New York office. Julie Samuels is the executive director of Tech:NYC.

Read more from the original source:
New York needs to be reimagined with technology and job training - Crain's New York Business

The Importance of Funding Quantum Physics, Even in a Pandemic – Inside Philanthropy

Lets get subatomic. In philanthropic circles, arcane topics such as theoretical physics and quantum mechanics have a tough time attracting significant funding. Grantseekers can find it challenging to convey to potential donors the importance of subjects that are not only outside the ken of most non-scientists, but which may not seem as pressing as emergencies like global pandemics, poverty or climate change. Even within science funding, public and private, the life sciences dominate.

But the Perimeter Institute, a center for theoretical physics based in Waterloo, Ontario, has been successfully attracting funding through a pioneering public-private funding model. We wrote about Perimeter and its approach last year in the wake of the 20-year-old institutes contribution to developing the worlds first image of a black hole.

In short, Perimeter draws a blend of support from government, industry and private funders, and has become a worldwide leader in advancing talent and new discoveries in theoretical physics.

Just last week, Perimeter announced its new Clay Riddell Centre for Quantum Matter, a research hub where scientists will study the subatomic world of quantum mechanics to understand and discover new states of matteryou know, states of matter other than the familiar solid, liquid, gas and plasma that you learned about in high school. (Dont ask us to explain plasma.)

The new center is the culmination of a 10-year, $25 million investment in quantum matter research, made possible by a $10 million founding donation from the Riddell Family Charitable Foundation. Clay Riddell, who died in 2018, was a Canadian entrepreneur and philanthropist. Physicists believe that study of quantum science and matter will eventually lead to useful technologies and abilities that stretch the imagination.

That the theoretical science of today leads to the technologies of tomorrow is a key message in basic scienceand especially funding for basic science, explained Greg Dick, Perimeters executive director of advancement and senior director of public engagement. Consider the theory of special relativity and curved space: One hundred years after Einstein proposed it, Dick said, special relativity is a necessary element of GPS navigation systems in cars and other settings. The theories of quantum mechanics led in just a few decades to the computer age. And before all that, the theories of magnetism and electricity eventually translated into practically every single thing we use every day.

When electricity and magnetism were discovered, the problem of the day was air pollution in New York City from the manure that horse hoofs pulverized into dust, said Dick. But fortunately, people were thinking about esoteric questions of electricity and magnetism, and that changed society.

In other words, society can ill afford to stop funding basic and theoretical science. The exciting thing is that the time from new theory to useful technology is getting shorter, Dick said. Perhaps in a decade, the study of quantum matter could lead to solutions for next-generation quantum computers, medical diagnostics, transportation, superconductors for energy grids and cryptography for data security and communications.

But just as likely, said Dick, the study of quantum matter will enable the creation of exotic materials and technologies no one currently expects or imagines.

And this brings us to why the coronavirus pandemic, which has demanded so much of the worlds attention, is helping science grantseekers connect with funders.

Obviously, when COVID started, there was a pause (in fundraising), but interestingly, COVID has also moved the relevance and value of foundational science to the forefront of peoples minds, said Dick. Yes, the theoretical physics that we do is nuanced, but COVID has put science on a pedestal. Its actually easier to have that conversation about the value of science.

Whatever their understanding of physics, prospective donors can easily grasp the importance of the basic research that has enabled todays search for treatments and vaccines for COVID-19.

In a related manner, the COVID-19 pandemic changed the nature of the social interactions with potential donors, said Dick. In the past, wed host big events and parties, but now, the pivot to digital communication has really opened up new ways to connect with supporters. Those person-to-person video calls can actually enable more personal and deeper conversations, he said.

Perimeter was established in 1999, seeded with $100 million from Mike Lazaridis, the founder of the Blackberry smartphone pioneer Research In Motion. Bringing the public along as enthusiastic partners was always a requirement, said Dick. Mikes vision right at the beginning was world-class research, for sure, but he also wanted that message of foundational science baked into Perimeter from the very beginning.

As a result, Perimeter also offers classroom-ready educational resources used by teachers around the world, reaching millions of students.

Originally posted here:
The Importance of Funding Quantum Physics, Even in a Pandemic - Inside Philanthropy

Material found in paint may hold the key to a technological revolution – Advanced Science News

The waste chips of paint you strip off the walls might not be so useless afterall.

Image credit: Sandia National Laboratories

For the next generation of computer processors, one persistent challenge for researchers is finding novel ways to make non-volatile memory on an ever-smaller scale. As smaller processors inevitably lead toward a finite limit on space and therefore processing power quantum computing or new materials that move away from traditional silicon chips are thought to overcome this barrier.

Now, researchers at Sandia National Laboratory, California, and the University of Michigan, publishing in Advanced Materials, have made a step forward in realizing a solution to this problem using a new material in processing chips for machine-learning applications, that gives these computers more processing power than conventional computer chips. The specific obstacle the authors wanted to overcome was the limitations with filamentary resistive random access memory (RRAM), in which defects occur within the nanosized filaments. The team instead wanted to create filament-free bulk RRAM cells.

The materials the authors use, titanium oxide or TiO2, may sound like a rather mundane inorganic substance to readers unfamiliar with it, but it is in fact a lot more common than most people realize. If you ever watched Bob Rosss wonderful The Joy of Painting, you may be more familiar with TiO2 as titanium white the name it is given when used as a pigment in paints. In fact, TiO2 is ubiquitous in paints not just on the landscape artists pallet, but in house paints, varnishes, and other coatings. It is also found in sunscreen and toothpaste.

The point is, TiO2 is cheap and easy to make, which is one of the reasons this new-found application in computer technology is so exciting.

A. Alec Talin of the Sandia National Laboratory, lead author of the paper, explained why this cheap, nontoxic substance is ideal for his teams novel processing chip: Its an oxide, theres already oxygen there. But if you take a few out, you create what are called oxygen vacancies. It turns out that when you create oxygen vacancies, you make this material electrically conductive.

These vacancies can also store electrical data, a key ingredient computing power. These oxygen vacancies are created by heating a computer chip with a titanium oxide coating 150 C, and through basic electrochemistry, some of the oxygen in the TiO2 coating can be removed, creating oxygen vacancies.

When it cools off, it stores any information you program it with, Talin said.

Furthermore, their TiO2-based processor not only offers a new way of processing digital information, it also has the potential to fundamentally alter the way computers operate. Currently, computers work by storing data in one place and processing that same data in another place. In other words, energy is wasted in moving data from one place to another before it can be processed.

What weve done is make the processing and the storage at the same place, said Yiyang Li of the University of Michigan and first author of the paper. Whats new is that weve been able to do it in a predictable and repeatable manner.

This is particularly important for machine-learning and deep neural network applications, where as much computing power is needed for data processing, and not data moving.

Li explained: If you have autonomous vehicles, making decisions about driving consumes a large amount of energy to process all the inputs. If we can create an alternative material for computer chips, they will be able to process information more efficiently, saving energy and processing a lot more data.

Talin also sees applications in everyday devices that are already ubiquitous. Think about your cell phone, he said. If you want to give it a voice command, you need to be connected to a network that transfers the command to a central hub of computers that listen to your voice and then send a signal back telling your phone what to do. Through this process, voice recognition and other functions happen right in your phone.

In an age where digital privacy is important, it may be attractive to consumers to know sensitive datasuch as the sound of their own voice stays in their phone, rather than being sent to the Cloud first, where accountability and control is less clear-cut.

Like many advances in science, the discovery of this technological application of TiO2 is, as Bob Ross would call it, yet another happy accident, that has real-world, positive applications.

Reference: Yiyang Li et al., FilamentFree Bulk Resistive Memory Enables Deterministic Analogue Switching, Advanced Materials (2020) DOI: 10.1002/adma.202003984

Quotes adapted from the Sandia National Laboratory press release.

Read more:
Material found in paint may hold the key to a technological revolution - Advanced Science News

What is Quantum Computing, and How does it Help Us? – Analytics Insight

The term quantum computing gained momentum in the late 20thcentury. These systems aim to utilize these capabilities to become highly-efficient. They use quantum bits or qubits instead of the simple manipulation of ones and zeros in existing binary-based computers. These qubits also have a third state called superposition that simultaneously represents a one or a zero. Instead of analyzing a one or a zero sequentially, superposition allows two qubits in superposition to represent four scenarios at the same time. So we are at the cusp of a computing revolution where future systems have capability beyond mathematical calculations and algorithms.

Quantum computers also follow the principle of entanglement, which Albert Einstein had referred to as spooky action at a distance. Entanglement refers to the observation that the state of particles from the same quantum system cannot be described independently of each other. Even when they are separated by great distances, they are still part of the same system.

Several nations, giant tech firms, universities, and startups are currently exploring quantum computing and its range of potential applications. IBM, Google, Microsoft, Amazon, and other companies are investing heavilyin developing large-scale quantum computing hardware and software. Google and UCSB have a partnership to develop a 50 qubits computer, as it would represent 10,000,000,000,000,000 numbers that would take a modern computer petabyte-scale memory to store. A petabyte is the unit above a terabyte and represents 1,024 terabytes. It is also equivalent to 4,000 digital photos taken every day. Meanwhile, names like Rigetti Computing, D-Wave Systems, 1Qbit Information Technologies, Inc., Quantum Circuits, Inc., QC Ware, Zapata Computing, Inc. are emerging as bigger players in quantum computing.

IEEE Standards Association Quantum Computing Working Group is developing two technical standards for quantum computing. One is for quantum computing definitions and nomenclature, so we can all speak the same language. The other addresses performance metrics and performance benchmarking to measure quantum computers performance against classical computers and, ultimately, each other. If required, new standards will also be added with time.

The rapid growth in the quantum tech sector over the past five years has been exciting. This is because quantum computing presents immense potential. For instance, a quantum system can be useful for scientists for conducting virtual experiments and sifting through vast amounts of data. Quantum algorithms like quantum parallelism can perform a large number of computations simultaneously. In contrast, quantum interference will combine their results into something meaningful and can be measured according to quantum mechanics laws. Even Chinese scientists are looking to developquantum internet, which shall be a more secure communication system in which information is stored and transmitted withadvanced cryptography.

Researchers at Case Western Reserve University used quantum algorithms to transform MRI scans for cancer, allowing the scans to be performed three times faster and to improve their quality by 30%. In practice, this can mean patients wont need to be sedated to stay still for the length of an MRI, and physicians could track the success of chemotherapy at the earliest stages of treatment.

Laboratoire de Photonique Numrique et Nanosciences of France has built a hybrid device that pairs a quantum accelerometer with a classical one and uses a high-pass filter to subtract the classical data from the quantum data. This has the potential to offer an highly precise quantum compass that would eliminate the bias and scale factor drifts commonly associated with gyroscopic components. Meanwhile, the University of Bristolhas founded a quantum solution for increasing security threats. Researchers at the University of Virginia School of Medicine are working to uncover the potential quantum computers hold to help understand genetic diseases.Scientists are also using quantum computing to find a vaccine for COVID and other life-threatening diseases.

In July 2017, in collaboration with commercial photonics tools providerM Squared, QuantIC demonstrated how a quantum gravimeter detects the presence of deeply hidden objects by measuring disturbances in the gravitational field. If such a device becomes practical and portable, the team believes it could become invaluable in an early warning system for predicting seismic events and tsunamis.

Continued here:
What is Quantum Computing, and How does it Help Us? - Analytics Insight

The Future of Computing: Hype, Hope, and Reality – CIOReview

Bill Reichert, Partner, Pegasus Tech Ventures

For roughly 75 years, the fundamental architecture of computers has not changed much. Certainly, the hardware has changed tremendously, and software has evolved accordingly. But the basic idea of storing instructions and data in binary code, and using on/off digital hardware to execute mathematical and logical operations, has remained roughly the same for decades.

All that is changing.

The same advances in semiconductor fabrication technology that powered Moores Lawthe exponential increase in the power of computers over the last several decadeshave enabled hardware engineers to develop new architectures that promise to transform the computing landscape over the coming decades.

At the same time, software engineering is also progressing. Marc Andreessen has famously said, Software is eating the world. What he did not make clear, though, is that virtually all the progress in computing over the past 30 years has been thanks to hardware, not software.

Heterogeneous Computing

New architectures, however,require that both software engineers and hardware engineers work together. A new class of hardware is emerging that takes advantage of what is called heterogeneous computing, multi-core chips that incorporate multiple different co-processors on the chip that are optimized for specialized tasks. Writing software that takes full advantage of these new chips is extremely challenging, and so companies like SambaNova Systems are developing operating systems and software compilers that optimize the application code automatically and allocate resources to compute tasks dynamically in real-time as computing demands change.

AI Chips

With the emergence of deep neural network software, engineers realized that Graphics Processing Units, an architecture commercialized by Nvidia, were nicely designed for doing the massive matrix calculations required by neural network models. But GPUs are not exactly optimized for AI, and so there has been an explosion of startups seeking to develop chips that offer 10x or 100x the performance and power efficiency of GPUs. On the server side, companies like Cerebras Systems and Graphcore, and more recently SambaNova, are promising order of magnitude improvements. And on the edge, companies like Gyrfalcon Technology, Syntiant, and Blaize are promising even greater improvements inperformance and power efficiency.

Virtually all the progress in computing over the past 30 years has been thanks to hardware, not software

Edge Computing

The second half of the 20th century was all about moving computing from centralized mainframe computers to desktop and laptop distributed computers. With the development of a high-speed Internet, the thinking shifted, and an application could sit in the cloud and support thousands, even millions, of users. But as the Internet of Things took off and enabled data collection from literally billions of devices, moving all that data up to the cloud in order to crunch it has become a challenge. Now companies are looking to process data at the edge, at the point of collection, rather than sending it up to the cloud, thereby reducing latency and cutting bandwidth and storage costs. At its simplest level, edge computing filters out unimportant data and sends only the most important data to the cloud. For more complex tasks, such as autonomous driving, edge computing requires processing massive AI models and making very accurate judgments in milliseconds. For these tasks, the new special-purpose chips discussed above and below are fighting for design wins.

Analog Computing

As brilliant as binary code is for enabling absolutely precise calculations, the real world is analog, not digital, and many compute tasks could be done more efficiently if we could operate with analog values rather than having to digitize them. But analog computing is imprecise, and most computing problems require exact values, not approximate values. (How much money do you have in your bank account?) Some problems, like AI inference and monitoring sensor data, do not need six sigma precision to get the right answer or make the right decision. Companies like Mythic, Analog Inference, and Aspinity are incorporating analog computing architectures into their chips to make them up to 100x more efficient solving problems involving data from our analog world.

Photonic Computing

Light has been used for digital communications and computer networks for decades, but using photons to do the math and putting photonic processors on a chip are extremely challenging. That is what several startups are trying to do. Spinning technologies out of MIT and Princeton, three companies, Lightelligence, Lightmatter, and Luminous Computing, are racing to commercialize the first photonic chip for doing AI inference at the edge.

Neuromorphic Computing

In spite of what the media portrays as the imminent cyber-apocalypse where robots rebel against their human masters and take over the world, we are a long way away from the science fiction world imagined in popular culture. The fact is that the human brain is still massively more powerful and efficient than the most powerful supercomputers on earth. But computer scientists think there is a path to create an artificial brain. The branch of artificial intelligence that uses neural network mathematical frameworks to compute information in a manner similar to the human brain is sometimes referred to as neuromorphic, because it mimics human neuro-biology. But researchers have been working on models that even more closely mimic the human brain in its design and efficiency. The brain sends signals as electrochemical spikes, not digital bytes, and the brains roughly 86 billion neurons are interconnected in a way that is very different from transistors on a chip. Researchers at Stanford, Intel, IBM, and several startup companies, such as Rain Neuromorphics and BrainChip, are trying to develop hardware and software that uses neuromorphic principles to deliver very high-power computing on very small semiconductor chips.

Quantum Computing

Almost certainly the most radical initiative in computing is the attempt to harness the potential of quantum computing. At the subatomic level, particles of matter behave in extraordinary and wonderful ways they can exist in more than one state simultaneously, and they can entangle with one another across a distance without any apparent physical connection. It turns out that electronic devices like transistors and diodes wouldnt even work if the universe were strictly Newtonian. If we can figure out how to control the quantum properties of light and matter the way we figured out how to use gears to make adding machines and transistors to make computers, we will be able to make quantum computers that are as superior to current supercomputers as supercomputers are to adding machines.

Some people say we are still a long way away from quantum supremacy, when quantum computers can solve problems that no classical computer can solve. But recent advances indicate that we may not be that far away from quantum advantage, when quantum computers can solve certain specialized problems faster than classical computers.

Already big players like IBM, Google, Intel, Honeywell, and Microsoft are demonstrating machines that can execute quantum algorithms and startups like Rigetti Computing,IonQ, and PsiQuantum are joining the race, along with quantum software companies like QC Ware, Cambridge Quantum Computing, and Zapata Computing. Big corporations and governments are investing in projects that will take advantage of the power of quantum computing in chemistry, pharmaceuticals, finance, logistics, failure analysis, and artificial intelligence.

Each of these emerging technologies promises to significantly advance computing, and with these advances will come new technology leaders. The evolution of computing has given rise to multiple generations of spectacular success stories like IBM, Intel, Microsoft, Nvidia, Google, and Amazon Web Services. Most of these companies are trying to reinvent themselves to catch the next wave of computing technology, but certainly new companies will emerge in these new sectors, and some famous names will founder and go the way of the dinosaurs, like Univac, Digital Equipment, MIPS, and Silicon Graphics. Meanwhile, corporate CIOs will have to decide where to place their bets and start investing in these new technologies, if they havent already.

See the rest here:
The Future of Computing: Hype, Hope, and Reality - CIOReview