Archive for the ‘Quantum Computer’ Category

Year 2019 in Science: History of Humans, Ebola Treatment and Quantum Computing – NewsClick

Image Courtesy: Smithsonian Magazine. Image depicts some of the skull caps excavated from Ngandong.

In development of science, what should matter the most is the findings that help the humanity, the findings that have the potential to open up new paradigms or those which change our understanding of the past or open our eyes to the future. The year 2019 also witnessed several such findings in the science world.

HUMAN HISTORY THROUGH GENETICS

Tracing human history has been achieved with the realm of genetics research as well. Year 2019 also witnessed some of the breakthroughs about human history based on analysis done on ancient DNA found on fossils and other sources.

One of such important findings has come up with a claim about the origin of modern human. What it says is that anatomically, modern humans first appeared in Southern part of Africa. A wetland that covered present day Botswana, Namibia and Zimbabwe was where the first humans lived some 200,000 years ago. Eventually, humans migrated out of this region. How was the study conducted? Researchers gathered blood samples from 200 living people in groups whose DNA is poorly known, including foragers and hunter-gatherers in Namibia and South Africa. The authors analyzed the mitochondrial DNA (mtDNA), a type of DNA inherited only from mothers, and compared it to mtDNA in databases from more than 1000 other Africans, mostly from southern Africa. Then the researchers sorted how all the samples were related to each other on a family tree. The data reveals that one mtDNA lineage in the Khoisan speakersL0is the oldest known mtDNA lineage in living people. The work also tightens the date of origin of L0 to about 200,000 years ago

Another very important and interesting finding in this field is that Homo Erectus, the closest ancestor of modern humans, marked its last presence on the island of Java, Indonesia. The team of scientists has estimated that the species existed in a place known as Ngandong near the Solo riverbased on dating of animal fossils from a bone bed where Homo Erectus skull caps and leg bones were found earlier. Scientists used to believe that Homo Erectus migrated out of Africa, into Asia, some two million years back. They also believed that the early human ancestor became extinct from the earth around 4 lakh years ago. But the new findings indicate that the species continued to exist in Ngandong even about 117,000 to 108,000 years ago.

So far, anything that is known about the Denisovans, the mysterious archaic human species, was confined to the Denisova caves in Altai Mountain in Siberia. Because the remnants of this ancient species could be discovered in the fossils of the Denisova cave only. But a recent report published in Nature about the discovery of a Denisovan jawbone in a cave in the Tibetan Plateau has revealed many interesting facts about archaic humans. The fossil has been found to be 1,60,000 years old with a powerful jaw and unusually large teeth, resembling the most primitive Neanderthals. Protein analysis of the fossil revealed that they are closer to the Siberian Denisovans.

Image Courtesy: dawn.com

QUANTUM COMPUTING AND SUPREMACY:

Image Courtesy: Quantum magazine.

Computer scientists nowadays are concentrating on going far beyond the speed that the present genre of computing can achieve. Now the principles of quantum mechanics are being tried to incorporate into the next-generation computing. There have been some advances, but the issue in this realm that has sparked controversies is Googles claim to have obtained quantum supremacy.

Sycamore, Googles 53-qubit computer has solved a problem in 200 seconds which would have taken even a supercomputer 10,000 years. In fact, it is a first step. It has shown that a quantum computer can do a functional computation and that quantum computing does indeed solve a special class of problems much faster than conventional computers.

On the other hand, IBM researchers have countered saying that Google hadnt done anything special. This clash indeed highlights the intense commercial interest in quantum computing.

NATURE, CLIMATE AND AMAZON FOREST

Image Courtesy: NASA Earth Observatory.

The man-made climate change has already reached a critical state. Climate researches have already shown how crossing the critical state would bring irreversible changes to the global climate and an accompanying disaster for humanity.

In the year 2019 also, the world has witnessed many devastations in the forms of storms, floods and wildfires.

Apart from the extreme weather events that climate change is prodding, the nature itself is in the most perilous state ever, and the reason is human-made environmental destruction.

The global report submitted by Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES) reviewed some 15,000 scientific papers and also researched other sources of data on trends in biodiversity and its ability to provide people everything from food and fiber to clean water and air.

The report notes that out of 8 million known species of animals and plants, almost 1 million are under the threat of getting extinct and this includes more than 40% of amphibian species and almost a third of marine mammals.

The month of August witnessed an unprecedented wildfire in Amazon rainforest, the biggest in the world. The fire was so large-scale that the smoke covered nearby cities with dark clouds. It has been reported that Brazils National Institute for Space Research (INPE) recorded over 72,000 fires this year, which is an increase of about 80% from last year. More worrisome is the fact that more than 9,000 of these fires have taken place in the last week alone.

The fires have engulfed several large Amazon states in Northwestern Brazil. NASA, on August 11 noted that the fires were huge enough to be spotted from the space.

The main reason attributable to Amazon fires is widescale deforestation due to policy-level changes made by Bolsonaro regime. Many parts of the forest are now made open for the companies to set up business ventureseven the deeper parts of the forest. This has led to massive deforestation.

NEW DIMENSION TO THE TREATMENT OF EBOLA

Image Courtesy: UN News.

In the past, there had been no drugs that could have cured Ebola.

However, two out of four experimental trials carried out in Democratic Republic of Congo were found to be highly effective in saving patients lives. The new treatment method used a combination of existing drugs and newly developed ones. Named as PALM trial, the new method uses monoclonal antibodies and antiviral agencies.

Monoclonal antibodies are antibodies that are made by identical immune cells that are all clones of a unique parent cell. The monoclonal antibodies bind to specific cells or proteins. The objective is that this treatment will stimulate the patients immune system to attack those cells.

KILOGRAM REDEFINED

Image courtesy: phys.org

Kilogram, the unit to measure mass was defined by a hunk of metal in France. This hunk of metal, also known as the International Prototype Kilogram or Big K, is a platinum-iridium alloy having a mass of 1 kilogram housed at the Bureau of Weights and Measures in France since 1889. The IPK has many copies around the world and are used to calibrate scales to make sure that the whole world follows a standard system of measurement.

But the definition of the Kilogram will no longer be the same. On the International Metrology Day this year, the way a Kilogram has been measured for more than a century has been changed completely. Now, the kilogram would be defined using the Planck constant, something that does not change.

See the rest here:
Year 2019 in Science: History of Humans, Ebola Treatment and Quantum Computing - NewsClick

AI, ML and quantum computing to cement position in 2020: Alibabas Jeff Zhang – Tech Observer

From the emerge of cognitive intelligence, in-memory-computing, fault-tolerant quantum computing, new materials-based semiconductor devices, to faster growth of industrial IoT, large-scale collaboration between machines, production-grade blockchain applications, modular chip design, and AI technologies to protect data privacy, more technology advancements and breakthroughs are expected to gain momentum and generate big impacts on our daily life.

We are at the era of rapid technology development. In particular, technologies such as cloud computing, artificial intelligence, blockchain, and data intelligence are expected to accelerate the pace of the digital economy, said Jeff Zhang, Head of Alibaba DAMO Academy and President of Alibaba Cloud Intelligence.

The following are highlights from the Alibaba DAMO Academy predictions for the top 10 trends in the tech community for this year:

Artificial intelligence has reached or surpassed humans in the areas of perceptual intelligence such as speech to text, natural language processing, video understanding etc. but in the field of cognitive intelligence that requires external knowledge, logical reasoning, or domain migration, it is still in its infancy. Cognitive intelligence will draw inspiration from cognitive psychology, brain science, and human social history, combined with techniques such as cross domain knowledge graph, causality inference, and continuous learning to establish effective mechanisms for stable acquisition and expression of knowledge. These make machines to understand and utilize knowledge, achieving key breakthroughs from perceptual intelligence to cognitive intelligence.

In Von Neumann architecture, memory and processor are separate and the computation requires data to be moved back and forth. With the rapid development of data-driven AI algorithms in recent years, it has come to a point where the hardware becomes the bottleneck in the explorations of more advanced algorithms. In Processing-in-Memory (PIM) architecture, in contrast to the Von Neumann architecture, memory and processor are fused together and computations are performed where data is stored with minimal data movement. As such, computation parallelism and power efficiency can be significantly improved. We believe the innovations on PIM architecture are the tickets to next-generation AI.

In 2020, 5G, rapid development of IoT devices, cloud computing and edge computing will accelerate the fusion of information system, communication system, and industrial control system. Through advanced Industrial IoT, manufacturing companies can achieve automation of machines, in-factory logistics, and production scheduling, as a way to realize C2B smart manufacturing. In addition, interconnected industrial system can adjust and coordinate the production capability of both upstream and downstream vendors. Ultimately it will significantly increase the manufacturers productivity and profitability. For manufacturers with production goods that value hundreds of trillion RMB, if the productivity increases 5-10%, it means additional trillions of RMB.

Traditional single intelligence cannot meet the real-time perception and decision of large-scale intelligent devices. The development of collaborative sensing technology of Internet of things and 5G communication technology will realize the collaboration among multiple agents machines cooperate with each other and compete with each other to complete the target tasks. The group intelligence brought by the cooperation of multiple intelligent bodies will further amplify the value of the intelligent system: large-scale intelligent traffic light dispatching will realize dynamic and real-time adjustment, while warehouse robots will work together to complete cargo sorting more efficiently; Driverless cars can perceive the overall traffic conditions on the road, and group unmanned aerial vehicle (UAV) collaboration will get through the last -mile delivery more efficiently.

Traditional model of chip design cannot efficiently respond to the fast evolving, fragmented and customized needs of chip production. The open source SoC chip design based on RISC-V, high-level hardware description language, and IP-based modular chip design methods have accelerated the rapid development of agile design methods and the ecosystem of open source chips. In addition, the modular design method based on chiplets uses advanced packaging methods to package the chiplets with different functions together, which can quickly customize and deliver chips that meet specific requirements of different applications.

BaaS (Blockchain-as-a-Service) will further reduce the barriers of entry for enterprise blockchain applications. A variety of hardware chips embedded with core algorithms used in edge, cloud and designed specifically for blockchain will also emerge, allowing assets in the physical world to be mapped to assets on blockchain, further expanding the boundaries of the Internet of Value and realizing multi-chain interconnection. In the future, a large number of innovative blockchain application scenarios with multi-dimensional collaboration across different industries and ecosystems will emerge, and large-scale production-grade blockchain applications with more than 10 million DAI (Daily Active Items) will gain mass adoption.

In 2019, the race in reaching Quantum Supremacy brought the focus back to quantum computing. The demonstration, using superconducting circuits, boosts the overall confidence on superconducting quantum computing for the realization of a large-scale quantum computer. In 2020, the field of quantum computing will receive increasing investment, which comes with enhanced competitions. The field is also expected to experience a speed-up in industrialization and the gradual formation of an eco-system. In the coming years, the next milestones will be the realization of fault-tolerant quantum computing and the demonstration of quantum advantages in real-world problems. Either is of a great challenge given the present knowledge. Quantum computing is entering a critical period.

Under the pressure of both Moores Law and the explosive demand of computing power and storage, it is difficult for classic Si based transistors to maintain sustainable development of the semiconductor industry. Until now, major semiconductor manufacturers still have no clear answer and option to chips beyond 3nm. New materials will make new logic, storage, and interconnection devices through new physical mechanisms, driving continuous innovation in the semiconductor industry. For example, topological insulators, two-dimensional superconducting materials, etc. that can achieve lossless transport of electron and spin can become the basis for new high-performance logic and interconnect devices; while new magnetic materials and new resistive switching materials can realize high-performance magnetics Memory such as SOT-MRAM and resistive memory.

Abstract: The compliance costs demanded by the recent data protection laws and regulations related to data transfer are getting increasingly higher than ever before. In light of this, there have been growing interests in using AI technologies to protect data privacy. The essence is to enable the data user to compute a function over input data from different data providers while keeping those data private. Such AI technologies promise to solve the problems of data silos and lack of trust in todays data sharing practices, and will truly unleash the value of data in the foreseeable future.

With the ongoing development of cloud computing technology, the cloud has grown far beyond the scope of IT infrastructure, and gradually evolved into the center of all IT technology innovations. Cloud has close relationship with almost all IT technologies, including new chips, new databases, self-driving adaptive networks, big data, AI, IoT, blockchain, quantum computing and so forth. Meanwhile, it creates new technologies, such as serverless computing, cloud-native software architecture, software-hardware integrated design, as well as intelligent automated operation. Cloud computing is redefining every aspect of IT, making new IT technologies more accessible for the public. Cloud has become the backbone of the entire digital economy.

Read the rest here:
AI, ML and quantum computing to cement position in 2020: Alibabas Jeff Zhang - Tech Observer

Perspective: End Of An Era | WNIJ and WNIU – WNIJ and WNIU

David Gunkel's "Perspective" (January 8, 2020).

The holiday shopping is over and everyone is busy playing with their new toys. But what was remarkable about Christmas 2019 might have been the conspicuous absence of such toys.

Previous holiday seasons saw the introduction of impressive technological wonders -- tablet computers, the iPhone, Nintendo Wii and the X-box. But this year, there was no stand-out, got-to-have technological object.

On the one hand, this may actually be a good thing. The amount of waste generated by discarded consumer electronics is a massive global problem that we are not even close to managing responsibly. On the other hand however, this may be an indication of the beginning of the end of an era -- the era of Moores Law.

In 1965, Gordon Moore, then CEO of Intel, predicted that the number of transistors on a microchip doubles every two years, meaning that computer chip performance would develop at an almost exponential rate. But even Moore knew there was a physical limit to this dramatic escalation in computer power, and we are beginning to see it top out. That may be one reason why there were no new, got-to-have technological gizmos and gadgets this holiday season.

Sure, quantum computing is already being positioned as the next big thing. But it will be years, if not decades, before it finds its way into consumer products. So for now, do not ask Santa to fill your stocking with a brand-new quantum device. It will, for now at least, continue to be lumps of increasingly disappointing silicon.

Im David Gunkel, and thats my perspective.

See the article here:
Perspective: End Of An Era | WNIJ and WNIU - WNIJ and WNIU

Quanta’s Year in Math and Computer Science (2019) – Quanta Magazine

For mathematicians and computer scientists, this was often a year of double takes and closer looks. Some reexamined foundational principles, while others found shockingly simple proofs, new techniques or unexpected insights in long-standing problems. Some of these advances have broad applications in physics and other scientific disciplines. Others are purely for the sake of gaining new knowledge (or just having fun), with little to no known practical use at this time.

Quanta covered the decade-long effort to rid mathematics of the rigid equal sign and replace it with the more flexible concept of equivalence. We also wrote about emerging ideas for a general theory of neural networks, which could give computer scientists a coveted theoretical basis to understand why deep learning algorithms have been so wildly successful.

Meanwhile, ordinary mathematical objects like matrices and networks yielded unexpected new insights in short, elegant proofs, and decades-old problems in number theory suddenly gave way to new solutions. Mathematicians also learned more about how regularity and order arise from chaotic systems, random numbers and other seemingly messy arenas. And, like a steady drumbeat, machine learning continued to grow more powerful, altering the approach and scope of scientific research, while quantum computers (probably) hit a critical milestone.

Read more here:

Quanta's Year in Math and Computer Science (2019) - Quanta Magazine

From the image of a black hole to ‘artificial embryos’, 2019 was the year of many firsts in science – Economic Times

NEW DELHI: An image of the black hole, the stuff of science fiction down the decades, was at the centre of a year that saw science breaching new frontiers with exciting firsts such as the development of a quantum computer that can outperform its classical counterparts and artificial embryos.

Cutting edge innovations in research and technology celebrated science and forwarded humankind's understanding of complex realities of the universe. The year will also be remembered as the year of testing biological and ethical limits in the laboratory, helping researchers find new avenues in the treatment of critical diseases.

In April, the International Event Horizon Telescope collaboration, consisting of a global network of radio telescopes, unveiled the first actual image of a black hole, a place in space where gravity pulls so much that even light cannot escape.

To produce the image, the researchers combined data from a network of radio telescopes to take simultaneous readings from around the world.

Science magazine named the image of the supermassive black hole situated at the centre of the Messier 87 galaxy, 54 million light years away, as the 2019 Breakthrough of the Year.

The imaging of the black hole is a fantastic revelation that is simultaneously a validation and a celebration of science, Ayan Banerjee, from the Indian Institute of Science Education and Research (IISER) in Kolkata, told PTI.

Although it does not uncover something that we did not know earlier, it does convert science fiction into science -- which is crucial for the acceptance of science in the daily lives of human beings, and the generation of future scientists, Banerjee said.

In a year that marked the 50th anniversary of the Apollo Moon landings, lunar exploration was high on the agendas of space agencies.

In January, China's Chang'e-4 probe became the first spacecraft to land safely on the far side of the Moon. Its rover Yutu-2 continues to roll across the dusty soils of Von Karman crater on the lunar body.

Other attempts to explore the Earth's natural satellite were not so successful.

To produce the image, the researchers combined data from a network of radio telescopes to take simultaneous readings from around the world. In April, an Israeli-led effort to put the first private spacecraft on the Moon's surface ended in a crash landing. The same fate was met by India's ambitious Chandrayaan-2 Vikram lander in September.

The ongoing Mars missions returned a host of results. In April, NASA announced that its robotic Mars InSight lander had recorded a marsquake for the first time ever.

The marsquake' is the first recorded trembling that appears to have come from inside the planet, as opposed to being caused by the forces above the surface, such as wind.

There were many firsts in the micro world of laboratories too.

US researchers restored cellular function in 32 pig brains that had been dead for hours, opening up a new avenue in treating brain disease -- and shaking our definition of brain death to its core.

Announced in April in the journal Nature, the researchers at the Yale University School of Medicine devised a system roughly analogous to a dialysis machine, called BrainEx, that restores circulation and oxygen flow to a dead brain.

In another out-of-body experiment, scientists grew monkey embryos in a dish for nearly three weeks -- longer than primate embryos have ever been grown in the laboratory before.

The advance raised ethical concerns of whether lab-grown human embryos should be allowed to develop beyond 14 days, a restriction imposed in most countries.

In September, researchers at the University of Michigan in the US provided a possible circumvention of the 14-day limit by using human stem cells to make artificial embryos' that mimic the early development of a real human embryo.

Our stem cell structures that mimic embryos can help fill critical gaps in knowledge about early human development, and that could lead to a lot of good, Jianping Fu, an associate professor at Michigan, who led the study, said in a statement.

In October, Google took a quantum leap in computer science. Using its state-of-the-art quantum computer, called Sycamore, the tech giant claimed "quantum supremacy" over the most powerful supercomputers in the world by solving a problem considered virtually impossible for normal machines.

The quantum computer completed the complex computation in 200 seconds. That same calculation would take even the most powerful supercomputer approximately 10,000 years to finish, according to researchers from the University of California, Santa Barbara, who published their results in the journal Nature.

A fantastic discovery has been that of Google's 53 qubit quantum computer ('quantum supremacy), Banerjee said.

And for the first time in July, an artificial intelligence (AI) bot beat human champions at multiplayer poker.

The AI programme developed by Carnegie Mellon University in the US in collaboration with Facebook AI defeated leading professionals in six-player no-limit Texas hold'em poker, the world's most popular form of poker.

The AI, called Pluribus, defeated poker professional Darren Elias, who holds the record for most World Poker Tour titles, and Chris Ferguson, winner of six World Series of Poker events.

In August, researchers from Oxford University and IBM Research made the first-ever ring-shaped molecule of pure carbon in the lab by using an atomic-force microscope to manipulate individual molecules.

Carbon can be arranged in a number of configurations. For example when each of its atoms is bonded to three other carbon atoms, it's relatively soft graphite.

A ring of carbon atoms, where each atom is bonded to just two others, and nothing else has eluded scientists for 50 years. Their best attempts have resulted in a gaseous carbon ring that quickly dissipated.

7 Sep, 2018

7 Sep, 2018

7 Sep, 2018

7 Sep, 2018

7 Sep, 2018

See original here:

From the image of a black hole to 'artificial embryos', 2019 was the year of many firsts in science - Economic Times