Archive for the ‘Quantum Computer’ Category

University of Sheffield Launches Quantum Center to Develop the Technologies of Tomorrow – HPCwire

Jan. 22, 2020 The Sheffield Quantum Centre, which will be officially opened by Lord Jim ONeill, Chair of Chatham House and University of Sheffield alumnus, is bringing together more than 70 of the Universitys leading scientists and engineers to develop new quantum technologies.

Quantum technologies are a broad range of new materials, devices and information technology protocols in physics and engineering. They promise unprecedented capabilities and performance by exploiting phenomena that cannot be explained by classical physics.

Quantum technologies could lead to the development of more secure communications technologies and computers that can solve problems far beyond the capabilities of existing computers.

Research into quantum technologies is a high priority for the UK and many countries around the world. The UK government has invested heavily in quantum research as part of a national program and has committed 1 billion in funding over 10 years.

Led by the Universitys Department of Physics and Astronomy, Department of Electronic and Electrical Engineering and Department of Computer Science, the Sheffield Quantum Centre will join a group of northern universities that are playing a significant role in the development of quantum technologies.

The University of Sheffield has a strong presence in quantum research with world leading capabilities in crystal growth, nanometre scale device fabrication and device physics research. A spin-out company has already been formed to help commercialize research, with another in preparation.

Professor Maurice Skolnick, Director of the Sheffield Quantum Centre, said: The University of Sheffield already has very considerable strengths in the highly topical area of quantum science and technology. I have strong expectation that the newly formed center will bring together these diverse strengths to maximize their impact, both internally and more widely across UK universities and funding bodies.

During the opening ceremony, the Sheffield Quantum Centre will also launch its new 2.1 million Quantum Technology Capital equipment.

Funded by the Engineering and Physical Sciences Research Council (EPSRC), the equipment is a molecular beam epitaxy cluster tool designed to grow very high quality wafers of semiconductor materials types of materials that have numerous everyday applications such as in mobile phones and lasers that drive the internet.

The semiconductor materials also have many new quantum applications which researchers are focusing on developing.

Professor Jon Heffernan from the Universitys Department of Electronic and Electrical Engineering, added: The University of Sheffield has a 40-year history of pioneering developments in semiconductor science and technology and is host to the National Epitaxy Facility. With the addition of this new quantum technologies equipment I am confident our new research center will lead to many new and exciting technological opportunities that can exploit the strange but powerful concepts from quantum science.

For more information on the Sheffield Quantum Centre, including how to study or collaborate with its researchers, visit:Sheffield Quantum Centre

About the University of Sheffield

With almost 29,000 of the brightest students from over 140 countries, learning alongside over 1,200 of the best academics from across the globe, the University of Sheffield is one of the worlds leading universities. A member of the UKs prestigious Russell Group of leading research-led institutions, Sheffield offers world-class teaching and research excellence across a wide range of disciplines. Unified by the power of discovery and understanding, staff and students at the university are committed to finding new ways to transform the world we live in.

Source: University of Sheffield

Continue reading here:
University of Sheffield Launches Quantum Center to Develop the Technologies of Tomorrow - HPCwire

Inside the race to quantum-proof our vital infrastructure – www.computing.co.uk

"We were on the verge of giving up a few years ago because people were not interested in quantum at the time. Our name became a joke," said Andersen Cheng, CEO of the UK cybersecurity firm Post-Quantum. After all, he continued, how can you be post- something that hasn't happened yet?

But with billions of pounds, renminbi, euros and dollars (US, Canadian and Australian) being pumped into the development of quantum computers by both governments and the private sector and with that research starting to bear fruit, exemplified by Google's achievement of quantum supremacy, no-one's laughing now.

One day, perhaps quite soon, the tried and trusted public-key cryptography algorithms that protect internet traffic will be rendered obsolete. Overnight, a state in possession of a workable quantum computer could start cracking open its stockpiles of encrypted secrets harvested over the years from rival nations. Billions of private conversations and passwords would be laid bare and critical national infrastructure around the world would be open to attack.

A situation often compared with the Y2K problem, the impact could be disastrous. Like Y2K, no-one can be quite sure what the exact consequences will be; unlike Y2k the timing is unclear. But with possible scenarios ranging from massive database hacks to unstoppable cyberattacks on the military, transport systems, power generation and health services, clearly, this is a risk not to be taken lightly.

Critical infrastructure including power generation would be vulnerable to quantum computers

Post-quantum cryptography uses mathematical theory and computer science to devise algorithms that are as hard to crack as possible, even when faced with the massive parallel processing power of a quantum computer. However, such algorithms must also be easy to deploy and use or they will not gain traction.

In 2016, the US National Institute of Standards and Technology (NIST) launched its competition for Public-Key Post-Quantum Cryptographic Algorithms, with the aim of arriving at quantum-safe standards across six categories by 2024. The successful candidates will supplement or replace the three standards considered most vulnerable to quantum attack: FIPS 186-4 (digital signatures), plusNIST SP 800-56AandNIST SP 800-56B (public-key cryptography).

Not all types of cryptography are threatened by quantum computers. Symmetric algorithms (where the same key is used for encryption and decryption) such as AES, which are often deployed to protect data at rest, and hashing algorithms like SHA, used to prove the integrity of files, should be immune to the quantum menace, although they will eventually need larger keys to withstand increases in classical computing power. But the asymmetric cryptosystems like RSA and elliptic curve cryptography (ECC) which form the backbone of secure communications are certainly in danger.

Asymmetric cryptography and public-key infrastructure (PKI) address the problem of how parties can exchange encryption keys where there's a chance that an eavesdropper could intercept and use them. Two keys (a keypair) are generated at the same time: a public key for encrypting data and a private key for decrypting it. These keys are related by a mathematical function that's trivial to perform one in one direction (as when generating the keys) but very difficult in the other (trying to derive the private key from the corresponding public key). One example of such a 'one-way' function is factorising very large integers into primes. This is used in the ubiquitous RSA algorithms that form the basis of the secure internet protocols SSL and TLS. Another such function, deriving the relationship between points on a mathematical elliptic curve, forms the basis of ECC which is sometimes used in place of RSA where short keys and reduced load on the CPU are required, as in IoT and mobile devices.

It is no exaggeration to say that in the absence of SSL and TLS the modern web with its ecommerce and secure messaging could not exist. These protocols allow data to be transmitted securely between email correspondents and between customers and their banks with all the encryption and decryption happening smoothly and seamlessly in the background. Unfortunately, though, factorising large integers and breaking ECC will be a simple challenge for a quantum computer. Such a device running something like Shor's algorithm will allow an attacker to decrypt data locked with RSA-2048 in minutes or hours rather than the billions of years theoretically required by a classical computer to do the same. This explains NIST's urgency in seeking alternatives that are both quantum-proof and flexible enough to replace RSA and ECC.

NIST is not the only organisation trying to get to grips with the issue. The private sector has been involved too. Since 2016 Google has been investigating post-quantum cryptography in the Chrome browser using NewHope, one of the NIST candidates. Last year Cloudflare announced it was collaborating with Google in evaluating the performance of promising key-exchange algorithms in the real world on actual users' devices.

Of the original 69 algorithms submitted to NIST in 2016, 26 have made it through the vetting process as candidates for replacing the endangered protocols; this number includes NewHope in the Lattice-based' category.

One of the seven remaining candidates in the Code-based' category is Post-Quantum's Never-The-Same Key Encapsulation Mechanism (NTS-KEM) which is based on the McEliece cryptosystem. First published in 1978, McEliece never really took off at the time because of the large size of the public and private keys (100kB to several MB). However, it is a known quantity to cryptographers who have had plenty of time to attack it, and it's agreed to be NP-hard' (a mathematical term that in this context translates very roughly as extremely difficult to break in a human timescale - even with a quantum computer'). This is because it introduces randomisation into the ciphertext with error correction codes.

"We actually introduce random errors every time we encrypt the same message," Cheng (pictured) explained. "If I encrypt the letters ABC I might get a ciphertext of 123. And if I encrypt ABC again you'd expect to get 123, right? But we introduce random errors so this time we get 123, next time we get 789."

The error correction codes allow the recipient of the encrypted message to cut out the random noise added to the message when decrypting it, a facility not available to any eavesdropper intercepting the message.

With today's powerful computers McEliece's large key size is much less of an issue than in the past.Indeed, McEliece has some advantages of its own - encryption/decryption is quicker than RSA, for example - but it still faces implementation challenges compared with RSA, particularly for smaller devices. So for the past decade, Cheng's team has been working on making the technology easier to implement. "We have patented some know-how in order to make our platform work smoothly and quickly to shorten the keys to half the size," he said.

Post-Quantum has open-sourced its code (a NIST requirement so that the successful algorithms can be swiftly distributed) and packaged it into libraries to make it as drop-in' as possible and backwards-compatible with existing infrastructure.

Nevertheless, whichever algorithms are chosen, replacing the incumbents like-with-like won't be easy. "RSA is very elegant," Cheng admits. "You can do both encryption and signing. For McEliece and its derivatives because it's so powerful in doing encryption you cannot do signing."

An important concept in quantum resistance is crypto-agility' - the facility to change and upgrade defences as the threat landscape evolves. Historically, industry has been the very opposite of crypto-agile: upgrading US bank ATMs from insecure DES to 3DES took an entire decade to complete. Such leisurely timescales are not an option now that a quantum computer capable of cracking encryption could be just three to five years away.

Because of the wide range of environments, bolstering defences for the quantum age is not as simple as switching crypto libraries. In older infrastructure and applications encryption may be hard-coded, for example. Some banks and power stations still rely on yellowing ranks of servers that they dare not decommission but where the technicians who understand how the encryption works have long since retired. Clearly, more than one approach is needed.

It's worth pointing out that the threat to existing cryptosystems comes not only from quantum computers. The long-term protection afforded by encryption algorithms has often been wildly overestimated even against bog standard' classical supercomputers. RSA 768, introduced in the 1970s, was thought to be safe for 7,000 years, yet it was broken in 2010.

For crypto-agility algorithms need to be swappable

Faced with the arrival of quantum computers and a multiplicity of use cases and environments, cryptographers favour a strength-in-depth or hybridised approach. Cheng uses the analogy of a universal electrical travel plug which can be used in many different counties.

"You can have your RSA, the current protocol, with a PQ [post-quantum] wrapper and make the whole thing almost universal, like a plug with round pins, square pins or a mixture of both. Then when the day comes customers can just turn off RSA and switch over to the chosen PQ algorithm".

Code-based systems like NTS-KEM are not the only type being tested by NIST. The others fall into two main categories: multivariate cryptography, which involves solving complex polynomial equations, and lattice-based cryptography, which is a geometric approach to encrypting data. According to Cheng, the latter offers advantages of adaptability but at the expense of raw encryption power.

"Lattice is less powerful but you can do both encryption and signing,

but it has not been proven to be NP-hard," he said, adding: "In the PQ world everyone's concluded you need to mix-and-match your crypto protocols in order to cover everything."

Professor Alan Woodward (pictured) of Surrey University's Department of Computing said that it's still too early to guess which will ultimately prove successful.

"Lattice-based schemes seem to be winning favour, if you go by numbers still in the race, but there is a lot of work being done on the cryptanalysis and performance issues to whittle it down further," he said. "If I had to bet, I'd say some combination of lattice-based crypto and possibly supersingular isogeny-based schemes will emerge for both encryption and signature schemes."

Quantum mechanics can be an aid in the generation of secure classical encryption keys. Because of their deterministic nature, classical computers cannot generate truly random numbers; instead they produce pseudo-random numbers that are predictable, even if only to a tiny degree. One of Edward Snowden's revelations was that the NSA had cracked the random number generator used by RSA. More recently, weaknesses in RSA's random number generation were discovered in some IoT devices, where one in 172 were found to use the same factor to generate keys. However, a quantum random number generator (QRNG) produces numbers that are truly random, according to quantum theory, resolving this key area of vulnerability.

QKD commonly uses polarised photos to represent ones and zeros

Whereas post-quantum cryptography is based on maths, the other major area of research interest, quantum key distribution (QKD), is rooted in physics, specifically the behaviour of subatomic particles. QKD is concerned with key exchange, using quantum-mechanics to ensure that eavesdroppers cannot intercept the keys without being noticed.

In BB84, the first proposed QKD scheme and still the basis for many implementations, the quantum mechanical properties of subatomic particle, such as the polarity of a photon, is manipulated to represent either a zero or a one. A stream of such photons, polarised at random, is then sent by one party to a detector controlled by the other.

Before they reach the detector, each photon must pass through a filter. One type of filter will allow ones' to pass, the other zeros'; as with the polarisation process, the filters are selected at random, so we'd expect half of the photons to be blocked by the filtering process. Counterintuitively, however, their quantum mechanical properties mean that even those photons that are blocked' by a filter still have a 50 per cent chance of passing their correct value to the detector. Thus, we'd expect an overall agreement between transmission and detection of 75 per cent (50 per cent that pass straight through plus 25 per cent that are blocked' but still communicate their correct value).

Once enough photons have been transmitted to produce a key of the required length, the parties compare, over a separate channel, the sequence of emitted ones and zeros with the filter used for each, discarding the individual results where they disagree. A classical symmetric encryption key is then created from the remaining string of ones and zeros. This key can be used as an uncrackable one-time pad' which is then used to encrypt data such as a message or a login.

Should a man-in-the-middle intercept the stream of photons, the parties will be alerted because of the observer effect: measuring the state of a quantum particle will change it. Statistically, the number of photons registered as correct' by the detector will drop from 75 per cent to around 62.5 per cent and this will be noticed when the two parties compare a random sample of their results at the end of the process. Any such discrepancy will cause the key to be rejected. Properly implemented, QKD can be considered as a provably unbreakable method of exchanging keys.

Switzerland is a QKD pioneer, deploying the technology to secure electoral votes as far back as 2007. The company that helped to achieve this feat, Geneva University spin-off ID Quantique (IDQ), has since become one of the main manufacturers of QKD and QRNG hardware. CEO Grgoire Ribordy (pictured) has seen an recent upsurge of interest beginning in 2016 when the European Commission unveiled its 1 billion, ten-year Quantum Flagship programme. The market is now starting to mature, he said, adding that his company boasts customers in government, finance and "other organisations that have high-value IP to protect".

There's a certain rivalry between physics and maths, between QKD and post-quantum encryption, not least because funding has been hard to come by. Being hardware-based, QKD has so far gobbled up the lion's share of the research grants, but it's possible that when NIST returns its verdicts more money will flow into PQ. Arguments also rage over the practical limits of security.

"The physicists tend to talk about QKD as being perfectly secure' which sets the cryptographers on edge as there is no such thing in practice," Woodward said.

Ribordy is adamant that both techniques will be required. As with the hybrid approach to adopting algorithms, it's not an either-or situation; it all depends on the use case.

"I think they're actually complementary. Quantum crypto [another name for QKD] will provide a higher security and should be used maybe in backbone networks where there's a lot of at stake, big pipes must be protected with more security, and then the quantum-resistant algorithms can find an application in areas where security is not as critical or maybe where there's less data at stake."

One company that's looking to scale up QKD on a national basis is

the startup Quantum Xchange. Based in Bethesda, Maryland, USA, it was founded in 2018 with VC funding to provide ultra-secure data networks. President and CEO John Prisco (pictured) bemoaned the fact that his country, while forging ahead with quantum computers, is behind the curve when it comes to defending against them. It's possible that by 2024 when NIST selects its winning algorithms, the game will already be up.

"Everybody is saying, OK, let's fight quantum with quantum and I subscribe to that," he said. "We've got quantum computers that are offensive weapons and quantum keys that are the defensive of counterpart to that. The rest of the world outside of the United States is embracing this a lot more quickly - Europe, Japan and China."

Quantum particles are uniquely sensitive to any kind of disturbance, so while China may have successfully transmitted quantum keys between Earth and the Micius satellite, this was only possible because of ideal weather conditions at the time (although, interestingly, Woodward believes it could ultimately be the winning approach).

Particles transmitted through the more common fibreoptic cable are also limited by the tendency of the polarised photons to react with the medium. Even with the most pristine fibre, this limits real-world transmission distance to around 100km. After that, you need intermediary repeaters and trusted nodes' to relay the signal. Since it's not possible to directly clone quantum states, the quantum signal must be converted to classical and then back to quantum again, representing a weak point in the otherwise unbreakable chain. So trusted nodes must be very thoroughly secured, which inevitably increases costs and limits current applications. It is also possible for an attacker to interfere with emitters and detectors to corrupt the key generation process.

Other issues? Well, there's a lack of standards and certifications and the equipment is costly. Also, without some sort of secure signature process, how can parties exchanging keys be sure who they are exchanging them with? In addition, it's restricted to point-to-point communications and it's also incompatible with existing networks.

The theory is sound, said Woodward, but the engineering is still a challenge.

"It's in practice that QKD is encountering difficulties. For example, QKD is not yet at a stage where it is using single photons - it uses pulses of light. Hence, the very basis of not being able to clone the quantum state of a photon is put in question as there is more than one of them."

Woodward added that even after the kinks in QKD - be that via satellite, fibreoptic cables or over the airwaves - have been ironed out, the technology will still likely be confined to highly sensitive data and backbone networks because PQ cryptography will be easier to slot into existing infrastructure.

"Whichever [QKD] scheme proves most reliable and robust they all require that expensive infrastructure over what we have now, and so I can envisage it being used for, possibly, government communications but not for home users whose machines are picking a means to communicate securely with their bank's website," he said.

"The post-quantum schemes in the NIST competition would simply replace the software we already have in places such as TLS so the cost would be much lower, and the level of disruption needed for adoption by end-users would be far less."

However, Quantum Xchange is working on overcoming some of these limitations. The firm already operates a small number of high security QKD connections between financial institutions in New York and datacentres in nearby New Jersey over dedicated fibreoptic cables using trusted nodes to extend the reach of its QKD infrastructure. But it is also working on a hybrid system called Phio TX. This will allow the transmission of electronic quantum keys (i.e. keys created using a QRNG) or classical symmetric keys created from the quantum key via a secure channel separate from that used for the encrypted data. The idea is to make the technology more widely applicable by straddling the QKD-PQ divide and removing the point-to-point restrictions.

"The point is to be crypto-agile," Prisco said. "If a company is trying to come up with a quantum-safe strategy they can implement this product that has quantum-resistant algorithms, electronic quantum keys and optical quantum keys, so it becomes a level-of-service discussion. If you have a link that absolutely has to be protected by the laws of physics, you'd use an optical quantum key. If there's virtually no chance of someone intercepting the data with your key you could use a trusted exchange and the combination of the quantum-resistant algorithm with the quantum random number generated key is very powerful."

Edit: the original article stated the $1.2 billionNational Quantum Initiative Act was passed by the House of Representatives in December 2019 whereas this took place in December 2018.

See more here:
Inside the race to quantum-proof our vital infrastructure - http://www.computing.co.uk

Tucson Morning Blend Top 5 Tech Trends you’ll love this year. Heather Rowe 1:27 – KGUN

NEW TECH STUFF TO MAKE OUR LIVES BETTER IN 2020 In the decade now drawing to a close, every part of our lives our personal lives, our businesses and careers became fully digital. And with the 2020s now upon us, were going to see even more massive changes as the tech we use gets further refined and as technology that was dreamed up only recently becomes part of our daily routines! Here are five of the top technologies that IBM says will revolutionize the year and decade ahead:

1. Artificial Intelligence will turbo-charge productivity both personally, and professionally.

While artificial intelligence probably wont take your job, it will change how you work. In the coming decade, expect to see AI making its way into all sorts of workplaces around the world automating routine tasks that will free up your time to concentrate on parts of your job that are more satisfying and meaningful. And there will be lots of new jobs and career possibilities for those who gain the skills to work in technology fields.

2. Blockchain will help to make the food you eat safer than ever.

Food recalls keep consumers constantly on their toes affecting their shopping habits, and calling produce and pantry items into question. But blockchain networks like IBM Food Trust (which is used by a growing number of retailers including Walmart, Albertsons and Carrefour as well as major food suppliers like Dole) are helping to trace foods from the farm to your fork. What is blockchain? Its a digital ledger that means means consumers now have unprecedented insight into exactly where their food has come from and it doesnt stop with food blockchain now tracks global shipments, marriages and more. Right now were able to track food shipments on the blockchain via apps and in the next decade, well see this cutting edge technology become a part of everyday life.

3. Edge Computing will have a big impact on retail, and on the tech you use on your cell phone.

Today's consumer electronics, cars and electric vehicles, and all sorts of other digital devices are equipped with sensors that collectively generate tons of data. Today theres an estimated 15 billion intelligent devices operating on the outer edges of the network, and by 2022, that number is expected to reach 55 billion. In order to make sense of all of the information from these devices, well see massive growth in whats called edge computing: the use of compact, efficient computer servers located at the networks edges/near these smart devices that can process data locally, instead of sending it all back to a data center via the cloud.. The next decade will see a surge in edge computing, aided by the rollout of 5G technology and while consumers wont see edge computing it will transform the way retailers stock the latest goods you buy, and it will affect how cellphone carriers support mobile gaming and augmented reality and more.

4. From cloud computing to the Hybrid Cloud: what you need to know.

You know how when youre getting ready to pack for a big trip, you need to gather stuff from all over the place to make your vacation work? You might have clothes and shoes spread out between multiple closets, your suitcase is in the basement, your passport (which needs to stay super secure) is in a safe. Well, businesses with lots of data are the same way: they might have some info in one type of cloud, some info in another, and more stuff on three servers in two different states. Thats why more and more businesses are turning to hybrid cloud: its a technology infrastructure that makes it easy for companies to quickly access data wherever its stored to make it usable and easy to analyze. For consumers, this means theyre being helped by retailers and companies more quickly all with their data being safer than ever.5. Quantum computing moves from the realm of the theoretical (and from being a sci-fi movie plotline!) into the world of practical experiments and applications.

Its not necessary to be a quantum physicist to grasp the main point of quantum computing: it seeks to solve complex problems that have been considered unsolvable using classical computers alone. IBM is a leader on making quantum technology available to industry, academia and anyone else inspired by quantum computings potential. As the next decade unspools well see quantum computing moving from the lab to the mainstream and it will start to solve problems in chemistry, medicine and more.

Go here to see the original:
Tucson Morning Blend Top 5 Tech Trends you'll love this year. Heather Rowe 1:27 - KGUN

The case of the elusive Majorana: The so-called ‘angel particle’ still a mystery – Penn State News

UNIVERSITY PARK, Pa. A 2017 report of the discovery of a particular kind of Majorana fermion the chiral Majorana fermion, referred to as the angel particle is likely a false alarm, according to new research. Majorana fermions are enigmatic particles that act as their own antiparticle and were first hypothesized to exist in 1937. They are of immense interest to physicists because their unique properties could allow them to be used in the construction of a topological quantum computer.

A team of physicists at Penn State and the University of Wurzburg in Germany led by Cui-Zu Chang, an assistant professor of physics at Penn State, studied over three dozen devices similar to the one used to produce the angel particle in the 2017 report. They found that the feature that was claimed to be the manifestation of the angel particle was unlikely to be induced by the existence of the angel particle. A paper describing the research appears on Jan. 3 in the journal Science.

When the Italian physicist Ettore Majorana predicted the possibility of a new fundamental particle which is its own antiparticle, little could he have envisioned the long-lasting implications of his imaginative idea, said Nitin Samarth, Downsbrough Department Head and professor of physics at Penn State. Over 80 years after Majoranas prediction, physicists continue to actively search for signatures of the still elusive 'Majorana fermion' in diverse corners of the universe.

In one such effort, particle physicists are using underground observatories that seek to prove whether the ghost-like particle known as the neutrino a subatomic particle that rarely interacts with matter might be a Majorana fermion.

On a completely different front, condensed matter physicists are seeking to discover manifestations of Majorana physics in solid-state devices that combine exotic quantum materials with superconductors. In such devices, electrons are theorized to dress themselves as Majorana fermions by stitching together a fabric constructed from core aspects of quantum mechanics, relativistic physics, and topology. This analogous version of Majorana fermions has particularly captured the attention of condensed-matter physicists because it may provide a pathway for constructing a topological quantum computer whose qubits (quantum versions of binary 0s and 1s) are inherently protected from environmental decoherence the loss of information that results when a quantum system is not perfectly isolated, and a major hurdle in the development of quantum computers.

An important first step toward this distant dream of creating a topological quantum computer is to demonstrate definitive experimental evidence for the existence of Majorana fermions in condensed matter, said Chang. Over the past seven or so years, several experiments have claimed to show such evidence, but the interpretation of these experiments is still debated.

The team studied devices fashioned from a quantum material known as a quantum anomalous Hall insulator, wherein the electrical current flows only at the edge. A recent study predicted that when the edge current is in clean contact with a superconductor, propagating chiral Majorana fermions are created and the electrical conductance of the device should be half-quantized a value of e2/2h where e is the electron charge and h is Planck constant when subject to a precise magnetic field. The Penn State-Wurzburg team studied over three dozen devices with several different materials configurations and found that devices with a clean superconducting contact always show the half-quantized value regardless of magnetic field conditions. This occurs because the superconductor acts like an electrical short and is thus not indicative of the presence of the Majorana fermion, said the researchers.

The fact that two laboratories at Penn State and at Wurzburg found completely consistent results using a wide variety of device configurations casts serious doubt on the validity of the theoretically proposed experimental geometry and questions the 2017 claim of observing the angel particle, said Moses Chan, Evan Pugh Professor Emeritus of Physics at Penn State.

I remain optimistic that the combination of quantum anomalous Hall insulators and superconductivity is an attractive scheme for realizing chiral Majoranas, said Morteza Kayyalha, a postdoctoral research associate at Penn State who carried out the device fabrication and measurements. But our theorist colleagues need to rethink the device geometry.

This is an excellent illustration of how science should work, said Samarth. Extraordinary claims of discovery need to be carefully examined and reproduced. All of our postdocs and students worked really hard to make sure they carried out very rigorous tests of the past claims. We are also making sure that all of our data and methods are shared transparently with the community so that our results can be critically evaluated by interested colleagues.

In addition to Chang, Samarth, Chan and Kayyalha, the research team includes Penn State faculty member Qi Li, and Wurzburg faculty members Laurens Molenkamp and Charles Gould. The project relied on materials synthesis carried out at Penn States 2D Crystal Consortium user facility for synthesis of 2D quantum materials and was funded by the U.S. National Science Foundation, the Office of Naval Research, the U.S. Department of Energy, the Army Research Office, and the European Research Council.

See the article here:
The case of the elusive Majorana: The so-called 'angel particle' still a mystery - Penn State News

Year 2019 in Science: History of Humans, Ebola Treatment and Quantum Computing – NewsClick

Image Courtesy: Smithsonian Magazine. Image depicts some of the skull caps excavated from Ngandong.

In development of science, what should matter the most is the findings that help the humanity, the findings that have the potential to open up new paradigms or those which change our understanding of the past or open our eyes to the future. The year 2019 also witnessed several such findings in the science world.

HUMAN HISTORY THROUGH GENETICS

Tracing human history has been achieved with the realm of genetics research as well. Year 2019 also witnessed some of the breakthroughs about human history based on analysis done on ancient DNA found on fossils and other sources.

One of such important findings has come up with a claim about the origin of modern human. What it says is that anatomically, modern humans first appeared in Southern part of Africa. A wetland that covered present day Botswana, Namibia and Zimbabwe was where the first humans lived some 200,000 years ago. Eventually, humans migrated out of this region. How was the study conducted? Researchers gathered blood samples from 200 living people in groups whose DNA is poorly known, including foragers and hunter-gatherers in Namibia and South Africa. The authors analyzed the mitochondrial DNA (mtDNA), a type of DNA inherited only from mothers, and compared it to mtDNA in databases from more than 1000 other Africans, mostly from southern Africa. Then the researchers sorted how all the samples were related to each other on a family tree. The data reveals that one mtDNA lineage in the Khoisan speakersL0is the oldest known mtDNA lineage in living people. The work also tightens the date of origin of L0 to about 200,000 years ago

Another very important and interesting finding in this field is that Homo Erectus, the closest ancestor of modern humans, marked its last presence on the island of Java, Indonesia. The team of scientists has estimated that the species existed in a place known as Ngandong near the Solo riverbased on dating of animal fossils from a bone bed where Homo Erectus skull caps and leg bones were found earlier. Scientists used to believe that Homo Erectus migrated out of Africa, into Asia, some two million years back. They also believed that the early human ancestor became extinct from the earth around 4 lakh years ago. But the new findings indicate that the species continued to exist in Ngandong even about 117,000 to 108,000 years ago.

So far, anything that is known about the Denisovans, the mysterious archaic human species, was confined to the Denisova caves in Altai Mountain in Siberia. Because the remnants of this ancient species could be discovered in the fossils of the Denisova cave only. But a recent report published in Nature about the discovery of a Denisovan jawbone in a cave in the Tibetan Plateau has revealed many interesting facts about archaic humans. The fossil has been found to be 1,60,000 years old with a powerful jaw and unusually large teeth, resembling the most primitive Neanderthals. Protein analysis of the fossil revealed that they are closer to the Siberian Denisovans.

Image Courtesy: dawn.com

QUANTUM COMPUTING AND SUPREMACY:

Image Courtesy: Quantum magazine.

Computer scientists nowadays are concentrating on going far beyond the speed that the present genre of computing can achieve. Now the principles of quantum mechanics are being tried to incorporate into the next-generation computing. There have been some advances, but the issue in this realm that has sparked controversies is Googles claim to have obtained quantum supremacy.

Sycamore, Googles 53-qubit computer has solved a problem in 200 seconds which would have taken even a supercomputer 10,000 years. In fact, it is a first step. It has shown that a quantum computer can do a functional computation and that quantum computing does indeed solve a special class of problems much faster than conventional computers.

On the other hand, IBM researchers have countered saying that Google hadnt done anything special. This clash indeed highlights the intense commercial interest in quantum computing.

NATURE, CLIMATE AND AMAZON FOREST

Image Courtesy: NASA Earth Observatory.

The man-made climate change has already reached a critical state. Climate researches have already shown how crossing the critical state would bring irreversible changes to the global climate and an accompanying disaster for humanity.

In the year 2019 also, the world has witnessed many devastations in the forms of storms, floods and wildfires.

Apart from the extreme weather events that climate change is prodding, the nature itself is in the most perilous state ever, and the reason is human-made environmental destruction.

The global report submitted by Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES) reviewed some 15,000 scientific papers and also researched other sources of data on trends in biodiversity and its ability to provide people everything from food and fiber to clean water and air.

The report notes that out of 8 million known species of animals and plants, almost 1 million are under the threat of getting extinct and this includes more than 40% of amphibian species and almost a third of marine mammals.

The month of August witnessed an unprecedented wildfire in Amazon rainforest, the biggest in the world. The fire was so large-scale that the smoke covered nearby cities with dark clouds. It has been reported that Brazils National Institute for Space Research (INPE) recorded over 72,000 fires this year, which is an increase of about 80% from last year. More worrisome is the fact that more than 9,000 of these fires have taken place in the last week alone.

The fires have engulfed several large Amazon states in Northwestern Brazil. NASA, on August 11 noted that the fires were huge enough to be spotted from the space.

The main reason attributable to Amazon fires is widescale deforestation due to policy-level changes made by Bolsonaro regime. Many parts of the forest are now made open for the companies to set up business ventureseven the deeper parts of the forest. This has led to massive deforestation.

NEW DIMENSION TO THE TREATMENT OF EBOLA

Image Courtesy: UN News.

In the past, there had been no drugs that could have cured Ebola.

However, two out of four experimental trials carried out in Democratic Republic of Congo were found to be highly effective in saving patients lives. The new treatment method used a combination of existing drugs and newly developed ones. Named as PALM trial, the new method uses monoclonal antibodies and antiviral agencies.

Monoclonal antibodies are antibodies that are made by identical immune cells that are all clones of a unique parent cell. The monoclonal antibodies bind to specific cells or proteins. The objective is that this treatment will stimulate the patients immune system to attack those cells.

KILOGRAM REDEFINED

Image courtesy: phys.org

Kilogram, the unit to measure mass was defined by a hunk of metal in France. This hunk of metal, also known as the International Prototype Kilogram or Big K, is a platinum-iridium alloy having a mass of 1 kilogram housed at the Bureau of Weights and Measures in France since 1889. The IPK has many copies around the world and are used to calibrate scales to make sure that the whole world follows a standard system of measurement.

But the definition of the Kilogram will no longer be the same. On the International Metrology Day this year, the way a Kilogram has been measured for more than a century has been changed completely. Now, the kilogram would be defined using the Planck constant, something that does not change.

See the rest here:
Year 2019 in Science: History of Humans, Ebola Treatment and Quantum Computing - NewsClick