Archive for the ‘Quantum Computing’ Category

DoD Buys Two New Supercomputers That Rank Among Its Most Powerful Ever – Breaking Defense

Sandia National Laboratory Computer Annex conducts the hourly walk-through of the Thunderbird supercomputer at 2 a.m.

WASHINGTON: The Pentagon recently completed a $68 million acquisition of two new supercomputing platforms and related technical services that rank among its most powerful supercomputers ever and will be among the top 100 performers globally.

These are significant assets, Kevin Newmeyer, deputy director of the Defense Departments High Performance Computing Modernization Program (HPCMP), told Breaking Defense. They bring to us an increase in our computing capacity and the latest advanced chips for artificial intelligence work and storage to support applications of both computational and machine learning concepts within the same computer that we hope will deliver products and services to the warfighter faster.

Its the HPCMPs job to give DoD military and civilian as well as defense contractor scientists, engineers, and technologists access to such supercomputers to solve some of the militarys most computationally complex problems.

The problems range from climate/weather/ocean modeling and simulation, space/astrophysical sciences, and acoustics to signal/image processing, data/decision analytics, and electronics, networks, and C4I systems. Newmeyer said the most common use case is computational fluid dynamics, which is required for making complicated calculations in areas such as aircraft and ship design and engineering.

For the latest acquisition, the Pentagon chose Penguin Computings TrueHPC supercomputing platform. The two new supercomputers, according to the company, will provide DoD with a combined total of over 365,000 cores, more than 775 terabytes of memory, and a total of 47 petabytes of high-performance storage, including over 5 petabytes of high-performance flash storage.

Thats about 150,000 computers all stacked together, operating as one thing, Newmeyer said. If you laid them end to end, you would work your way pretty much across the country.

What does all that compute power get you? An additional 17.6 petaFLOPS, in total. FLOPS or floating point operations per second are the standard measure of a supercomputers performance. FLOPS are determined by how many real numbers a computer can process per second while accounting for the trade-off between range and precision of calculations.

FLOPS are a measure of computational power for solving computer-based problems. Its the horsepower of a machine, Penguins Vice President of Federal Sales Tom Ireland told Breaking Defense.

PetaFLOPS number one quadrillion (1,000,000,000,000,000). To put that in perspective, HPCMP currently has a total capacity across all of its supercomputers of approximately 100 petaFLOPS, according to Newmeyer. That includes the Navys most powerful (known) supercomputer, Narwhal, which is capable of 12.8 petaFLOPS. The known part of the Air Forces most powerful supercomputer, Mustang, is capable of 4.87 petaFLOPS. (Part of Mustang is classified, Newmeyer noted.) Penguins two TrueHPC supercomputers expected to register at 8.5 petaFLOPS and 9 petaFLOPS will be two of HPCMPs most powerful computers ever, Ireland said.

According to the Top500 Project, the fastest supercomputer in the world, as of June 2021, is Japans Fugaku, which registered 442.01 petaFLOPS in November 2020, taking the top spot from IBMs Summit (148.6 petaFLOPS), which is housed at the Department of Energys Oak Ridge National Laboratory.

The Pentagons upgrade in supercomputing power comes amid an intense technological race against near-peer rival China. According to the Top500, China currently leads the world in the total number of supercomputers with 188, but when ranked by performance, the US has five of the top 10 most powerful supercomputers in the world, while China has two of the top 10. No other country has more than one in the top 10.

Ireland noted that Penguin, which has been building supercomputers for 20 years, has for years been running programs at the Department of Energy, which has the most powerful (known) supercomputers in the US. Fifteen of Penguins debuts over 20 years have made the Top500, and were DoD to run official benchmarks on these two new supercomputers, they would rank within the top 100 worldwide, Ireland said.

The Navys DoD Supercomputing Resource Center (DSRC) at Stennis Space Center in Mississippi will house one of the new platforms, while the other will go to the Air Force Research Labs DSRC at Wright-Patterson Air Force Base in Dayton, Ohio.

But first Penguin has to build, deploy, and integrate them into HPCMPs network, known as the Defense Research Engineering Network (DREN). Ireland said Penguins TrueHPC consists of about 1,500 nodes, which must be engineered to work as one, giant machine.

The trick with distributed computing meaning its taking what heretofore was done on a mainframe-style computer where its all on a board, and its broken up into separate, discrete servers is making sure that is an adequate platform for any given application, Penguins Chief Strategy Officer Matt Jacobs told Breaking Defense. To make sure that balance between the elements is right and theres an appropriate amount of compute to solve the problem.

Jacobs said some of the key elements include data patterns, network traffic, and storage capacity, which all must be brought together in a way that doesnt strand investment in any given element of those resources and that its an effective production platform for the workload application. Thats really the art, he added.

Jacobs said that Penguin generally builds these types of platforms in a couple of months, but like many companies worldwide, Penguin has encountered challenges in the global supply chain, especially around chips. Jacobs and Ireland said the supply chain hiccups are beyond the companys control, but said they still wouldnt significantly delay the project.

Notably, the platforms will include over 100 NVIDIA graphics processing units, or GPUs, to bolster DoDs AI and machine learning capabilities, Ireland said.

Ultimately, Ireland said, the project is about keeping the US warfighter equipped with state-of-the-art technologies to solve compute problems. Were keeping our warfighters current. You dont want them fighting wars with F-14s when theres F-22s.

Its unclear how long the era of supercomputers will last, as the US and China, among others, race ahead towards quantum computing, which uses quantum mechanics to make a technological leap in processing power. But Newmeyer said hes not concerned traditional supercomputing platforms will become obsolete anytime soon.

Youll still have a use for these types of machines, he said. Any quantum computer built in the near future is going to be highly expensive to operate, and [quantum computers] are only more useful for certain applications maybe in some stuff around hypersonics, certainly cryptology, navigation there quantum has a key role. But for general computation, [quantum] is an awful lot of money.

Read the rest here:
DoD Buys Two New Supercomputers That Rank Among Its Most Powerful Ever - Breaking Defense

Why We Live In A Golden Age Of Innovation – Forbes

Self-organizing team

Despite horrific daily headlines, the important news is that we are living in a golden age of innovation. It has been called the digital age, among other names, although that label is wrongly taken to imply that this is just about technology. In reality its a combination of both new technology and a radically different kind of management. It is the combination of bothnew technology and a new managementthat makes things fundamentally different from what was happening in the 20th century.

The new digital technologies are amazingthe Internet, the cloud, algorithmic decision making, block chain, artificial intelligence, quantum computing, and 3-D printing. Moreover the technologies are evolving rapidly and, as Vivek Wadhwa et al point out in their book, From Incremental To Exponential (Berrett-Koehler, 2020), they are interacting with each other to create even further possibilities.

Yet alone, technology doesnt make much difference. We have learned over the last two decades that few benefits ensue from the new technologies unless there is also a different mindset towards managinghuman beings creating value for other human beings. It is a new meta-model of management that is fundamentally different from industrial-era management, which was based on efficiency and outputs. Firms such as Haier and Microsoft, among scores of other firms, have demonstrated the new way, not just in tech, but in cars, finance, health, agriculture, music, movies, retail, restaurants, gaming and hoteling. The new management mindset includes.

Embracing a goal of creating value for the customer as the primary foundation of everything the firm does.

Unleashing talent in self-organizing teams and micro-enterprises. Talent is now driving strategy, rather than vice versa.

Operating as a network of competence, rather than a hierarchy of authority.

Enabling a firm to create new businesses, new business models, platforms and ecosystems, and managing data as an asset.

And once both technology and management are in place, then dramatic benefits start appearing, along three dimensions.

First, the impressive benefits for customers are almost magic, transforming how we work, how we communicate, how we go about, how we shop, how we how we play games, how we deliver health care and education, how we raise our children, how we entertain ourselves, how we read, how we listen to music, how we watch theater, go to the movies, and how we worship: in short how we live. This combination of new technology and new ways of running companies is changing most of our lives, even those in developing countries.

Second, it has changes the workplace, potentially for the better. When those doing the work are collaborating in self-organizing teams, focused on delivering value for customers, work can be meaningful and uplifting. At its best, human beings are delivering value for other human beings, as opposed to individuals producing things in accordance with instructions from bosses.

Third, it is much more profitable for the firms themselves, once they get fully into this mode. Microsoft is a striking example. It made a commitment to the new way in 2014, under the leadership of CEOand now chairmanSatya Nadella. Since his taking over and implementing this different way of running a company, Microsoft has added $1.5 trillion to its market capitalization.

Some critics ask whether there is anything really new in the new way of managing. And indeed, management innovators have been working on some of these changes for at least a century, beginning with Mary Parker Follett in the 1920s. Yet until recently, there was little enduring success, as Art Kleiner noted in his book, The Age of Heretics (Jossey-Bass, 2008).

As Gary Hamel explained last year, You can go back and read about the precursors of the Agile management, about early attempts at building self-managing teams, about more participatory decision structures. A lot of this work, in the 60s and 70s, produced extraordinary resultshuge gains in productivity and engagementbut few of the changes scaled up. Most of these efforts were ultimately aborted or marginalized. In the end, the empire struck back.

Thus, this new meta-model of management isnt just another variant of 20th century management. True, it is not yet everywhere. But the extraordinary gains being made by firms that have embraced the new meta-model create massive incentives for other firms to make the shift, as well as huge disincentives not to make the shift.

What makes Microsoft such a remarkable story is that it is often cited as an example of a stagnant bureaucracy that would never change. Its transformation shows that change is possible.

Meanwhile, many big old industrial-era firms are struggling. Thats why its dangerous to think of the new age as the fourth industrial age, pace Klaus Schwab and his book The Fourth Industrial Revolution (Currency, 2017). If firms think of this as a continuation, or evolution, of the industrial age, they are unlikely to succeed with the new meta-model of management needed for handling digital technologies.

Nobel-Prize-winning economist Edmund Phelps has suggested in his book, Mass Flourishing: How Grassroots Innovation Created Jobs, Challenge, and Change (Princeton, 2013), that we could be on the brink of a Mass Human Flourishing. And when we see what is happening in the best firms today, it begins to look plausible. But there is also a legitimate concern that we could be on the brink of a Mass Human Repression, if these new technologies were to be used for malign purposes. This is a choice that societies around the world are now facing. Are we heading, as Phelps suggests, towards a Mass Human Flourishing? Or a Mass Human Repression? The choice is ours. But to make that choice, we first have to understand it.

How The Digital Age Is Reinventing (Almost) Everything

How Microsofts Digital Transformation Created A Trillion Dollar Gain

See the original post here:
Why We Live In A Golden Age Of Innovation - Forbes

View from Washington: Aukus looms over AI and quantum – E&T Magazine

The new US-UK-Australia alliance is set to shake up how all three countries carry out research in key emerging technologies.

Most of the talk has been about submarines, but another important aspect of the new Aukus alliance between Australia, the UK and the US is that it defines emerging technologies particularly artificial intelligence and quantum computing as first-order national security issues.

As Tom Tugendhat, chair of the Commons Foreign Affairs Committee, said in a Twitter thread:

Bringing together the military industrial complex of these three allies together is a step change in the relationship. Weve always been interoperable, but this aims at much more. From artificial intelligence to advanced technology the US, UK and Australia will now be able to cost save by increasing platform sharing and innovation costs. Particularly for the smaller two, thats game-changing.

Tugendhat is right. The game has changed, and in ways that are only just coming to light. For example, digital innovation has been driven by communications, e-commerce, consumer electronics and the PC since the mid-1980s, even though the sector originally depended on the defence industry. This alliance puts government and security back at the forefront.

In other ways though, Aukus reflects a consolidation of how the technology landscape has evolved during the last five years amid greater competition between China and the West and recurrent talk of decoupling.

As well as restricting the US activities of several Chinese companies through its Entities List (most notably, but not exclusively Huawei), Washington has blocked Chinese-led takeovers of companies it considers particularly sensitive, such as Lattice Semiconductor. Aukus itself is then consistent with the recommendation of March's US National Security Commission on AI, chaired by former Google chief Eric Schmidt, that the US needed to not just increase its own efforts but also "rally our closest allies and partners to defend and compete in the coming era of AI-accelerated competition and conflict".

UK regulators are still mulling over US company Nvidias proposed $54bn bid for Cambridge-based Arm partly for national security issues more on that later and Prime Minister Boris Johnson has launched probes into a clutch of others. These include a Chinese-backed deal for semiconductor manufacturer Newport Wafer Fab and one with suggested Chinese involvement involving the takeover of a Welsh graphene specialist, Perpetuus Group.

For its part, China has hardly sought to hide that it views AI and other emerging technologies as key to defence as well as future economic prosperity. Its New Generation Artificial Intelligence Development Plan was published in 2017 and calls for the country to be the world leader in the sector by 2030. It became a policy priority following the 2016 defeat of the world Go champion, Lee Sedol, in a tournament against AlphaGo, an AI developed by UK-based Google subsidiary DeepMind Technologies. Chinas military leaders see Go as an important proving ground in the development of strategic thinking for the battlefield.

The landscape has changed greatly since, also in 2016, Theresa Mays government nodded through Softbanks original acquisition of Arm, back then passing over concerns raised in the Ministry of Defence similar to those being taken more seriously today.

With the AI race well under way whether you like it or not the consolidation within Aukus of the research efforts of the three countries promises not only the technological benefits Tugendhat identifies but also feels like a necessary acceleration.

But there will be a price.

The cycle for delivering consumer and other branches of civilian innovation has shortened from the 18 months in Moores Law to one that is now, to all intents and purposes, annual. Of late, defence applications have often used the benefits of programmable logic to which various secret combinations of spices and sauces would be added. However, it has been clear for a while that the worlds of hardware and software are eliding for AI, and quantum computing will require a shift to entirely new architectures. As a result, what companies can and cannot release to the public, and when, is likely to come under much tighter official scrutiny. Time-to-market vs. Defence of the Realm(s).

Consolidation as well as greater cooperation across the three countries is also a possibility, and this brings things back to Arm-Nvidia. As two world-class companies operating in the technology spaces covered by Aukus, and given the environment the alliance seeks to create, it may be much harder for UK regulators to block the deal. Indeed, they may now want to encourage it. Meanwhile the EU, which has serious antitrust concerns over the union of the leading IP provider with a leading chipmaker, may feel understandable French anger notwithstanding that there is too much political risk in objecting, particularly with some members nervous about the extent of President Joe Bidens commitment to Nato.

Then, some of the more notable consequences may be for the global research infrastructure, one that had become increasingly freewheeling since the fall of the Berlin Wall.

Some familiar voices are already proclaiming Aukus as evidence of the Brexit dividend. Never mind the facts that technological collaboration between the three members is already taking place through the Five Eyes intelligence alliance (with New Zealand and Canada, both not part of this agreement); that the US and UK have been sharing the nuclear propulsion research covered since 1958 and already overlap hugely in defence research (e.g., BAE Systems); and that the technological and national security trends in AI and quantum have only surfaced since the referendum vote (Lets spend 250m a week on R&D, anyone? Anyone?)

That said, as emerging technologies are considered more sensitive, governments are going to reconsider how far they can go in undertaking certain types of cutting-edge work through multinational economic bodies like the EU and other civil partnerships rather than military alliances operating under strict secrecy. Just how open exchanges in technical conferences covering those areas can be in future is also now even more up for debate.

These issues have always been there. And they have always been tricky. But are we at a point where they are about to be as tricky as they were half a century ago, and when those who knew how to navigate such territory have either retired or passed away? And, of course, we do not yet know where any boundaries are going to be set.

Many in the UK technology sector will see Aukus as a great opportunity. They are probably right to do so. But, even if not entirely in public view, the three powers involved need to communicate how they expect commercial and academic collaboration to work clearly and, given the alliance positions the areas within its scope as pressing and serious, quickly. Submarines may look like item one on the agenda, but everything else is equally immediate.

Once youve shook things up, you must still reorder them. Ad hoc simply isnt an option. The months ahead will be busy ones. Well, they better had be.

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Follow this link:
View from Washington: Aukus looms over AI and quantum - E&T Magazine

Ohio State-led QuSTEAM initiative awarded $5 million from NSF – The Ohio State University News

A multidisciplinary, multi-institutional program led by The Ohio State University is taking the next step in its aim to develop a diverse, effective and contemporary quantum-ready workforce by revolutionizing and creating more equitable pathways to quantum science education.

QuSTEAM: Convergence Undergraduate Education in Quantum Science, Technology, Engineering, Arts and Mathematics, was awarded a $5 million cooperative agreement over a two-year period from the National Science Foundations (NSF) Convergence Accelerator. Following QuSTEAMs initial assessment period, Phase I, the award will fund Phase IIs objective to build transformative, modular quantum science degree and certification programs.

I know from personal experience that collaboration is the key to scientific success. Working across disciplines especially when it comes to the highly complex and multidisciplinary world of quantum science research will help us more quickly harness the enormous power of this emerging field and deliver real-world results more quickly and efficiently, said Ohio State President Kristina M. Johnson. As an added bonus, this project enables Ohio State to further part of its core mission, which is to educate the next generation of researchers through educational opportunities that advance diversity and workforce development.

The rapidly evolving field of quantum information science will enable technological breakthroughs and have far-reaching economic and societal impacts what researchers at the National Institute of Standards and Technology refer to as the second quantum revolution. Ohio State is emerging as a key leader in pushing the field forward, recently joining the Chicago Quantum Exchange, a growing intellectual hub for the research and development of quantum technology, as its first regional partner.

NSFs Convergence Accelerator is focused on accelerating solutions toward societal impact. Within three years, funded teams are to deliver high-impact results, which is fast for product development, said Douglas Maughan, head of the NSF Convergence Accelerator program. During Phase II, QuSTEAM and nine other 2020 cohort teams will participate in an Idea-to-Market curriculum to assist them in developing their solution further and to create a sustainability plan to ensure the effort provides a positive impact beyond NSF funding.

QuSTEAM is a great example of how universities and industry can work together to build the foundation for a strong, diverse workforce, said David Awschalom, the director of the Chicago Quantum Exchange andLiew Family Professor in Molecular Engineering and Physics at the University of Chicago. Innovations in this field require us to provide broadly accessible quantum education, and QuSTEAM represents an ambitious approach to training in quantum engineering.

Unlocking that potential, however, also requires a foundational shift in teaching and growing a quantum-literate workforce. QuSTEAM brings together scientists and educators from over 20 universities, national laboratories, community colleges, and historically Black colleges and universities (HBCUs) to develop a research-based quantum education curriculum and prepare the next generation of quantum information scientists and engineers. The initiative also has over 14 industrial partners, including GE Research, Honda and JPMorgan Chase, and collaborates with leading national research centers to help provide a holistic portrait of future workforce needs.

We have leaders in quantum information and STEM education, and both of these groups independently do good work building undergraduate curriculum, but they actually work together surprisingly rarely, said QuSTEAM lead investigator Ezekiel Johnston-Halperin, professor in the Department of Physics at Ohio State. We are talking to people in industry and academia about what aspects of quantum information are most critical, what skills are needed, what workforce training looks like today and what they expect it to look like a couple years from now.

We feel strongly about the need for redesigning quantum science education, which is the objective of QuSTEAM, said Marco Pistoia, head of the Future Lab for Applied Research and Engineering (FLARE) at JPMorgan Chase. The complexity of the quantum computing stack is enabling the creation of many new job opportunities. It is crucial for quantum curricula nationwide to collectively support this multiplicity of needs, but for this to happen, quantum scientists and engineers must have the proper training. We are very excited to see the impact of QuSTEAMs work in the near and long term, considering finance is predicted to be the first industry sector to start realizing significant value from quantum computing.

QuSTEAM is headed by five Midwestern universities: lead institution Ohio State, the University of Chicago, the University of Michigan, Michigan State University and the University of Illinois at Urbana-Champaign, all of which have partnered with local community colleges and regional partners with established transfer pipelines to engage underrepresented student populations.

The group is also collaborating with the IBM-HBCU Quantum Center to recruit faculty from its network of over 20 partner colleges and universities, as well as Argonne National Laboratory. In all, the QuSTEAM team comprises 66 faculty who share expertise in quantum information science and engineering, creative arts and social sciences, and education research.

To best develop a quantum-ready workforce, QuSTEAM identified the establishment of a common template for an undergraduate minor and associate certificate programs as the near-term priority. The team will build curricula consisting of in-person, online and hybrid courses for these degree and certification programs including initial offerings of the critical classes and modules at the respective universities while continuing to assess evolving workforce needs.

QuSTEAM plans to begin offering classes in spring 2022, with a full slate of core classes for a minor during the 2022-2023 academic year. The modular QuSTEAM curriculum will provide educational opportunities for two- and four-year institutions, minority-serving institutions and industry, while confronting and dismantling longstanding biases in STEM education.

If we want to increase diversity in quantum science, we need to really engage meaningfully with community colleges, minority-serving institutions and other small colleges and universities, Johnston-Halperin said. The traditional STEM model builds a program at an elite, R1 university and then allows the content to diffuse out from there. But historically this means designing it for a specific subset of students, and everything else is going to be a retrofit. Thats just never as effective.

QuSTEAM leverages integrated university support from faculty and staff from the Drake Institute for Teaching and Learning, the Institute for Materials Research, the Department of Physics and the Ohio State Office of Research.

Johnston-Halperin is joined at Ohio State by QuSTEAM co-PI Andrew Heckler, professor of physics and physics education research specialist. Other Ohio State faculty included on QuSTEAM are Daniel Gauthier, professor in the Department of Physics; Christopher Porter, postdoctoral researcher in the Department of Physics; David Penneys, associate professor in the Department of Mathematics; Zahra Atiq, assistant professor of practice of computer science and engineering in the College of Engineering; David Delaine and Emily Dringenberg, assistant professors of engineering education in the College of Engineering; and Edward Fletcher, associate professor of educational studies in the College of Education and Human Ecology.

QuSTEAM is one of 10 teams selected for two-year, $5 million Phase II funding as part the NSF Convergence Accelerator 2020 Cohort, which supports efforts to fast-track transitions from basic research and discovery into practice, and seeks to address national-scale societal challenges. With this funding, QuSTEAM will address the challenge of developing a strong national quantum workforce by instituting high-quality, engaging courses and educational tracks that allow for students of all backgrounds and interests to choose multiple paths of scholarship.

See original here:
Ohio State-led QuSTEAM initiative awarded $5 million from NSF - The Ohio State University News

Atomically-Thin, Twisted Graphene Has Unique Properties That Could Advance Quantum Computing – SciTechDaily

New collaborative research describes how electrons move through two different configurations of bilayer graphene, the atomically-thin form of carbon. These results provide insights that researchers could use to design more powerful and secure quantum computing platforms in the future.

Researchers describe how electrons move through two-dimensional layered graphene, findings that could lead to advances in the design of future quantum computing platforms.

New research published in Physical Review Letters describes how electrons move through two different configurations of bilayer graphene, the atomically-thin form of carbon. This study, the result of a collaboration between Brookhaven National Laboratory, the University of Pennsylvania, the University of New Hampshire, Stony Brook University, and Columbia University, provides insights that researchers could use to design more powerful and secure quantum computing platforms in the future.

Todays computer chips are based on our knowledge of how electrons move in semiconductors, specifically silicon, says first and co-corresponding author Zhongwei Dai, a postdoc at Brookhaven. But the physical properties of silicon are reaching a physical limit in terms of how small transistors can be made and how many can fit on a chip. If we can understand how electrons move at the small scale of a few nanometers in the reduced dimensions of 2-D materials, we may be able to unlock another way to utilize electrons for quantum information science.

When a material is designed at these small scales, to the size of a few nanometers, it confines the electrons to a space with dimensions that are the same as its own wavelength, causing the materials overall electronic and optical properties to change in a process called quantum confinement. In this study, the researchers used graphene to study these confinement effects in both electrons and photons, or particles of light.

The work relied upon two advances developed independently at Penn and Brookhaven. Researchers at Penn, including Zhaoli Gao, a former postdoc in the lab of Charlie Johnson who is now at The Chinese University of Hong Kong, used a unique gradient-alloy growth substrate to grow graphene with three different domain structures: single layer, Bernal stacked bilayer, and twisted bilayer. The graphene material was then transferred onto a special substrate developed at Brookhaven that allowed the researchers to probe both electronic and optical resonances of the system.

This is a very nice piece of collaborative work, says Johnson. It brings together exceptional capabilities from Brookhaven and Penn that allow us to make important measurements and discoveries that none of us could do on our own.

The researchers were able to detect both electronic and optical interlayer resonances and found that, in these resonant states, electrons move back and forth at the 2D interface at the same frequency. Their results also suggest that the distance between the two layers increases significantly in the twisted configuration, which influences how electrons move because of interlayer interactions. They also found that twisting one of the graphene layers by 30 also shifts the resonance to a lower energy.

Devices made out of rotated graphene may have very interesting and unexpected properties because of the increased interlayer spacing in which electrons can move, says co-corresponding author Jurek Sadowski from Brookhaven.

In the future, the researchers will fabricate new devices using twisted graphene while also building off the findings from this study to see how adding different materials to the layered graphene structure impacts downstream electronic and optical properties.

We look forward to continuing to work with our Brookhaven colleagues at the forefront of applications of two-dimensional materials in quantum science, Johnson says.

Reference: Quantum-Well Bound States in Graphene Heterostructure Interfaces by Zhongwei Dai, Zhaoli Gao, Sergey S. Pershoguba, Nikhil Tiwale, Ashwanth Subramanian, Qicheng Zhang, Calley Eads, Samuel A. Tenney, Richard M. Osgood, Chang-Yong Nam, Jiadong Zang, A.T. Charlie Johnson and Jerzy T. Sadowski, 20 August 2021, Physical Review Letters.DOI: 10.1103/PhysRevLett.127.086805

The complete list of co-authors includes Zhaoli Gao (now at The Chinese University of Hong Kong), Qicheng Zhang, and Charlie Johnson from Penn; Zhongwei Dai, Nikhil Tiwale, Calley Eads, Samuel A. Tenney, Chang-Yong Nam, and Jerzy T. Sadowski from Brookhaven; Sergey S. Pershogub, and Jiadong Zang from the University of New Hampshire; Ashwanth Subramanian from Stony Brook University; and Richard M. Osgood from Columbia University.

Charlie Johnson is the Rebecca W. Bushnell Professor of Physics and Astronomy in the Department of Physics and Astronomy in the School of Arts & Sciences at the University of Pennsylvania.

This research was supported by National Science Foundation grants MRSEC DMR- 1720530 and EAGER 1838412. Brookhaven National Laboratory is supported by the U.S. Department of Energys Office of Science.

Here is the original post:
Atomically-Thin, Twisted Graphene Has Unique Properties That Could Advance Quantum Computing - SciTechDaily