Archive for the ‘Quantum Computing’ Category

Expert: Now is the time to prepare for the quantum computing revolution – TechRepublic

Though quantum computing is likely five to 10 years away, waiting until it happens will put your organization behind. Don't play catch-up later.

TechRepublic's Karen Roby spoke with Christopher Savoie, CEO and co-founder of Zapata Computing, a quantum application company, about the future of quantum computing. The following is an edited transcript of their conversation.

SEE: The CIO's guide to quantum computing (free PDF) (TechRepublic)

Christoper Savoie: There are two types of quantum-computing algorithms if you will. There are those that will require what we call a fault-tolerant computing system, one that doesn't have error, for all intents and purposes, that's corrected for error, which is the way most classical computers are now. They don't make errors in their calculations, or at least we hope they don't, not at any significant rate. And eventually we'll have these fault-tolerant quantum computers. People are working on it. We've proven that it can happen already, so that is down the line. But it's in the five- to 10-year range that it's going to take until we have that hardware available. But that's where a lot of the promises for these exponentially faster algorithms. So, these are the algorithms that will use these fault-tolerant computers to basically look at all the options available in a combinatorial matrix.

So, if you have something like Monte Carlo simulation, you can try significantly all the different variables that are possible and look at every possible combination and find the best optimal solution. So, that's really, practically impossible on today's classical computers. You have to choose what variables you're going to use and reduce things and take shortcuts. But with these fault-tolerant computers, for significantly many of the possible solutions in the solution space, we can look at all of the combinations. So, you can imagine almost an infinite amount or an exponential amount of variables that you can try out to see what your best solution is. In things like CCAR [Comprehensive Capital Analysis and Review], Dodd-Frank [Dodd-Frank Wall Street Reform and Consumer Protection Act] compliance, these things where you have to do these complex simulations, we rely on a Monte Carlo simulation.

So, trying all of the possible scenarios. That's not possible today, but this fault tolerance will allow us to try significantly all of the different combinations, which will hopefully give us the ability to predict the future in a much better way, which is important in these financial applications. But we don't have those computers today. They will be available sometime in the future. I hate putting a date on it, but think about it on the decade time horizon. On the other hand, there are these nearer-term algorithms that run on these noisy, so not error-corrected, noisy intermediate-scale quantum devices. We call them NISQ for short. And these are more heuristic types of algorithms that are tolerant to noise, much like neural networks are today in classical computing and [artificial intelligence] AI. You can deal a little bit with the sparse data and maybe some error in the data or other areas of your calculation. Because it's an about-type of calculation like neural networks do. It's not looking at the exact answers, all of them and figuring out which one is definitely the best. This is an approximate algorithm that iterates and tries to get closer and closer to the right answer.

SEE: Hiring Kit: Video Game Designer (TechRepublic Premium)

But we know that neural networks work this way, deep neural networks. AI, in its current state, uses this type of algorithm, these heuristics. Most of what we do in computation nowadays and finance is heuristic in its nature and statistical in its nature, and it works good enough to do some really good work. In algorithmic trading, in risk analysis, this is what we use today. And these quantum versions of that will also be able to give us some advantage and maybe an advantage overwe've been able to show in recent workthe purely classical version of that. So, we'll have some quantum-augmented AI, quantum-augmented [machine learning] ML. We call it a quantum-enhanced ML or quantum-enhanced optimization that we'll be able to do.

So, people think of this as a dichotomy. We have these NISQ machines, and they're faulty, and then one day we'll wake up and we'll have this fault tolerance, but it's really not that way. These faulty algorithms, if you will, these heuristics that are about, they will still work and they may work better than the fault-tolerant algorithms for some problems and some datasets, so this really is a gradient. It really is. You'd have a false sense of solace, maybe two. "Oh well, if that's 10 years down the road we can just wait and let's wait till we wake up and have fault tolerance." But really the algorithms are going to be progressing. And the things that we develop now will still be useful in that fault-tolerant regime. And the patents will all be good for the stuff that we do now.

So, thinking that, "OK, this is a 10 year time horizon for those fault-tolerant computers. Our organization is just going to wait." Well, if you do, you get a couple of things. You're not going to have the workforce in place to be able to take advantage of this. You're probably not going to have the infrastructure in place to be able to take advantage of this. And meanwhile, all of your competitors and their vendors have acquired a portfolio of patents on these methodologies that are good for 20 years. So, if you wait five years from now and there's a patent four years down the line, that's good for 24 years. So there really is, I think, an incentive for organizations to really start working, even in this NISQ, this noisier regime that we're in today.

Karen Roby: You get a little false sense of security, as you mentioned, of something, oh, you say that's 10 years down the line, but really with this, you don't have the luxury of catching up if you wait too long. This is something that people need to be focused on now for what is down the road.

SEE: Quantum entanglement-as-a-service: "The key technology" for unbreakable networks (TechRepublic)

Christoper Savoie: Yes, absolutely. And in finance, if you have a better ability to detect risks then than your competitors; you're at a huge advantage to be able to find alpha in the market. If you can do that better than others, you're going to be at a huge advantage. And if you're blocked by people's patents or blocked by the fact that your workforce doesn't know how to use these things, you're really behind the eight ball. And we've seen this time and time again with different technology evolutions and revolutions. With big data and our use of big data, with that infrastructure, with AI and machine learning. The organizations that have waited generally have found themselves behind the eight ball, and it's really hard to catch up because this stuff is changing daily, weekly, and new inventions are happening. And if you don't have a workforce that's up and running and an infrastructure ready to accept this, it's really hard to catch up with your competitors.

Karen Roby: You've touched on this a little bit, but really for the finance industry, this can be transformative, really significant what quantum computing can do.

Christoper Savoie: Absolutely. At the end of the day, finance is math, and we can do better math and more accurate math on large datasets with quantum computing. There is no question about that. It's no longer an "if." Google has, with their experiment, proven that at some point we're going to have a machine that is definitely going to be better at doing math, some types of math, than classical computers. With that premise, if you're in a field that depends on math, that depends on numbers, which is everything, and statistics, which is finance, no matter what side you're on. If you're on the risk side or the investing side, you're going to need to have the best tools. And that doesn't mean you have to be an algorithmic trader necessarily, but even looking at tail risk and creating portfolios and this kind of thing. You're dependent on being able to quickly ascertain what that risk is, and computing is the only way to do that.

SEE: The quantum decade: IBM predicts the 2020s will see quantum begin to solve real problems (TechRepublic)

And on the regulatory side, I mentioned CCAR. I think as these capabilities emerge, it allows the regulators to ask for even more scenarios to be simulated, those things that are a big headache for a lot of companies. But it's important because our global financial system depends on stability and predictability, and to be able to have a computational resource like quantum that's going to allow us to see more variables or more possibilities or more disaster scenarios. It can really help. "What is the effect of, say, a COVID-type event on the global financial system?" To be more predictive of that and more accurate at doing that is good for everybody. I think all boats rise, and quantum is definitely going to give us that advantage as well.

Karen Roby: Most definitely. And Christopher, before I let you go, if you would just give us a quick snapshot of Zapata Computing and the work that you guys do.

Christoper Savoie: We have two really important components to try and make this stuff reality. On the one hand, we've got over 30 of the brightest young minds and algorithms, particularly for these near-term devices and how to write those. We've written some of the fundamental algorithms that are out there to be used on quantum computers. On the other hand, how do you make those things work? That's a software engineering thing. That's not really quantum science. How do you make the big data work? And that's all the boring stuff of ETL and data transformation and digitalization and cloud and multicloud and all this boring but very important stuff. So basically Zapata is a company that has the best of the algorithms, but also best-of-breed means of actually software engineering that in a modern, multicloud environment that particularly finance companies, banks, they're regulated companies with a lot of data that is sensitive and private and proprietary. So, you need to be able to work in a safe and secure multicloud environment, and that's what our software engineering side allows us to do. We have the best of both worlds there.

Learn the latest news and best practices about data science, big data analytics, and artificial intelligence. Delivered Mondays

Image: sakkmesterke, Getty Images/iStockphoto

Go here to read the rest:
Expert: Now is the time to prepare for the quantum computing revolution - TechRepublic

IBM partners with the University of Tokyo on quantum computer – Illinoisnewstoday.com

Tokyo IBM and the University of Tokyo have announced one of Japans most powerful quantum computers.

According to IBM, IBM Quantum System One is part of the Japan-IBM quantum partnership between the University of Tokyo and IBM, advancing Japans quest for quantum science, business and education.

IBM Quantum System One is currently in operation for researchers at both Japanese scientific institutions and companies, and access is controlled by the University of Tokyo.

IBM is committed to growing the global quantum ecosystem and facilitating collaboration between different research communities, said Dr. Dario Gil, director of IBM Research.

According to IBM, quantum computers combine quantum resources with classical processing to provide users with access to reproducible and predictable performance from high-quality qubits and precision control electronics. Users can safely execute algorithms that require iterative quantum circuits in the cloud.

see next: IBM partners with Atos on contract with Dutch Ministry of Defense

IBM Quantum System One in Japan is IBMs second system built outside the United States. In June, IBM unveiled the IBM Quantum System One, managed by the scientific research institute Fraunhofer Geselleschaft, in Munich, Germany.

IBMs commitment to quantum is aimed at advancing quantum computing and fostering a skilled quantum workforce around the world.

We are thrilled to see Japans contributions to research by world-class academics, the private sector, and government agencies, Gil said.

Together, we can take a big step towards accelerating scientific progress in different areas, Gil said.

Teruo Fujii, President of the University of Tokyo, said, In the field of rapidly changing quantum technology, it is very important not only to develop elements and systems related to quantum technology, but also to develop the next generation of human resources. To achieve a high degree of social implementation.

Our university has a wide range of research capabilities and has always promoted high-level quantum education from the undergraduate level. Now, with IBM Quantum System One, we will develop the next generation of quantum native skill sets. Further refine it.

In 2020, IBM and the University of Tokyo Quantum Innovation Initiative Consortium (QIIC) aims to strategically accelerate the research and development activities of quantum computing in Japan by bringing together the academic talents of universities, research groups and industries nationwide.

Last year, IBM also announced partnerships with several organizations focusing on quantum information science and technology. Cleveland Clinic, NS Science and Technology Facilities Council in the United Kingdom, And that University of Illinois at Urbana-Champaign..

see next: Public cloud computing provider

Original post:
IBM partners with the University of Tokyo on quantum computer - Illinoisnewstoday.com

Sumitomo Corporation Quantum Transformation (QX) Project Announces Its Vision and Activities at the IEEE Quantum AI Sustainability Symposium – Yahoo…

- New social paradigm shift "QX" is right around the corner -

TOKYO, August 23, 2021--(BUSINESS WIRE)--Sumitomo Corporation Quantum Transformation (QX) Project will present at the IEEE Quantum AI Sustainability Symposium on September 1st, 2021. The QX Project was launched in March 2021 by Sumitomo Corporation, a global Fortune 500 trading and investment company, with the intent to provide new value to society by applying quantum computing technology to the wide-ranging industries in which the company operates. This is the worlds first project that defines "Quantum Transformation (QX)" as the next social paradigm shift, beyond "Digital Transformation (DX)".

This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20210823005255/en/

Sumitomo Corporation Quantum Transformation (QX) Project Announces Its Vision and Activities. (Graphic: Business Wire)

The founder and head of the QX Project, Masayoshi Terabe, will present about the vision and activities of QX at the IEEE Quantum AI Sustainability Symposium. The organizer "IEEE" is the world's largest technical professional organization for the advancement of technology. In this talk, he will show how quantum computing can contribute to sustainability. For example, he will introduce the Quantum Sky project, which is a pilot experiment for developing flight routes for numerous air mobility vehicles by quantum computing. Also you can find other concepts like Quantum Smart City and Quantum Energy Management.

The objective of the QX Project is to create new value to the society by combining vast business fields of Sumitomo Corporation throughout its more than 900 consolidated companies, from underground to space, and an extensive number of business partners around the world.

A broad and deep ecosystem is necessary to achieve QX. This is because combining a wide range of technologies, not limited to quantum, and working with a crossover of various industries, is essential. If you are interested in this project, lets take on the challenge of creating a new business, and a new society together!

Story continues

[information]

[Appendix]

View source version on businesswire.com: https://www.businesswire.com/news/home/20210823005255/en/

Contacts

Contact info:Luke Hasumura, responsible for Vision & Ecosystem on Quantum Transformation.qx@sumitomocorp.com +81-3-6285-7489

View post:
Sumitomo Corporation Quantum Transformation (QX) Project Announces Its Vision and Activities at the IEEE Quantum AI Sustainability Symposium - Yahoo...

Energy Department Sets $61M of Funding to Advance QIS Research – MeriTalk

The U.S. Department of Energy (DOE) has announced $61 million in funding for infrastructure and research projects to advance quantum information science (QIS).

Specifically, the DOE is supplying $25 million in funding for creating quantum internet testbeds, which will advance foundational building blocks including devices, protocols, technology, and techniques for quantum error correction at the internet scale.

The DOE also is providing $6 million in funding for scientists to study and develop new devices to send and receive quantum network traffic and advance a continental-scale quantum internet.

Lastly, the DOE granted $30 million of funding to five DOE Nanoscale Science Research Centers to support cutting-edge infrastructure for nanoscience-based research to strengthen the United States competitiveness in QIS and enable the development of nanotechnologies.

Harnessing the quantum world will create new forms of computers and accelerate our ability to process information and tackle complex problems like climate change, said U.S. Secretary of Energy Jennifer M. Granholm in a statement. DOE and our labs across the country are leading the way on this critical research that will strengthen our global competitiveness and help corner the markets of these growing industries that will deliver the solutions of the future.

The DOE recognized the advantages of QIS back in 2018 when it became an integral partner in theNational Quantum Initiative, which became law in December 2018. Since then, the DOE Office of Science has launched a range of multidisciplinary research programs in QIS, including developing quantum computers as testbeds, designing new algorithms for quantum computing, and using quantum computing to model fundamental physics, chemistry, and materials phenomena.

Original post:
Energy Department Sets $61M of Funding to Advance QIS Research - MeriTalk

The Right Way to Structure Cyber Diplomacy – War on the Rocks

The modern State Department was forged in an era of global transformation. In the 1930s, the department had fewer than 2,000 personnel and, as one historian emphasized, it was a placid place that was comfortable with lethargic diplomacy. World War II revolutionized the department, which readily transformed itself to handle the demands of planning a new international order. Between 1940 and 1945, the departments domestic staff levels tripled and its budget doubled.

Today, the State Department is once again confronting the challenge of how to organize itself to cope with new international challenges not those of wartime, but ones created by rapid technological change. There are ongoing conversations about how the department should handle cyberspace policy, as well as concerns about emerging technologies like artificial intelligence, quantum computing, next generation telecommunications, hypersonics, biotechnology, space capabilities, autonomous vehicles, and many others.

As Ferial Ara Saeed recently emphasized, the department is not structured in a way that makes sense for addressing these matters. She is not alone in having this view, and others have also offered ideas for reform. Former Secretary of State Mike Pompeos proposal for a Bureau of Cyberspace Security and Emerging Technologies focused too narrowly on security, as Saeed correctly diagnoses. As an alternative, she proposes consolidating all technology policy issues under a new under secretary, who would report to the deputy secretary of state for management and resources.

The State Department should be restructured so that it can conduct effective cyber diplomacy, but establishing one bureau for all things technology-related is not the way to proceed. Conceptually, the core challenges for cyberspace policy are different from those related to emerging technology issues, and creating one all-encompassing bureau would generate multiple practical problems. Instead, the department should establish a Bureau of International Cyberspace Policy, as proposed in the Cyber Diplomacy Act. Consolidating cyberspace policy issues in a single bureau would provide greater coherence to overarching priorities and day-to-day diplomatic activities. Emerging technology issues should remain the responsibility of the appropriate existing bureaus. If they are provided with greater resourcing and if appropriate connective tissue is created, those bureaus will have greater flexibility in crafting individualized strategies for a very diverse array of technologies. At the same time, the department would be able to prioritize and adopt a strategic approach to technology diplomacy.

Cyberspace Matters Are Different from Other Technology Issues

Through our work as staff of the U.S. Cyberspace Solarium Commission, we have observed how cyberspace policy will have impacts on U.S. foreign policy and international relations that differ fundamentally from those produced by other technology issues. That is why cyberspace policy warrants a distinct foreign policy approach.

Unlike other technologies, cyberspace has created a new environment for international interaction. As Chris Demchak describes, cyberspace is a substrate that intrudes into, connects at long range, and induces behaviors that transcend boundaries of land, sea, air, institution, nation, and medium. Since the early 2000s, as one brief has put it, states have recognized cyberspace and its undergirding infrastructure as not only strategic assets, but also a domain of potential influence and conflict. At the same time, a lack of international agreement or clarity on key definitions compounds the difficulties of dealing with cyberspace as a new arena of state-to-state interaction.

A U.N. Group of Governmental Experts produced a consensus report outlining norms of responsible state behavior in cyberspace that was welcomed by the U.N. General Assembly in 2015. However, U.N. members were by no means agreed on how international law applies to cyberspace. Although that issue was addressed more successfully in 2021, diplomats are still negotiating critical questions like what counts as cybercrime, critical infrastructure, espionage, or many of the other foundational concepts in this area. All of these questions, and many others beyond the negotiations of the United Nations, have long-term implications for the future of the internet, as cyberspace policy experts navigate a path between security and surveillance, and between openness and authoritarianism. To be successful in this diplomacy, the State Department should prioritize these issues and provide its diplomats with organizational structures that will support Americas proactive leadership. In short, the State Department should have a dedicated cyberspace policy bureau.

The focus and activities of such a bureau would be functionally very different from what will be involved in addressing other technology issues. A Bureau of International Cyberspace Policy would be responsible for implementing a relatively established policy for cyber diplomacy. The head of the bureau would be working to ensure an open, interoperable, reliable, and secure internet, pushing back on authoritarian leanings in internet governance, and advocating for a multi-stakeholder model for the future of cyberspace. Certain details may change, but the core elements of this policy have been consistent across administrations and Congresses. Accordingly, the real added value of a cyberspace policy bureau is not in defining policy, but rather implementing that policy, which will require extensive engagement with non-aligned countries to help sway the balance of opinion toward an open internet, and international capacity-building efforts to help drive progress toward greater global cyber security.

By contrast, the challenge U.S. policymakers confront on emerging technologies is a question of establishing what Americas international policies and diplomatic strategies should be. As the National Security Commission on Artificial Intelligence observed in relation to the State Department, a lack of clear leadership on emerging technology hinders the Departments ability to make strategic technology policy decisions as part of a larger reorientation toward strategic competition.

Policymakers and officials working on emerging technologies will also face the challenge of adapting overarching policies as technologies emerge, develop, and ideally stabilize over time. Emerging technologies do not remain emerging indefinitely, and so an organizational structure that allows the development of cohesive strategies around these technologies should have the flexibility to shift between topics. Of course, cyberspace policy and the strategic considerations that guide it will also certainly need to adapt to changes, but its basic focus is likely to remain more stable. Much of Americas work in outlining cyberspace policy has already been done, and thus the missions that remain for example working with partners and allies on joint attribution of cyber attacks, rallying votes in the United Nations, and managing capacity building projects are unlikely to change dramatically any time soon.

Undoubtedly, there will be many areas of overlap between the work of those handling emerging technology issues and the responsibilities of a cyberspace policy office. But there will also be overlap between efforts on emerging technologies and matters handled by the Bureau of Economics and Business Affairs, the Bureau of East Asian and Pacific Affairs, the Bureau of International Security and Nonproliferation, and many others. The fact that there is overlap between two organizational constructs should not be taken as a justification to merge them, and while technology obviously plays a central role in both cyberspace policy and emerging technologies policy, the actual work required to address them is very different.

It also makes sense to keep some technology issues in their current bureaucratic homes because of their historical legacy and the subsequent development of specialized expertise within those homes. No one would suggest, for example, that emerging issues in nuclear technology should be pulled out of the Bureau of International Security and Nonproliferation and made the responsibility of a new emerging technology bureau. And some technologies might only have globally significant implications for a relatively short period of time. Advanced robotics, for example, might have a major impact on manufacturing and broader economic areas, which could require the sustained attention of policymakers as they grapple with the initial implications of such technology. But once advanced robotics become a routine part of industrial operations, it would make less sense to have brought the issue under a new bureau when the pre-existing functional and regional bureaus might be best poised to address the relevant challenges.

Making every technology policy the responsibility of one under secretary would not solve the State Departments current problems. Instead, it would result in unclear prioritization, strained resources, and would leave one leader handling two very different mission sets.

The Importance of Avoiding a Security-Focused Approach to Cyberspace

In creating a Bureau of International Cyberspace Policy, the State Department should also avoid limiting that bureaus focus solely to security-related matters. That was one of the flaws with the previous administrations efforts to create the Bureau of Cyberspace Security and Emerging Technologies. While that bureau never materialized, the Government Accountability Office roundly criticized the State Department for failing to provide data or evidence to support its plans and for its lack of consultation with other federal agencies. Rep. Gregory Meeks, the chairman of the House Foreign Affairs Committee, emphasized that the proposed office would not have been in a position to coordinate responsibility for the security, economic, and human rights aspects of cyber policy.

Any reorganization of the State Department should ensure that diplomats can take into account all dimensions political, economic, humanitarian, and security of cyberspace policy and elevate them within the department. That would allow a new bureau to lead the way in promoting a free and secure internet. Some of the reform proposals that have been put forward reflect this approach. For example, the Cyber Diplomacy Act, which has already passed in the House, would create an ambassador-at-large position, with rank equal to that of an assistant secretary, to lead a new cyber bureau. That person would report to the under secretary for political affairs or an official of higher rank, which leaves open the possibility that the position would report directly to the secretary of state or one of the departments two deputy secretaries. While some have proposed the deputy secretary for management and resources for this reporting chain, that position has a history of going unfilled, and having a new cyberspace bureau report to it is a recipe for undercutting the fledgling bureau before it can even get off the ground. A better alternative would be to allow the State Department some flexibility in determining a new bureaus reporting structure, which might include the more natural choice of reporting to the other deputy secretary.

An overly narrow focus on security is not the only trap to avoid in creating a new cyber bureau. Orienting it around the idea of strategic competition with China would also be a problem. No doubt China will remain a key driver of U.S. policy for years to come, but global threats and opportunities may look very different in future decades than they do now. Cyber diplomacy should not be oriented around one adversary specifically and the structure and functioning of a new cyberspace policy bureau should stand the test of time.

The Devil Is in the Details, But a Cyberspace Policy Bureau Is the Best Approach

The unfortunate political reality is that reorganizing the State Department is hard. That alone is not a reason to forgo reform, but it does introduce constraints on what may be feasible. Any new office or bureau will need leaders, but current law strictly limits the rank that they can hold. Creating a new under secretary, or even a new assistant secretary, would require significant changes to the State Department Basic Authorities Act, and there is limited political momentum for that particular undertaking. The law currently authorizes the appointment of 24 assistant secretaries and six under secretaries. Although the Cyberspace Solarium Commission initially recommended creating an assistant secretary position to lead a new cyber bureau and although it has been clear for two decades that the State Departments structure should be overhauled making such drastic changes to the necessary legislation may be a nonstarter on Capitol Hill for the foreseeable future. The Cyber Diplomacy Act provides the best available work-around by placing an ambassador-at-large at the head of the new bureau, ensuring that the position has the stature necessary for effective leadership.

The new bureau would also have to contend with the challenges of prioritization. The Cyber Diplomacy Act lists a wide variety of issues including internet access, internet freedom, digital economy, cybercrime, deterrence, and international responses to cyber threats that would become a cyberspace bureaus responsibilities. Even without giving it emerging technology topics to handle, consolidating just cyberspace policy issues will require careful planning to determine which pieces get pulled from existing bureaus. To allow a new bureau to adequately deal with digital economy matters, for example, policymakers would need to decide which aspects of that issue get moved from the purview of the Bureau of Economic and Business Affairs. The new bureau would have a good case for inheriting responsibility for portfolios like investment in information communications technology infrastructure abroad, particularly as it relates to cyber security capacity building, but there is a strong argument for other pieces like e-commerce to remain in their existing homes. The more bearing a particular teams work has on preserving an open, interoperable, reliable, and secure internet, the more it should be considered a strong candidate for incorporation into a new bureau.

Moving the responsibility for particular policy matters is not the only tool available, however. The Cyber Diplomacy Act creates an avenue for the new bureaus personnel to engage other State Department experts to ensure that concerns like human rights, economic competitiveness, and security have an influence on the development of U.S. cyber policy. The proposed Cyberspace Policy Coordinating Committee would ensure that officials at the assistant secretary level or higher from across the department can weigh in on matters of concern for their respective portfolios.

With a new cyberspace policy bureau, a coordinating committee, and enhancements to emerging technology capacity in its existing regional and functional bureaus, the State Department would be structured to handle the digital age effectively.

Natalie Thompson is a Ph.D. student in political science at Yale University. Previously, she was a research analyst for the U.S. Cyberspace Solarium Commission and a research assistant and James C. Gaither junior fellow at the Carnegie Endowment for International Peace, working with the Technology and International Affairs Program on projects related to disinformation and cyber security. She tweets at @natalierthom.

Laura Bate is a senior director with the U.S. Cyberspace Solarium Commission and a 2021 Next Generation National Security Fellow with the Center for a New American Security. Previously, she was a policy analyst with New Americas Cybersecurity Initiative and remains an International Security Program Fellow. She tweets at @Laura_K_Bate.

Image: State Department (Photo by Freddie Everett)

Read the original post:
The Right Way to Structure Cyber Diplomacy - War on the Rocks