Archive for the ‘Quantum Computing’ Category

"Once the quantum revolution starts it will be exponential" – CTech

Amir Naveh, CPO & Co-founder of Classiq

Sector: Quantum computing

Founders: Nir Minerbi, Amir Naveh, Dr. Yehuda Naveh

Investors: Entre Capital, Team8, Wing Capital, IN-Venture, Phoenix, HSBC, HPE, NTT, Awz Ventures, OurCrowd

Classical computers can do amazing things, we've seen that in the past 80 years. But some things they're unable to do, explained Amir Naveh, CPO and Co-founder of Classiq. From drug discovery to efficient fertilizer, development, to doing high-performance stuff that computers are not very good at doing. Quantum computers can do all these things, and they can do it with amazing efficiencies All of this is going to happen in the coming few years and in the next 20 years we will see a full-blown revolution."

Classiq has developed a technological solution that enables the development of software for quantum computers which is similar to advanced software development for regular computers. The solution developed by the company is protected by more than 20 patents, and the company's software development platform is considered the most advanced in the world of quantum computing.

Classiq has been selected as one of "Tomorrow's Growth Companies" according to Qumra Capital. This year, for the fourth year in a row, Qumra published its list of the most promising growth companies in Israeli high-tech, naming those who are on the path to becoming the next big thing.

This is, for me, the journey of a lifetime. It is amazing technology, the company is growing really fast I really hope to see this quantum revolution going from a research phase to more actual practical usage applications. Once it starts it will be exponential. It won't be twice as good it will be a thousand times better, a million times better. It's hard to imagine how the world is going to change but I hope in the next decade we will see some amazing things.

You can watch the full interview in the video above.

Continue reading here:
"Once the quantum revolution starts it will be exponential" - CTech

Cloud Assessment: Clarifying the Vision, Transforming the Organization – CIO

Cloud migration has become a tech buzzword across enterprises worldwide. However, to be an effective cloud user means not only getting introduced to the concept, but also thoroughly evaluating your existing IT infrastructure and processes, identifying their potential in moving to cloud, and effectively planning your migration strategy. Given the many advantages of migration, businesses are looking to tap into the long-term benefits of cloud computing, which include:

Conducting an objective and accurate assessment of their existing services, applications, security, and network infrastructure has been a challenge for organizations. Numerous discovery tools, including Cloudscape, Cloudamize, Device42, and TSO Logic, can help you understand your on-premise infrastructure.

Though these discovery tools do a good job in terms of understanding the infra estate as well as other basic information like CPU, RAM, disk storage, and OS, they have their own limitations. Mostly, the assessments are far from being accurate during and after migration. This is because the organizations do not go deeper in terms of understanding the applications and the business. The most common challenges of cloud migration are:

Broad basing the discovery

The good news, however, is that none of these challenges are insurmountable. To make the migration process as smooth as possible, we need to discover or analyze the source code, configurations, applications, and databases too and not just focus on infra discovery.

This helps to better understand the internal dependencies of applications and the roadblocks in the migration process. Both static and dynamic analyzers should be used together with the infra discovery tool to have a fail-proof migration.

As static analyzers help understand the components of applications and their dependencies on 3rd-party applications, it helps analyze the impact of re-platforming or refactoring the application. This is where AI and ML can be used in conjunction with these mechanisms to get a better understanding.

The ML and AI journey to cloud

With Artificial Intelligence and Machine Learning (AI/ML) in cloud becoming mainstream, organizations are able to overcome these challenges. AI/ML automatically generate insights from data. From predictive maintenance in manufacturing plants and fraud detection in financial services to accelerating scientific discovery, businesses of all types can benefit from this technology.

This has also given rise to applications such as chatbots, virtual assistants, and search engines that rival human interaction capabilities. As the dynamic and complex business environments of the modern times require a shift to data-driven decision making, there is a growing demand for robust, lineage, governance, and risk mitigation tactics.

ID2C Changing the game of data discovery

ID2C is TCS proprietary ML-driven tool, which combines discovery tool and static analyzers outputs along with other available data and intelligently deduces technology stack and dependencies to derive more value. This enables accurate identification of a variety of different technologies from different vendors even when they are seemingly disconnected. TCS AWS business unit conducts assessment projects worth $5M every year while influencing more than $100M foundation, migration, and operations projects.

AI/ML-driven data discovery combined with anomaly detection is a critical aspect of big data and cloud cost optimization and has the potential to save enterprises significant amounts of money. So why did we create an artificial intelligence-based platform for enhanced data discovery? Benefits include:

As cloud native transformations are being increasingly sought after, TCS ID2C tool built on AWS cloud helps enterprises in their cloud journey by helping understand the on-premise environment better and thereby derives correct strategies to transform their application portfolio now and in the future.

Author Bio

TCS

Ph: +91 9731397076

E-mail: Guruprasad.kambaloor@tcs.com

Guruprasad Kambaloor works as a Chief Architect in the AWSBU division of TCS. Guru has a rich experience of 26+ years in the IT industry spanning many domains like Healthcare, Life Sciences, E&R, Banking, and multiple technologies like Cloud, IoT, Blockchain, Quantum Computing. Currently he heads the Platform Engineering for AWSBU which has built platforms like Cloud Counsel, Cloud Mason, Migration Factory, Exponence to name a few. His current interests are AI/ML, Quantum Computing, and its relevance/usage in Cloud.

To learn more, visit us here.

Continue reading here:
Cloud Assessment: Clarifying the Vision, Transforming the Organization - CIO

NTT DATA invests globally in regional centres of R&D to reinforce and expand existing lab network PCR – PCR-online.biz

NTT DATA plans to invest in six global innovation centres this August, extending existing centres of excellence and expanding its network of labs. Each region has specific technological focuses, with the European hubs focusing on quantum computing, cyber security, and the metaverse. The two new centres in Italy and Germany join the already established network of European labs including Epworth House in London, Milan, and its Living Lab in Barcelona.

NTT DATA is increasing its commitment to applied research and innovation in EMEA, China, the Americas, and South-East Asia. It is broadening technological horizons from genomics research in Japan to LiDAR in China and smart city planning in North America. A lateral focus on regions from the United States to China and Japan allows instantaneous sharing of information and an end to asynchronous, siloed scientific efforts.

Aimed at concentrating resources on cutting-edge technologies with the potential to become mainstream in five to ten years, NTT DATA is generating new business through joint R&D projects in strategic locations. It is prioritising areas with a high sensitivity to the latest technologies and thriving, innovative technological systems. Joint programmes with leading companies, universities, and start-ups will enable the Innovation Centres to accumulate information on advanced technologies in their region. Furthermore, NTT DATA aims to expand its network of experts to 300 by the end of the 2025 fiscal year, growing its world-leading capabilities and offering of unique expertise on emerging technologies such as quantum computing and the metaverse. The viability of this investment is validated by the successes of the pre-existing European lab network.

Early successes of these labs have included advancements in digital twin and quantum computing. This involves using the new computational paradigms inherent in quantum machinery to help partners find optimal combinations of millions of possible options and model increasingly complex financial scenarios. Digital twin technology has led to the 3D digitalisation of cities likeLas VegasandRome,improving public safety and tackling pollution.

Since their launch in 2019, our UK labs in London have continued to invest in emerging technologies. This includes the recent material applications of digital twin computing through the use ofshot-tracking technologyat 150thThe Open in collaboration with The R&A, partnerships with Great Ormond St. HospitalsDRIVE(Digital Research, Informatics and Virtual Environments) lab, and a wide range of start-ups and scaleups in emerging technologies from virtual and augmented reality to applications of machine learning.

Tom Winstanley, CTO and Head of New Ventures atNTT DATA UK&I, said:Its fantastic to see that innovation has no borders. NTT DATA is extending its reach globally in almost every major landmass on the globe. This signals great things for Europe and the UK as we collaborate on research with our colleagues in Japan, China, and the US, as well as intensifying the European lab network.

This investment increases our opportunities to work with academia and tap into the innovation ecosystem in this region. We create value for our clients by using or creating the latest technology, and that starts in the lab. Good research deserves to be rewarded, and Im delighted that NTT DATA is committing to the material applications of these exciting new technologies.

Early successes in quantum computing and AI show the depth of our experts knowledge and hands-on capabilities. I look forward to the new wave of globally informed digital solutions and innovation that our lab network can produce going forward.

Read the latest edition of PCRs monthly magazine below:

Like this content? Sign up for thefree PCR Daily Digestemail service to get the latest tech news straight to your inbox. You can also follow PCR onTwitterandFacebook.

Continue reading here:
NTT DATA invests globally in regional centres of R&D to reinforce and expand existing lab network PCR - PCR-online.biz

What technology will there be in a hundred years? – Morning Express

IBMs quantum computer, Q System One.HOLGER MUENCH

It is impossible to know for sure what technology will be available in a century, but we can anticipate an overview if we take a look at the advances that are taking place in the three main areas of ICT (Information and Communications Technology) : the hardware (devices and machines), communications (wired and mobile networks) and software (services and applications). These three fields go hand in hand, although the hardware rule the evolutionary race.

Today, we are still using transistor technology that imposes limits on processing power and data storage. In fact, in recent years, the rate of increase in storage volume and processing speed that can be achieved in ever smaller devices has slowed. Quantum computers promise to overcome all these limits. I imagine that in 100 years there will be genuine quantum computers. Then it will be possible to quickly carry out operations that now take years or that cannot be done directly with a conventional computer. Quantum computing is still a chimera and we should not believe everything that is said about it. Companies like Google and IBM have their own versions of quantum computers, but their papers are essentially marketing and his experiments are not always verified. What seems indisputable is that this is the future, so universities are taking this seriously, a lot of research is being done and courses are being taught to undergraduates.

Second, we have communications. Here the great challenge is to connect people and objects of all kinds at high speed. The great revolution that we have experienced with mobile communications has been spectacular. We have not yet finished implementing 5G and we are already working on 6G. We are close to achieving a response time of less than a millisecond and further expanding the bandwidth, already close to gigabytes, which in a hundred years will be on the order of Teras or more. With 6G will come the integration of artificial intelligence and image processing natively as part of the mobile network, which will multiply the massive data transmission capacity and we will do it in a sustainable way, with low emissions, better than now. Some experts believe that in the future, people will be implanted with processors of some kind to remotely and wirelessly monitor, for example, our health.

All this allows to advance in the third leg that is the software. Because if you manage to connect two objects or a person with an object in milliseconds, you will be able to do things like operate remotely with a robot, control any object or robot from your mobile, or improve the interaction of autonomous vehicles with the environment. If we talk about the development software and applications, when quantum computing becomes a reality, all operating systems and applications will have to be reprogrammed to adapt to new computers and human-machine interfaces will change. Through augmented reality we will display a virtual screen on a wall or in the air, that is, a support will not be necessary. This means that in 100 years the real world and the virtual world will be one. We will see the real objects, but with more information that augmented reality will give you and you will be able to interact with them just by moving your hands with very simple gestures, with glasses or with a cap. It is what people demand, simple solutions to control machines.

As for programming, the current trend is for girls and boys to learn programming from an early age. That seed that we are sowing is going to germinate and these girls and boys are going to be much more capable of developing applications without having to study a computer science degree. In other words, the ability to program will be a basic skill for most, or many, people. This will mean that people will be able to program applications to suit them, for their business or for leisure.

With the evolution of these three technological legs that I am talking about, we will have the possibility of having applications that will greatly develop some fields. An obvious one is telemedicine: we will have virtual family doctors, who will diagnose with the help of artificial intelligence (AI). And speaking of AI, there are many problems that are still intractable because we dont have enough computing power, but when quantum computing becomes a fact, it will also take off AI beyond current limits. Home automation will also advance a lot. We will have robots at home, not necessarily with a human appearance, but they will be the ones that take care of us and have us constantly monitored. This will not be so difficult, since, in the end, a person who takes care of, for example, an elderly person, what he does is watch him, see if he is sick, if he needs to eat, help him if he has a problem and if he cannot solve it, call who can do it. These care robots will do the same, but with greater safety and more skills.

Lydia Fuentes Fernandez She is a doctor in computer science engineering and professor of Telematics Engineering at the University of Malaga.

Question sent via email by Hever Galindo Sandoval Chavez

Coordination and drafting: victory bull

we answer is a weekly scientific consultation, sponsored by the Dr. Antoni Esteve Foundation and the program LOral-Unesco For Women in Science, which answers readers questions about science and technology. They are scientists and technologists, partners of AMIT (Association of Women Researchers and Technologists), which answer those questions. Send your questions to usrespondemos@gmail.com or on Twitter #nosotrasrespondemos.

You can follow MATTER in Facebook, Twitter and Instagramor sign up here to receive our weekly newsletter.

Read the original post:
What technology will there be in a hundred years? - Morning Express

How reality gets in the way of quantum computing hype – VentureBeat

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.

Baidu is the latest entrant in the quantum computing race, which has been ongoing for years among both big tech and startups. Nevertheless, quantum computing may face a trough of disillusionment as practical applications remain far from reality.

Last week, Baidu unveiled its first quantum computer, coined Qian Shi, as well as what it claimed is the worlds first all-platform integration solution, called Liang Xi. The quantum computer is based on superconducting qubits, which is one of the first types of qubits, among many techniques that have been investigated, that became widely adopted, most notably in the quantum computer which Google used to proclaim quantum supremacy.

Qian Shi has a computing power of 10 high-fidelity qubits. High fidelity refers to low error rates. According to the Department of Energys Office of Science, once the error rate is less than a certain threshold i.e., about 1% quantum error correction can, in theory, reduce it even further. Beating this threshold is a milestone for any qubit technology, according to the DOEs report.

Further, Baidu said it has also completed the design of a 36-qubit chip with couplers, which offers a way to reduce errors. Baidu said its quantum computer integrates both hardware, software and applications. The software-hardware integration allows access to quantum chips via mobile, PC and the cloud.

MetaBeat 2022

MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.

Moreover, Liang Xi, Baidu claims, can be plugged into both its own and third-party quantum computers. This may include quantum chips built on other technologies, with Baidu giving a trapped ion device developed by the Chinese Academy of Sciences as an example.

With Qian Shi and Liang Xi, users can create quantum algorithms and use quantum computing power without developing their own quantum hardware, control systems or programming languages, said Runyao Duan, director of the Institute for Quantum Computing at Baidu Research. Baidus innovations make it possible to access quantum computing anytime and anywhere, even via smartphone. Baidus platform is also instantly compatible with a wide range of quantum chips.

Despite Baidus claim of being the worlds first such solution, the Liang Xi platform is reminiscent of Israels Innovation Authority approach, which is also aimed at being compatible with various types of qubits.

Although this is Baidus first quantum computer, the company has already submitted over 200 patents throughout the last four years since the founding of its quantum computing research institute. The patents span various areas of research including quantum algorithms and applications, communications and networks, encryption and security, error correction, architecture, measurement and control and chip design.

Baidu claims its offering paves the way for the industrialization of quantum computing, making it the latest company to make grandiose claims about quantum computing being on the verge of widespread adoption. Some quantum startups have already amassed staggering valuations of over $1 billion.

However, real applications for quantum computers, besides encryption, have yet to emerge. And even if they do, its expected that those will require thousands, which is far from what has anyone yet been able to achieve. For example, this scalability concern led Intel to stop pursuing the popular superconducting qubit approach in favor of the less mature silicon and silicon-germanium qubits, which are based on transistor-like structures that can be manufactured using traditional semiconductor equipment.

Nevertheless, voices are already emerging to warn of overhyping the technology. In the words of the Gartner Hype Cycle, this may mean that quantum computing may approach its trough of disillusionment.

The other main challenge in quantum computing is that real qubits tend to be too noisy, leading to decoherence This leads to the necessity of using quantum error correction, which increases the number of qubits far above the theoretical minimum for a given application. A solution called noisy intermediate scale quantum (NISQ) has been proposed as a sort of midway, but its success has yet to be shown.

The history of classical computers is filled with examples of applications that the technology enabled that had never been thought of beforehand. This makes it tempting to think that quantum computing may similarly revolutionize civilization. However, most approaches for qubits currently rely on near-absolute zero temperature. This inherent barrier implies quantum computing may remain limited to enterprises.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

See more here:
How reality gets in the way of quantum computing hype - VentureBeat