‘India wants to be a powerhouse in the world of quantum skills’

In an interview during his recent visit to India, Gill said that he also met some government officials, including Minister of State for Electronics and Information Technology Rajiv Chandrasekhar, with whom he shared how IBM helped build a national quantum plan in India. Can do. Gill also explained how enterprises and governments can benefit from technologies such as hybrid cloud, edge computing, quantum computing and shared his thoughts on Web3. Edited excerpt:

IBM underscored its commitment to developing a quantum ready workforce and building an ecosystem to nurture the community in India nearly five years ago. What is progress?

We have made tremendous progress and in fact, this was one of the main aspects of his discussions with the Minister (Rajeev Chandrasekhar) that I had. They intend to ensure that India is a powerhouse in the world of quantum prowess and quantum technologies. In this context, access to technology is important. That’s why we are committed to the open source environment – the most widely used worldwide is Kiskit. We are seeing tremendous acceptance in terms of advocates and quantum ambassadors here in India, and we are currently in several talks with various IITs (Indian Institutes of Technology) and major centers of training to develop a curriculum, and certification. Qiskit (for learning quantum computation) textbook is now available in Tamil, Bengali and Hindi also. We’re going to run a lot of workshops and a lot of programs around it. I think there is a tremendous opportunity and part of our commitment is for Quantum to find a way to develop these broad-based skills and talent programming in India.

What progress has been made on quantum computers, and how do they compare to supercomputers at present?

Most computation in CPU (Central Processing Unit) or Accelerator (GPU or Gaming Processor Unit), or AI will continue to run on classical computers, but there are a number of significant problems that are very suitable for quantum computers. One of them is the dimension of simulation and modeling of our world.
It turns out that there are also very important mathematical problems that are suitable for quantum computers, such as cryptography and factoring. Blockchain, crypto and other such technologies will have to adapt and change due to the advancement of quantum.

We have over 180 institutions that are part of the IBM Quantum Network and they include some of the world’s largest corporations from the financial sector such as Goldman Sachs and JP Morgan Chase and Wells Fargo, Mizuho Bank, and others such as Daimler and even larger. Energy companies in the oil and gas sector, and some materials companies. There is also a great appetite among the students at the university and the research laboratories it participates in.

But when will the world get to see a stable quantum computer that will work around current limitations such as noise leading to high error rates, interference, etc.?

We already have quantum computers, but as you rightly pointed out, they have their limits. We still haven’t crossed the quantum advantage limit (the so-called quantum advantage or quantum supremacy is the point when quantum systems perform tasks that today’s classical computers cannot) but they are quantum computers nonetheless. We have built over 30 of them over the past 4-5 years, of which over 20 quantum computers are currently active and IBM is providing access to them through IBM Cloud. Every day we run three and a half billion quantum circuits, which run on real quantum hardware.

The roadmap we shared is that in the first year we will build a 100 qubit quantum computer, this year, we are going to build a 433 qubit machine and the next year, a machine with more than 1000 qubits (a quantum computer has a quantum consists of bits or qubits that can simultaneously encode a one and a zero. This property allows them to process much more information than conventional computers and at unimaginable speeds.)

The error rate of qubits is also improving tremendously (we can get from 10 to minus 4 error rate). And the techniques we use for algorithms and software — error mitigation and error correction — are improving as well. If you combine all of these, (and) if you want to be conservative, we’re going to see quantum gains in this decade.

What is the roadmap for quantum computing?

We’ve seen AI-centric or GPU-focused supercomputers, and we’re definitely going to see quantum-focused supercomputers. This is how it can work. Imagine a quantum computer with hundreds or thousands of qubits with a single cryostat (heat creates error in the qubits so they need to be cooled to near absolute zero in a device called a cryostat that contains liquid helium), and now Imagine a quantum data center with multiple cryostats in one data center.
You can build a data center that has thousands or even thousands of qubits but in the first generation the connection between these different cryostats is classical. If you are smart enough to take a problem and partition the problem in such a way that you can run parallel workloads in quantum machines and then connect and affix them classically, you can still get into a classical piece. incur an exponential cost but still can get a good answer.
The next step is to combine the fields of quantum communication and quantum computing. It’s a roadmap in the next 10-20 years, but we’ll see quantum supercomputers and they’re going to work in tandem with current supercomputers.

I would now like to debate how hybrid cloud has been adopted in enterprises, and how it has evolved from a market and research perspective.

From a market perspective, if you look at any kind of medium-sized or large-scale business (from the lens of the market), this is the reality (of the hybrid cloud). Simply put, the question is how to make the hybrid cloud strategy work and continue to modernize the infrastructure so that workloads and processes can run optimally on it. This is why the open-source component and acquisition of Red Hat was so important – having an operating system based on Linux and a container architecture based on Kubernetes. This is an annual market opportunity of over $1 trillion for us through IBM Consulting to provide the middleware, infrastructure and the right skills to help our clients work and succeed in that environment.

From a computer science lens, we have seen the immense importance of edge computing and if you look beyond, you will also see the heterogeneous nature of architectures based on microprocessor-centric architectures such as AI accelerator-centric architectures and quantum-centric architectures. Future. Therefore, it is important to build a very heterogeneous, very distributed, computational environment and ensure that it is designed and works properly.

Big data may be important when it comes to AI, but a lot of effort is being made to do a lot with less data.

Yes this is true. One extreme is the story of how you learn from large amounts of data – we’re talking about leveraging advances in self-supervision to be able to train large fundamental models, and a good example is NLP (Natural Language Processing) But the challenge our customers face with AI is that the data science part of it – the data labeling and training pipeline consumes 90% of the resources and (also consumes) a lot of time. Therefore, whatever we can do to reduce it is extremely important. Then there’s another vector – how do you learn naturally with a minimum of examples, with some little learning, and so on? This is an area where we invest a lot.

Semiconductors are another important part of IBM Research. In May 2021, IBM announced that its second generation of nanosheet technology had paved the way for a 2 nm node. Please explain the importance of this development.

The topic of semiconductors has become a national and international priority today. I meet government leaders around the world and now, politicians and citizens are realizing the importance of semiconductors because they (semiconductors) are in literally everything – cars, refrigerators, phones and computers. The semiconductor industry is a half-trillion dollar industry. By all accounts, it is going to double in size in the coming decade. To enable that growth, innovation and manufacturing capability must go hand in hand.
IBM plays a central role on the innovation side in creating new technology that enables manufacturers to bring that capability to the world at large. As an example, last year’s announcement on 2 nanometer technology is incredibly exciting because almost nothing is more impressive than the next generation of transistors (allowing a chip to fit up to 50 billion transistors in one space). which is the size of a fingernail). We recently (in December 2021) announced the Vertical Field Effect Transistor, or VTFET—a design that aims to enable smaller, more powerful, and energy-efficient devices. Of course, we also use our expertise in semiconductor technology to build quantum computing.

What is your role as a member of the National Science Board?

The National Science Board is the entity that is the governing board of the National Science Foundation (NSF) of the United States of America. It is a very important part of all basic science work. The characteristic of that funding is that it is driven by curiosity, and not by application. It is about pushing the boundaries of mathematics, physics, chemistry and biology, and it is extremely important that we, as a society, defend and support the need for that kind of discovery. Without that investment, it (discovery) takes decades.

Before we conclude, I’d love to have your thoughts on Web 3.0 and the Metaverse — two buzzwords that are currently taking the industry by storm.

I love to see the foundations of these areas. On Web 3.0, it’s back to the story of computer science. It’s about how we actually build the next generation of distributed computational environments. We touched on this lens from a hybrid cloud perspective. But it complements it because it’s about: How do you create a web architecture and a network architecture that is inherently distributed by design, in this case? This requires thinking about a lot of fundamentals – from the security dimension to the semantic nature of the relationships around it. The previous version (of the web) was about interactivity. Now it’s also about how we bring sensors together and the fact that we have computers everywhere, and humans are interacting with it. So, it’s the next generation architecture that I think is fundamental.

As far as the metaverse is concerned, I’m probably not the most qualified person to talk about it,[but]obviously it will be a hugely important way to expand the way we entertain and collaborate. (But) I would really like technology to also be oriented towards a wider set of problems (solutions).

subscribe to mint newspaper

, Enter a valid email

, Thank you for subscribing to our newsletter!


download
The app will get 14 days of unlimited access to Mint Premium absolutely free!