Quantum computing

Utility-scale quantum computing — advances and future challenges

Posted on

At IRPS 2024, Dr. Rajeev Malik, Program Director, System Development, Benchmarking & Deployment, IBM Quantum, USA, presented another keynote on utility-scale quantum computing — advances and future challenges. He said quantum timeline started in 1970s with Charles Henry Bennett. It has been advancing since. The field has continued to grow. About five years ago, we introduced Quantum System 1.

Quantum computers are the only novel hardware that changes the game. We need to solve several hard problems. Factoring is involved here. The value of quantum computing becomes apparent as problems scale. They can’t be handled by classical computers. That’s the goal!

Dr. Rajeev Malik.

We are seeing increasing utility in quantum computing. Any quantum computer can have errors. We are doing error correction, with increasing circuit complexity. We have scale + quality + speed, as performance metrics. These are the three key metrics for delivering quantum computing performance.

Scale involves the number of qubits. Quality is measured as error per layered gate. Quantum computers are noisy, and need to lower the error rates of 2 qubit gates (<0.1 percent). Speed is calcutated in circuit layers per second. We need individual gate operations to complete in <1us range to have reasonable runtimes for real workloads. IBM is striving to bring useful quantum computing to the world, and make it quantum safe.

IBM quantum platform
He talked about IBM quantum platform. We are working on designing the processor, to how users are going to use the system. Qiskit is software development kit, followed by tools that run systems, and users’ work.

A quantum circuit is fundamental unit of quantum computation. A quantum circuit is a computational routine consisting of coherent quantum operations on quantum data, such as qubits, and concurrent real-time classical computation. Relevant problems need 100s and 1000s of qubits and 1 million gates or more to solve.

IBM Quantum Network is mapping interesting problems to quantum circuits. The IBM Quantum Development Roadmap is to run quantum circuits faster on quantum hardware and software. Different things are being used to build qubits. These include photons, ions, solid-state defects, nanowires, neutral atoms, superconducting circuits, etc.

An IBM quantum system has dilution refrigerator with cryogenic interconnects and components, including the processor. It has custom third generation room temperature electronics for control and readout. Classical co-compute servers can enable the qiskit runtime to execute use case workloads efficiently.

IBM quantum roadmap includes demonstrate quantum- centric supercomputing in 2025. We can scale quantim computers by 2027. In 2029, we can deliver fully error-corrected system. In 2030+, we can deliver quantum-centric supercomputers with 1,000s of logical qubits. Beyond 2033, quantum-centric supercomputers will include thousands of qubits capable of running 1 billion gates, unlocking the full power of quantum computing.

From Falcon with 27 qubits in 2019, IBM has moved Osprey with 433 qubits in 2022. We are scaling QPUs, with Osprey using scalable I/O. IBM’s advanced device stackup includes interposer chip, wiring plane, readout plane, and qubit plane. Lot of technologies are borrowed from semiconductors. In 2023, IBM released Condor, pushing the limits of scale and yield with 1,121 qubits. IBM also released Heron with 133 qubits, with I/O complexity on par with Osprey. IBM aims to extend Heron to Flamingo and Crossbill via modular coupling.

Accelerated timescale
We are now upgrading everything, and development is on an accelerated timescale. It is for processors, connectors, control electronics that are getting updated every 12-18 months. Density, power, and cost remain key focus areas. There is migration from discrete to integrated on-board components, and control electronics to 4K temperatures or cryo CMOS. We have longer reliability of cryogenic components. We are looking at predictability, availability, and stability of the system. We have achieved system uptime of over 97 percent, and over 90 percent available to run jobs.

We have deployments in Fraunhofer, Germany, University of Tokyo, Cleveland Clinic, PINQ2 Canada, and pending at Yonsei University, Seoul, Riken, Japan, Iqerbasque, Spain, etc., and 40 innovation centers. We need a disruptive change to unlock potential of quantum computation. We have multiple systems in place by H2-23, with more to come.

Integration of disruptive technologies, HPC systems needed

Posted on

ETP4HPC organized a conference on emerging technologies for HPC in Europe. The speakers were Prof. Dr. Estela Suarez, Research Group Leader, Jülich Supercomputing Centre, and Prof. Dr. Kristel Michielsen, Group Leader, Research Group, Quantum Information Processing, Jülich Supercomputing Centre, Germany.

They first looked at HPC system architecture aspects and managing heterogeneity. How are we managing heterogeneity in HPC? Causes for this trend include hardware, where, we need larger, more energy-efficient systems, and applications, where, we can combine different codes into complex workflows.

Heterogeneity is everywhere! This includes processing, memory, network, and paradigms — be it classical, neuromorphic and quantum computing. Traditional barriers are dissolving. Computing is not only in-node processors (CPU, GPU). We are computing in network with DPUs and iPUs. We also have processing in memory, and network-attached memory.

System architecture can enable integration of new technologies. Resource management has dynamic orchestration and malleability, and is key to achieve effective resource utilization. Software stack looks at master hardware, and fosters performance portability. We have standardized programming models, smart compilers, runtime and workflow systems, and debugging and performance tools.

Applications have challenges, but opens new opportunities. There can be novel workflows, and novel mathematical formulations. New application features are combining HPC and AI. Heterogeneity must be tackled by a coordinated effort at all levels.

We can build a modular supercomputing architecture (MSA). MSA has been evolving since 2011, with Deep Projects. We are now moving to the Exascale Jupiter system in 2024.

We are looking at heterogeneity at the chip-level. We can integrate the different dies (chiplets) in the same package at lower cost, facilitate diversity with chiplets chosen based on customer, and have short connections with lower latency, better bandwidth, and lower power consumption.

Integration of disruptive technologies needed
System complexity grows with heterogeneity. It is hard to predict the performance of a given application. There are dependencies understanding for hardware/software co-design. Energy efficiency is as important as performance.

Qualitative and quantitative evaluation of HPC systems are needed. These include large (Exa-)scale system modelling, end-to-end performance models of applications based on system metrics, and using the accelerators and techniques like ML/AI. Research is needed in heterogeneous hardware and application modelling.

There is the integration of disruptive technologies. The end of Moore’s Law calls for more disruptive solutions, such as ASIC-based solutions, neuromorphic computing, and quantum computing. A sudden, full exchange of technologies is not feasible. Not every application is suitable for these approaches. Even those that are, would need to be rewritten

Integration with ‘traditional’ HPC technologies is also needed. Applications can run on HPC system and offload suitable functions to disruptive devices. There is gradual, step-by-step adapting to innovative devices. Integration requires both hardware and software solutions.

Neuromorphic computing presents scenarios such as intelligent edge co-processors for distributed cross-platform edge-cloud-HPC workflows. AI inference and training is at the edge. Also, data movement is minimized.

Another is datacenter co-processors / accelerators for AI / ML training and inference at scale. We can have inference for HPC-AI hybrid workloads, and training for AI algorithm (spiking neural network, back-propagation).

In quantum computing, we can have integration of quantum computers and simulators (QCS) in HPC systems on a system level, with loose and tight models. It can also be in programming level with full hardware-software stack, and application level with optimization, quantum chemistry and quantum ML.

There can be application-centric benchmarking, with test for the algorithm, the software stack and the technology. We can also do emulation of QCS with HPC systems. It can lead to ideal and realistic QCS, and designing, analyzing, and benchmarking QCS and quantum algorithms.

Heterogeneity in HPC
There are several challenges. It is hard to efficiently share resources, and maximize utilization. It is hard to identify sources of performance loss and optimization opportunities. We also have to maintain performance portability. It is still difficult to understand and predict performance.

We also have several opportunities. There can be better energy efficiency, ability to select ideal hardware for each part of the application, wider range of providers and technologies, away from monopolistic scenarios, and development and integration of disruptive technologies with potentially better performance and energy efficiency. We can also see the real impact of co-design.

Quantum computing: From hype to game changer!

Posted on Updated on

At 2023 Symposium on VLSI Technology and Circuits, Japan, Hiroyuki Mizuno, Distinguished Researcher, Hitachi, Ltd and Fellow, IEEE, presented on quantum computing: From hype to game changer!

Investment in quantum computers is heating up. There is concern about the hype. Quantum computers have experienced a “winter period” in the past. It happens when technological advances failed to keep pace. Strong public interest is a double-edged sword for development.

Markets are very complex. It is not only the consumer market. Investment heating does not necessarily mean that it will immediately deliver value to the consumers or society. The gap between investment and return to consumers results in hype.

Derivative services created by the development of quantum computers, such as quantum-inspired annealing. The currently available services include noisy intermediate-scale quantum computer (NISQ), intermediate-scale quantum annealing, and quantum-inspired annealing. The goals are gate-type QC and quantum annealing.

Quantum-inspired annealing (CMOS annealing) is a classical computing method that uses the mechanism used in quantum annealing. Quantum computers can efficiently simulate quantum behavior, while current classical computers are limited in their ability. As size of the problem increases, computational resources requirements increase exponentially.

The goal is to build a fault-tolerant quantum computer (FTQC). It needs a large number of qubits, as well as the fidelity of qubit operation, and a quantum error correction mechanism. The challenge is: that the path to the goal has not yet been drawn. No solid device structure has yet been fixed for which we can say this is the answer. The same is true for architecture.

Path to FTQC includes silicon qubits that inherently enable a large number of qubits to be implemented on a single chip. At present, QuTech’s 6-qubit implementation and operation is the champion. Many innovations (new approaches) are needed to achieve the goal in the future.

NISQ devices will be useful tools for exploring many-body quantum physics, and may have other useful applications. However, the 100-qubit quantum computer will not change the world right away. Quantum technologists should continue to strive for more accurate quantum gates and, eventually, fully fault-tolerant quantum computing.

Riken has successfully demonstrated the fundamental properties of three-qubit quantum error correction in silicon. The challenge is to scale this operation in a small number of qubits to work with a large number of qubits.

We are taking a “top-down” approach to maximize the use of silicon IC technology. Firstly, we integrate a large number of qubits, and then improve the “quality” of these qubits with various technologies (big data analysis, digital twin, etc.).

Next milestone
What is the next milestone for silicon quantum computers? Array structure is a specialty of silicon semiconductor technology, which is commonly used. The next milestone is to form qubit array structure, and then perform the initialization, operation, and readout of the qubits in it.

QCMOS process has been developed, where the walls of the quantum dots are formed by MOSs, and quantum dots are formed by self-aligned MOSs formed between the MOSs. Mature semiconductor technology of about 65nm can achieve the proper distance between qubits. Qubit arrays and CMOS circuits can be integrated on a single chip.

Initialization of large numbers of qubits is required. In our approach, we use a more tolerant one using a single electron pumping technique that store electrons one by one. Even if a quantum dot allows multiple electrons to enter, this method allows only one electron to enter every quantum dot.

Crosstalk between the qubit to be operated and the neighboring qubits is a major concern. Our idea is to separate the electron from the quantum dot, meaning a “qubit” can move (shuttle), or shuttling qubit.

For readout, high-sensitivity high-frequency reflectometers are widely used. Challenge is that it is difficult to support large-scale qubit integration. nA sensitivity integrated sensor enables qubit array readout like a SRAM/DRAM sense amplifier. We are also using cryo-CMOS circuits. Implements high-precision control system inside the dilution refrigerator — two types of chips. We are currently in the process of evaluating in detail. In our approach, a quantum computer is a control system that controls the state of a large number of qubits, i.e., a cyber-physical system.

A quantum operating system (OS) has a model (digital twin) of the qubits to be controlled and their operating environment, and provides feedback control. It provides execution and development environments for different kinds of users: system O&M, including quantum physicists, algorithm researchers, and application developers.

GPT-3 is a prediction machine for the next token. It requires pre-training of the model, which contains about 175 billion parameters, with about 45-TB text data. 3640 petaflop/s-day computing power is required, which occupies Google’s 10-PF 3rd generation TPU Pod for about 37 days. Along with huge amounts of data, outstanding computing power is the key.

More accurate and efficient predictions are made by combining deductive and inductive methods. Software systems suitable for quantum computers are still under development. The key point is that: it is a different system from classical computers.

We tend to do “replacement of the classical by the quantum,” to try to utilize the past as much as possible. We should not only think about improvement of the assets developed for classical computers, but, we need to think different for quantum computers.

European initiative of building federated quantum technology pilot line

Posted on

European initiative of building a federated quantum technology pilot line in Europe and its strategic importance was presented by Mika Prunnila, Prof., VTT Technical Research Centre of Finland Ltd.

Qu-Pilot line is now being built across Europe. There is a significant gap between hardware R&D and commercialization. Piloting are significant investments. European pilot infrastructure is essential for future success of European companies in quantum race.

Qu-Pilot FPA was formed and accepted by EC. SGA1 kick-off will be in April 2023. EC grant for SGA1 €19 million. There are 21 partners, 11 use case companies, eight RTOs, and two service providers. VTT is the coordinator. Qu-Pilot is connected to the quantum flagship. Qu-Pilot comprises of multiple platforms — superconducting, photonics, semiconducting, and diamond, respectively.

There is the quantum testbed network or Qu-Test. It will improve test facilities for the test and qualification of quantum devices. Test devices and sub-system provided by industry, and harmonize procedures and methodologies among the members of the consortium. There are 25 partners, 12 companies, 11 RTOs, and two service providers involved. TNO is the coordinator, with total budget of €19 million.

There are expected outcomes of the Qu-Pilot. Superconducting pilot line has 3D integrated quantum processors, qubit control, and a readout component pilot for upto 50 qubits. Photonics pilot line has hybrid integrated, quantum app-specific photonics ICs operating in visible to infrared wavelengths. Semiconducting pilot line has development of pilot fabrication of semiconductor quantum processors in range of <10 qubits.

Diamond pilot line has the increase in diamond growth capacity, and develop processes for pilot fabrication of nanofabricated diamond devices.

European quantum microsystem pilot-line development activity (Qu-Pilot) is starting. It contributes to gaining European sovereignty in deep-tech and the EU Chips Act. We must invest in quantum infrastructure in order to stay ahead in the global race of quantum chip.

Semiconductor industry: Beachheads that drive innovation!

Posted on

Semiconductor industry: Beachheads that drive the innovation, was presented by Guillaume Girardin, CTO, Yole Group, at ISS Europe 2023.

Semiconductors are in the headlines today! There is decline of semiconductor revenues by 2023. Main factors include weak demand, memory crash, inventory, Ukraine-Russia war, and uncertainty.

We are witnessing overcapacity production, lack of qualified resources (temporary and long-term). Logistics is still under pressure. Geopolitics and reshuffling of worldwide production site and supply chain changes are happening. Inflation (raw materials, energy, etc.) are probably stabilizing/declining.

We also have unbalanced demand (VE, data center, etc.). Semiconductor content per system is growing at slower pace than usual (people are more reluctant due to uncertainty). Cost of semiconductor is rising (Moore’s Law under pressure). Overall, the demand of end-systems is decreasing. Will 2023 be the bottom before new growth?

We are having the worst in 2023. Will growth be back in 2024? Yole estimated 2022 semiconductor total market growth of +2 percent. Inflation lasted more than expected, and demand in consumer weakened more than expected (smartphone fell below 1.2 billion) in H2-2022. Tensions across the supply chain resorbed better than expected. Demand bottom line should be reached by Q3-2023. We expected that 2023 will be a tough year.

Yole expects 2023 semiconductor total market decline of ~-9 percent. Increasing trade war/tensions between US and China, weak GDP growth, as well as inflation lasting more than anticipated, fuel moderate global economic recession. There is the weakening of consumer demand supporting 60 percent of semiconductors.

Memory market is crashing down since Q3-2022, and will be ongoing till Q3-2023. The power market is fighting the semiconductor recession, thanks to electrification. Processing market is surfing on a sluggish consumer market.

Global technology roadmap
Let us look at the global technology roadmap. Moore and beyond is moving from information to interaction and transformation. Beyond Moore is actuating the transformation age. We are witnessing the evolution of compound semiconductors.

There is movement to GaAs VCSEL for 3D sensing, and GaAs uLEDs. We are seeing growth of InP for handset sensing, LiDar, and InP-based RF components. In GaN, we have GaN handset PA, and GaN in EV. We are seeing SiC in 800V EV, and high-voltage apps. The compound semiconductors substrate market is estimated at $2,349 million by 2027. It includes power, RF, photonics, display and LED, and others.

Compound semiconductors
CS substrates, SiC n-type for power, SI for RF GaN, and InP and silicon for power GaN applications are expected to show double-digit CAGRs in 2021-27 period. While GaAs RF and LEDs represent stable markets, miniLED/microLED bring momentum to the industry.

There is also an opportunity for semiconductor packaging. Create business opportunity for advanced and traditional packaging platforms. Advanced packaging (WLPs, flip-chip, TSV) business opportunity is supported by AI/ML, mobile, AR/VR, 5G, smart automotive, etc. Traditional packaging (wire-bond lead-frame based) business opportunity is created by IoTs, Industry 4.0, smart automotive, etc. Mega trends like mobile, automotive, Industry 4.0, IoT, etc., require a variety of MEMS sensors that create opportunities for traditional and advanced packaging.

Megatrends create business for variety of ICs — from high-end xPUs to low-end discretes. These include CPUs, GPUs, SoCs, APUs, FPGAs, and ASICS, DSPs, and MCUs. There are opportunities in MEMS/sensors, power ICs/discretes, memory, optoelectronics, etc. Front-end scaling slows, but continues! Moore’s Law keeps going on, at a slower pace.

The high-end fab business continues growing (5nm and below). More of heterogeneous integration is required to support functionality, faster time to market, and low cost. More than Moore’s has seen resurgence of legacy fabs and increasing business. 8” wafer fabs and related tools demand will remain strong.

In the last two years, top packaging players have invested over one-third of the market value. In 2022, advanced packaging capex is expected to reach $16 billion, which is 26 percent higher than the previous year. It represents almost 44 percent of total advanced packaging revenues for the year. This represents a low-margin business for OSATs. So, there are a few questions to answer. Will this huge capex growth be sustainable in the next few years? Is there enough equipment to sustain these capex increases?

Silicon photonics
Now, let us look at the global trends and roadmap for Si photonics vs. InP. Pluggables, OBO, and CPO will co-exist for some time. SiPh provides pluggables with advancements via integration. CPO will leverage SiPh to further increase the levels of integration for optical engines. 224G SerDes speed will be challenging for pluggable optics. It is very difficult to continue with pluggable form factor for 3.2T optical module.

Silicon photonics die market is expected to reach $740 million by 2027, from $152 million in 2021. They can be used for data center transceivers, Long-haul transceivers, 5G transceivers, co-packaged engines, immunoassay, fiber-optic gyroscope, automotive LiDar, photonic processing, and optical interconnects.

Photonics and quantum computing
We are witnessing that photonics and quantum computing need each other. Photonics will be of great importance to the development of quantum technologies, as lasers and other photonic devices are used for trapped ion/photon/neutral atom technologies. By 2030, we will have megatrends such as 6G, autonomous vehicles, moon/Mars society, quantum, space, AR/VR, robotics, green technology, etc.

Semiconductor market value should not reach the trillion-dollar market value by 2030. Right now, that is very unlikely. However, consumer, automotive and data center markets might be the three pillars that might support the growth for +$800 billion milestone by 2030.

While demand is weakening in 2023 due to market correction, and economical factors, Yole estimates that opportunities across all the semiconductor supply chain will happen over the next 10 years. Multiple technologies are supporting this, from emerging to maturing ones. They will reshape most of the industry within 10 years. From compound semiconductor, silicon photonics, advanced packaging for future computing, quantum technologies maturing and seeking for hybridization, the dawn of new technologies disruption is on its way.


Europe’s semiconductor growth: Drivers and opportunities
Europe’s semiconductor growth: Drivers and opportunities, was presented by Ms. Rebecca Dobson, Corporate VP, EMEA, Cadence, at ISS Europe 2023.

Generational drivers are pushing semiconductor growth and expanding the market. These are around 5G, autonomous vehicles, hyperscale computing, AI/ML, and IIoT. Data is driving a silicon renaissance. 165ZB data will be generated this year. There will be 5,000 percent increase in data interactions over the past decade. This includes >80 percent unstructured data. Also, 2 percent less data will be analyzed!

There is growing semiconductor content in systems. Systemic growth drivers are providing long-term tailwinds. These include More than Moore, design complexity, system companies building silicon, More Moore, domain-specific architectures, 3D-ICs, digital transformation, and shift-left paradigm.

There are several opportunities for Europe. Europe’s strengths have been established and stable global players, experienced and mature workforce, strengths in automotive, 5G, Industry 4.0, and the European CHIPS Act.

Europe is now driving semiconductor design growth. We are seeing increasing scale of investment, Europe is taking a market-leading position in autonomous vehicles, and is attracting and retaining top talent.

There are challenges to support the European startup ecosystem. Seed and A round funds are often small. European startups are often slow in scaling. There is lack of major players investing in startups, mentorship and $. Many are acquired by non-EU companies. More rigor through legislation on appropriate acquisitions is needed.

Europe taking a market leading position in autonomous vehicles. This includes growth in infotainment, automotive Ethernet, ECU, ADAS, and functional safety. Europe is also investing and retaining top talent. It is stimulating interest in semiconductors. Government investment in schools and university programs is ongoing. There is ecosystem expansion and collaboration, and diversity in hiring requirements.

Europe needs to be harnessing opportunities for growth. Collaboration is the key!

Superconducting quantum computers and synergies with semiconductor industry

Posted on Updated on

Dr. Kuan Tan, CTO and Co-founder, IQM, presented the superconducting quantum computers and synergies with the semiconductor industry at the Future of Computing session, Semicon Europa 2022.

There is a class of commercial problems that can never be solved with classical computers. With quantum computers, these can tackle intractable use cases with speed and efficiency, creating new industries of tomorrow.

Quantum computing will partly replace the $40 billion market. 76 percent of global HPC centers will use quantum computing by 2023. HPC centers must scale their computing power to stay competitive. Next generation HPC hardware is too expensive, too large footprint, and consumes too much power. Even with the best next-gen HPCs, some problems are still intractable. Quantum computing is the most promising alternative to CPU and GPU clusters currently used in HPC centers.

Possible sustainability use cases include ammonia-based fertilizer, logistics intelligence improvement, energy grid optimization, and battery simulation. Espoo, Finland-based IQM is the European leader for quantum, with offices in Munich, Madrid, and Paris.

IQM has a business strategy to accelerate innovation. We are building full-stack systems and delivering them to HPC centers. We will re-invest the revenue to critical parts of the value chain.

IQM´s strategic value chain includes design of QC chipsets, their production, systema and HPC integration, and producing innovative quantum ASICs. We are covering the whole value stack that allows for unique hardware-software co-design. Over 50 patent families have been filed for strategic QC components and implementations to ensure FTO. IQM owns key IP for cryogenic control of quantum processors.

There is the strategic advantage of private quantum foundry in Europe. Among the drivers, quantum processors are too complex for public or university cleanrooms. There is low availability and chip shortage. We are also having efficient IP generation. As for advantages, it accelerates design and production cycle, guarantees production stability, provides know-how generation and possibility for foundry service, and re-inforces the in-house developed quantum design automation.

KQCircuits is IQM’s open-source quantum circuit design platform. There is seamless and automated superconducting circuit design based on primitive elements. Also, we have wafer-scale photomask generation support. Same technology is used by IQM to generate QPU. We have active development and support from IQM design and software teams.

There is a growing need of quantum computers in HPC centers, today. Very soon, we expect upside potential from the disruptive quantum application market.

Global HPC market TAM is estimated over €45 billion in 2024 and growing. EuroHPC alone is an €8 billion market. Only in Europe, 68 HPC projects were ongoing in 2021. 76 percent of global HPC centres will adopt quantum computing by 2023 (on-prem and cloud). 71 percent will invest in on-premises solutions by 2026. IDC forecasts worldwide quantum computing market to grow to $8.6 billion in 2027. This market might expand up to $1 trillion in total by 2035.

Quantum computing is here! Opportunities and synergies for closing the quantum innovation cycle are in quantum design automation, foundry business and services, system integrators and HPC, and disruptive potential in pharmaceuticals, optimization, chemistry, AI, financial applications, etc.

Satellite quantum key distribution moving to industrialization phase!

Posted on

European Photonics Industry Consortium (EPIC) recently hosted a meeting on satellite quantum communications.

Within the panorama of quantum technologies, quantum communications hold a prominent place. A satellite-based system is one of the most effective strategies to ensure quantum-secure key distribution at very long distances and the realization of a large scale-quantum network. Such technology pushes the photonics industry to its limits by incorporating state-of-the-art laser systems, optics and photonic detection schemes.

Euro QCI in place
Laurent de Forges de Parny, Quantum System Engineer, Thales Alenia Space, said they are satellite providers for the Thales Group. It is working on two projects — QKD and QIN. Europe is extremely concerned about safety of communications. It is initiator of Euro QCI or quantum communications infrastructure. SAGA is related to this project as a space bot driven by European Space Agency (ESA). Thales Alenia Space is involved in the quantum key distribution (QKD) system design and building block development, LEO and GEO QKD — prepare and measure, and entangled-based, and quantum information networks (QIN).

Thales is working on quantum cryptography and quantum Internet. One can use satellite to perform quantum distribution. For space-based QKD systems, we are encoding quantum information at single photon levels. We have a classical channel (bi-directional) to correct errors. We are using photonic chips to reduce satellite size. For space-based QKD and QIN systems, we are exploiting entangled photon pairs. We need pump laser source for systems, and other components. We are looking for partners to join our projects, including SMEs, especially, European providers. QKD chips are an interesting area for us. It is collaborating with French and other experts.

Quantum entanglement communications
Dr. Emmanuel Fretel, Project Manager, Aurea Technology, talked about quantum entanglement communications from ground to space. Aurea is an EU 27 and ESA-member deep-tech SME, with strong industrial and high-level quantum background,. It is a leading provider of QKD sub-systems for ground and space ultra-secure QKD communications. It has new entangled photon sources for space apps. In 2021, there was the largest QKD network in EU 27, paving way for Euro QCI, among Italy, Slovenia, and Croatia. The crow fly distance was up to 100km between cities.

SAGA mission has QCS with pan-European reach for ESA missions. Quantum communication link at 1550nm C band. We have used time-energy entangled photon source. TRL4 has been achieved and demonstrated to ESA. Full QKD test bench is on demand. There is ground and space for industrial QKD. Space has new entangled photon source at 1550nm (TRL4). We can improve the TRL, and perform an outdoor terrestrial and space demonstrations.

Some other players
Nicephore Nicolas, Business Development Manager, iXblue, noted that there are quantum solutions for satellite communications. This October 2022, ECA Group and iXblue joined forces and became eXail, a global industrial high-tech champion. iXblue has photonics solutions, from components to instruments. It has QKD that is continuous variable. Information is encoded in the amplitude and phase of laser pulses. For QKD, there is one specific amplitude moderator per wavelength, DC coupled versions for RAM and pyroelectrical effect mitigations, etc. iXblue has developed a versatile driver. It has additional optical components for efficient coherent transmission and encryption.

iXblue has flight-proven optical solutions for space apps. We are moving from components to complete systems. Space-grade versions are available as COTS, or as customs. It has the industrial capability to integrate space-grade systems. It is looking for additional components, such as photodiodes, space-guide electronics, etc.

Malcolm Humphrey, VP and CTO, PLX, said they offer monolithic invariant optical assemblies for laser system apps. It is introducing MOST or monolithic optical structure technology. AMCS or alignment monitoring and control system is an alignment instrument for advanced topographic laser altimeter system (ATLAS) aboard ICESat-2 satellte, which is on active mission. For Japan Aerospace Exploration Agency (JAXA), in 2020, PLX has the laser communication system (LUCAS) to enable data relaying between LEO and GEO satellites via optical communications.

PLX’s novel active beam steering technology delivers complete laser scanning systems for target tracking and metrology apps. It is looking for partners to develop beam steering technology for space apps.

Martin Wölz, Product Manager, Quantum Communications, Jena-Optronik GmbH, said they are focusing on attitude and orbit control system (AOCS) sensors, such as star, rendezvous, and docking sensors, and optoelectronic subsystems, etc. In satellite QKD payload, photon source is on the satellite. Single photon time tagging via TRL2 to TRL4 engineering can be done. Commercial IC is not available as there is no mass market. Bottoms-up IC design was financed by BMBF research project QSource, which is funded by German Federal Ministry of Education and Research. Breadboard is expected in 2023. European QKD will start from 2026. We are at TRL5, and will move to TRL9 by 2026-27.

Satellite QKD is now moving to industrialization phase. QKD systems developed by Jena-Optronik include QRNG, VCSEL FPS, modulator driver, single photon time-tagging, etc. We are now looking for new photonic devices for space qualification, such as active polarization compensation. QKD mission timelines are competitive and require industry expertise in design and verification for space.

Costantino Agnesi, Scientist and Product Developer, ThinkQuantum Srl, said it is a spin-off from the University of Padua, Italy. It develops QKD and quantum random numbers generation systems, to design and commissioning of tailored solutions. It offers design capabilities for satellite payloads and ground segment for quantum communication systems.

ThinkQuantum offers development of optical ground stations for reception and decoding of satellite-transmitted quantum states. It is a partner for developing QKD solutions for space. Quantum state generator offers high quality in terms of robustness and compatibility with BB84 QKD and other quantum protocols. It offers the iPognac polarization decoder, with space-qualified version under development. iPognac is a high-performance qubit source for QKD. ThinkQuantum also offers QKD and QRNG systems commercially.

Dr. Ivan Nikitski, Photonics Technology Manager, EPIC was the moderator

A look into future of computing in 2040!

Posted on

SEMI, USA, organized a seminar today on a look into the future of computing in 2040. Ms. Bettina Weiss, Chief of Staff & Corporate Strategy, SEMI, welcomed everyone.

Dr. Alessandro Curioni, IBM Fellow, VP, IBM Research Europe & Africa, Director IBM Research Europe – Zurich, IBM, talked about the importance of future of computing effort for the industry. There is a ‘Future of Computing’ think tank with scope for top down end-user perspective, bottoms-up approach, and considerations for classic computing, computing for AI, and quantum computing.

Jim Sexton.

Progress report
Jim Sexton, IBM Fellow, Department Group Manager, Data Centric Systems, IBM, presented the progress report of the think tank. There are industry trends driving change. Economic and geopolitical trends include increasing dependence on silicon design and fabrication. There was supply chain crisis following the pandemic. We have had responses, such as US Chips Act and Science Act, EU Chips Act, and from Japan and Korea.

From EU chips survey, chip demand is expected to double between 2022 and 2030, with significant increase in future demand for leading-edge semiconductors. Companies are establishing new chip fabrication facilities. Current supply crisis is expected to last till 2024, forcing companies to adopt costly migtration measures.

US Chips and Science Act worth $54 billion supports to build, expand and modernize US facilities, and equipment for semiconductor fabrication, assembly, testing, advanced packaging, etc. EU Chips Act worth $43 billion look at boosting EU reliance on semiconductors.

Computing trends include cloud model for computing, edge computing, AI and quantum computing, increasing importance of security, virtualization, compliance, etc. There is pervasive computing, and computer neural/brain interfacing.

Cloud is leading to decentralization, We are moving to disaggregated and decentralized clouds. We are seeing rise of sovereign cloud, latency-sensitive edge apps, etc. We are having true multi-cloud apps. We have flexibility to choose and combine best-of-breed technologies, and specialization.

Data, AI, quantum, and hybrid cloud are driving progress and change in every aspect of computing. With AI systems and quantum systems, fundamentally new architectures are opening the door to new insights. Advanced microelectronics is leading to new processors, devices, accelerators, interconnects, etc.

At the top level, complex apps do analysis, modeling, and simulation, AI, etc. The container platform approach, based on Kubernetes, looks at the tools to build and compose, manage, secure, automate, and optimize. We are now having high-performance compute for quantum supercomputer, AI supercomputer, modeling and simulation supercomputer, and general-purpose compute.

Future computing foundations lead to complex workflows on unprecedented quantities of data. Hybrid cloud technologies can secure, provision, and deploy, across multiple locations. Computers can now address all elements of a ‘discovery process’. This is applicable across research, enterprise, and government. We are building the foundation for future computing.

Workflow complexity and provisioning are components of a complex discovery workflow. This includes AI- and quantum-enriched analysis, knowledge generation and analysis, etc. We are seeing silicon evolution. We are seeing improved performance per chip. Power per chip remains unchanged, and cost per transistor is increasing.

We are facing the computer energy problem. Data centers are gobbling up lot of the world’s electricity. AI power consumption doubles every 3 months. CEA-Leti recognized 60 elements are now used for silicon, and 15 percent is recycled. We are looking at the future of sustainable computing.

Power requirements are growing unsustainability. We are seeing new hardware for specific tasks, new packaging methods, new memory and fabric technologies. Standards are essential for supporting future integration.

Deep dive into quantum
James Clarke, Director of Quantum Hardware, Intel, took a deeper dive into quantum computing. Quantum concepts include superposition, entanglement, and fragility. Fragility will require error correction and likely millions of qubits. It is not quite clear how many qubits we need for a fault-tolerant system.

James Clark.

There is broad support within the US Government, and other governments worldwide. There is the National Quantum Initiative Act 2018. Quantum computing could shape everything. All of these are separate from the Chips Act. Quantum TAM is too early to call at this stage. Hardware development must come first. We are likely 10 years away from a commercially relevant QC.

There are various physical implementations of qubits. These include: superconducting loops, trapped ions, silicon quantum dots, topological qubits, and PSI quantum, etc. There are over a dozen qubit types. All of the above qubits represent closer alignment to microelectronics infrastructure.

Beyond quantum bits, quantum circuits power the computation. A quantum program can be represented by a sequence of quantum circuits and non-concurrent classical computation. There is industry investment in scaling and system development. These are across trapped ions, superconducting, quantum dots, photons, etc. We have a long way to go. Technology ecosystem R&D is happening across process, integration, laser, etc.

There is the ITRS for QC. We need partnerships between IDM and industry. USG must play a large role. We need a Quantum Foundry/USG/industry/academia partnerships. The equipment and chemicals suppliers need to look at QC. Transistors drive QC, and not the other way around. We need nascent technologies to quantum/qubits that are necessary, such as superconducting digital, photonics, etc.

Quantum has the potential to be transformative for several classes of algorithms. QC will augment, and not replace classical compute. Progress is needed for scaling and system development.

Challenges and opportunities for developing superconducting quantum info systems

Posted on Updated on

Challenges and opportunities for developing superconducting quantum info systems, was presented by Ray Simmonds, Physicist, National Institute of Standards & Technology (NIST), at the Scaling and Lithography Tech Talks, Semicon West 2022, USA.

He gave a brief introduction to superconducting qubits. SQUID is superconducting quantum interference device.

Ray Simmonds.

There are several outstanding challenges. These are: In quantum fundamentals, we have high-fidelity quantum gates (single + multiqubit), accurate and precise readout of many qubits, quantum error correction, and benchmarking quantum information processing. In scaling quantum systems, we have to increase the qubit numbers, route microwave signals, large-scale, low-noise microwave electronics, integrated wideband quantum limited amplifiers, and circuit isolation, shielding, filtering, and cooling.

In quantum chips, we need modeling tools for circuit design and layout, robust, flexible, well-connected architecture, precise fabrication of Josephson junctions, and fabricating very-high coherence devices. In computer science, we need quantum software/programming, integrated ‘classical’ computer processing, and develop quantum algorithms.

For 3D integration for quantum processors, there is the IARPA quantum enhanced optimization. 3-stack enables high connectivity, maintaining high qubit coherence. Field programmable Josephson amplifier (FPJA) provides quantum-limited amplification, and increases the measurement system efficiency. We learn from you, the industry, to improve the materials and fabrication.

Silicon-based quantum computing as disruptive paradigm

Posted on Updated on

Silicon-based quantum computing as a disruptive paradigm for computing, was presented by
Dr. Maud Vinet, Quantum Hardware Program Manager, CEA Leti, at the Scaling and Lithography Tech Talks, Semicon West 2022, USA.

Leti aims to transform a transistor in a quantum bit. In 2016, qubit was derived out of a transistor. There is the control and program arrays of qubits.

What’s needed to build a quantum chip? These are good-quality qubits, 2D array definition, long-distance quantum information transfer for multicore, cryogenic large-scale control, and QCAD tools. Leti has demonstrated high fidelity, fast measurements, high fidelity in arrays, and universal control for 6 qubits array.

There are materials and challenges to go further. Silicon spin qubits offer spin degree of freedom of an electron. There are gate-defined quantum dots. Formation of quantum dots happen at low temperature. There is formation of a well between two barriers. Carriers have no thermal energy and have to tunnel through. Energy states are quantized in well. A small VDS to scan precisely-resolved quantum states.

For qubits, by applying a static magnetic field B, two-level spin qubit is achieved for artificial atom. Single qubit rotation is due to an alternative magnetic field. Two-qubit operation is due to spin exchange interaction. We now need large ensemble of qubits.

Gate pitch should be around 50nm and below, with two gate levels. There is the need for excellent interfaces quality. Thermal SiO2 is preferred as a gate oxide, but becomes a problem for the second gate level. Two methods are used to move away from interfaces. One, use of back gate in FD-SoI-like structures, and use of 2DEG heterostructures rSiGe/sSi/rSiGe. There is need for reduced gate stack variability. Care is now paid to MGG. There is high-throughput characterization to speed up developements.

Silicon-based quantum computing is a serious contender, and is moving fast. There are still some material and integration challenges. There are chances to harvest mainstream CMOS developments.