Sensors

MEMS and sensors driving innovation: Flex 2021

Posted on

There was a panel discussion on MEMS and sensors driving innovation at the ongoing SEMI Flex 2021. The participants were Matthew Dyson, IDTechEx, Hadi Hosseini, Stanford University, Michael Brothers, UES and ARFL, Ms. Erin Ratcliff, University of Arizona, Michael Crump, University of Washington, and Ms. Moran Amit, University of California, San Diego.

Matthew Dyson said there are lot of benefits and significant savings over time. There are apps in wearable, stretchable devices, etc. There is demand from smartphones that is the driver of MEMS and sensors business. A lot of money is also going into printable electronics.

Michael Brothers added that you have to identify key parameters within your own scenario. Ms. Erin Ratcliff noted that we need to look at larger area for sweat, as an example. We are doing architectural design in the virtual space. Michael Crump, said that with stretchable field sensors, you can stretch sensitive materials. You can see a baseline shift, as you stretch them. We took the approach of 3D printed jel paste where zero space does not change. We also need to look at how the sensors resistance changes over time.

Ms. Moran Amit added the baseline is a bit different for them. An example is the thermometer. 36.7C is normal for everyone. If the baseline is zero, it may still look different from a kid to another. Different sensors would work for different kids. Hadi Hosseini said that people are looking to use the wearables to diagnose illnesses. We are looking at changes in oxidation in the blood. We are also prototyping. We got a grant last year to develop a device. We are hoping to collect data for children with ADHD. My focus is on mental illness. There are other areas like mental wellness.

Medical community responding to sensors
It would be interesting to see how is the medical community responding to the use of sensors. Michael Brothers said there is some response. One of the key drivers is cost and benefit. People are interested in wearables. There are factors preventing adoption in the medical community, for now.

Turning to non-imaging techniques, what bio-parameters in a wearable device could help with mental health diagnosis? Hadi Hosseini said that with ADHD, you can use sensors to identify patterns in the child. People have been also looking at cell phones to collect data in the background. Matthew Dyson added that wearables for mental health diagnosis have been developed in Belgium. Monitoring of electrical signals include muscle and brain activity for mental health diagnosis. Ms. Erin Ratcliff said when you design a sensor, it has to give information about something new. How do you translate that into full device study?

Michael Brothers felt that biosystems work in a different way. Sensors should be created to identify changes in the human body. You have ask about the right problems. You also need clinical trials to introduce new sensors. It is also very hard to determine physiological relations. Ms Erin Ratcliffe added that there are teams that design sensors. You may have to guess the range, but that’s not a useful detection strategy.

Matthew Dyson said there is a lot applicable to flexible electronics. There should be specific bodies for doing that. There should be some designated standards bodies. Ms. Erin Ratcliffe noted that consortium models are beginning to evolve. Companies also hope to listen.

World greener
According to Michael Crump, sustainability is pervasive throughout. They are able to print features for energy overhead. As for using AI/ML for key markers, Hadi Hosseini said that we don’t have enough data yet for specific disorders. It takes time to collect data. There are lot of ML studies. Generalizing data for 100-200 patients can be challenging. We collect brain imaging data from patients to identify sub-types of illnesses.

Michael Brothers added there can be array sensors, mass factor patterns, etc. There is lot of work needed in AI/ML. It is an interesting problem. The issue is: how do you collect all the data? Ms. Moran Amit said that there is stress on waste and sustainability. Our system has the doctor equipped with it, to assess many people. A thermometer can be used over and over again. There may be less sales.

Michael Crump felt that there is a need to get to conductive trace. We don’t want to be printing lines and lines, but just one line. We are trying to get to the place where we can print something from a single pass. Ms. Erin Ratcliffe added there needs to be more targeted focus on $10-15 type models, rather than $100 and above. Hadi Hosseini said there are many different technologies. Some of them are not yet developed enough or are underdeveloped. We need to work with the others. There’s the application of more advanced techniques, such as printable materials.

Matthew Dyson felt there is room for new technologies. A lot of progress is made on printed electronics. Sensors are being deployed in cars, wearables, missiles, etc. There will be more apps that make it to commercial reality.

Electronics on the brain: Flex 2021

Posted on

Day 3 of the SEMI Flex 2021 started with George Malliaras, Prince Philip Prof. of Technology, University of Cambridge, presenting the keynote on electronics on the brain.

One of the most important scientific and technological frontiers of our time is the interfacing of electronics with the human brain. This endeavor promises to help understand how the brain works and deliver new tools for the diagnosis and treatment of pathologies including epilepsy and Parkinson’s disease.

George Malliaras.

Current solutions are limited by the materials that are brought in contact with the tissue and transduce signals across the biotic/abiotic interface. Recent advances in electronics have made available materials with a unique combination of attractive properties, including mechanical flexibility, mixed ionic/electronic conduction, enhanced biocompatibility, and capability for drug delivery. He presented examples of novel devices for recording and stimulation of neurons and show that organic electronic materials offer tremendous opportunities to study the brain and treat its pathologies.

Bioelectronic medicine is game changing. There has been the emergence of bioelectronic medicine. We have nerve simulation for autoimmune diseases, etc. The current technology is however, limiting. Signals are small and diverse, and the environment is hostile to electronics. It also requires highly invasive and multiple surgeries.

Teaching electronics is sometimes a foreign language. We need to get drugs into the brain. Bioelectronics is interfacing biology and electronics. There is sensing and diagnosis. This leads to actuation and treatment of the brain. High resolution brain mapping is an example. If you use organices, there is used mixed conductivity that leads to novel, state-of-the-art devices. The physics of these materials is still under investigation.

There is volumetric ion transport in PEDOT/PSS microelectrodes. There are recordings of single neurons from the brain surface. Current work is looking at large area and high density. We also have some options for treating epilepsy.

There is localised drug delivery past the blood-brain barrier. These have been used for brain cancers, and there is a large gamut of drugs. We can get spatiotemporal control, as well. However, wafers offer limited cargo and it is not suitable. we need to develop new technologies. An example is the organic electronic ion pump. In the ion exchange membrane, the ions flow in only one direction — from source to target.

There is electrophoretic drug delivery, as well. We use GABA delivery in vitro. Also, implantable devices stop or prevent seizures. Another app is chemotherapy delivery to nonresectable brain tumours. Implants often require highly invasive surgery. Paddle-type electrodes are more efficient, but they require lamenectomy.

When you combine bioelectronics with soft robotics, there are expandable impants. There is dynamic control of the device shape. You can deploy in spinal cords in cadavers.

Implantable electronics hold considerable promise for understanding te brain and addressing the pathologies. Mixed conductors enable high resolution cortical electrodes that record neurons without penetrating the brain. Electrophoretic devices can deliver the drug without the solvent, with excellent spatiotemporal resolution. They stop/prevent seizures in an animal model. Microfluidics allow expandable implants that minimize the invasiveness of neurosurgery.

Embedded computing with image sensors

Posted on Updated on

Day 3 of SEMI Technology Unites Global Summit 2021 began with two sessions: MEMS and Sensors, and Fab Management.

Speaking at the MEMS and Sensors summit, Gianluigi Casse, Bruno Kessler Foundation (Fondazione Bruno Kessler – FBK), Technology and Knowledge Open Hub, said the FBK has 12 research labs and over 400 researchers, with 51 patents and 23 joint innovation labs. The FBK CMM (Center for Materials and Microsystems) works on several research topics. The microelectronics industry has gone very far since inception, to anticipate new markets in strategic sectors.

New challenges include quantum technologies, big science, space economy, etc. There has been renewing infrastructure such as integrating technology platform, and FESR+IPCEI founds. They have also updated the co-operation model.

The FBK internal foundry has four large labs. This takes care of design, fabrication, test and packaging. Some areas are silicon drift detectors, thin silicon sensors, and MEMS and superconducting circuits and systems. The external foundries are focused on CMOS fabrication, such as CMOS SPAD image sensors, quantum random number generators, etc.

With IPCEI, FBK is a partner with 24 partners and 2 research centers, including CEA-Leti). With FCSR, it has nanotech capabilities enabling QT R&D through submicron and deep submicron structure definition. With FCSR, it is working on nanotechnologies using EBL and ion beam. With IPCEI, it is working on heterogenous integration using wafer thinning, through-silicon vias, and wafer bonding.

FBK is also working on devices for the future, such as quantum, space, etc. It is leveraging the open hub paradigm in areas such as silicon technology, silicon photonics, nanotechnology, heterogenous integration, and chip stacking. There is R&D being done in quantum technologies. Examples are EPIQUS (chip-scale quantum photonic-electronic platform), QRANGE and FastGhost (ghost imaging microscope). In big science, it has SiPM (silicon photomultipliers and low-gain avalanche diodes) and LGAD, and MAPS (monolithic active pixel sensor for proton sensing). It has also developed 3D flash lidar cameras and miniaturized StarTracker on chip for nano satellites.

In future, FBK will provide advanced, open and customizable technology platform. It will also invest more in the devices of the future, such as integrated photonics/detectors in Q SPAD imaging for quantum and space SiPM, LGAD and MAPS for big science.

Embedded computing for image sensors
Pierre Cambou, Yole Développement, talked about Embedded Computing the Next Paradigm Shift for Image Sensors at the MEMS and Imaging Sensors summit.

The CMOS image sensor (CIS) 2021 revenues represents 5.1 percent of the global semiconductor market. Mobile business has been the largest market for 2019, followed by computing, automotive, security and industrial. Technologies and markets have changed dramatically over the years. The optical fingerprint recognition adoption scenario is changing. It is an alternative to facial recognition. 3D camera market scenario is also changing as the adoption switches to rear cameras.

Wafer shipments by technology have seen 10 percent for sensing. In mobile technology trends, Samsung is now matching Sony’s previous technology. Also, triple-stack technology includes a 32nm DRAM wafer. In-pixel hybrid stack connection pitch allows for 10um (mu) pixels.

There is always-on video-based context awareness. There is voice- to video-based devices, as well as embedded intelligence. Sensing and computing trade-offs are also there for autonomous driving (AD). Computing power increases with the square of data. Embedded computing will avoid cloud compute saturation for realtime and critical apps.

If we look at embedded AI, there is an answer from Sony. There is the innovation path for the CIS industry. Sony, STMicroelectronics, Samsung, etc., are leading the way. In quantum image sensors, there is dynamic low-light imaging with Quanta image sensors. There can also be combination of images and IMU for robust SLAM in HDR and high-speed scenarios.

Sensing will be the main driver for the next paradigm shift. AI is the new revolution in the cloud and embedded. New generation of image sensors will benefit from the trend.

Event-based vision
Luca Verre, Prophesee presented on Toward Event-Based Vision Wide-scale Adoption at the MEMS and Imaging Sensors summit. Since the beginning, there have been new opportunity areas. There are AR/VR, automotive, etc., that are new areas opening up.

We are revealing the invisible between the frames. Prophesee is capturing motion via static representation. It is focusing on event-based vision. In a video, he explained there is no gap between the frames as there are no frames anymore. It has been adding intelligence down to the pixel, as well. This also leads to zero redundancy sampling, pixel-individual sampling optimization, etc.

Prophesee has also been doing pixel evolution, since 2014. With Sony, it announced during ISSCC 2020 that they developed a stack-event-based vision sensor. This has the smallest pixels, best HDR performance, and highest AER event readout.

The product line-up includes Metavision sensors, Metavision evaluation kits, for sensing, and Metavision Designer and Kit. Their partners are Imago and Century Arks.

He gave examples of spatter monitoring that tracks small particles with spatter-like motion. Another is high-speed counting without any motion blur. In ML, they have a pre-trained network. An event-based camera detects pedestrians in night, as an example.

Glass and silicon bioMEMS components heart of tomorrow’s medical devices

Posted on Updated on

Yole Développement and Teledyne organized a meeting on glass and silicon bioMEMS components. They looked at how these are the heart of tomorrow’s medical devices.

Opening the discussion, Sébastien Clerc, Technology & Market Analyst, Microfluidics, Sensing & Actuating, Yole Développement, said that there is prevalence and cost of chronic diseases. There are diseases such as sleep apnea, diabetes, infertility, Parkinson’s disease, epilepsy, cardiovascular events, etc. However, possible solutions do exist.

Sebastian Clerc.

There are many examples of bioMEMS-enabled systems. MEMS, and other sensors and actuators are used in many medical devices, either implantable, wearable, or external. There is need for more compact and comfortable systems. There can be mainstream vital sign monitoring, new monitoring systems, pacemakers, etc.

Micro-technologies are everywhere in healthcare apps. There are microfluidics, imaging devices, bioMEMS and biosensors. Microfluidics market is estimated to grow to $5.3 billion by 2025, from $2.7 billion in 2019. Imaging devices will grow from $4.3 billion in 2019 to $6.6 billion in 2025. BioMEMS and biosensors will grow from $4.9 billion in 2019 to $9.6 billion in 2025. BioMEMS market dynamics include use of microfluidics, silicon microphones, optical MEMS, etc.

Next-gen DNA sequencing is leveraging glass and silicon technologies. In 5-10 years, there will be DNA sequencing for less than $100. Silicon and glass for microfluidics are estimated to be huge. The supply chain for bioMEMS and microfluidic fabs is growing. In glass, there is Caliper, Schott, Philips, Invenios, etc. In silicon, there is XFab, Sensera, TSMC, Teledyne Dalsa, STMicroelectronics, etc. In polymer, there is ChipShop, Carville, Axxicon, Hochuan, Weidmann, etc.

MEMS platforms are accelerating the time-to-market. MEMS foundries are key partners to reduce the TTM and reach medical-grade devices.

Glass and silicon bioMEMS
Collin Twanow, Director of Technology, Teledyne MEMS, spoke about glass and silicon bioMEMS. He said that there is diverse need for medical monitoring, diagnosis, and treatment. There is the evolving demographics. There is the acceptance of technology advancements by doctors, regulatory bodies, and patients. There is continuing advancement and introduction of MEMS and microfabrication in this field.

Teledyne MEMS foundry services is no. 1 independent pure-play MEMS foundry. It has the largest portfolio of microfabrication technologies available in the world (non-captive). There are hundreds of unique prototypes built, and technologies for all sensor types and markets.

Collin Twanow.

Teledyne has 150 mm and 200 mm wafer diameter production lines. Teledyne offers all the advanced processes and fabrication equipment. These include DRIE, metal disposition by sputter and evaporation, glass/quartz wet etching, RIE plasma etching, Si anisotropic wet etching – KOH, and TMAH, test and automatic optical inspection (AOI), back-end process, bonding, etc.

Teledyne bioMEMS fabrication includes platinum gold and other metals, CMOS post processing, patterned polyimide, silicon glass and other substrates. The apps served include diagnostics, cell treatment, drug development, antibody ID, disease testing, genetic analysis, etc.

MEMS and microfabrication for biotech includes silicon and glass microfluidics, CMOS post processing, thin-film bioassay substrates, CMUT arrays for medical imaging, other bioMEMS devices, and other medical device considerations.

In MicraFluidics, the silicon microfluidic process platform, there are features such as high-aspect ratio microchannels in silicon wafer, input/output ports in glass, consistent inorganic surfaces, suitable for functionalized coatings, and customer specified chip size.

The process includes pattern customized through-wafer ports in glass wafer, pattern microfluidic channels in silicon substrate with precise high-aspect ratio anisotropic etching, and glass and silicon wafer bonding providing a strong and reliable bond interface.

In CMOS post processing, there is integrated MEMS and CMOS electronics, extensive and flexible CMOS post-processing capabilities are available for next generation, integrated biochips, expertise to handle advanced CMOS wafers from multiple CMOS foundries for post processing using fully-compatible lithography tools, expertise to work with different polymers for microchannels and microfluidic wells definition, capability to deposit thin metal and dielectric layers for integrated electrical detection, and polymer wafer bonding, with microfluidic features, and CMP. Bioassay apps include consumable test chips, DNA capture and analysis, and genetic testing.

Capacitive micromachined ultrasonic transducer (CMUT) MEMS platform is the emerging transducer / receiver technology for medical imaging and treatment. The CMUT technology offers many potential advantages over traditional linear array piezoelectric transducer technology, including, advantages of wafer fabrication scale, 2D arrays offer higher resolution waveform shaping, greater sensitivity, superior acoustic impedance matching, potential to co-integrate with electronics, and SOI Layer that provides consistency of single crystal silicon for top electrode.

Teledyne’s phase-gate system ensures thoroughness in path to manufacturing. It provides a rapid, reliable path to high yield production. Design for manufacturing involves designing into an established process capability, and ensuring the expected process variation does not lead to product variation. Benefits include first-run prototype success, faster to manufacturing, stable yield and performance, and lower costs.

Teledyne MEMS is the world’s largest pure-play MEMS foundry. It has extensive experience in microfluidics and bioMEMS, with biocompatible materials. The foundry is structured for prototype development and large-volume commercial manufacturing for the medical industry.

Next decade will be remembered for artificial empathy: MSEC 2020

Posted on

The final discussion at the ongoing SEMI MEMS & Sensors Executive Congress (MSEC 2020) looked at next decade of MEMS, and the opportunities and challenges ahead.

Now, 2020 has been a challenging year in many aspects. The consequences of the pandemic are expected to further impact the upcoming years, possibly even decade. The MEMS industry is also experiencing multiple challenges: Development costs for new generations of MEMS sensors are increasing. Wider, diverse markets are now required to compensate the growing development expenses, and to lower the risks. A sole focus on hardware is not sufficient anymore.

Jens Fabrowsky.

Jens Fabrowsky, Executive VP, Automotive Electronics, Robert Bosch GmbH, said that MEMS is a fascinating story of innovation. Their technology pull is mainly app driven. They are prevalent in manufacturing, materials, packaging, and computing. There is significant cost reduction and reliability improvement. Technology areas driving MEMS innovation are 3D MEMS, piezo materials, in-MEMS processing, and heterogenous packaging, etc.

MEMS computing trends are relevant, personalized, and trusted. They are also connected to the end users’ needs. MEMS sensors and actuators are important. Raw and processed data can be filtered and processed. Sensor fusion combines data from multiple sources. We are now beginning to consolidate the data from multiple sources. There is the tactile Internet. By adding sensors, you can also understand emotions to make machines be more helpful. Digital twins are growing. We are briding the physical and digital worlds, to have the phygital experience.

We are building brain-level AI at Bosch. Self-diagnostics will improve our lives. We are looking at emphatic computing in the future. AR will be engaging, immersive and secure in future. Sensors are able to detect personal signatures too. There is edge AI and ML inside the sensor. Research and training needs to move to new collaboration models in AI. We are improving the accessibility of our products in open source programs.

From artificial strength (AS), we have moved to AI. Later, we will be moving to artificial empathy (AE). The next decade will be remembered for AE.

For developing the industry solutions, there will be a challenge to understand the big systems. We need to overcome those. There are industry-wide efforts going on. You also need to be able to manage data. It is all coming down to edge computing. We need to keep the data private, and not to let it out in the cloud.

Trouble with innovative sensing applications: MSEC 2020

Posted on

Individualized digital health monitors benefit wearers with chronic illness or those at risk of acute infectious diseases with rapid onsets. As technology leaders are making vital sign sensors convenient and affordable for mobile and fitness applications, US Centers for Medicaid and Medicare Services and other healthcare policy makers, have adopted reimbursement policies to encourage remote patient monitoring practices. The global pandemic has only further accelerated these initiatives.

Ian Chen.

Ian Chen, Executive Director, Maxim Integrated, spoke about the trouble with innovative sensing applications, and how to overcome those, at the ongoing MSEC 2020. Today, healthcare is becoming even more personalized. Global shipments of select wearables shipment hit 210 million units in 2019. The market may grow to 340 million units by 2023.

Today, accelerometer data is used in a TPMS to get tyre location, monitoring blood pressure on an ambulatory patient, monitoring body temperature for disease onset detection, and personalized health monitoring. Also all paths of invention need to pass through data collection.

Eg., there is a case for continuous monitoring of cattle health. The industry wants a objective metric for animal wellness. Maxim did cattle monitoring for some time. They need to improve detection via sensor fusion. They also need to improve the motion artifact detection algorithm.

Another example is the case for chronic obstructive pulmonary disease (COPD) remote patient monitoring. COPD is an incurable disease, so there needs to be patient health monitoring, via telehealth. Here, waveform analysis algorithm, and additional sensing modalities are needed.

The call to the sensing community is for accelerating the time-to-data. That’s a must! Only then can we see the potential of the new sensing apps.

Sensory motion tracking and future of human-digital interaction: MSEC 2020

Posted on Updated on

Day 6 of the ongoing MEMS & Sensors Executive Congress (MSEC 2020) had a set of interesting presentations on market trends. Sensory motion tracking, enabled by smart sensors, is at the forefront of this fourth industrial age. In healthcare, motion tracking is helping people to monitor heart rates, sleep patterns and fitness levels.

Motion tracking technology is opening many possibilities for the entrepreneurs and innovators to transform their own industries. Physical, biological and technological worlds are merging like never before, and the potential of sensory motion capture is only restricted by the extent of our imagination.

Igor Ikink.

Speaking about this, Igor Ikink, Director Technology, Xsens, said that the most immediate use of sensory motion is in entertainment and gaming, sports, clinical, and workplace biomechanics, etc. This has the potential to help companies take work activities to a new level. Sensory motion is captured, tracked, processed, and analyzed. Industry 4.0 is also changing the way we work.

There are advancements in MEMS, in size, pitch, roll, power consumption, and lower costs. Novel post-processing has now become a necessity. There are also a plethora of services available. Wearable technology is becoming the terminology for the digital age. There are thin films being attached to garments. These are smart textiles. So much knowhow is going into shoes. Watches and cells, and eyewear are already becoming smart. Wearables are leading to ubiquitous computing, and help make better decisions faster.

Boundaries are constantly being redrawn today. Biomedical sensors offer exciting opportunities. Wearables are also changing. Technology has also become more scalable. There will be motion tracking-as-a-service, to control database, connecting to the cloud. Healthcare will be very different in 10 years from now. There will be an IoT-connected society. Covid-19 has also accelerated remote patient monitoring.

The post-pandemic world is full of opportunities. An example is the remote health and wellness monitoring, remote sport monitoring, etc. Another example is high-performance rehabilitation. Sensors are key drivers. They are small, cheap, and wearable, in our quest for digitization.

Semicon-based ultrasound and democratization of medical imaging: MSEC 2020

Posted on

The next presentation at the ongoing MEMS & Sensors Executive Congress (MSEC 2020) was on semiconductor-based ultrasound and democratization of medical imaging. It was presented by Gioel Molinari, President, Butterfly Network.

Butterfly is a complete point-of-care imaging solution. Healthcare is in desperate need of innovation. About 2/3rds of diagnostic dilemmas can be solved with simple imaging. Ultrasound is harmless, and has universal clinical utility. However, its use and impact has been limited.

There are acoustic limitations of ultrasound sensors. Butterfly replaces three piezo probes with one wideband transducer for whole body imaging.

Semiconductors have the wherewithal to disrupt the various industries. Sony reportedly had nearly half the image sensor market in 2019. Ultrasound is the fifth pillar of the new physical examination.

Gioel Molinari.

Recently, we have seen lung ultrasound being used for screening Covid-19 patients. They check for the healthy vs. Covid-19 lung. There is faster diagnosis and monitoring than CT or CXR at the patient’s point-of-care. An NGO partner is the Bill & Melinda Gates Foundation.

We have used an open imaging ecosystem, comprising devices, content and SaaS. We have developed a superior medical imaging solution that is also integrated with the EMRs.

The Butterfly solution is unique. There are probe components, such as transducer chip, substrate interposer, and lens shroud housing for the probe head. This is V1.0, so there are no compromises. Covid-19 has rapidly accelerated the concept of imaging. Market expansion is ongoing.

Wearable social distancing with ultrasonic time-of-flight sensors: MSEC 2020

Posted on

At the ongoing MEMS & Sensors Executive Congress (MSEC 2020), session 5 looked at the ultrasonic transducers. The opening session was on the wearable social distancing solution based on ultrasonic time-of-flight sensors, by Prof. David Horsley, Co-Founder, TDK-Chirp Microsystems Inc.

MEMS beats conventional ultrasound on size, weight and power. There are many apps for MEMS ultrasonic sensors. Some of them are presence detection, smart locks, smart homes and IoT, drones, social distancing, robotics, smart homes, AR/VR gaming, etc.

Prof. David Horsley.

The CH101 and CH201 ToF sensors are systems-in-package (SiP) with piezoelectric micromachined ultrasonic transducer (PMUT), and the programmable SoC for all ultrasonic signal processing. These are programmable for different operations, such as pulse echo and pitch catch. CH101 is guaranteed to match the frequency. Both of these are available from Digi-Key.

There are wearable tags for social distancing. Wearables promote safe social distancing by alerting people when they are too close. Contact records are stored in the wearable. They also alert the wearer if any contact occurs with an infected tag. The transmission probability model can be combined with contact records to estimate the probability of infection.

How it works!
Now, how does the ultrasonic time-of-flight sensor work? The tags exchange unique IDs. They communicate via radio and ultrasound. Now, electromagnetic waves pass through non-conducive materials. This may result in false positives. Ultrasound requires having clear air path between people to detect contact. TDK-Chirp provides the reference designs for others to develop their tags.

Bluetooth Low Energy (BLE) radio and ultrasound can maximize the battery life. Each tag is periodically broadcast in its ID. The tag battery life is about 17 days. Using ultrasonic ToF sensors, wearable tags provide safety.

TinyML — MI meets billions of sensors: MSEC 2020

Posted on

The ongoing MEMS & Sensors Executive Congress (MSEC 2020) continued with a session TinyML: Massive opportunity when MI meets real world of billions of sensors.

Evgeni Gousev, Senior Director, Qualcomm AI Research, Qualcomm Technologies, and tinyML Foundation, BoD, said there is the physical (analog) world and the digital (AI) world today. There are fundamental issues, such as energy insufficiency, privacy issues, latency issues, and reliability issues. Here, TinyML comes into play! It is essentially, energy efficient, metadata/privacy by design, etc.

Evgeni Gousev.

TinyML ideally needs energy harvesting. Memory has to be less than 1MB, often, <100kB. Cost has to be very low to enable massive deployment. TinyML comprises ML architectures, techniques, tools and approaches that are capable of performing on-device analytics. Data is the new oil, and ML is a way to produce it! There are massive tinyML opportunities in all verticals, where machine intelligence meets the physical world of billions of sensors.

The growth drivers include more developed, energy-efficient hardware, energy efficient algorithms/NN, etc. TinyML has also met global growth. There are ~4k members in 17 countries, within 16 months, so far. TinyML is implemented with a holistic hardware-software system (algorithms /network), etc.

There will be more enhancements coming in the near future. There will be compute in memory, analog compute, and neuromorphic, etc. Over the next 10 years, there will be tinyML everywhere, with trillions of devices.

An example is tinyML for always-on voice. The SAM is 8 billion units by 2023. There will also be tinyML using MEMS sensors. Software is becoming increasingly intelligent, enabling AI inside the sensor itself. Another use case is tinyML for predictive maintenance. TinyML is for always-on vision. Example, Qualcomm’s always-on computer vision module.

Vision will enhance many use cases across numerous verticals. It will be used across smartphones, smart watches, tablets, VR, etc. Qualcomm’s system approach has been for computer vision. The QCC112 chipset is an example.

ML is at the edge of the physical world, at mW power. TinyML brings significant data mining and analytics capabilities, for numerous use cases and verticals.