(Left to right): Joshua Doctor, Dr. Albert Liu and Matthew Manion

Interview with Dr. Albert Liu | How Time-Aware Sensors Are Transforming Chemical Mapping for Industrial and Biomedical Applications

February 19, 2025

Dr. Albert Liu and his team are pioneering time-aware sensors for chemical mapping, an approach that enhances the ability to track chemical concentrations with both spatial and temporal precision. Unlike traditional inert tracers, these sensors integrate “material memory,” allowing them to record reaction times dynamically and provide a more detailed chemical profile.

Follow the Main Authors here:

A link to the specific paper our earlier news report and this interview pertain to can be found here:

Manion, M. L., Doctor, J., & Liu, A. T. (2025). Temporally resolved concentration profiling via computationally limited, distributed sensor nodes. AIChE Journal, 71(2). https://doi.org/10.1002/aic.18691

In this interview, Dr. Liu discusses the core innovation behind time-aware sensors, the advantages they offer over conventional techniques, and the challenges involved in their deployment. He also provides insights into future directions for scaling up this technology and potential collaborations for commercialisation. The following conversation is presented unedited to preserve Dr. Liu’s original responses and offer a deeper look into the significance of his research.

Could you explain the core technological innovation behind time-aware sensors and how they improve upon existing chemical mapping techniques?

Historically, only “inert” tracers could be used to gauge reactor conditions. These tracers are molecules that don’t react with other chemicals in the reactor. The tracers would be injected into a reactor, and by measuring the relative times those molecules leave the reactor gives the “residence time” inside the reactor. Because there are many paths small molecules can follow in the reactor, different molecules may exit more or less quickly than others. Analogously, 100 cars travelling from Point A to Point B may have variable travel times due to stop lights, speed variation, traffic, etc. The only information gained in both situations is simply how long the molecules (or cars) took to travel from A to B, no chemical concentration information is elucidated from inert tracers.

“Reactive” tracers, like TAPS presented here, use a similar concept, but with material memory. We release these tracers, which keep an internal clock that runs until it detects enough analyte to shut-off its clock. The particles then leave the reactor and we measure when each tracer’s clock shut off, creating a distribution of “reaction” times or sensing times. We can then reconstruct an axial, spatio-temporal profile of our reactor concentration. In the car analogy, imagine we release 100 Google Earth cars, covered in cameras to document the journey from A to B. We tell each car to take pictures only of gas stations as it travels. Now, when those cars arrive at point B, we can determine how long it took them to arrive and where gas stations are located on the commute. The cameras on the car are analogous to sensor materials on TAPS, and the gas stations are our chemical species of interest.

What specific advantages does time-awareness provide in tracking chemical concentrations over time, and how does this impact applications in biomedicine and industry?

Time awareness is critical for sensing applications. Unless a person manually keeps time and measures something, the computer actually doing these tasks needs this information in its circuitry. Integrated circuits (IC) keep time typically with quartz oscillators, but the TAPS framework provided here relies on electrodynamic materials. These electrodynamic materials are useful as information is stored as a physical property (i.e. conductivity). If these properties change as a known function of time (i.e. linear increases with time), then we can determine how much time passed by measuring the material before & after the experiment. The advantage of using TAPS over IC sensors is that TAPS can be scaled to the micron range (~100 um) & can be fabricated cheaply en masse. IC sensors are more complicated, limited in scale down (~mm – cm), and expensive to fabricate en masse. This means very cheap and simple sensors can be used to map concentrations in people or industrial equipment without significantly disrupting the fluid flow therein.

What are the key materials and fabrication techniques used in these sensors, and how do they enable miniaturisation while maintaining high sensitivity?

It is important to note that our paper focuses on the framework for deploying such sensors, with specifics of the fabrication left to the original papers (which Albert worked on). Here is the link to the original papers:

Liu, A.T., Hempel, M., Yang, J.F. et al. Colloidal robotics. Nat. Mater. 22, 1453–1462 (2023). https://doi.org/10.1038/s41563-023-01589-y
Liu, P., Liu, A.T., Kozawa, D. et al. Autoperforation of 2D materials for generating two-terminal memristive Janus particles. Nature Mater 17, 1005–1012 (2018). https://doi.org/10.1038/s41563-018-0197-z
Koman, V.B., Liu, P., Kozawa, D. et al. Colloidal nanoelectronic state machines based on 2D materials for aerosolizable electronics. Nature Nanotech 13, 819–827 (2018). https://doi.org/10.1038/s41565-018-0194-z

How does the data processing mechanism work within these sensors to synchronise time-stamped measurements across thousands of units in a narrow space?

It doesn’t automatically synchronize measurements across sensors. When the sensors leave their sampling space (i.e. the reactor), their “reaction” times are measured and binned into a histogram. Each sensor only records information relevant to itself, and meaningful information can only be extracted by releasing many of these sensors as an ensemble, to account for noise and sensor loss. This is why, in the paper, we refer to this sensing scheme as “computationally distributed”. No single sensor can determine the concentration profile of your reactor. You need certain assumptions or understanding of the fluid flow in the system to extract this ensemble information. You release many sensors and assume they behave in an identical fashion with random fluctuations (i.e. a statistical ensemble), and measuring how many of those tracers shut their clock off along their route gives the scientist information on how long they were in transit and the target chemical concentration along the path. More chemical present causes more tracers to turn their clock off. 

What are the biggest challenges in deploying these sensors in real-world scenarios, such as industrial chemical monitoring or biomedical diagnostics?

The biggest challenge is the method of measurement after the sensors are collected. The methods of manufacture utilize existing VLSI semiconductor techniques, and thousands to millions can be made on a single silicon wafer. They can be collected easily enough by filtration, but the technology to read the material information has not been established in a high throughput manner. The information must be read with conductive AFM or a scanning optical system with electronic sensitivity.

Could you discuss any limitations in terms of signal interference, power consumption, or sensor degradation over extended periods of use?

Signal interference would only occur if there were fabrication issues or the sensor material was not specific enough to the analyte of interest. The signal is not transmitted during use, so that is not a problem.

Degradation is obviously an issue and would depend on the specific materials and environment. For an application like the human intestines, for example, its reasonable we’d need some capsule to release the particles in the gut so they wouldn’t be lost when swallowing.

Power consumption is an important factor, and this is currently the area in need of improvement for these technologies. A few options currently exist, but a greater variety and demonstrated stability is needed for robust implementation.

Looking ahead, what are the next steps in advancing this technology? Are there collaborations underway to commercialise these sensors or integrate them with existing chemical analysis platforms?

The next steps would involve a bench scale deployment with these sensors, to give an experimental validation to our simulation findings. We are looking for collaboration and commercialisation opportunities.

Leave a Reply

Your email address will not be published.

Previous Story

Polymer Editing | ORNL’s Jeff Foster on Upcycling Plastic Waste into High-Performance Materials

Privacy Preference Center