Low-power universal edge tracer architecture using accuracy-controlled resource reallocation for event-driven sensing applications

Daejin Park, Jeonghun Cho

Research output: Contribution to journalConference articlepeer-review

2 Scopus citations

Abstract

The low-power memory tracer architecture for sense data acquisition is proposed. The hardware pre-processor based on the proposed sense data tracer enables the low-power sense data analysis in,a noisy environment. The sensed signals are tagged with the edge phases, threshold level, and elapsed timing distance between previous signal edges. The incoming sense data analysis is delayed, and its raw data is reallocated into the tracer memory. The traced sensed data is analyzed by the allocated event patternmatcher in the silent background mode without any CPU assistance. The proposed method and hardware architecture enable an accurate original sense data reconstruction in the slow processor clock frequency. Newly designed building blocks are integrated into previously designed sensor processors for 3DTV active shutter glasses. The experimental result shows an additional power reduction to about 25% of our previous work by allowing a small amount of error in the original sense data reconstruction. This paper describes the systems' architecture and details of the proposed memory tracer in addition identifying the key concepts and functions.

Original languageEnglish
Pages (from-to)67-73
Number of pages7
JournalProcedia Computer Science
Volume56
Issue number1
DOIs
StatePublished - 2015
Event29th European Conference on Solid-State Transducers, EUROSENSORS 2015; Freiburg; Germany; 6 September 2015 through 9 September 2015. - Freiburg, Germany
Duration: 6 Sep 20159 Sep 2015

Keywords

  • Event-driven processing
  • Low-power sensor interface
  • Wearable sensor system

Fingerprint

Dive into the research topics of 'Low-power universal edge tracer architecture using accuracy-controlled resource reallocation for event-driven sensing applications'. Together they form a unique fingerprint.

Cite this