Lab Home | Phone | Search | ||||||||
|
||||||||
A designer proposing a numerical algorithm to solve an application is mostly committed to accuracy as if it were the unique constraint to look at. This is reasonably true in many applications and, surely, in those where energy and computational complexity are not an issue. However, when focusing on scenarios requiring intelligent embedded systems, i.e., embedded systems with data processing and decision making ability (e.g., Wireless Sensor Networks, hybrid monitoring technologies and any mission critical embedded applications), energy, complexity and cost aspects should be taken into account. It turns out that complexity must be balanced with accuracy directly at design time by recalling that our application resides in an uncertainty-affected world (lack of a priori information, external and electronics noise, processing uncertainty, algorithm alternatives, etc). As a target we would like to assess the performance of our algorithm with a cost-effective approach and select the "right" algorithm within a set of feasible solutions; to finally have it working on the chosen embedded system. The talk will present an overview of probabilistic techniques based on randomized algorithms for solving those "computationally hard" problems associated with performance verification and introduce the Probably Approximately Correct Computation (PACC) approach for assessing accuracy of algorithms running on embedded systems. Introduction of probability and random sampling makes it possible to overcome the fundamental tradeoff between computational complexity and conservatism associated with a worst-case, rarely quantifiable, deterministic approach. The simplicity of randomized techniques is an advantage, as it will be shown on a set of application, also involving Wireless Sensor Networks. Host: Brendt Wohlberg |