CADA: Collaborative Auditing for Distributed Aggregation

José Valerio, Pascal Felber, Martin Rajman & Etienne Rivière

Abstract The aggregation of distributions, composed of the number of occurrences of each element in a set, is an operation that lies at the heart of several large-scale distributed applications. Examples include popularity tracking, recommendation systems, trust management, or popularity measurement mechanisms. These applications typically span multiple administrative domains that do not trust each other and are sensitive to biases in the distribution aggregation: the results can only be trusted if inserted values were not altered nor forged, and if nodes collecting the insertions do not arbitrarily modify the aggregation results. In order to increase the level of trust that can be granted to applications, there must be a disincentive for servers to bias the aggregation results. In this paper we present the CADA auditing mechanisms that let aggregation servers collaboratively and periodically audit one another based on probabilistic tests over server-local state. CADA differs from the existing work on accountability in that it leverages the nature of the operation being performed by the node rather than a general and application-oblivious model of the computation. The effectiveness of CADA is conveyed by an experimental evaluation that studies its ability to detect malevolent behaviors using lightweight auditing oracles.
Citation J. Valerio, et al., "CADA: Collaborative Auditing for Distributed Aggregation," in Proceedings of the 9th European Dependable Computing Conference (EDCC'12), Sibiu, Romania, 2012, p. 1-12.
Type Conference paper (English)
Name of conference Proceedings of the 9th European Dependable Computing Conference (EDCC'12) (Sibiu, Romania)
Date of conference 8-5-2012
Pages 1-12
Related project MistNet: An Experimental Peer-to-peer Platform for the Cloud