A large-scale network such as the Internet is typically controlled by multiple entities, and it is not possible to directly obtain information about the internal characteristics. Desired internal characteristics include network structure and topology, loss rate and delay distribution, and the origin-to-destination traffic matrix, which are important for dynamic routing, optimized service provision, service level verification, and detection of anomalous or malicious behavior. Internet tomography is the study of methods for inferring unknown internal characteristics of large-scale networks on the basis of peripheral information that can be easily obtained. Tomography is made difficult by the heterogeneity and the largely unregulated structure of the Internet. Furthermore, one cannot rely on the cooperation of individual servers and routers.
The result is a series of inference questions that can be expressed as "inverse problems,'' with strong parallels to signal processing problems such as tomographic image reconstruction, system identification and array processing. Tomography efforts are further complicated by the fact that solutions to these problems potentially involve acquiring private or sensitive data. Algorithms that collect and process information should do so in a privacy-preserving way, allowing cooperating users to benefit from the result of the total collected data without gaining access to the data itself.
This workshop will address open research areas including: algorithms for placement of network monitors; space-efficient and time-efficient on-line algorithms to process even high volumes of traffic; machine- learning and statistical algorithms to analyze traffic patterns and detect anomalous behavior; algorithms for computing traffic flows and using results to better distribute network capacity and resources.