"The Mathematics of Encryption: An Elementary Introduction
,” by DIMACS faculty member Midge Cozzens and Steven Miller of Williams College, is part of the Mathematical World series published by AMS. The series features accessible expository works that illustrate the beauty and utility of mathematics. In keeping with the series philosophy, this new book introduces just enough mathematics to explore topics that include classical and modern methods of encryption, ciphers, secret-sharing, error detection and correction, steganography, and quantum cryptography. The book is an outgrowth of introductory cryptography courses for non-math majors taught at both Rutgers and Williams. It is suitable for a wide range of audiences, both inside and outside the classroom.
The second
new book, “Graph
Partitioning and Graph Clustering,” is part of the AMS series
in Contemporary
Mathematics. The book is a compilation of papers resulting
from the 10th
DIMACS Implementation Challenge on Graph Partitioning and Graph
Clustering held in February 2012.
It is edited by the Challenge organizers, David Bader, Henning Meyerhenke, Peter Sanders, and Dorothea Wagner.
Implementation Challenges
seek to benchmark realistic algorithm performance for important
problem classes when worst-case analysis is overly pessimistic and
probabilistic models are too unrealistic. They use experimentation
to gain insight into practical algorithm performance when analysis
fails. By evaluating different implementations on common instances,
the Challenges create a reproducible picture of the state of the art
in the area under consideration. They often lead to improved
implementation methods and data structures and take a step toward
technology transfer by providing leading-edge implementations of
algorithms for others to use and adapt.
Some of the contributions of the 10th Challenge include: extension
of a file format used by several graph partitioning and graph
clustering libraries for graphs and their partitions; an online
testbed of input instances and generators; definition of a new
combination of measures to assess the quality of a clustering;
definition of a measure to assess the work an implementation
performs in a parallel setting to normalize sequential and parallel
implementations to a common baseline; and a nondiscriminatory way to
assign scores to solvers that takes both running time and solution
quality into account.