

Title: Computing at Extreme Scale - The Romance of Big Data
Abstract:
Computing at the limits of technology calls for numerous engineering decisions and tradeoffs. General purpose solutions do not work at the extremes. Traditional HPC has been analyzed for decades, resulting in specialized architectures. Systems for life critical systems, those for large enterprises, those for tiny devices, also present their own special requirements.
The area of data intensive computing is newer, and the computing models are less established. To support large (millions) of users doing similar but different computations, expecting to have access to enormous amounts of information (petabytes, not gigabytes) and to get prompt responses and global access calls for different compromises. Different applications present their own requirements and difficulties.
This talk will address some of those needs – different models of storage and data management that are appropriate for different types of application, networking demands for parallelism and global access, management of large numbers of fallible processors and storage. Support for such computing also calls for different approaches to software methodology, system management, and deployment.
But massive data also opens new ways to approach science and to get remarkable results, ranging from fast advertising (yes,it's very hard) to language translation to (of course) search.