Dr. Fran Berman, director of the San Diego Supercomputer Center [SDSC], wrote an interesting article for the December 2008 edition of Communications, the monthly magazine of the Association for Computing Machinery [ACM]. The article provides a sets out a simple guide for managing what has become known as the “data deluge.”
The ‘free rider’ solution of ‘Let someone else do it’– whether that someone else is the government, a library, a museum, an archive, Google, Microsoft, the data creator, or the data user — is unrealistic and pushes responsibility to a single company, institution, or sector. What is needed are cross-sector economic partnerships,” says Berman. She adds that the solution is to “take a comprehensive and coordinated approach to data cyberinfrastructure and treat the problem holistically, creating strategies that make sense from a technical, policy, regulatory, economic, security, and community perspective.”
The article closes with a Top 10 list of guidelines to follow when designing and managing your data workflow systems. I won’t steal Fran’s thunder, so check out the condensed summary here or pick up the latest issue of Communications.