August 27, 2011

IBM Is On The Data Very, Very, Very Large




According to an article published this week, the MIT Technology Review, IBM scientists working on a new warehouse 120 200 000 petabytes of data consists of a traditional hard drive work together. Information on the container giant is expected to save about 1000000000000 space for files and would need to more efficient simulations of complex systems, such as those used for weather and climate model.

The advantages of the new system from a file system known as the General Parallel File System (GPFS), which was developed at the IBM Almaden supercomputers allow faster access to data. It spreads the individual files across multiple discs, so that many parts of a file can be read or written at the same time.

GPF take advantage of the cluster architecture to provide faster access to data from the file, which is common in many storage devices, which provides the optimum use of available storage for superior performance. It 'also the storage engine of IBM's Watson, who could easily beat me at risk.

No comments:

Post a Comment

Message