Content-rich websites typically house their images as individual files or as more costly binary database objects. Both methods place high demands on storage resources with commensurate high monetary cost. Inexpensive shared service accounts come with strict limits on disk and computing resources which prevent their use for hosting content-rich websites. To minimize the load on these limited resources, large numbers of images or other records may be concatenated into a single file and delivered with short latency independent of location within the file using Linux utility dd. This solution and its performance are presented for (a) a website which houses 8000+ genealogical reference books with more than 3.5 million 250-Kbyte page images, (b) a website-based database of 86 million death records, and (c) a cluster computing application which utilizes 800 neuroimaging databases, each with 1.5 million records. These disparate applications demonstrate the efficacy, scalability, and general utility of the solution.
翻译:内容丰富的网站通常以单个文件或成本更高的二进制数据库对象存放其图像。两种方法都对存储资源提出了很高的需求,而相应的货币成本也很高。费用高昂的共享服务账户对磁盘和计算资源有严格的限制,防止其被用于托管内容丰富的网站。为了最大限度地减少这些有限资源的负担,大量图像或其他记录可合并成一个单一的文档,并在文件内使用Linux 通用数据dd提供与位置无关的短期文件。这一解决方案及其性能是为以下网站提供的:(a) 拥有8000+基因参考书的8000+基因参考书,拥有350万张250千克字节以上的页面图像;(b) 网站数据库,拥有8 600万份死亡记录,以及(c) 集集计算应用程序,利用800万个神经成像数据库,每个数据库都有150万份记录。这些互不相干的应用程序显示了解决方案的有效性、可缩放性和一般效用。