Data Lake (DL) is a Big Data analysis solution which ingests raw data in their native format and allows users to process these data upon usage. Data ingestion is not a simple copy and paste of data, it is a complicated and important phase to ensure that ingested data are findable, accessible, interoperable and reusable at all times. Our solution is threefold. Firstly, we propose a metadata model that includes information about external data sources, data ingestion processes, ingested data, dataset veracity and dataset security. Secondly, we present the algorithms that ensure the ingestion phase (data storage and metadata instanciation). Thirdly, we introduce a developed metadata management system whereby users can easily consult different elements stored in DL.
翻译:数据湖(DL)是一个大数据分析解决方案,它以本地格式摄入原始数据,使用户能够在使用时处理这些数据。数据摄入不是简单的数据复制件和粘贴物,而是一个复杂而重要的阶段,以确保摄入的数据随时都能找到、可获取、互操作和可重复使用。我们的解决方案有三重。首先,我们提出了一个元数据模型,其中包括外部数据源、数据摄入过程、摄入数据、数据、数据集真实性和数据集安全等信息。第二,我们介绍了确保摄入阶段的算法(数据存储和元数据即时化)。第三,我们引入了一个发达的元数据管理系统,用户可以方便地查阅DL存储的不同元素。