Karpagam JCS ISSN: 2582 – 8525 (Print), 2583 – 3669 (Online)

A STUDY ON DECENTRALIZED LIGHTWEIGHT SECURE AUDITING SCHEMES FOR DYNAMIC OUTSOURCING DATA INTEGRITY IN THE MULTI CLOUD STORAGE ENVIRONMENT

Abstract
Multi cloud storage environment is an emerging paradigm which provides storage services to various categories of users to out source their data. Due to the multi- tenancy nature of data storage in the cloud, it leads to multiple security concerns like data breaches and unauthorized data access. It is of critical importance to the user of the cloud to outsource their data to cloud. In order to mitigate those challenges and enhance the data integrity in the multi-tenant cloud environment, many secure auditing schemes have been designed and implemented by various researchers. Hence, review of the decentralized lightweight secure auditing scheme defined by various researchers to ensure the data integrity on their outsourced data has been carried out in this article. Initially those models have been incorporated by third partyauditors (TPA) to accomplish the auditing task in the cloud with a list of secure policies for verifications and certification of the cloud storage. Next, decentralized cloud auditing scheme using block chain with. group of verifiers has been incorporated which insists single designated third party auditors to verify the integrity of the data. Finally lattice based data auditing scheme with prior knowledge is implemented without delegating the auditing to the third party auditor for data integrity verification. On experimental analysis of those models using synthesis data in the cloudplatform, it leverages more challenges with respect to computation overhead (response time) and managing the data integrity verification among the streaming outsourcing data at runtime. Further those models are less capable to block based audit due to improper data auditing structures. To manage those challenges, decentralized lightweight data integrity verifying model for secure auditing of the streaming of outsourced data can be designed and implemented using Auto encoder based deep learning model as research methodology. It is efficient in verifying the data integrity and also it is capable in eliminating data duplication.

View Full Article

Download or view the complete article PDF published by the author.

📥 Download PDF 👁️ View in Browser