Sort:
Open Access Issue
Towards Privacy-Aware and Trustworthy Data Sharing Using Blockchain for Edge Intelligence
Big Data Mining and Analytics 2023, 6 (4): 443-464
Published: 29 August 2023
Downloads:31

The popularization of intelligent healthcare devices and big data analytics significantly boosts the development of Smart Healthcare Networks (SHNs). To enhance the precision of diagnosis, different participants in SHNs share health data that contain sensitive information. Therefore, the data exchange process raises privacy concerns, especially when the integration of health data from multiple sources (linkage attack) results in further leakage. Linkage attack is a type of dominant attack in the privacy domain, which can leverage various data sources for private data mining. Furthermore, adversaries launch poisoning attacks to falsify the health data, which leads to misdiagnosing or even physical damage. To protect private health data, we propose a personalized differential privacy model based on the trust levels among users. The trust is evaluated by a defined community density, while the corresponding privacy protection level is mapped to controllable randomized noise constrained by differential privacy. To avoid linkage attacks in personalized differential privacy, we design a noise correlation decoupling mechanism using a Markov stochastic process. In addition, we build the community model on a blockchain, which can mitigate the risk of poisoning attacks during differentially private data transmission over SHNs. Extensive experiments and analysis on real-world datasets have testified the proposed model, and achieved better performance compared with existing research from perspectives of privacy protection and effectiveness.

Regular Paper Issue
Improving Data Utility Through Game Theory in Personalized Differential Privacy
Journal of Computer Science and Technology 2019, 34 (2): 272-286
Published: 22 March 2019

Due to dramatically increasing information published in social networks, privacy issues have given rise to public concerns. Although the presence of differential privacy provides privacy protection with theoretical foundations, the trade-off between privacy and data utility still demands further improvement. However, most existing studies do not consider the quantitative impact of the adversary when measuring data utility. In this paper, we firstly propose a personalized differential privacy method based on social distance. Then, we analyze the maximum data utility when users and adversaries are blind to the strategy sets of each other. We formalize all the payoff functions in the differential privacy sense, which is followed by the establishment of a static Bayesian game. The trade-off is calculated by deriving the Bayesian Nash equilibrium with a modified reinforcement learning algorithm. The proposed method achieves fast convergence by reducing the cardinality from n to 2. In addition, the in-place trade-off can maximize the user’s data utility if the action sets of the user and the adversary are public while the strategy sets are unrevealed. Our extensive experiments on the real-world dataset prove the proposed model is effective and feasible.

total 2