Muutke küpsiste eelistusi

E-raamat: Guide to Data Privacy: Models, Technologies, Solutions

  • Formaat - PDF+DRM
  • Hind: 43,21 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

Data privacy technologies are essential for implementing information systems with privacy by design.Privacy technologies clearly are needed for ensuring that data does not lead to disclosure, but also that statistics or even data-driven machine learning models do not lead to disclosure.  For example, can a deep-learning model be attacked to discover that sensitive data has been used for its training?  This accessible textbook presents privacy models, computational definitions of privacy, and methods to implement them. Additionally, the book explains and gives plentiful examples of how to implementamong other modelsdifferential privacy, k-anonymity, and secure multiparty computation.

Topics and features:









Provides integrated presentation of data privacy (including tools from statistical disclosure control, privacy-preserving data mining, and privacy for communications) Discusses privacy requirements and tools fordifferent types of scenarios, including privacy for data, for computations, and for users Offers characterization of privacy models, comparing their differences, advantages, and disadvantages Describes some of the most relevant algorithms to implement privacy models Includes examples of data protection mechanisms









This unique textbook/guide contains numerous examples and succinctly and comprehensively gathers the relevant information. As such, it will be eminently suitable for undergraduate and graduate students interested in data privacy, as well as professionals wanting a concise overview.

Vicenç Torra is Professor with the Department of Computing Science at Umeå University, Umeå, Sweden.
1 Introduction
1(26)
1.1 Motivations for Data Privacy
2(3)
1.1.1 Privacy, Security and Inference
4(1)
1.2 Two Motivating Examples
5(5)
1.2.1 Sharing a Database
5(2)
1.2.2 Sharing a Computation
7(2)
1.2.3 Privacy Leakages and Risk
9(1)
1.3 Privacy and Society
10(2)
1.4 Terminology
12(12)
1.4.1 The Framework
12(1)
1.4.2 Anonymity and Unlinkability
13(2)
1.4.3 Disclosure
15(3)
1.4.4 Dalenius' Definitions for Attribute and Identity Disclosure
18(2)
1.4.5 Plausible Deniability
20(1)
1.4.6 Undetectability and Unobservability
20(1)
1.4.7 Pseudonyms and Identity
21(2)
1.4.8 Transparency
23(1)
1.5 Privacy and Disclosure
24(1)
1.6 Privacy by Design
25(1)
1.7 Bibliographical Notes
26(1)
2 Machine and Statistical Learning, and Cryptography
27(26)
2.1 Machine and Statistical Learning
27(1)
2.2 Classification of Techniques
28(1)
2.3 Supervised Learning
29(2)
2.3.1 Classification
29(1)
2.3.2 Regression
30(1)
2.3.3 Validation of Results: k-fold Cross-validation
31(1)
2.4 Unsupervised Learning
31(19)
2.4.1 Clustering
32(10)
2.4.2 Association Rules Mining
42(7)
2.4.3 Expectation-Maximization Algorithm
49(1)
2.5 Cryptography
50(2)
2.5.1 Symmetric Cryptography
50(1)
2.5.2 Public-key Cryptography
51(1)
2.5.3 Homomorphic Encryption
52(1)
2.6 Bibliographical Notes
52(1)
3 Disclosure, Privacy Models, and Privacy Mechanisms
53(54)
3.1 Disclosure: Definition and Controversies
54(6)
3.1.1 A Boolean or Measurable Condition
55(1)
3.1.2 Identity Disclosure
56(1)
3.1.3 Attribute Disclosure
56(1)
3.1.4 Attribute Disclosure in Clusters and Cells
57(1)
3.1.5 Discussion
58(2)
3.2 Measures for Attribute Disclosure
60(8)
3.2.1 Attribute Disclosure for Numerical Data Releases
61(1)
3.2.2 Attribute Disclosure for Categorical Data Releases
62(1)
3.2.3 Model-Based Attribute Disclosure
63(1)
3.2.4 Attribute Disclosure for Absent Attributes
64(2)
3.2.5 Discussion on Attribute Disclosure
66(1)
3.2.6 Attribute Disclosure Through Membership Inference Attacks
67(1)
3.3 Measures for Identity Disclosure
68(10)
3.3.1 Uniqueness
69(1)
3.3.2 Re-Identification for Identity Disclosure
70(8)
3.4 Privacy Models
78(15)
3.4.1 Privacy from Re-Identification
79(1)
3.4.2 K-Anonymity
79(2)
3.4.3 K-Anonymity and Anonymity Sets: k-Confusion
81(4)
3.4.4 K-Anonyiriity and Attribute Disclosure: Attacks and Privacy Models
85(1)
3.4.5 K-Anonymity and Computational Anonymity
86(2)
3.4.6 Differential Privacy
88(1)
3.4.7 Local Differential Privacy
89(2)
3.4.8 Integral Privacy
91(1)
3.4.9 Homomorphic Encryption
91(1)
3.4.10 Secure Multiparty Computation
92(1)
3.4.11 Result Privacy
92(1)
3.4.12 Privacy Models for Clusters and Cells
93(1)
3.4.13 Discussion
93(1)
3.5 Classification of Privacy Mechanisms
93(10)
3.5.1 On Whose Privacy Is Being Sought
95(3)
3.5.2 On the Computations to Be Done
98(2)
3.5.3 On the Number of Databases
100(1)
3.5.4 Knowledge Intensive Data Privacy
101(1)
3.5.5 Other Dimensions and Discussion
102(1)
3.6 Summary
103(1)
3.7 Bibliographical Notes
104(3)
4 Privacy for Users
107(22)
4.1 User's Privacy in Communications
108(5)
4.1.1 Protecting the Identity of the User
108(5)
4.1.2 Protecting the Data of the User
113(1)
4.2 User's Privacy in Information Retrieval
113(14)
4.2.1 Protecting the Identity of the User
113(1)
4.2.2 Protecting the Query of the User
114(3)
4.2.3 Private Information Retrieval
117(10)
4.3 Other Contexts
127(1)
4.4 Bibliographical Notes
127(2)
5 Privacy for Computations, Functions, and Queries
129(30)
5.1 Differential Privacy Mechanisms
130(20)
5.1.1 Differential Privacy Mechanisms for Numerical Data
130(10)
5.1.2 Composition Theorems
140(4)
5.1.3 Differential Privacy Mechanisms for Categorical Data
144(4)
5.1.4 Properties of Differential Privacy
148(1)
5.1.5 Machine Learning
149(1)
5.1.6 Concluding Remarks
149(1)
5.2 Secure Multiparty Computation Protocols
150(6)
5.2.1 Assumptions on Data and on Adversaries
151(1)
5.2.2 Computing a Distributed Sum
151(2)
5.2.3 Secure Multiparty Computation and Inferences
153(1)
5.2.4 Computing the Exclusive OR Function
154(2)
5.2.5 Secure Multiparty Computation for Other Functions
156(1)
5.3 Bibliographical Notes
156(3)
6 Privacy for Data: Masking Methods
159(52)
6.1 Perturbative Methods
162(27)
6.1.1 Data and Rank Swapping
162(7)
6.1.2 Microaggregation
169(12)
6.1.3 Additive and Multiplicative Noise
181(3)
6.1.4 PRAM: Post-Randomization Method
184(4)
6.1.5 Lossy Compression and Other Transform-Based Methods: De-Noising Data
188(1)
6.2 Non-perturbative Methods
189(2)
6.2.1 Generalization and Recoding
189(2)
6.2.2 Suppression
191(1)
6.3 Synthetic Data Generators
191(8)
6.3.1 Synthetic Data Generators and Generative Adversarial Networks
194(2)
6.3.2 Table-GANs
196(3)
6.4 Masking Methods and k-Anonymity
199(2)
6.4.1 Mondrian
199(1)
6.4.2 Microaggregation and Generalization
199(1)
6.4.3 Algorithms for it-Anonymity: Variants and Big Data
200(1)
6.5 Data Protection Procedures for Constrained Data
201(4)
6.5.1 Types of Constraints
201(4)
6.6 Masking Methods and Big Data
205(1)
6.7 Bibliographical Notes
206(5)
7 Selection of a Data Protection Mechanism: Information Loss and Risk
211(20)
7.1 Information Loss: Evaluation and Measures
212(12)
7.1.1 Generic Versus Specific Information Loss
212(1)
7.1.2 Information Loss Measures
213(2)
7.1.3 Generic Information Loss Measures
215(5)
7.1.4 Specific Information Loss
220(2)
7.1.5 Information Loss and Big Data
222(2)
7.2 Selection of Masking Methods
224(2)
7.2.1 Aggregation: A Score
224(1)
7.2.2 Visualization: R-U Maps
225(1)
7.2.3 Optimization and Post-Masking
225(1)
7.3 Machine Learning
226(1)
7.4 Privacy in Federated Learning
227(2)
7.5 Bibliographical Notes
229(2)
8 Other Data-Driven Mechanisms
231(16)
8.1 Result-driven Approaches
232(5)
8.2 Tabular Data
237(8)
8.2.1 Sensitivity Rules
238(2)
8.2.2 Tabular Data Protection
240(1)
8.2.3 Cell Suppression
240(4)
8.2.4 Controlled Tabular Adjustment
244(1)
8.3 Bibliographical Notes
245(2)
9 Conclusions
247(4)
9.1 Guidelines
248(3)
Appendix A Matching and Integration: Record Linkage for Identity Disclosure Risk 251(30)
References 281(24)
Index 305
Vicenç Torra is Professor with the Department of Computing Science at Umeå University, Umeå, Sweden.  He is the Wallenberg Chair on AI at the university, as well as a fellow of IEEE and EurAI.