├── About-page-text.md ├── Home.md ├── June.6.md ├── Landing-page.md ├── Location-privacy-principles.md ├── Main-page-blurb.md ├── Notes.md ├── Pattern-template.md ├── Patterns-to-write.md ├── Patterns.md ├── Principles.md ├── Resources.md ├── Specs.md ├── media └── images │ ├── Fire_Eagle_granularity.png │ ├── Google_Dashboard_Latitude.png │ ├── Privacy-Label.png │ ├── chrome_location_icon.png │ ├── landing_page.jpg │ ├── lion_location_icon.png │ └── privacy-aware-wording.jpg ├── notes ├── TPL-presentation-feedback.md ├── antipatterns.md └── june.1.md ├── patterns ├── Abridged-Terms-and-Conditions.md ├── Active-broadcast-of-presence.md ├── Added-noise-measurement-obfuscation.md ├── Aggregation-gateway.md ├── Ambient-notice.md ├── Anonymity-set.md ├── Anonymous-reputation-based-blacklisting.md ├── Appropriate-Privacy-Feedback.md ├── Appropriate-Privacy-Icons.md ├── Asynchronous-notice.md ├── Attribute-based-credentials.md ├── Awareness-Feed.md ├── Buddy-List.md ├── Data-breach-notification-pattern.md ├── Decoupling-[content]-and-location-information-visibility.md ├── Discouraging-blanket-strategies.md ├── Dynamic-Privacy-Policy-Display.md ├── Enable-Disable-Functions.md ├── Encryption-user-managed-keys.md ├── Federated-privacy-impact-assessment.md ├── Icons-for-Privacy-Policies.md ├── Identity-federation-do-not-track-pattern.md ├── Impactful-Information-and-Feedback.md ├── Incentivized-Participation.md ├── Increasing-Awareness-of-Information-Aggregation.md ├── Informed-Consent-for-Web-based-Transactions.md ├── Informed-Credential-Selection.md ├── Informed-Implicit-Consent.md ├── Informed-Secure-Passwords.md ├── Lawful-Consent.md ├── Layered-policy-design.md ├── Location-granularity.md ├── Masquerade.md ├── Minimal-Information-Asymmetry.md ├── Negotiation-of-Privacy-Policy.md ├── Obligation-management.md ├── Obtaining-Explicit-Consent.md ├── Onion-routing.md ├── Outsourcing-[with-consent].md ├── Pay-Back.md ├── Personal-Data-Table.md ├── Personal-data-store.md ├── Platform-for-Privacy-Preferences.md ├── Policy-matching-display.md ├── Preventing-Mistakes-or-Reducing-Their-Impact.md ├── Privacy-Aware-Wording.md ├── Privacy-Awareness-Panel.md ├── Privacy-Labels.md ├── Privacy-Mirrors.md ├── Privacy-Policy-Display.md ├── Privacy-aware-network-client.md ├── Privacy-color-coding.md ├── Privacy-dashboard.md ├── Private-link.md ├── Protection-against-tracking.md ├── Pseudonymous-identity.md ├── Pseudonymous-messaging.md ├── Reasonable-Level-of-Control.md ├── Reciprocity.md ├── Selective-Access-Control.md ├── Sign-an-Agreement-to-Solve-Lack-of-Trust-on-the-Use-of-Private-Data-Context.md ├── Single-Point-of-Contact.md ├── Sticky-policy.md ├── Strip-invisible-metadata.md ├── Support-Selective-Disclosure.md ├── Trust-Evaluation-of-Services-Sides.md ├── Trustworthy-privacy-plugin.md ├── Unusual-activities.md ├── Use-of-dummies.md ├── User-data-confinement-pattern.md └── Whos-Listening.md ├── principles ├── Minimization.md └── location.md ├── resources ├── privacypatterns-sxsw.key └── privacypatterns.key └── specs ├── patterns-schemas.md ├── requirements.md └── website.md /About-page-text.md: -------------------------------------------------------------------------------- 1 | ## About the project ## 2 | 3 | Privacy represents a broad variety of concerns — subjective, contextual, hard-to-define — that real people have about the flows of personal information. Location-based services provide a key example: a growing field that uses potentially sensitive data, where adoption has been held back by privacy concerns. 4 | 5 | Translating these concerns (as well as corporate and legal liability) into technical artifacts — a process known generally as "privacy-by-design" — has proven difficult. How can we best convert lawyer speak into engineering speak? How can problems be elegantly anticipated early in the development process? 6 | 7 | Drawing inspiration from Christopher Alexander and the success of software design patterns in improving communication about tried-and-true practices, we hope privacy patterns will: 8 | 9 | * standardize language for privacy-preserving technologies 10 | * document common solutions to privacy problems 11 | * help LBS designers identify and address privacy concerns 12 | 13 | We're currently compiling a first draft of some patterns to get things started, but our goal is for this to be a living document constructed by the community of engineers, designers, lawyers and regulators involved in this topic. 14 | 15 | ## Contributors ## 16 | 17 | [Nick Doty](http://npdoty.name) is a PhD student at the [UC Berkeley, School of Information](http://ischool.berkeley.edu) and works on privacy at the [World Wide Web Consortium](http://www.w3.org). 18 | 19 | [Mohit Gupta](http://m0hit.name) develops mobile software at [Location Labs](http://location-labs.com) with a focus on building privacy into products. 20 | 21 | [Jeff Zych](http://jlzych.com) and [Rowyn McDonald](http://www.rowyn.com) are Master's students at the [School of Information](http://ischool.berkeley.edu) and built this site! 22 | 23 | ## Contact us ## 24 | 25 | *If you're interested in privacy patterns — because you'd like to contribute your own content, support the project in some way or suggest an improvement — please contact us.* -------------------------------------------------------------------------------- /Home.md: -------------------------------------------------------------------------------- 1 | # Privacy Patterns Wiki 2 | 3 | The privacy patterns wiki, is a space for discussion about 4 | 5 | * The writing of draft patterns 6 | * Project goals and plans 7 | * [[Specs]] 8 | * [[Notes]] 9 | * [[Resources]] 10 | * [[Patterns]] 11 | * [[Principles]] 12 | 13 | and other miscellaneous items. 14 | 15 | Note: The wiki is *public* for all to see, but can be modified only by project members. 16 | -------------------------------------------------------------------------------- /June.6.md: -------------------------------------------------------------------------------- 1 | ## June 6th meeting with 2 | 3 | ### Our goals: 4 | 5 | 1. Standardize terminology 6 | 2. Document common practices (ie patterns) 7 | 3. Be a space for disussion over patterns and practices related with privacy. Code samples, living document, comments and feedback. 8 | 9 | ### 10 | 11 | 12 | - looking at patterns that relate to privacy and intimacy and pulling learnings or translations (expectations) or conflicts into location information privacy patterns. 13 | 14 | - 15 | 16 | ### identifying conflicts in location privacy and then looking for opportunity of patterns 17 | 18 | - this might provide 19 | 20 | #### 21 | 22 | -------------------------------------------------------------------------------- /Landing-page.md: -------------------------------------------------------------------------------- 1 | ## Specs for the landing page for privacypatterns.org 2 | 3 | ![first sketch for the privacypatterns landing page by @npdoty](media/images/landing_page.jpg) -------------------------------------------------------------------------------- /Location-privacy-principles.md: -------------------------------------------------------------------------------- 1 | There are several sources of principles, issues or factors to support or consider in designing a privacy-supportive system. These include the OECD guidelines and FTC (see also versions from the DHS and others) Fair Information Practice Principles (FIPPs). There are even location-specific sets include the CTIA guidelines (which are organized around a few principles). 2 | 3 | [ Mulligan and Doty](http://escholarship.org/uc/item/0rp834wf) present a list of ten factors, drawn from FIPPs, academic theories of privacy (including informational self-determination and contextual integrity) and industry experience. 4 | 5 | * Appropriateness 6 | * Minimization 7 | * User Control 8 | * Notice 9 | * Consent 10 | * Secondary Use 11 | * Distribution 12 | * Retention 13 | * Transparency and Feedback 14 | * Aggregation 15 | 16 | For each of these factors, we've listed titles for potential privacy patterns that support privacy for that factor. (Some patterns may be listed for more than one factor.) Patterns may also be organized by layer/level (front-end/back-end/privacy policy) or even role (UI designer/developer/lawyer). 17 | 18 | ## Appropriateness ## 19 | * ... 20 | * ''something process-oriented: user research, evaluating existing norms, etc.'' 21 | 22 | ## Minimization ## 23 | * [[Fuzzing]] / [[Location granuality]] 24 | * Summarization 25 | 26 | ## User Control ## 27 | * User privacy preferences (''GeoPriv'') 28 | * Privacy by default 29 | 30 | ## Notice ## 31 | * Asynchronous notice 32 | * Ambient notice 33 | * Privacy policy 34 | 35 | ## Consent ## 36 | * Opt (''contra opt out vs. opt in'') 37 | 38 | ## Secondary use ## 39 | * Data use preferences (''GeoPriv'') 40 | 41 | ## Distribution ## 42 | * Distribution preferences (''GeoPriv'') 43 | 44 | ## Retention ## 45 | * Trail/history anonymization 46 | * Encrypting location data 47 | 48 | ## Transparency and Feedback ## 49 | * Privacy dashboard 50 | * Per-recipient and per-datum transparency (''How do others see me?'') 51 | 52 | ## Aggregation ## 53 | * Trail/history anonymization 54 | * k-anonymity 55 | 56 | -------------------------------------------------------------------------------- /Main-page-blurb.md: -------------------------------------------------------------------------------- 1 | Privacy patterns are design solutions to common privacy problems — a way to translate "privacy-by-design" into practical advice for software engineering. We believe design patterns can help **document common practices** and **standardize terminology** and while we're starting with a set of patterns for location-based services, we hope to build **a living, community space** where all can contribute their own patterns. -------------------------------------------------------------------------------- /Notes.md: -------------------------------------------------------------------------------- 1 | # Meeting Notes 2 | 3 | * [[june.1]] 4 | * [[antipatterns]] 5 | * [[tpl presentation feedback]] 6 | 7 | _meeting notes are saved within the folder `notes/` in the git repository. If you are only using web based editing **(this)**, you can safely ignore this note_ -------------------------------------------------------------------------------- /Pattern-template.md: -------------------------------------------------------------------------------- 1 | {%hyde _metadata blocks must begin with this_ 2 | 3 | title: Title of the pattern _variable names must be lower case_ 4 | 5 | type: pattern _values allowed: pattern, principle, page_ 6 | 7 | excerpt: This short description is a 1-2 line abstract to appear on the homepage and other shortened forms. 8 | 9 | categories: 10 | - location 11 | - notice 12 | - smartphone 13 | - ui 14 | _tags that classify the subject matter, the purpose of the pattern, and the applicable audience_ 15 | 16 | status: draft _values allowed: stub, draft, published_ 17 | 18 | address: http://npdoty.name, @m0hit _comma-separated list of authors, or more importantly, the people to contact or blame about the content, to be displayed in an <address> HTML element on the page. in future, URLs may be de-referenced and hCard identities extracted; Twitter/github handles and email addresses also possible._ 19 | 20 | %} _end the metadata block with this_ 21 | 22 | [TOC] 23 | 24 | 25 | 26 | 27 | 28 | 29 | ## Summary 30 | 31 | 32 | 33 | 34 | ## Context 35 | 36 | 37 | 38 | 39 | ## Problem 40 | 41 | 42 | 43 | 44 | ## Solution 45 | 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | 54 | 55 | 56 | 57 | 58 | 59 | ## Consequences 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | ## Examples 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | ## See Also 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | ## General Comments 95 | 96 | 97 | 98 | 99 | 100 | 101 | -------------------------------------------------------------------------------- /Patterns-to-write.md: -------------------------------------------------------------------------------- 1 | _List the title (and potentially brief description or informative links) of a possible pattern, even one you're not sure about, here, so that we can remember it and potentially write it up later._ 2 | 3 | * [WRITTEN] **Unlisted or unguessable URLs** -- rather than requiring a log in, just make a long URL that isn't listed anywhere or shows up in search results; people can informally get privacy by sharing the URL only with some people, with the drawback that the URL can be forwarded around 4 | 5 | * **Summarization** -- on the server side store just the processed conclusions/inferences/summaries of some data, rather than all the raw data itself 6 | 7 | * **Privacy / data usage activity stream** -- the same way we get activity streams about our friends, why not see an activity stream of how other services are accessing and processing your data? (Also, could re-use/take advantage of standards and conventions around activity streams.) 8 | 9 | * Facebook [https://www.facebook.com/settings/?tab=applications "Apps You Use"] access log includes day-by-day logs of which information was accessed by which Facebook app 10 | 11 | * [WRITTEN] **Stripping invisible metadata** -- Remove invisible and potentially sensitive metadata (like geolocation in EXIF) from uploaded user-generated content. 12 | 13 | * See [https://support.twitter.com/groups/31-twitter-basics/topics/111-features/articles/20156423-about-image-uploading-on-twitter Twitter image uploading] 14 | 15 | * **Show examples of data** -- rather than abstract categories, show examples (pictures, addresses, specific sample pieces of data) 16 | 17 | * **View-As** -- Facebook and some other social networks provide a feature where you can see how your info is seen by the public or by a particular other person. 18 | 19 | * **Privacy policy** -- whether it's a pattern or an anti-pattern, it's certainly a common practice for addressing Notice. 20 | 21 | * **Usage notification** -- an application lets a user know why they're asking for location at the time that they're asking. See, for example, Lion's new `purpose` parameter, the Camera app on iPhone, maybe old versions of Google Gears. 22 | 23 | * **Privacy Grids** For a service that shares information captured during usage, with different groups/applications, it is useful to represent the practices as a grid with one axis being "kind" of data and the other being "shared with". Eg. Foursquare privacy grid (https://foursquare.com/privacy/grid) -------------------------------------------------------------------------------- /Patterns.md: -------------------------------------------------------------------------------- 1 | ## List of Patterns. 2 | 3 | * [[Asynchronous notice]] 4 | * [[Ambient notice]] 5 | * [[Location granularity]] 6 | * [[Privacy dashboard]] 7 | * [[Strip invisible metadata]] 8 | 9 | [[Patterns to Write]] 10 | 11 | [[Location Privacy Principles]] 12 | -------------------------------------------------------------------------------- /Principles.md: -------------------------------------------------------------------------------- 1 | ## List of Principles 2 | 3 | * [[Location]] -------------------------------------------------------------------------------- /Resources.md: -------------------------------------------------------------------------------- 1 | ## Resources 2 | 3 | ### Site copy 4 | * [[Main page blurb]] 5 | * [[About page text]] 6 | 7 | ### Presentations 8 | 9 | * [[Presentation for policy audiences|resources/privacypatterns.key]] 10 | * [[Draft presentation for engineers (SXSW)|resources/privacypatterns-sxsw.key]] 11 | -------------------------------------------------------------------------------- /Specs.md: -------------------------------------------------------------------------------- 1 | # Specifications 2 | 3 | List of specifications for possible designs and reference specs for various parts of the privacy patterns project. 4 | 5 | 6 | * [[landing page]] 7 | * [[website]] 8 | * [[requirements]] 9 | * [[patterns schemas]] -------------------------------------------------------------------------------- /media/images/Fire_Eagle_granularity.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/privacypatterns/patterns/3fe5821aeb207904946990b76527054d8a92d67b/media/images/Fire_Eagle_granularity.png -------------------------------------------------------------------------------- /media/images/Google_Dashboard_Latitude.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/privacypatterns/patterns/3fe5821aeb207904946990b76527054d8a92d67b/media/images/Google_Dashboard_Latitude.png -------------------------------------------------------------------------------- /media/images/Privacy-Label.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/privacypatterns/patterns/3fe5821aeb207904946990b76527054d8a92d67b/media/images/Privacy-Label.png -------------------------------------------------------------------------------- /media/images/chrome_location_icon.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/privacypatterns/patterns/3fe5821aeb207904946990b76527054d8a92d67b/media/images/chrome_location_icon.png -------------------------------------------------------------------------------- /media/images/landing_page.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/privacypatterns/patterns/3fe5821aeb207904946990b76527054d8a92d67b/media/images/landing_page.jpg -------------------------------------------------------------------------------- /media/images/lion_location_icon.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/privacypatterns/patterns/3fe5821aeb207904946990b76527054d8a92d67b/media/images/lion_location_icon.png -------------------------------------------------------------------------------- /media/images/privacy-aware-wording.jpg: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/privacypatterns/patterns/3fe5821aeb207904946990b76527054d8a92d67b/media/images/privacy-aware-wording.jpg -------------------------------------------------------------------------------- /notes/TPL-presentation-feedback.md: -------------------------------------------------------------------------------- 1 | * patterns 2 | * Alex: there are categories of software design patterns 3 | * is location one category? 4 | * anonymization might be one broad category 5 | 6 | * Don: how do you know the patterns are the effective ones? 7 | * as a vocabulary, it's actually good to have even if the patterns are bad (anti-patterns) or be able to discuss when or why you should use X 8 | * as opposed to just saying, "I think this is bad" 9 | 10 | * understands immediately when we mention Christopher Alexander 11 | 12 | * Alex: several different ways to categorize this 13 | * scoping down to location might just be artificially narrowing yourself, 14 | 15 | * Leslie: "an architecture program" the functions that a client would want for a building 16 | series of pages that describe what the client wants from various points of view 17 | 18 | * once you have patterns that you can share, you can improve each particular one 19 | * the client wants This, doesn't want This 20 | * e.g. transparency at the bottom of a door to avoid opening a door into a child 21 | 22 | * the "program" is broken down into particular patterns 23 | -------------------------------------------------------------------------------- /notes/antipatterns.md: -------------------------------------------------------------------------------- 1 | # Patterns and Anti-patterns 2 | 3 | Author: @npdoty 4 | 5 | One supporting point for using "patterns" as the output is that we can 6 | also compare to "anti-patterns". I most often hear this with the 7 | "password anti-pattern" but it's a pretty widely used term. I can't find 8 | the link right now, but I think there was also a list of "dark patterns" 9 | for violating privacy or user expectations. 10 | 11 | It might be useful in giving example best practices to also show 12 | examples of anti-pattern or bad practices; here's what you should avoid 13 | and why. Of course in many cases this will be kind of hard to show -- 14 | many privacy anti-patterns are problems because of their invisibility -- 15 | but may occasionally be a great help. 16 | -------------------------------------------------------------------------------- /notes/june.1.md: -------------------------------------------------------------------------------- 1 | # Meeting Notes, June 1 2011 2 | 3 | Members: @npdoty @mohitgupta 4 | 5 | - writing patterns. 6 | 7 | - presentation for policy people 8 | - presentation for eng/design people. 9 | - we need to find some venues for this presentation. 10 | 11 | - basic spec for the website to showcase / comments on patterns. 12 | 13 | - tools to use 14 | - what are the features we need. 15 | 16 | - have a centralized repository for notes and stuff. 17 | 18 | - 19 | 20 | - setup repostiories. content repository. 21 | 22 | - for patterns 23 | - for notes 24 | - for specs 25 | - actual code. 26 | 27 | - The first spec: 28 | 29 | - in what format do the raw patterns 30 | 31 | @mohit 32 | - set up git repo. 33 | 34 | - write a initial spec outline and structure 35 | - make space for notes. 36 | 37 | - set up a task list (as bugs) 38 | 39 | - on github, so that we can track things. 40 | - maybe trak (hosted on ischool) 41 | 42 | 43 | @nick 44 | - setup meeting with @nick, @rowyn, @jeff, @mohit and maybe @deirdre 45 | 46 | - needs the spec for the site. 47 | - some kind of plan for the summer. 48 | - @mohit needs some repository that everyone can easily work with. 49 | maybe github + a private git repo with some mac scripts. 50 | -------------------------------------------------------------------------------- /patterns/Active-broadcast-of-presence.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Active broadcast of presence" 4 | type: pattern 5 | excerpt: "Users may actively choose to automatically provide updates when they want to share presence information, to increase both the relevance of, and control over, their sharing." 6 | categories: 7 | - location 8 | - mobile 9 | - control 10 | - update 11 | status: pattern 12 | use: Lawful-Consent 13 | com: Reasonable-Level-of-Control Private-link Masquerade 14 | address: 15 | 16 | %} 17 | 18 | [TOC] 19 | 20 | 21 | 22 | 23 | 24 | ## Context 25 | 26 | 27 | Controllers provide an interface for acquiring information about the user. When one such user wants to share or broadcast their information, such as location or other presence data, that user may want to constrain the information. In this way, they may wish to prioritize data that is contextually relevant, or avoid a full stream of data which may be either noisy or intrusive. The controller wants the user to be able to provide this data at will, to maximize the applicability of their services. However, they do not want the user to regret providing too much data, nor to bother the user with constant requests. 28 | 29 | ## Problem 30 | 31 | 32 | A service aims to acquire or broadcast a user's real-time data, particularly presence or location information, to a platform (e.g. social network). They wish to do so without revealing sensitive data (e.g. private locations, histories, or health information) nor overwhelming recipients with noisy data or users with constant requests. 33 | 34 | #### Forces/Concerns 35 | - The controller wants to use the user's current data to provide more relevant information to the users of their service, but without violating the user's privacy. 36 | - The user wants to participate in the service and provide useful information, but not all information, as they consider some aspects more sensitive than others. 37 | - Users who intend to use the service do not want to have the service flooded with irrelevant data. 38 | 39 | 40 | ## Solution 41 | 42 | 43 | Allow the user to actively choose when to share information, whether to _broadcast_ it, and when not to. Assume that sharing settings do not apply holistically to all situations and seek clarification when in doubt. 44 | 45 | ### [Structure] 46 | 47 | 48 | The service may present distinct contexts in which to honor explicit settings, but in absence of this context assume that further consent is required. The user may choose not be be asked again, but must make this decision explicit. 49 | 50 | ### [Implementation] 51 | 52 | 53 | In addition to privacy settings with appropriate defaults, allow the user the option to be asked again, every time the context changes. 54 | 55 | By default, users should actively choose to broadcast rather than the service deciding based on general settings which may not apply to the present context. Various contexts may be provided distinct settings. 56 | 57 | In these situations users need only be reminded prior to setting the values themselves. After this, they may choose to be notified about broadcasting, but not about sharing with the service itself. In this way, the user may decide later. 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | ## Examples 70 | 71 | 72 | 73 | 74 | ### [Known Uses] 75 | 76 | 77 | - Foursquare check-in model prior to Pilgrim 78 | - Google services 79 | 80 | 81 | 82 | 83 | 84 | 85 | ### [Related Patterns] 86 | 87 | 88 | [Active broadcast of presence](Active-broadcast-of-presence) _complements_ [Reasonable Level of Control](Reasonable-Level-of-Control), [Masquerade](Masquerade), and [Private link](Private-link). With [Reasonable Level of Control](Reasonable-Level-of-Control), it can consider a larger audience with granular sharing choices. With [Masquerade](Masquerade), it may make the audience more specific. Finally, with [Private link](Private-link), the specific audience may be determined by whom is provided with the link. As such, it may not be as private. 89 | 90 | Like many patterns which affect user data, this pattern _must use_ [Lawful Consent](Lawful-Consent). 91 | 92 | ### [Sources] 93 | 94 | 95 | Based on: 96 | 97 | Chung, E. S., Hong, J. I., Lin, J., Prabaker, M. K., Landay, J. a., & Liu, A. L. (2004). Development and Evaluation of Emerging Design Patterns for Ubiquitous Computing. DIS ’04 Proceedings of the 5th Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques, 233–242. http://doi.org/10.1145/1013115.1013148 98 | 99 | Bier, C., & Krempel, E. (2012). Common Privacy Patterns in Video Surveillance and Smart Energy. In ICCCT-2012 (pp. 610–615). Karlsruhe, Germany: IEEE. 100 | 101 | Doty, N., Gupta, M., & Zych, J. (n.d.). PrivacyPatterns.org. Retrieved February 26, 2015, from http://privacypatterns.org/ 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | -------------------------------------------------------------------------------- /patterns/Added-noise-measurement-obfuscation.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Added-noise measurement obfuscation" 4 | type: pattern 5 | excerpt: "Add some noise to service operation measurements, but 6 | make it cancel itself in the long-term" 7 | categories: 8 | - minimize 9 | - hide 10 | - obfuscate 11 | status: draft 12 | %} 13 | 14 | [TOC] 15 | 16 | 17 | 18 | 19 | 20 | 21 | ## Summary 22 | 23 | 24 | Add some noise to service operation measurements, but make it cancel itself in the long-term 25 | 26 | ## Context 27 | 28 | 29 | A service provider gets continuous measurements of a service attribute linked to a service individual. 30 | 31 | ## Problem 32 | 33 | 34 | The provision of a service may require repeated, detailed measurements of a service attribute linked to a data subject to e.g. properly bill them for the service usage, or adapt the service according to the demand load. However, these measurements may reveal further information (e.g. personal habits, etc.) when repeated over time. 35 | 36 | ## Solution 37 | 38 | 39 | A noise value is added to the true, measured value before it is transmitted to the service provider, so as to obfuscate it. The noise abides by a previously known distribution, so that the best estimation for the result of adding several measurements can be computed, while an adversary would not be able to infer the real value of any individual measurement. Note that the noise needs not be either additive or Gaussian. In fact, these may not be useful for privacy-oriented obfuscation. Scaling noise and additive Laplacian noise have proved more useful for privacy preservation. 40 | 41 | 42 | A service provider can get reliable measurements of service attributes to fulfil its operating requirements; however, no additional personal information can be inferred from the aggregation of several measurements coming from the same user. 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | 54 | ## Consequences 55 | 56 | 57 | 58 | 59 | 60 | The pattern applies to any scenario where the use of a resource over time is being monitored (e.g. smart grid, cloud computing). The device providing the measurement must be trustworthy, in order to ensure that it abides by the established noise pattern. 61 | 62 | Some information is lost due to the noise added. This loss of information may prevent the information from being exploited for other purposes. This is partly an intended consequence (e.g. avoid discovering user habits), but it may also preclude other legitimate uses. 63 | 64 | In order for information to be useful after noise addition, the number of data points over which measurements are aggregated (i.e. the size of the aggregated user base) needs to be high; otherwise, either the confidence interval would be too broad or differential privacy could not be effectively achieved. 65 | 66 | ### [Constraints] 67 | 68 | 69 | 70 | 71 | 72 | The pattern applies to any scenario where the use of a resource over time is being monitored (e.g. smart grid, cloud computing). The device providing the measurement must be trustworthy, in order to ensure that it abides by the established noise pattern. 73 | 74 | Some information is lost due to the noise added. This loss of information may prevent the information from being exploited for other purposes. This is partly an intended consequence (e.g. avoid discovering user habits), but it may also preclude other legitimate uses. 75 | 76 | In order for information to be useful after noise addition, the number of data points over which measurements are aggregated (i.e. the size of the aggregated user base) needs to be high; otherwise, either the confidence interval would be too broad or differential privacy could not be effectively achieved. 77 | 78 | ## Examples 79 | 80 | 81 | An electric utility operates a smart grid network with smart meters that provide measurements of the instantaneous power consumption of each user. The utility employs that information to both adapt the power distribution in a dynamic fashion, according to user demand at each moment, and bill the each client periodically, according to his aggregated consumption over the billing period. However, this information can also be exploited to infer sensitive user information (e.g. at what time he or she leaves and comes back to home, etc.) 82 | 83 | ### [Known Uses] 84 | 85 | 86 | - Bohli, J.-M.; Sorge, C.; Ugus, O., "A Privacy Model for Smart Metering," Communications Workshops (ICC), 2010 IEEE International Conference on , vol., no., pp.1,5, 23-27 May 2010 87 | - Xuebin Ren; Xinyu Yang; Jie Lin; Qingyu Yang; Wei Yu, "On Scaling Perturbation Based Privacy-Preserving Schemes in Smart Metering Systems," Computer Communications and Networks (ICCCN), 2013 22nd International Conference on , vol., no., pp.1,7, July 30 2013-Aug. 2 2013 88 | - Mivule, K. (2013). Utilizing noise addition for data privacy, an overview. arXiv preprint arXiv:1309.3958. 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | -------------------------------------------------------------------------------- /patterns/Aggregation-gateway.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Aggregation Gateway" 4 | type: pattern 5 | excerpt: "Encrypt, aggregate and decrypt at different places." 6 | categories: 7 | - aggregate 8 | - hide 9 | - restrict 10 | status: draft 11 | %} 12 | 13 | [TOC] 14 | 15 | 16 | 17 | 18 | 19 | 20 | ## Summary 21 | 22 | 23 | Encrypt, aggregate and decrypt at different places. 24 | 25 | ## Context 26 | 27 | 28 | A service provider gets continuous measurements of a service attribute linked to a set of individual service users. 29 | 30 | ## Problem 31 | 32 | 33 | The provision of a service may require detailed measurements of a service attribute linked to a data subject to adapt the service operation at each moment according to the demand load. However, these measurements may reveal further information (e.g. personal habits, etc.) when repeated over time. 34 | 35 | ## Solution 36 | 37 | 38 | A homomorphic encryption (e.g. Paillier) is applied at the metering system, using a secret shared with the service provider (generated by applying e.g. Shamir’s Secret Sharing Scheme) 39 | The encrypted measurements from a group of users are transmitted to an independent yet trusted third party. This third-party cannot know about the content of each measurement (as it is encrypted), but it can still operate on that data in an encrypted form (as the encryption system is homomorphic). There are different trusted third parties for each group of users. In order to improve the privacy resilience, each user may belong to several groups at the same time. 40 | The trusted third-party aggregates the measurements from all the users in the same group, without accessing the data in the clear at any time. 41 | The service provider receives the encrypted, aggregated measurement and decrypts it with the shared secret. 42 | 43 | A feeder metering system can be added as a measuring rod which introduces a comparison for each group of meters. 44 | 45 | 46 | Let the service provider have reliable access to the aggregated load at every moment, so as to fulfil its operating requirements, without letting it access the individual load required from each specific service user. 47 | 48 | 49 | 50 | 51 | 52 | 53 | 54 | 55 | 56 | 57 | 58 | ## Consequences 59 | 60 | 61 | 62 | 63 | 64 | There is a need to deploy trusted third parties that compute the aggregations over each group of users. Note that they need to be honest (i.e., they cannot collude with the other parties involved), but they need not respect confidentiality (as they only have access to encrypted contents). Smart meters are needed that have computation resources to apply secret generation and homomorphic encryption procedures (note that this is trivial when dealing with the use of computational resources, but it does not have to be always available in the case of e.g. smart grid systems). The potential range of measured values must be large enough to avoid brute force attacks. Robust homomorphic encryption schemes introduce a large computational load. 65 | 66 | 67 | 68 | 69 | 70 | 71 | ## Examples 72 | 73 | 74 | An electric utility operates a smart grid network with smart meters that provide measurements of the instantaneous power consumption of each user. The utility employs that information to adapt the power distribution in a dynamic fashion, according to the user demand at each moment. 75 | 76 | ### [Known Uses] 77 | 78 | 79 | - Lu, R., Liang, X., Li, X., Lin, X., & Shen, X. (2012). Eppa: An efficient and privacy-preserving aggregation scheme for secure smart grid communications.Parallel and Distributed Systems, IEEE Transactions on, 23(9), 1621-1631. 80 | - Rottondi, C., Verticale, G., & Capone, A. (2013). Privacy-preserving smart metering with multiple data consumers. Computer Networks, 57(7), 1699-1713. 81 | - Kursawe, K., Danezis, G., & Kohlweiss, M. (2011, January). Privacy-friendly aggregation for the smart-grid. In Privacy Enhancing Technologies (pp. 175-191). Springer Berlin Heidelberg. 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | -------------------------------------------------------------------------------- /patterns/Anonymity-set.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Anonymity Set" 4 | type: pattern 5 | excerpt: "This pattern aggregates multiple entities into a set, such that they cannot be distinguished anymore." 6 | categories: 7 | - anonymity 8 | - mix-networks 9 | - obfuscation 10 | - hide 11 | - mix 12 | status: draft 13 | %} 14 | 15 | [TOC] 16 | 17 | 18 | 19 | 20 | 21 | 22 | ## Summary 23 | 24 | 25 | This pattern aggregates multiple entities into a set, such that they 26 | cannot be distinguished anymore. 27 | 28 | ## Context 29 | 30 | 31 | This pattern is applicable in a messaging scenario, where an attacker 32 | can track routing information. Another possible scenario would be the 33 | storage of personal information in a database. 34 | 35 | ## Problem 36 | 37 | 38 | In a system with different users we have the problem that we can often 39 | distinguish between them. This enables location tracking, analyzing 40 | the behaviour of the users or other privacy-infringing practices. 41 | 42 | ## Solution 43 | 44 | 45 | There are multiple ways to apply this pattern. One possibility is, to 46 | strip away any distinguishing features from the entities. If we do not 47 | have enough entities, such that the anonymity set would be too small, 48 | then we could even insert fake identities. 49 | 50 | 51 | The goal of this pattern is to aggregate different entities into a 52 | set, such that distinguishing between them becomes infeasible. 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | ## Consequences 65 | 66 | 67 | 68 | 69 | 70 | One factor to keep in mind is that this pattern is useless if there 71 | are not many entities, such that the set of probable suspects is too 72 | small. What "too small" means depends on the exact scenario. Another 73 | factor is a possible loss of functionality. 74 | 75 | 76 | 77 | 78 | 79 | 80 | ## Examples 81 | 82 | 83 | Assuming that there are two companies, one is a treatment clinic for 84 | cancer and the other one a laboratory for research. The Clinic 85 | releases its Protected Health Information (PHI) about cancer victims 86 | to the laboratory. The PHI's consists of the patients' name, birth 87 | date, sex, zip code and diagnostics record. The clinic releases the 88 | datasets without the name of the patients, to protect their privacy. A 89 | malicious worker at the laboratory for research wants to make use of 90 | this information and recovers the names of the patients. The worker 91 | goes to the city council of a certain area to get a voter list from 92 | them. The two lists are matched for age, sex and location. The worker 93 | finds the name and address information from the voter registration 94 | data and the health information from the patient health data. 95 | 96 | ### [Known Uses] 97 | 98 | 99 | Anonymity sets are in use in various routing obfuscation mechanisms 100 | like Onion Routing. Hordes is a multicast-based protocol that makes 101 | use of multicast routing like point-to-multipoint delivery, so that 102 | anonymity is provided. Mix Zone is a location-aware application that 103 | anonymizes user identity by limiting the positions where users can be 104 | located. 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | -------------------------------------------------------------------------------- /patterns/Anonymous-reputation-based-blacklisting.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Anonymous Reputation-based Blacklisting" 4 | type: pattern 5 | excerpt: "Get rid of troublemakers without even knowing who they 6 | are." 7 | categories: 8 | - separate 9 | - hide 10 | - restrict 11 | status: draft 12 | %} 13 | 14 | [TOC] 15 | 16 | 17 | 18 | 19 | 20 | 21 | ## Summary 22 | 23 | 24 | Get rid of troublemakers without even knowing who they are. 25 | 26 | ## Context 27 | 28 | 29 | A service provider provides a service to users who access anonymously, and who may make bad use of the service. 30 | 31 | ## Problem 32 | 33 | 34 | Anonymity is a desirable property from the perspective of privacy. However, anonymity may foster misbehaviour, as users lack any fear of retribution. 35 | 36 | A service provider can assign a reputation score to its users, based on their interactions with the service. Those who misbehave earn a bad reputation, and they are eventually added to a black list and banned from using the service anymore. However, these scoring systems traditionally require the user identity to be disclosed and linked to their reputation score, hence they conflict with anonymity. This has made, for instance, Wikipedia administrators to take the decision to ban edition requests coming from the TOR network, , as they cannot properly identify users who misbehave. 37 | 38 | A Trusted Third Party (TTP) might be introduced in between the user and the service provider. The TTP can receive reputation scores from the service provider so as to enforce reputation-based access policies, while keeping the identity hidden from the service provider. However, this would require the user to trust the TTP not to be a whistle-blower indeed. 39 | 40 | How can we make users accountable for their actions while keeping them anonymous? 41 | 42 | ## Solution 43 | 44 | 45 | First, the service provider provides their users with credentials for anonymous authentication. 46 | 47 | Then, every time an authenticated user holds a session at the service, the service provider assigns and records a reputation value for that session, depending on the user behaviour during the session. Note that these reputation values can only be linked to a specific session, but not to a specific user (as they have authenticated anonymously). 48 | 49 | When the user comes back and starts a new session at the service, the service provider challenges the user to prove in zero-knowledge that he is not linked to any of the offending sessions (those that have a negative reputation associated). Zero-knowledge proofs allow the user to prove this, without revealing their identity to the service provider. Different, alternative proofs have been proposed, e.g. prove that the user is not linked to any of the sessions in a set of session IDs, prove that the last K sessions of the user have good reputation, etc. 50 | 51 | In practice, more complex blacklisting rules can be applied as well. For instance, several reputation scores can be assigned to the same session, each regarding different facets of the user behaviour. Then, the blacklisting thresholds may take the form of a Boolean combination or a lineal combination over individual session and facet reputation values. 52 | 53 | 54 | A service provider wants to prevent users who misbehave from accessing the service anymore, without gaining access to their identity. 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | ## Consequences 67 | 68 | 69 | 70 | 71 | 72 | Different implementations may only be practical for services with a reduce number of users, require intense computations, limit the scope of the reputation to a constrained time frame, be vulnerable to Sybil attacks, etc. Nonetheless, protocols are being improved to overcome these and other issues. See the cited sources below for the specific discussion. 73 | 74 | 75 | 76 | 77 | 78 | 79 | ## Examples 80 | 81 | 82 | A wiki allows any visitor to modify its contents, even without having been authenticated. Some malicious visitors may vandalize the contents. This fact is signalled by the wiki administrators. If a visitor coming from the same IP address keeps vandalizing the site, they will earn a bad reputation, and their IP will be banned from modifying the contents anymore. However, users accessing through a Tor anonymity network proxy cannot be identified from their IPs, and thus their reputation cannot be tracked. 83 | 84 | ### [Known Uses] 85 | 86 | 87 | - Au, M. H., Kapadia, A., & Susilo, W. (2012). BLACR: TTP-free blacklistable anonymous credentials with reputation. 88 | - Au, M. H., & Kapadia, A. (2012, October). PERM: Practical reputation-based blacklisting without TTPs. In Proceedings of the 2012 ACM conference on Computer and communications security (pp. 929-940). ACM. 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | -------------------------------------------------------------------------------- /patterns/Attribute-based-credentials.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Attribute Based Credentials" 4 | type: pattern 5 | excerpt: "Attribute Based Credentials (ABC) are a form of 6 | authentication mechanism that allows to flexibly and selectively 7 | authenticate different attributes about an entity without 8 | revealing additional information about the entity (zero-knowledge 9 | property)." 10 | categories: 11 | - anonymity 12 | - authentication 13 | - zero-knowledge 14 | - minimize 15 | - hide 16 | - restrict 17 | status: draft 18 | %} 19 | 20 | [TOC] 21 | 22 | 23 | 24 | 25 | 26 | 27 | ## Summary 28 | 29 | 30 | Attribute Based Credentials (ABC) are a form of authentication mechanism that allows to flexibly and selectively authenticate different attributes about an entity without revealing additional information about the entity (zero-knowledge property). 31 | 32 | ## Context 33 | 34 | 35 | ABC can be used in a variety of systems including Internet and smart cards. 36 | 37 | ## Problem 38 | 39 | 40 | Authentication of attributes classically requires full and unique authentication of an entity. For example, attributes (like age) could be put into a certificate together with name of the user, email address, public key, and other data about that entity. To corroborate an attribute (for example, that the user is an adult) the certificate has to be presented and all information have to be revealed. This is not considered a privacy-preserving solution. 41 | 42 | ## Solution 43 | 44 | 45 | There are multiple schemes to realize ABCs and implementations are also available. They typically all include a managing entity that entitles issuers to issue credentials to entities that could then act as provers of certain facts about the credentials towards verifiers. 46 | 47 | A formal model can be found here. 48 | 49 | 50 | To allow a user to selectively prove specific attributes like age > 18 to a verifying party without revealing any additional information. 51 | 52 | 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | ## Consequences 63 | 64 | 65 | 66 | 67 | 68 | ABC schemes require substantial compute power or optimization, so implementation may not be straightforward. Some projects like IRMA developed at Radboud University Nijmegen have shown that even resource restricted devices like smartcards can implement ABCs. 69 | 70 | 71 | 72 | 73 | 74 | 75 | ## Examples 76 | 77 | 78 | You want to issue an ID card that holds a users birthdate bd and can be used to prove that the card holder is old enough to view age-restricted movies in a cinema. Depending on the rating of the movie (minimum age x), the card holder can run a proof that: 79 | 80 | "today - bd > x" 81 | 82 | Multiple uses of the card at the same cinema should not be linkable. 83 | 84 | ### [Known Uses] 85 | 86 | 87 | The most popular implementations include: 88 | 89 | - IBM's IDEMIX developed as part of the PRIME/PRIMELIFE project. 90 | - Microsoft's U-Prove. 91 | - Radboud University Nijmegen's IRMA project. 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | -------------------------------------------------------------------------------- /patterns/Discouraging-blanket-strategies.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Discouraging blanket strategies" 4 | type: pattern 5 | excerpt: "Give users the possibility to define a privacy level from a range of options each time they share content." 6 | categories: 7 | - control 8 | - choose 9 | status: pattern 10 | use: Buddy-List Lawful-Consent Private-link 11 | com: Selective-Access-Control Decoupling-[content]-and-location-information-visibility Negotiation-of-Privacy-Policy Reasonable-Level-of-Control Masquerade 12 | address: 13 | 14 | %} 15 | 16 | [TOC] 17 | 18 | 19 | 20 | 21 | 22 | 23 | ## Context 24 | 25 | 26 | Socially oriented services on the Internet allow their often diverse userbase to share content. These masses of users and shared content are also varied enough to discourage individual attention. Controllers prefer to protect themselves from additional complexity and investment into features which provide them with less data. Users, however, feel in need of privacy settings to distinguish their personal risk appetite from that of the norm. They each have their own ideas about the sensitivities of their information, which makes sufficient controls difficult to implement. 27 | 28 | ## Problem 29 | 30 | 31 | Overly simplified privacy settings following all or nothing strategies could result in over-exposure, self-censoring, and unsatisfied users. 32 | 33 | These all or nothing strategies could refer to privacy settings which holistically apply to all content, or to binary (or otherwise deficient) choices for public visibility. 34 | 35 | #### Forces/Concerns 36 | - The level of invasiveness depends upon the context of the content being shared. 37 | - Users have differing levels of sensitivity attributed to contexts, and thus different levels to provide their content. 38 | - They trust the controller to different extents. 39 | - Controllers do not want to violate user privacy. 40 | - They want to protect themselves from blame if a user's privacy is violated. 41 | - They want users to produce content which is at least somewhat valuable. (For instance, when nobody can see the content there won’t be an impact in collaborative environments) 42 | 43 | 44 | ## Solution 45 | 46 | 47 | Provide users with the possibility to define a privacy level for content being shared with the controller, or with other users. Give them a range of visibilities, so that they can decide the access-level of the content being shared according to different users, or service-defined groups. 48 | 49 | 50 | 51 | 52 | 53 | ### [Implementation] 54 | 55 | 56 | Provide users easily-recognizable visual elements to define the privacy level for each content submission. Use controls, such as (drop-down) lists, combo boxes, etc. to provide a range of possible privacy levels. 57 | 58 | The privacy levels could be defined in terms of social group of the users in question to the user who is sharing content. For instance, family, closest friends, colleagues, acquaintances, everybody. This is in line with [Reasonable Level of Control](Reasonable-Level-of-Control) and [Selective Access Control](Selective-Access-Control), where a user might be given the opportunity to define their own groups or set individual privacy levels. 59 | 60 | The privacy controls themselves also need to be designed in such a way that it is very clear to users what each setting does and what it means for their privacy. 61 | 62 | ## Consequences 63 | 64 | 65 | #### Benefits 66 | - Grants users complete control over the privacy of the content being shared, which may lower the bar for them to share certain data they otherwise would not. 67 | 68 | #### Liabilities 69 | - Users could find having to set privacy settings every time they share content to be tedious. It would be necessary to define reasonable defaults for privacy settings (least effort for minimal sharing). 70 | 71 | 72 | 73 | 74 | 75 | 76 | ## Examples 77 | 78 | 79 | - Facebook. 80 | - Google Plus. 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | ### [Related Patterns] 93 | 94 | 95 | This pattern may be used by [Support Selective Disclosure](Support-Selective-Disclosure), as it makes up a foundation for a compound pattern. [Discouraging blanket strategies](Discouraging-blanket-strategies) may also be complemented by [Selective Access Control](Selective-Access-Control) and by [Decoupling [content] and location information visibility](Decoupling-[content]-and-location-information-visibility). These patterns support each other to provide more flexible privacy setting management to users. 96 | 97 | [Discouraging blanket strategies](Discouraging-blanket-strategies) 98 | _complements_ [Reasonable Level of Control](Reasonable-Level-of-Control), which features selective and granular sharing and may consider a range of options. These methods complement each other where considering this range. 99 | 100 | This pattern additionally _complements_ the [Negotiation of Privacy Policy](Negotiation-of-Privacy-Policy) pattern. While [Negotiation of Privacy Policy](Negotiation-of-Privacy-Policy) aims to let users opt-in or opt-out, this pattern handles the way in which users configure their privacy settings. If users were not able to opt-in/out then a blanket strategy would probably be in place. 101 | 102 | Finally, this pattern also _may use_ [Buddy List](Buddy-List). Content the user decides to share could be selectively shown only to certain users on the [Buddy List](Buddy-List). 103 | 104 | ### [Sources] 105 | 106 | 107 | Based on: 108 | 109 | S. Ahern, D. Eckles, N. Good, S. King, M. Naaman, and R. Nair, “Over-Exposed ? Privacy Patterns and Considerations in Online and Mobile Photo Sharing,” CHI ’07, pp. 357–366, 2007. 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | -------------------------------------------------------------------------------- /patterns/Dynamic-Privacy-Policy-Display.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Dynamic Privacy Policy Display" 4 | type: pattern 5 | excerpt: "Provide standardized contextual policy information on the nature and risks of disclosure through tooltips." 6 | categories: 7 | - inform 8 | - notify 9 | use: Privacy-Policy-Display 10 | com: Policy-matching-display Platform-for-Privacy-Preferences 11 | status: pattern 12 | address: 13 | 14 | %} 15 | 16 | [TOC] 17 | 18 | 19 | 20 | 21 | 22 | 23 | ## Context 24 | 25 | 26 | 27 | Controllers are mandated by various laws and regulations to ensure that users, their data subjects, are adequately informed before requesting consent. Failing this, the consent loses legitimacy and the controller may face repercussions. However, the main mechanism for supplying this information resides with privacy policies, which must also conform to legislative norms. The language this necessitates is long and complex, which jeopardizes the chances of users understanding it. This information can be summarized, and otherwise reworded to make the content more accessible to users, though typically the length of such summaries are still quite long. 28 | 29 | ## Problem 30 | 31 | 32 | Not all contexts are suitable for extensive privacy policy information, yet users often still need to be able to obtain additional data without breaking those contexts. 33 | 34 | #### Forces and Concerns 35 | 36 | 37 | 38 | - Controllers need informed consent for collection, sometimes with limited screen space available, yet users do not typically read privacy policies on their own 39 | - Users do not want to spend time and effort reading through privacy policies 40 | - Controllers want users to actually use their service (or product), but users will not do so if the cost of doing so entails disproportionate effort 41 | - Users want to be able to get to using the service quickly, without needing to visit multiple policy pages 42 | 43 | ## Solution 44 | 45 | 46 | Provide the user with additional relevant policy information on hover or tap, by way of 'tooltips', to best inform them given contextual limitations. In a mobile setting these tooltips may unobtrusively become available to tap when the relevant control is most in focus (i.e. selected, centered, or occupies most of the screen). 47 | 48 | This information may highlight the nature and potential consequences of the disclosure, and should be displayed consistently. 49 | 50 | 51 | 52 | 53 | 54 | 55 | ### [Implementation] 56 | 57 | 58 | _The information should be provided to the user where it is needed. Therefore the tooltip should appear on demand (i.e., need of information). This could be for example in a login dialog as soon as the user navigates the mouse into the concerning part of the interface. The tooltip should then be made visible to the user and contain all necessary information for making an informed decision._ 59 | 60 | ## Consequences 61 | 62 | 63 | _By displaying the relevant [information pertaining to] privacy policies whenever they apply to what the user is currently doing or about to do, the user's awareness of what will happen with the information they're about to share is increased._ 64 | 65 | However, users may also happen to not trigger the tooltip, especially when using a mobile device. As such it is important that they are aware of its existence, and its importance, given the current context. 66 | 67 | 68 | 69 | 70 | 71 | 72 | ## Examples 73 | 74 | 75 | When a user needs to login and is given numerous options, with limited space provided, each option can have an assigned tooltip. These can appear on hover, tap, or scroll, where necessary appearing with less opacity until the user taps the tooltip itself. It can also lead to further detail through '(see more)' in a recognizable blue underlined hyperlink format. To encourage use of this a variant may either scroll through detail or show a visible scroll bar. Not needing the user to leave the application or webpage will require less effort on their part. 76 | 77 | ### [Known Uses] 78 | 79 | 80 | The PrimeLife HCI Pattern Collection (V2) features a prototype using tooltips to convey policy information on hover. 81 | 82 | 83 | 84 | 85 | 86 | 87 | ### [Related Patterns] 88 | 89 | 90 | This pattern _complements_ [Policy Matching Display](Policy-matching-display) and [Platform for Privacy Preferences](Platform-for-Privacy-Preferences). In both cases the solutions are different, but share the same context. 91 | 92 | With [Policy Matching Display](Policy-matching-display), this pattern may add mismatches between preferences and the standardized policies to 'tooltips' therein. Against the [Platform for Privacy Preferences](Platform-for-Privacy-Preferences) pattern, however, they may work together to show the user how the privacy policy compares to user preferences. 93 | 94 | This pattern _may use_ [Privacy Policy Display](Privacy-Policy-Display). While this pattern adds additional elements to the solution presented by [Privacy Policy Display](Privacy-Policy-Display), it is a _uses_ relationship rather than _extends_, as both patterns approach different problems. 95 | 96 | ### [Sources] 97 | 98 | 99 | S. Fischer-Hübner, C. Köffel, J.-S. Pettersson, P. Wolkerstorfer, C. Graf, L. E. Holtz, U. König, H. Hedbom, and B. Kellermann, “HCI Pattern Collection - Version 2,” 2010. 100 | 101 | C. Graf, P. Wolkerstorfer, A. Geven, and M. Tscheligi, “A Pattern Collection for Privacy Enhancing Technology,” The Second International Conferences of Pervasive Patterns and Applications (Patterns 2010), vol. 2, no. 1, pp. 72–77, 2010. 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | -------------------------------------------------------------------------------- /patterns/Encryption-user-managed-keys.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Encryption with user-managed keys" 4 | excerpt: "Use encryption in such a way that the service provider cannot decrypt the user's information because the user manages the keys." 5 | status: draft 6 | display_in_list: True 7 | type: pattern 8 | categories: 9 | - encryption 10 | - control 11 | - mobile 12 | - cloud 13 | - hide 14 | - restrict 15 | address: David Houlding 16 | %} 17 | 18 | [TOC] 19 | 20 | 21 | 22 | 23 | 24 | 25 | ## Summary 26 | 27 | 28 | Use encryption in such a way that the service provider cannot decrypt the user's information because the user manages the keys. 29 | 30 | 31 | Enable encryption, with user-managed encryption keys, to protect the confidentiality of personal information that may be transferred or stored by an untrusted 3rd party. 32 | 33 | Supports [user control](User-control), [cloud computing](Cloud Computing) and [mobile](Mobile). 34 | 35 | ## Context 36 | 37 | 38 | User wants to store or transfer their personal data through an online service and they want to protect their privacy, and specifically the confidentiality of their personal information. Risks of unauthorized access may include the online service provider itself, or third parties such as its partners for example for backup, or government surveillance depending on the geographies the data is stored in or transferred through. 39 | 40 | ## Problem 41 | 42 | 43 | How can a user store or transfer their personal information through an online service while ensuring their privacy and specifically preventing unauthorized access to their personal information? 44 | 45 | 46 | Requiring the user to do encryption key management may annoy or confuse them and they may revert to either no encryption, or encryption with the online service provider managing the encryption key (affording no protection from the specific online service provider managing the key), picking an encryption key that is weak, reused, written down and so forth. 47 | 48 | Some metadata may need to remain unencrypted to support the online service provider or 3rd party functions, for example file names for cloud storage, or routing information for transfer applications, exposing the metadata to risks of unauthorized access, server side indexing for searching, or de-duplication. 49 | 50 | If the service provider has written the client side software that does the client side encryption with a user-managed encryption key, there can be additional concerns regarding whether the client software is secure or tampered with in ways that can compromise privacy. 51 | 52 | ## Solution 53 | 54 | 55 | Encryption of the personal information of the user prior to storing it with, or transferring it through an online service. In this solution the user shall generate a strong encryption key and manage it themselves, specifically keeping it private and unknown to the untrusted online service or 3rd parties. 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | ## Examples 78 | 79 | 80 | * [Spider Oak](https://spideroak.com/): online backup, sync, sharing enabling user managed personal information in zero knowledge privacy environment. 81 | * [Least Authority](https://leastauthority.com/): secure off-site backup system with client side encryption. 82 | * [LastPass](https://lastpass.com/): encrypted credentials and personal information database with user managed encryption keys. 83 | 84 | [Some](http://zeroknowledgeprivacy.org/) have used the term "zero-knowledge" to describe this pattern; however, "zero-knowledge proof" is a cryptographic term with a [distinct meaning](https://en.wikipedia.org/wiki/Zero-knowledge_proof). 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | -------------------------------------------------------------------------------- /patterns/Federated-privacy-impact-assessment.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Federated Privacy Impact Assessment" 4 | type: pattern 5 | excerpt: "The impact of personal information in a federation is 6 | more than the impact in the federated" 7 | categories: 8 | - risk-management 9 | - procedure 10 | - enforce 11 | - create 12 | status: draft 13 | %} 14 | 15 | [TOC] 16 | 17 | 18 | 19 | 20 | 21 | 22 | ## Summary 23 | 24 | 25 | The impact of personal information in a federation is more than the 26 | impact in the federated 27 | 28 | ## Context 29 | 30 | 31 | Identity Management scenarios (that is, when the roles of the Identity 32 | Provider and the Service Provider are separated). 33 | 34 | ## Problem 35 | 36 | 37 | Identity Management solutions were introduced to decouple the 38 | functions related to authentication, authorization, and management of 39 | user attributes, on the one hand, and service provision on the other 40 | hand. Federated Identity Management allows storing a data subject's 41 | identity across different systems. All together, these form a 42 | Federation that involves complex data flows. 43 | 44 | Federated Management solutions can be used to improve privacy (e.g. by 45 | allowing service providers to offer their services without knowing the 46 | identity of their users). However, the complexity of data flows and 47 | the possibility of collusion between different parties entail new 48 | risks and threats regarding personal data. 49 | 50 | ## Solution 51 | 52 | 53 | A Privacy Impact Assessment is conducted by all the members of the 54 | federation, both individually and in conjunction, so as to define 55 | shared privacy policies, prove they are met, and demonstrate the 56 | suitability of the architecture, in the benefit of all the members. 57 | 58 | 59 | Deal with privacy risks associated from the federation of different 60 | parties in an Identity Management solution. 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | 72 | ## Consequences 73 | 74 | 75 | 76 | 77 | 78 | The consequences depend on the results of the privacy-impact analysis. 79 | 80 | 81 | 82 | 83 | 84 | 85 | ## Examples 86 | 87 | 88 | An Identity Provider issues pseudonyms to authenticate users at 89 | third-party Service Providers, which can in turn check the 90 | authenticity of these pseudonyms at the Identity Provider, without 91 | getting to know the real user identity. However, the Identity Provider 92 | knows all the services requested by the users, which discloses 93 | personal information to the Identity Provider and allows it to profile 94 | the users. 95 | 96 | ### [Known Uses] 97 | 98 | 99 | The New Federated Privacy Impact Assessment (F-PIA). Building Privacy 100 | and Trust-enabled Federation. Information and Privacy Commissioner of 101 | Ontario & Liberty Alliance Project, January 2009 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | -------------------------------------------------------------------------------- /patterns/Identity-federation-do-not-track-pattern.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Identity Federation Do Not Track Pattern" 4 | type: pattern 5 | excerpt: "All information has been extracted from 6 | http://blog.beejones.net/the-identity-federation-do-not-track-pattern 7 | 8 | The Do Not Track Pattern makes sure that neither the Identity 9 | Provider nor the Identity Broker can learn the relationship 10 | between the user and the Service Providers the user us." 11 | categories: 12 | - identity-management 13 | - enforce 14 | - uphold 15 | status: draft 16 | %} 17 | 18 | [TOC] 19 | 20 | 21 | 22 | 23 | 24 | 25 | ## Summary 26 | 27 | 28 | All information has been extracted from 29 | http://blog.beejones.net/the-identity-federation-do-not-track-pattern 30 | 31 | The Do Not Track Pattern makes sure that neither the Identity Provider 32 | nor the Identity Broker can learn the relationship between the user 33 | and the Service Providers the user us. 34 | 35 | ## Context 36 | 37 | 38 | This pattern is focused on identity federation models 39 | 40 | ## Problem 41 | 42 | 43 | When an identity system provides identifying information about a user 44 | and passes this to a third party service, different parties can do 45 | correlation and derive additional information. 46 | 47 | ## Solution 48 | 49 | 50 | Include an orchestrator component, that must act in behalf and be 51 | controlled by the user. The orchestrator makes sure that the identity 52 | broker can’t correlate the original request from the service provider 53 | with the assertions that are returned from the identity provider. The 54 | correlation can only be done within the orchestrator but that’s no 55 | issue because this acts on behalf of the user, possibly on the device 56 | of the user. 57 | 58 | 59 | Avoid the correlation of end user and service provider data 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | ## Consequences 72 | 73 | 74 | 75 | 76 | 77 | In practice, the orchestrator could run in the browser of the user as 78 | a javascript program or as an App on his device 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | ### [Known Uses] 91 | 92 | 93 | Identity federations and ecosystems 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | -------------------------------------------------------------------------------- /patterns/Informed-Credential-Selection.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Informed Credential Selection" 4 | type: pattern 5 | excerpt: "Ensure users are informed of the potential privacy consequences of sharing various authenticating data." 6 | categories: 7 | - inform 8 | - explain 9 | com: Informed-Secure-Passwords 10 | status: pattern 11 | address: 12 | 13 | %} 14 | 15 | [TOC] 16 | 17 | ### [Also Known As] 18 | 19 | 20 | Credential Selection 21 | 22 | ## Context 23 | 24 | 25 | 26 | Controllers offering services (or products) often provide a means to authenticate users in order to permit them access. This access can be to a secure function, such as fulfilling a transaction. Since this action may have difficult to reverse consequences, the controller needs to be certain of the user's identity and informed consent. In order for consent to be valid, the controller must ensure that it is informed, as well as freely given, specific, and unambiguous. In order to determine identity, however, personally identifying information is needed. Some methods of authentication are also more invasive than others, allowing users to provide more information than necessary. 27 | 28 | ## Problem 29 | 30 | 31 | Credentials which users supply may be more invasive than necessary, this is a kind of consent which legally must be informed. 32 | 33 | #### Forces and Concerns 34 | 35 | 36 | 37 | - Users want to authenticate so that they know only they can obtain access 38 | - Users do not want to provide more information than they feel comfortable or than is necessary 39 | - Controllers want to prevent unauthorized access to user actions, as this can seriously affect their experience 40 | - Controllers do not want to process user data for which they do not have valid consent 41 | 42 | ## Solution 43 | 44 | 45 | Allow granular credential selection which explains to users the various ways in which personal data can be used, including who may access it, and how it may be used to derive further information. 46 | 47 | 48 | 49 | 50 | 51 | 52 | ### [Implementation] 53 | 54 | 55 | _Present the user with a selection mechanism that shows the user what possible choices are available and then show a summary page that contains the data to be sent._ The explanation of consequences must be shown as the user investigates the available credentials. It should be clear to the user which information authenticates them with the least privacy impact. 56 | 57 | One mental model for this could be the use of a credit card for identification. See the HCI Pattern Collection for further information on this example. 58 | 59 | _Independent of a mental model, the credential selection UI should contain two steps, namely, selection and summary. During the first step, all graphical elements of the selection mechanism should be based on the mental model. Thus, if working with the card based metaphor this should be apparent from the UI. During the second step, the invoked mental model is not as important as the key issue is to clearly convey which selected data and which meta-data is being sent._ 60 | 61 | ## Consequences 62 | 63 | 64 | _Allows a user to identify themselves in a granular way, controlling how much information they reveal by doing so._ 65 | 66 | _This approach should be used to make it easy for users to select the appropriate credentials. It also should inform them about which (personal) data and meta-data the recipient of the information will have after the transaction._ 67 | 68 | 69 | 70 | 71 | 72 | 73 | ## Examples 74 | 75 | 76 | Jiang et al. (2010). "A Classified Credential Selection Scheme with Disclosure-minimizing Privacy". International Journal of Digital Content Technology and its Applications, 4 (9), December 2010. 201 - 211. 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | ### [Related Patterns] 89 | 90 | 91 | This pattern _complements_ [Informed Secure Passwords](Informed-Secure-Passwords). This pattern focuses on informing users about the data released for authentication in certain contexts, elaborating on how such data might be used. [Informed Secure Passwords](Informed-Secure-Passwords) focuses on encouraging better use of password-based authentication. With overlapping contexts, these patterns both provide assistance around password-based authentication, together enhancing awareness and usage. [Preventing Mistakes or Reducing Their Impact](Preventing-Mistakes-or-Reducing-Their-Impact) may also benefit from the use of this pattern. It seeks to prevent users from unintentionally having their personal data accessed, particularly from automatic disclosure. However due to this contextual distinction, the benefit is less clearly complementary. 92 | 93 | This pattern has an implicit relationship with [Unusual Activities](Unusual-activities). This _complements_ relationship stems from [Informed Secure Passwords](Informed-Secure-Passwords). While not as strongly connected, this pattern may also benefit from detection and response to compromised authentication methods. 94 | 95 | ### [Sources] 96 | 97 | 98 | S. Fischer-Hübner, C. Köffel, J.-S. Pettersson, P. Wolkerstorfer, C. Graf, L. E. Holtz, U. König, H. Hedbom, and B. Kellermann, “HCI Pattern Collection - Version 2,” 2010. 99 | 100 | C. Graf, P. Wolkerstorfer, A. Geven, and M. Tscheligi, “A Pattern Collection for Privacy Enhancing Technology,” The Second International Conferences of Pervasive Patterns and Applications, vol. 2, no. 1, pp. 72–77, 2010. 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | -------------------------------------------------------------------------------- /patterns/Informed-Implicit-Consent.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Informed Implicit Consent" 4 | type: pattern 5 | excerpt: "Controllers must provide unavoidable notice of a users implicit consent to the processing of their data, where reasonable to do so." 6 | categories: 7 | - inform 8 | - notify 9 | com: Ambient-notice 10 | status: pattern 11 | address: 12 | 13 | %} 14 | 15 | [TOC] 16 | 17 | ### [Also Known As] 18 | 19 | 20 | Permission to Use Non-sensitive Personal Data - Implicit Consent 21 | 22 | ## Context 23 | 24 | 25 | 26 | Processing of user (data subject) information, particularly that which potentially identifies a user or group, requires their explicit informed consent. Inaction is not considered valid consent. However, not all instances make this feasible. As such there are circumstances in which legitimate interests of the controller may justify collection without first obtaining a clear statement of permission to do so. Security footage around a controller's premises, or fraud detection, for example, cannot reasonably be made optional to users of the service (or product). What constitutes legitimate interests in these contexts depends on the relationship and reasonable expectations between the controller and user. As such, sensitive data, or special categories of data, are more difficult to justify. 27 | 28 | ## Problem 29 | 30 | 31 | A controller needs to collect and otherwise process reasonable information to fulfill their legitimate interests regarding a user, but cannot feasibly acquire each user's explicit consent. 32 | 33 | #### Forces and Concerns 34 | 35 | 36 | 37 | - Users should not have to frequently and explicitly consent for regular, everyday, ubiquitous services which are expected and acceptable for legitimate interests 38 | - Users do not want to have certain data processed, and need a way to avoid implicitly consenting to it 39 | - Controllers do not want to have to obtain explicit consent in real-time bulk for expected and acceptable legitimate interests 40 | - Controllers want to ensure that legitimate consent exists before processing 41 | 42 | ## Solution 43 | 44 | 45 | Provide clear and concise notice that by using the service, the user implicitly consents to the processing necessary to fulfill legitimate interests. Ensure that this notice is perceived prior to the application of the effects it describes. 46 | 47 | 48 | 49 | 50 | 51 | 52 | ### [Implementation] 53 | 54 | 55 | Ensure that users are informed sufficiently prior to any processing with clear and concise notice, the complete detail of which should also be accessible. In digital mediums, this is straightforward, working similarly to Cookie Walls on websites. Users should be given the opportunity to choose not to use the service and therefore not be subject to the processing it requires. 56 | 57 | In physical instances it is more difficult to be sure that users take note of this. On devices, lights have often been used to convey a recording state. This, while clear once already subject to processing, is not sufficient however. Instead, large signs are commonplace to indicate the use of data collection. The most familiar example would be "Smile, you're on camera". Of course, this is less clear than "Our premises is recorded for security purposes, by entering you consent to this processing. See more info at [address]". These signs should be posted, visible prior to recording, at all entrances or otherwise where applicable. 58 | 59 | ## Consequences 60 | 61 | 62 | Users will be informed before implicitly providing consent to reasonable processing for legitimate interests of the controller. 63 | 64 | 65 | 66 | 67 | 68 | 69 | ## Examples 70 | 71 | 72 | Given a Sensor Network, Provider, and Controller, collected data is delegated by the Controller through the Provider to the Sensor Network. The Sensor Network collects some data with explicit consent, but this data may also be personal for a user who has not given such consent. This data may be potentially identifying, and thus the user should be informed prior to its processing. The Controller must ensure that the Provider of the Sensor Network provides any potential users with unambiguous warning of the collection, and that individual consent is infeasible. This may make use of a clear and legible warning sign. The Sensor Network itself should also be visible and obvious, clearly indicating when collection is ongoing. Societal norms may dictate this, such as security cameras in some contexts (commercial areas where valuables may be stolen) needing little warning. 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | ### [Related Patterns] 85 | 86 | 87 | This pattern _complements_ [Ambient Notice](Ambient-notice). By extension, it may also be implicitly _complemented by_ [Asynchronous Notice](Asynchronous-notice) as an alternative approach. Furthermore, this implicit connection has ties to [Preventing Mistakes or Reducing Their Impact](Preventing-Mistakes-or-Reducing-Their-Impact). 88 | 89 | In all these cases, a user needs to be made aware of the implicit nature of their consent. [Ambient Notice](Ambient-notice) can complement this intention with continuous, but unobtrusive notice. Its alternative can do so in a model, continual manner, where context changes might justify it. This is less complementary depending on how it is applied. In either case, users need to be prevented from mistakenly sharing personal data. 90 | 91 | ### [Sources] 92 | 93 | 94 | Y. Asnar, V. Bryl, R. Bonato, L. Campagna, D. Donnan, P. El Khoury, P. Giorgini, S. Holtmanns, M. Martinez-Juste, F. Massacci, J. Porekar, C. Riccucci, A. Saidane, M. Seguran, R. Thomas, and A. Yautsiukhin, “Initial Set of Security and Privacy Patterns at Organizational Level,” 2007. 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | -------------------------------------------------------------------------------- /patterns/Location-granularity.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Location Granularity" 4 | excerpt: "Support minimization of data collection and distribution. Important when a service is collecting location data from or about a user, or transmitting location data about a user to a third-party." 5 | status: draft 6 | type: pattern 7 | categories: 8 | - location 9 | - minimization 10 | - abstract 11 | - group 12 | %} 13 | 14 | [TOC] 15 | 16 | 17 | 18 | 19 | 20 | 21 | ## Summary 22 | 23 | 24 | Support minimization of data collection and distribution. Important when a service is collecting location data from or about a user, or transmitting location data about a user to a third-party. 25 | 26 | 27 | Support [minimization](Minimization) of data collection and distribution. 28 | 29 | ## Context 30 | 31 | 32 | When a service is collecting location data from or about a user, or transmitting location data about a user to a third-party. 33 | 34 | ## Problem 35 | 36 | 37 | Many location-based services collect current or ongoing location information from a user in order to provide some contextual service (nearest coffee shop; local weather; etc.). Collecting more information than is necessary can harm the user's privacy and increase the risk for the service (in the case of a security breach, for example), but location data may still need to be collected to provide the service. Similarly, users may want the advantages of sharing their location from your service to friends or to some other service, but sharing very precise information provides a much greater risk to users (of re-identification, stalking, physical intrusion, etc.). 38 | 39 | 40 | Accepting or transmitting location data at different levels of granularity generally requires a location hierarchy or geographic ontology agreed upon by both services and a more complex data storage model than simple digital coordinates. 41 | 42 | Truncating latitude and longitude coordinates to a certain number of decimal places may decrease precision, but is generally not considered a good fuzzing algorithm. (For example, if a user is moving in a straight line and regularly updating their location, truncated location information will occasionally reveal precise location when the user crosses a lat/lon boundary.) Similarly, using "town" rather than lat/lon may occasionally reveal more precise data than expected when the user crosses a border between two towns. 43 | 44 | ## Solution 45 | 46 | 47 | Since much geographic data inherently has different levels of precision (see [geographic ontologies](Geographic ontologies), for example) -- like street, city, county, state, country -- there may be natural divisions in the precision of location data. By collecting or distributing only the necessary level of granularity, a service may be able to maintain the same functionality without requesting or distributing potentially sensitive data. A local weather site can access only the user's zip code to provide relevant weather without ever accessing precise (and therefore sensitive) location information. 48 | 49 | A similar pattern is [location fuzzing](Fuzzing) which uses an algorithm to decrease the accuracy of location data without changing its lat/lon precision. This may be useful if the application only functions on latitude/longitude data, but can be vulnerable to attack. 50 | 51 | In some cases, less granular data may also better capture the intent of a user (that tweet was about Sproul Plaza in general, not that particular corner) or be more meaningful to a recipient ("Nick is in Berkeley, CA" means more to my DC relatives than the particular intersection). For more along these lines, see, for example, [the Meaningful Location Project](http://www.meloproject.com/team). 52 | 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | ## Examples 74 | 75 | 76 | 1. _Fire Eagle location hierarchy_ 77 | 78 | ![Fire Eagle granularity screenshot](media/images/Fire_Eagle_granularity.png) 79 | 80 | Yahoo! Fire Eagle allows user to provide location information to applications using eight different "levels" of granularity in their [hierarchy](http://fireeagle.yahoo.net/developer/documentation/location): 81 | 82 | * No information 83 | * As precise as possible 84 | * Postal code 85 | * Neighborhood 86 | * Town 87 | * Region 88 | * State 89 | * Country 90 | 91 | Fire Eagle specifically requires that recipient applications be written to handle data at any of the levels, and allows updating the user's location at any level of granularity. 92 | 93 | 2. _Twitter "place" vs. "exact location"_ 94 | 95 | [Twitter](https://support.twitter.com/articles/78525-about-the-tweet-location-feature) allows users to tag a tweet with either exact coordinates, a Twitter "place" (a town, neighborhood or venue) or both. 96 | 97 | 3. _Geode_ 98 | 99 | One of the fore-runners to the W3C Geolocation API, Firefox's experimental Geode feature allowed JavaScript access to the current location at four different levels of granularity.{{fact}} 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | -------------------------------------------------------------------------------- /patterns/Negotiation-of-Privacy-Policy.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Negotiation of Privacy Policy" 4 | type: pattern 5 | excerpt: "Over time, build user preferences from a privacy-preserving default semi-automatically, through opt-in/opt-out, semantics, and informed solicitations." 6 | categories: 7 | - control 8 | - choose 9 | status: pattern 10 | use: Lawful-Consent Private-link 11 | sim: Enable-Disable-Functions 12 | ref: Reasonable-Level-of-Control 13 | com: Discouraging-blanket-strategies Decoupling-[content]-and-location-information-visibility Masquerade 14 | address: 15 | 16 | %} 17 | 18 | [TOC] 19 | 20 | 21 | 22 | 23 | 24 | 25 | ## Context 26 | 27 | 28 | Often when users find a service (or product) they would like to use, and begin signing-up, they are immediately exposed to assumptions which may not hold for them. As users have differing privacy priorities, a controller cannot guess as to what settings best accommodate them. Since these preferences may be intricate, users cannot be expected to specify them in detail all at once or before using the service. 29 | 30 | ## Problem 31 | 32 | 33 | Users have sometimes wildly different priorities regarding their privacy, though a controller does not know these details when a user first joins a service. There is a temptation to provide these users the settings the average user uses. 34 | 35 | #### Forces/Concerns 36 | - Users are different and do not all fall under one universal setting without some being unsatisfied. 37 | - The controller wants to cater to user individuality. 38 | - Getting users to specify all of their individual tastes before using a service will make some users abandon the process. Some settings may be missed, and many users will be upset. 39 | 40 | ## Solution 41 | 42 | 43 | As users begin to use a service, determine their individual privacy sensitivities by allowing them to opt-in/opt-out of account details, targeted services, and telemetry. When a user's preference is not known, assume the most privacy-preserving settings. It should always take more effort to over-share than to under-share. 44 | 45 | 46 | 47 | 48 | 49 | 50 | ### [Implementation] 51 | 52 | 53 | Unauthenticated users should enjoy the most privacy-preserving defaults. When a user joins the service, they may be presented with [excerpts or summaries of] a privacy policy, which they can use to inform their choices. Using simple, recognizable controls, users can be asked to opt-in (for explained benefits) or opt-out (at explained costs) before any of their data is used. They can then be asked for additional consents further down the line as they become contextually relevant. 54 | 55 | In this way, only the needed consent is asked for as the controller's understanding of the user's preferences improves. This can allow the service to determine which solicitations users are individually likely to consider, and which ones will only waste their time or upset them. 56 | 57 | ## Consequences 58 | 59 | 60 | Private defaults will often not be the appropriate settings for a user, as most users may be less privacy-concerned. The additional effort taken to share more, with users or the controller, will reduce the valuable data collected. However, providing users with invasive defaults would risk public outrage by the vocal few, who may affect opinions holistically. 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | ### [Related Patterns] 83 | 84 | 85 | This pattern is used by [Support Selective Disclosure](Support-Selective-Disclosure), a compound pattern. It is also complemented by [Discouraging blanket strategies](Discouraging-blanket-strategies) (avoiding all or nothing strategies for privacy settings) and [Decoupling [content] and location information visibility](Decoupling-[content]-and-location-information-visibility) (change the privacy every time content is being shared). 86 | 87 | [Negotiation of Privacy Policy](Negotiation-of-Privacy-Policy) _refines_ [Reasonable Level of Control](Reasonable-Level-of-Control), which describes methods which allow users to share their information (selectively and granularly), while this pattern targets the beginning of the service use. 88 | 89 | It is _similar to_ [Enable/Disable Functions](Enable-Disable-Functions), where users may switch between functionalities which affect their privacy. This solution is quite similar to opting in or out of those features. Specifically a similar problem is addressed from non-functional as opposed to functional perspectives. 90 | 91 | This pattern _must use_ [Lawful Consent](Lawful-Consent) in order to be implemented correctly. Additionally, due to the same need for notified and informed users in this pattern, the following patterns also complement this pattern: 92 | 93 | - [Ambient](Ambient-Notice)/[Asynchronous Notice](Asynchronous-Notice), 94 | - [Preventing mistakes or reducing their impact](Preventing-mistakes-or-reducing-their impact), 95 | - [Awareness Feed](Awareness-Feed) (and components), and 96 | - [Privacy Dashboard](Privacy-Dashboard) (and components). 97 | 98 | 99 | ### [Sources] 100 | 101 | 102 | Based on: 103 | 104 | J. Porekar, A. Jerman-Blažič, and T. Klobučar, “Towards organizational privacy patterns,” Proceedings - The 2nd International Conference on the Digital Society, ICDS 2008, 2008. 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | -------------------------------------------------------------------------------- /patterns/Obligation-management.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Obligation Management" 4 | type: pattern 5 | excerpt: "The pattern allows obligations relating to data sharing, 6 | storing and processing to be transferred and managed when the data 7 | is shared between multiple parties." 8 | categories: 9 | - enforce 10 | - uphold 11 | status: draft 12 | %} 13 | 14 | [TOC] 15 | 16 | 17 | 18 | 19 | 20 | 21 | ## Summary 22 | 23 | 24 | The pattern allows obligations relating to data sharing, storing and 25 | processing to be transferred and managed when the data is shared 26 | between multiple parties. 27 | 28 | ## Context 29 | 30 | 31 | The developer aims to make sure that multiple parties are aware of and 32 | comply with required user/organisational policies as personal and 33 | sensitive data are successively shared between a series of parties who 34 | store or process that data. 35 | 36 | ## Problem 37 | 38 | 39 | Data may be accessed or handled by multiple parties that share data 40 | with an organisation in ways that may not be approved by the data 41 | subject. 42 | 43 | ## Solution 44 | 45 | 46 | Service providers use an obligation management system. Obligation 47 | management handles information lifecycle management based on 48 | individual preferences and organisational policies. The obligation 49 | management system manipulates data over time, ensuring data 50 | minimization, deletion and notifications to data subjects. 51 | 52 | 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | ## Consequences 63 | 64 | 65 | 66 | 67 | 68 | Benefits: privacy preferences and policies are communicated and adhered 69 | to among organisations sharing data. Liabilities: additional effort to 70 | set obligations. 71 | 72 | 73 | 74 | 75 | 76 | 77 | ## Examples 78 | 79 | 80 | A service provider subcontracts services, but requires that the data 81 | to be deleted after a certain time and that the service provider 82 | requires to be notified if there is further subcontracting. 83 | 84 | ### [Known Uses] 85 | 86 | 87 | Pretschner et al (2009) provide a framework for evaluating whether a 88 | supplier is meeting customer data protection obligations in 89 | distributed systems. Researchers at IBM propose Enterprise Privacy 90 | Authorization Language (EPAL) (2004) to govern data handling practices 91 | according to fine-grained access control. Casassa Mont (2004) discusses 92 | various important aspects and technical approaches to deal with 93 | privacy obligations. Pretschner, A., Schtz, F., Schaefer, C., and 94 | Walter, T.: Policy Evolution in Distributed Usage Control. Electron. 95 | Notes Theor. Comput. Sci. 244, 2009 IBM, The Enterprise Privacy 96 | Authorization Language (EPAL), EPAL specification, 97 | http://www.zurich.ibm.com/security/enterprise-privacy/epal/, 2004 98 | Mont, M. C., Dealing with Privacy Obligations, Important Aspects and 99 | Technical Approaches, TrustBus, 2004 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | -------------------------------------------------------------------------------- /patterns/Obtaining-Explicit-Consent.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Obtaining Explicit Consent" 4 | type: pattern 5 | excerpt: "Controllers require consent to be given willingly and specifically when in any way processing the personal data of their users." 6 | categories: 7 | - control 8 | - consent 9 | status: pattern 10 | sim: Sign-an-Agreement-to-Solve-Lack-of-Trust-on-the-Use-of-Private-Data-Context 11 | address: 12 | 13 | %} 14 | 15 | [TOC] 16 | 17 | ### [Also Known As] 18 | 19 | 20 | Explicit Consent / Obtaining Explicit consent via privacy agreement / Permission to Use Sensitive Data via Privacy Agreement - Explicit Consent 21 | 22 | ## Context 23 | 24 | 25 | In order to offer services (or products) to users (data subjects), controllers often need to collect (process) user data. Sometimes this is sensitive, identifying, or just metadata or other information which may be correlated to become more invasive. This nonetheless enables them to offer competitive features and functionality. 26 | 27 | However, controllers are required to obtain unambiguous consent from their users in order to process their personal data in any way. Depending on the legal jurisdiction, there are additional considerations to take into account depending on the type of data in question. Typically, sensitive data requires especially rigorous care. 28 | 29 | ## Problem 30 | 31 | 32 | Controllers which aim to make use of user data, especially that which can be used to identify the user or sensitive aspects about the user, may not do so without a legally binding and sound acquisition of the user's consent. 33 | 34 | #### Forces/Concerns 35 | - Users want to use services without having to invest an inordinate amount of effort into discovering privacy risks. 36 | - Controllers need to be sure that users do not consent out of impatience or intimidation. 37 | - Users do not want to consent many times to the same service under the same privacy policy for each and every purpose. 38 | - Controllers need to be able to prove that users consented. 39 | 40 | ## Solution 41 | 42 | 43 | Provide a clear and concise notification of all pertinent information the service could derive provided it had all the data it asks for. Indicate what this means for features and functionality. Then ask the user whether this tradeoff is something they consent to. If true, digitally signify and timestamp their response, or use [Contractual Consent](Sign-an-Agreement-to-Solve-Lack-of-Trust-on-the-Use-of-Private-Data-Context). 44 | 45 | 46 | 47 | 48 | 49 | 50 | ### [Implementation] 51 | 52 | 53 | The controller must ensure each user's sufficient understanding of the potential consequences. Otherwise the consent might not be informed. They must verify their users' willingness despite those consequences to provide their data for the specific purposes they need. If they do not, the consent might not be freely given. 54 | 55 | Ensuring that users do not consent based on time constraints, or the intimidation of the information provided, may require testing with a sample. If the sample is representative, it will give the controller a defense against any claims of coercion. 56 | 57 | The mechanism used for users to signify their consent should be clear. For example, if it is a button, it could read "I consent." 58 | 59 | ## Consequences 60 | 61 | 62 | #### Benefits 63 | Controllers can derive clearer potential consequences when the data collected is the same for every consenting user. Users therefore can look over these risks and spend less time making a valid decision. This reduces the chances of users consenting without informing themselves due to the difficult or verbose content presented. 64 | 65 | #### Liabilities 66 | Users do not however want to consent to all purposes necessarily since they might not all be compatible with what they feel comfortable sharing. In these cases users can be presented with a type of [Selective Disclosure](Selective-Disclosure). 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | ### [Related Patterns] 89 | 90 | 91 | Thus pattern may be used by [Lawful Consent](Lawful-Consent), as it is one of the patterns featured within it as a compound pattern. 92 | 93 | [Obtaining Explicit Consent](Obtaining-Explicit-Consent) is _similar to_ [Sign an Agreement to Solve Lack of Trust on the Use of Private Data Context](Sign-an-Agreement-to-Solve-Lack-of-Trust-on-the-Use-of-Private-Data-Context). This Contractual Consent merely focuses on the need for non-repudiation more so than [Obtaining Explicit Consent](Obtaining-Explicit-Consent). Their perspectives are different while their overall solution is more or less the same. 94 | 95 | It also is refined by [Informed Consent for Web-based Transactions](Informed-Consent for-Web-based-Transactions). It is a refinement of, and is used by, [Lawful Consent](Lawful-Consent). In both cases the controller aims to inform users prior to obtaining their consent, though in the other pattern's case it is applied specifically for Web-based transactions. 96 | 97 | Porekar et al. (2008) also highlight this pattern's complementing Enforce patten relationships: 98 | 99 | - [Constructing Privacy Policy](Creating-Privacy-Policy); 100 | - [Maintaining Privacy Policy](Maintaining-Privacy-Policy); 101 | - [Privacy Policy Negotiation](Negotiation-of-Privacy-Policy); 102 | - Maintaining agreements over longer periods of time 103 | 104 | ### [Sources] 105 | 106 | 107 | Based on: 108 | 109 | J. Porekar, A. Jerman-Blažič, and T. Klobučar, “Towards organizational privacy patterns,” Proceedings - The 2nd International Conference on the Digital Society, ICDS 2008, 2008. 110 | 111 | C. Bier and E. Krempel, “Common Privacy Patterns in Video Surveillance and Smart Energy,” in ICCCT-2012, 2012, pp. 610–615. 112 | 113 | Y. Asnar et al., “Initial Set of Security and Privacy Patterns at Organizational Level,” no. December 2006, 2007. 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | -------------------------------------------------------------------------------- /patterns/Onion-routing.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Onion Routing" 4 | type: pattern 5 | excerpt: "This pattern provides unlinkability between senders and 6 | receivers by encapsulating the data in different layers of 7 | encryption, limiting the knowledge of each node along the delivery 8 | path." 9 | categories: 10 | - routing 11 | - anonymous-communication 12 | - hide 13 | - mix 14 | status: draft 15 | %} 16 | 17 | [TOC] 18 | 19 | 20 | 21 | 22 | 23 | 24 | ## Summary 25 | 26 | 27 | This pattern provides unlinkability between senders and receivers by 28 | encapsulating the data in different layers of encryption, limiting the 29 | knowledge of each node along the delivery path. 30 | 31 | ## Context 32 | 33 | 34 | A system in which data is routed between different nodes. 35 | 36 | ## Problem 37 | 38 | 39 | When delivering data, the receiver has to be known. If the system 40 | provides the functionality that the receiver of data should be able to 41 | answer, than the receiver should also know the address of the sender. 42 | When forwarding information over multiple stations then, in a naive 43 | implementation, each station on the delivery path knows the sender and 44 | the final destination. 45 | 46 | ## Solution 47 | 48 | 49 | The solution is to encrypt the data in layers such that every station 50 | on the way can remove one layer of encryption and thus get to know the 51 | immediate next station. This way, every party on the path from the 52 | sender to the receiver only gets to know the immediate successor and 53 | predecessor on the delivery path. 54 | 55 | 56 | The goal of this pattern is to achieve unlinkability between senders 57 | and receivers. 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | 68 | 69 | ## Consequences 70 | 71 | 72 | 73 | 74 | 75 | If there are too few hops, the anonymity set is not big enough and the 76 | unlinkability between sender and receiver is at risk. The same problem 77 | occurs when there is too few communication going on in the network. 78 | The multiple layers of encryption will bloat up the data and consume 79 | bandwidth. If all nodes on the delivery path collaborate in deducing 80 | the sender and the receiver, the pattern becomes useless. 81 | 82 | 83 | 84 | 85 | 86 | 87 | ## Examples 88 | 89 | 90 | Alice is a whistle-blower and tries to forward data to Bob who works at 91 | the press. She sends the corresponding documents as an 92 | e-mail-attachment. Eve monitors the traffic and can see who sent this 93 | mail to whom. The next day, police raids Alice's apartment and sends 94 | her to jail. Bobs mail account gets seized. 95 | 96 | ### [Known Uses] 97 | 98 | 99 | The TOR-browser, a web-browser specifically designed to ensure 100 | anonymity makes heavy use of onion routing. 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | -------------------------------------------------------------------------------- /patterns/Outsourcing-[with-consent].md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Outsourcing [with consent]" 4 | type: pattern 5 | excerpt: “The controller has to obtain additional specific, informed, explicit, and freely given consent before outsourcing data processing to a third party.“ 6 | categories: 7 | - control 8 | - consent 9 | status: pattern 10 | use: Lawful-Consent 11 | address: 12 | 13 | %} 14 | 15 | [TOC] 16 | 17 | 18 | 19 | 20 | 21 | ## Context 22 | 23 | 24 | Controllers often do not have the means to feasibly or sufficiently process the data they oversee to the extent they desire. In these cases they seek an external processor or third party to handle the process. This typically conflicts with their already obtained consent from their users (their data subjects), as further processing by a third party is not necessarily compatible with the agreed upon purposes. In these situations the controller does not have legally obtained consent for this processing and will be liable if they carry it out. 25 | 26 | ## Problem 27 | 28 | 29 | Third party processors do not inherent user consent granted to a controller, but need each user's consent before they may process their information. The processor cannot contact the necessary users as they have no lawful access to any means to identify them. 30 | 31 | #### Forces/Concerns 32 | - Controllers wish to outsource processing when it is not feasible or viable to do so themselves. 33 | - Third party processors want to process information efficiently without needing to address other considerations. 34 | - The controller does not want to be liable, or damage their reputation. 35 | - Outsourcing _has a strong impact on the security and privacy requirements of organizations_. A contract will bind both parties. 36 | - _The outsourced third party will be obliged by all data protection principles_ to which the controller is, in addition to stricter measures imposed on processors. 37 | 38 | ## Solution 39 | 40 | 41 | Obtain additional (Lawful Consent)[Lawful-Consent] for the specific purposes needed from each user before allowing the third party to process their data. Do not process the data of users who do not consent. 42 | 43 | _The consent can be seen as a contract establishing what and how data will be processed by the [third party]. The [controller] must also ensure, preferably by a written agreement, that the [third party] strictly follows all conditions relating to data processing that were imposed on [them]._ 44 | 45 | 46 | 47 | 48 | 49 | ### [Implementation] 50 | 51 | Before outsourcing data processing, it is necessary to obtain consent from the user and create an agreement between the controller and the third party. The consent itself needs to be freely given, informed, specific, and explicit. It should indicate purposes and means (physical or informational) regarding the controller and the third party. There is also an execution dependency between the controller and the user. 52 | 53 | Figure 2(b) shows an SI* model explaining the solution of Compagna et al. (2007) 54 | 55 | ## Consequences 56 | 57 | 58 | #### Benefits 59 | _The pattern solves the problem of granting [the consent] necessary to perform out-sourced data processing by assuring [users that their information is] processed according to the contract._ 60 | 61 | #### Liabilities 62 | _The [controller] may want assurance that the [third party] does not repudiate the data processing agreement and the [user] does not repudiate the consent._ As such the controller may decide to use the [Non-repudiation](Non-repudiation) pattern. 63 | 64 | 65 | 66 | 67 | 68 | 69 | ## Examples 70 | 71 | 72 | The scenario described by Compagna et al. (2007) features a Health Care Centre (data controller) and a user (data subject), Bob, who needs constant supervision. The subcontractor, a Sensor Network Provider (third party supplier), installs and maintains the network responsible for automated monitoring of Bob's health. This subcontractor needs additional specific, informed, explicit, and freely given consent from Bob. 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | ### [Related Patterns] 85 | 86 | This pattern _must use_ [Lawful Consent](Lawful-Consent) in order to be implemented. It also benefits from using the [Non-repudiation](Non-repudiation) pattern. 87 | 88 | ### [Sources] 89 | 90 | 91 | L. Compagna, P. El Khoury, F. Massacci, R. Thomas, and N. Zannone, “How to capture, model, and verify the knowledge of legal, security, and privacy experts : a pattern-based approach,” ICAIL ’07, 2007. 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | -------------------------------------------------------------------------------- /patterns/Pay-Back.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Pay Back" 4 | type: pattern 5 | excerpt: "Give users some benefits in exchange for providing information or content." 6 | categories: 7 | - control 8 | - choose 9 | status: pattern 10 | com: Reciprocity 11 | use: Lawful-Consent 12 | address: 13 | 14 | %} 15 | 16 | [TOC] 17 | 18 | 19 | 20 | 21 | 22 | 23 | ## Context 24 | 25 | 26 | In services where users may contribute content, or provide the system with account or profile information, the information is only valuable if relevant and accurate. For controllers providing this service (or product), worthless information does not typically generate income or future participation. Without consistent usage, a service becomes less popular and eventually may run at loss. This is particularly true in socially oriented services. To keep the service working, it is crucial that its users maintain content. Users however, might not feel inclined to do so. Keeping content up to date, or adding it in the first place, requires effort, and in some cases an acceptance of privacy risk. 27 | 28 | ## Problem 29 | 30 | 31 | Users do not necessarily want to provide and maintain content, they need a motivation to do so. Without this, a service will not flourish. 32 | 33 | Not all users will be equally motivated, so the service may not receive contribution at the level required to keep the service competitive. Furthermore, some users might not contribute at all. _Thus, it is difficult to maintain [Reciprocity](Reciprocity)_. 34 | 35 | #### Forces/Concerns 36 | - A service that depends on information flow requires a continuous feed of user activity. 37 | - If users are not motivated they likely will not continue to contribute content. 38 | - Some users do not require much motivation, and the use of the service could be enough for them to contribute. But this alone is insufficient for most services to run. 39 | 40 | ## Solution 41 | 42 | 43 | Provide users with different kinds of benefits when they contribute or maintain content for the service and make sure they do so consensually. 44 | 45 | 46 | 47 | 48 | 49 | 50 | ### [Implementation] 51 | 52 | 53 | Depending on the kind of service that is provided, different benefits could be considered: virtual or real currency, use of services, social benefits, and so on. 54 | 55 | When using virtual or real currency, the controller should first define how much in value users would receive depending on the contributions. In the case of virtual currency, the places where the currency could be used should be defined. 56 | 57 | Regarding use of service, some criteria could be considered non-exhaustively: feedback on content, frequency of contributions, the use of service for a minimum duration, access to a service earlier than others, or use of special features within the service. 58 | 59 | When users reach a limit, they could additionally assist with virtual or physical events for learning, meeting people, etc. In virtual scenarios, users could receive attention (feedback) from one another. 60 | 61 | ## Consequences 62 | 63 | 64 | #### Benefits 65 | - More users will be motivated to provide information, so the service could continue to be competitive. 66 | - It could help to maintain [Reciprocity](Reciprocity). 67 | 68 | #### Liabilities 69 | - It could be necessary to monitor the quality of the contributions before giving the user benefits. 70 | - Consent will not be genuine if users are coerced into providing their personal data. An example of this is the sunk cost fallacy. As the user builds emotional investment, the controller has more power over them. A service which was once available freely, or anonymously, can push users into accepting terms they do not truly consent to. When using this pattern it is important to make sure that [Lawful Consent](Lawful-Consent) is also used. 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | *YouTube financial retribution. 82 | *Dropbox increasing storage programs. 83 | *Local guides for Google Maps. 84 | *Likes, comments, followers in Facebook, Instagram. 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | ### [Related Patterns] 98 | 99 | 100 | [Pay Back](Pay-Back) _complements_ [Reciprocity](Reciprocity). These two patterns may work together to incentivize users to provide data. It may be used by [Incentivized Participation](Incentivized-Participation), which builds on this relationship. 101 | 102 | Additionally, as [Pay Back](Pay-Back) encourages users to provide additional data, it must be sure to obtain [Lawful Consent](Lawful-Consent). This is a _must use_ relationship. 103 | 104 | ### [Sources] 105 | 106 | 107 | Based on: 108 | 109 | T. Schümmer, “The Public Privacy – Patterns for Filtering Personal Information in Collaborative Systems,” in Proceedings of CHI workshop on Human-Computer-Human-Interaction Patterns, 2004. 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | -------------------------------------------------------------------------------- /patterns/Personal-data-store.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Personal Data Store" 4 | type: pattern 5 | excerpt: "Subjects keep control on their personal data that are 6 | stored on a personal device." 7 | categories: 8 | - control 9 | - separate 10 | - isolate 11 | status: draft 12 | %} 13 | 14 | [TOC] 15 | 16 | 17 | 18 | 19 | 20 | 21 | ## Summary 22 | 23 | 24 | Subjects keep control on their personal data that are stored on a 25 | personal device. 26 | 27 | ## Context 28 | 29 | 30 | The pattern is applicable to any data produced by the data subject (or 31 | originally under his control) as opposed to data about him produced by 32 | third parties. 33 | 34 | ## Problem 35 | 36 | 37 | Data subjects actually lose control over their data when they are 38 | stored on a server operated by a third party. 39 | 40 | ## Solution 41 | 42 | 43 | A solution consists in combining a central server and secure personal 44 | tokens. Personal tokens, which can take the form of USB keys, embed a 45 | database system, a local web server and a certificate for their 46 | authentication by the central server. Data subjects can decide on the 47 | status of their data and, depending on their level of sensitivity, 48 | choose to record them exclusively on their personal token or to have 49 | them replicated on the central server. Replication on the central 50 | server is useful to enhance sustainability and to allow designated 51 | third parties (e.g. health professionals) to get access to the data. 52 | 53 | 54 | Enhance the control of the subjects on their personal data. 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | ## Consequences 67 | 68 | 69 | 70 | 71 | 72 | Data subjects need to be equipped with a personal data store. 73 | 74 | 75 | 76 | 77 | 78 | 79 | ## Examples 80 | 81 | 82 | Patients want to keep control over their health data but also to grant 83 | specific access to some health professionals. 84 | 85 | ### [Known Uses] 86 | 87 | 88 | It has even been deployed for certain types of services, in 89 | particular, in the health sector. 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | -------------------------------------------------------------------------------- /patterns/Preventing-Mistakes-or-Reducing-Their-Impact.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Preventing mistakes or reducing their impact" 4 | type: pattern 5 | excerpt: "Prevent accidental automatic disclosure of personal information." 6 | categories: 7 | - inform 8 | - notify 9 | com: Impactful-Information-and-Feedback 10 | status: pattern 11 | address: 12 | 13 | %} 14 | 15 | [TOC] 16 | 17 | 18 | 19 | 20 | 21 | 22 | ## Context 23 | 24 | 25 | 26 | Numerous services (or products) are designed with the purpose of sharing amongst the public or a specific subset of users. In content sharing implementations, it is commonplace to streamline disclosure so that users do not need to publish manually. Content they generate is often automatically shared with the controller, even if not immediately made available to other users or the public. This of course requires the prior consent of users, though it is also possible for users to forget about that consent, or change their mind. If the distinction lies in a simple setting, it may not be apparent to the user that it is still in effect. 27 | 28 | ## Problem 29 | 30 | 31 | Immediate and automatic content publication without notification or confirmation of consent leads to unintentional disclosure and may invalidate prior consent. 32 | 33 | #### Forces and Concerns 34 | 35 | 36 | 37 | - Users of the service want to share content with others, but not all of the content they generate is fit for sharing 38 | - Most users do not want to manually upload content case by case, sometimes long after creation 39 | - Controllers want to make it easy for users to contribute content 40 | - Controllers do not want users to disclose content which they regret disclosing and potentially ruins the user's experience 41 | 42 | ## Solution 43 | 44 | 45 | Use contextual measures to predict whether content should be processed, re-establishing consent, to prevent accidental disclosure. 46 | 47 | 48 | 49 | 50 | 51 | 52 | ### [Implementation] 53 | 54 | 55 | _Through the study of patterns in disclosure behavior, systems may be able to helpfully warn users when disclosing following potentially significant change in context, perhaps reducing potential for mistakes. [These] privacy decisions are often correlated with the context of capture and the [content] as indicated [by the user. It] could be feasible to use these patterns for prediction or recommendation of privacy settings. In addition, providing an optional “staging area” before disclosure actually takes place and an easy way to review recent disclosures may reduce the immediate consequences of quickly regretted or accidental disclosure decisions._ 56 | 57 | ## Consequences 58 | 59 | 60 | Clearing up mistakenly shared data adds additional overhead, especially if the service does not offer simple modification or removal of information. As sharing more than actually intended may result in potential damage for users, they will benefit from services which reduce these risks. 61 | 62 | 63 | 64 | 65 | 66 | 67 | ## Examples 68 | 69 | 70 | _Through the study of [trends] in disclosure behavior, systems may be able to helpfully warn users when disclosing following potentially significant change in context, perhaps reducing potential for mistakes. As [Ahern et al.] found that privacy decisions are often correlated with the context of capture and the content of the photo as indicated by user-specified tags, it could be feasible to use these patterns for prediction or recommendation of privacy settings. In addition, providing an optional “staging area” before disclosure actually takes place and an easy way to review recent disclosures may reduce the immediate consequences of quickly regretted or accidental disclosure decisions._ 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | ### [Related Patterns] 83 | 84 | 85 | This pattern _complements_ [Impactful Information and Feedback](Impactful-Information-and-Feedback), and [[Informed] Credential Selection]([Informed]-Credential-Selection). As such the patterns which refine it do so as well where their context permits. For [Impactful Information and Feedback](Impactful-Information-and-Feedback) this is the case. It provides feedback about disclosure under certain privacy settings before it takes place, and can be notified of, and reviewed, before causing an impact. [[Informed] Credential Selection]([Informed]-Credential-Selection), however, can only reasonably _complement_ [Asynchronous Notice](Asynchronous-notice), as it deals with instant authentication. Modal notification like what this pattern might provide can inform users in a timely manner. 86 | 87 | It is also _refined_ by both [Ambient Notice](Ambient-notice) and [Asynchronous Notice](Asynchronous-notice). These variants of notice, which are themselves _alternatives_, both provide essentially equal problems for which they give more specific contexts and solutions. The former is unobtrusive and persistently informative, while the latter is unavoidably informative when context demands. 88 | 89 | ### [Sources] 90 | 91 | 92 | S. Ahern, D. Eckles, N. Good, S. King, M. Naaman, and R. Nair, “Over-Exposed ? Privacy Patterns and Considerations in Online and Mobile Photo Sharing,” CHI ’07, pp. 357–366, 2007. 93 | 94 | C. Bier and E. Krempel, “Common Privacy Patterns in Video Surveillance and Smart Energy,” in ICCCT-2012, 2012, pp. 610–615. 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | -------------------------------------------------------------------------------- /patterns/Privacy-Policy-Display.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Privacy Policy Display" 4 | type: pattern 5 | excerpt: "The goal of this display is to provide the user information about why what information by whom is requested. It should be used whenever personal data is required from the user." 6 | categories: 7 | - inform 8 | - provide 9 | com: Layered-policy-design Privacy-Aware-Wording 10 | status: pattern 11 | address: 12 | 13 | %} 14 | 15 | [TOC] 16 | 17 | 18 | 19 | 20 | 21 | 22 | ## Context 23 | 24 | 25 | 26 | Privacy policies are an important element in the processing activities of a controller. They not only relay to data subjects, the users, crucial aspects about the processing in question, but also adhere to the laws which mandate those policies. Balancing the accessibility of these policies however with the legal comprehensiveness needed is nontrivial. As such users do not naturally familiarize themselves with privacy policies as they need to be verbose, and often complex, to comply with the law. It is therefore necessary that controllers ensure that users are indeed informed before soliciting their consent. 27 | 28 | ## Problem 29 | 30 | 31 | Whenever the user's information is requested, it must be clear to them exactly what information is needed, who requests it, and what will be done with it. 32 | 33 | #### Forces and Concerns 34 | 35 | 36 | 37 | - Users do not want to read extensive policies, but they do want to understand any relevant risks 38 | - Controllers need users to understand specific policy elements in order to legally process their data 39 | - Users would rather be provided with relevant and ideally concise information than all of it at once 40 | - Controllers want users to trust that they are not trying to hide the risks of using the system 41 | 42 | ## Solution 43 | 44 | 45 | As requests for personal data are made, state clearly what information is needed by whom, for which purposes, and by what means, prior to soliciting consent. 46 | 47 | 48 | 49 | 50 | 51 | 52 | ### [Implementation] 53 | 54 | 55 | The Article 29 Working Party of the Data Protection Directive of the European Union have set out recommendations regarding the distribution of policy into a layered format. They suggest three tiers, each providing additional detail. Users should have clearly visible access to successive detail upon the controller's request of the related personal data. 56 | 57 | The first tier, 'short notice', shall offer core information necessary for users to understand the purposes and means of processing. It should provide a clear mechanism to obtain further detail. This tier is aimed towards maximum user understanding. 58 | 59 | The second tier, 'condensed notice', includes a summary of pertinent information as required by Article 13 of the General Data Protection Regulation (GDPR), the successor to the Directive. This non-exhaustively includes additional information regarding contact details of applicable entities, legal basis or obligation, legitimate interests, recipients, retention, data subject rights, and whether automated decision making is in use. 60 | 61 | The third tier, 'full notice', presents all remaining information required by the GDPR in addition to the previous information. This is the variation which expresses the full detail of the policy which best holds up the legislative requirements. 62 | 63 | ## Consequences 64 | 65 | 66 | By constantly reminding users what it really means to share their information, they will better contemplate the personal data they choose to provide. However, users may also become fatigued or otherwise desensitized by frequent reminders and begin to overlook privacy policies. As such it is important to balance the levels of visibility and implicit severities of the information conveyed. 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | ### [Related Patterns] 89 | 90 | 91 | This pattern _may be used_ by [Dynamic Privacy Policy Display](Dynamic-Privacy-Policy-Display) and [Policy Matching Display](Policy-matching-display). The first of these uses it to add standardization and 'tooltip' functionality, while the second adds preference and policy mismatches. 92 | 93 | This pattern _complements_ [Privacy Aware Wording](Privacy-Aware-Wording), [Layered Policy Design](Layered-policy-design), [Platform for Privacy Preferences](Platform-for-Privacy-Preferences). 94 | 95 | The complementary relationship with [Platform for Privacy Preferences](Platform-for-Privacy-Preferences) however, is implicit. It is made so by the _use_ of this pattern by [Dynamic Privacy Policy Display](Dynamic-Privacy-Policy-Display), which has a _complements_ relationship with [Platform for Privacy Preferences](Platform-for-Privacy-Preferences). 96 | 97 | The complementary relationship between [Privacy Aware Wording](Privacy-Aware-Wording) and [Layered Policy Design](Layered-policy-design) stems from their use as accessible policies. As they seek to make privacy policies more easily interpreted by users, it is a natural aid to the display of information requests and explanations within this pattern. 98 | 99 | 100 | ### [Sources] 101 | 102 | 103 | S. Fischer-Hübner, C. Köffel, J.-S. Pettersson, P. Wolkerstorfer, C. Graf, L. E. Holtz, U. König, H. Hedbom, and B. Kellermann, “HCI Pattern Collection - Version 2,” 2010. 104 | 105 | C. Graf, P. Wolkerstorfer, A. Geven, and M. Tscheligi, “A Pattern Collection for Privacy Enhancing Technology,” The Second International Conferences of Pervasive Patterns and Applications (Patterns 2010), vol. 2, no. 1, pp. 72–77, 2010. 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | -------------------------------------------------------------------------------- /patterns/Private-link.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Private link" 4 | type: pattern 5 | excerpt: "Enable sharing and re-sharing without wide public visibility or cumbersome authenticated access control." 6 | categories: 7 | - distribution 8 | - media 9 | - access 10 | - control 11 | - choose 12 | status: pattern 13 | use: Lawful-Consent 14 | com: Reasonable-Level-of-Control Decoupling-[content]-and-location-information-visibility Selective-Access-Control Active-broadcast-of-presence Masquerade 15 | address: 16 | 17 | %} 18 | 19 | [TOC] 20 | 21 | 22 | 23 | 24 | 25 | 26 | ## Context 27 | 28 | 29 | The controller provides a service which hosts resources, potentially constituting personal data. When users want to share (and enable re-sharing of) these resources, they may wish to do so privately using existing communication mechanisms. This is particularly relevant when users are sharing with contacts who would rather not, or cannot, simply authenticate. 30 | 31 | ## Problem 32 | 33 | 34 | Users want to share a private resource with unauthenticated users in a way that respects the sensitivity of that resource. 35 | The solution must not allow users to access resources that weren't intended to be shared, nor publicize the location of the intended resource to unintended recipients. 36 | 37 | #### Forces/Concerns 38 | - The controller should keep the links confidential in order to honor the user's privacy expectations. 39 | - The link should not be guessable (e.g. by convention or brute force) to prevent unintended recipients from accessing unlisted links. 40 | - The user should be able to limit the access to a resource by version or time restriction. 41 | - The recipient should be able to forward the link to another trusted recipient, so long as the link is valid. 42 | - The recipient should be able to access the link again at a later date, unless the resource is version or time restricted. 43 | 44 | Note that the URL will be retained in recipients' browser history and could easily be inadvertently shared with others. Services should help users understand these limitations. 45 | 46 | ## Solution 47 | 48 | 49 | Provide the user a _private link_ or _unguessable URL_ for a particular resource, such as a set of their personal information (e.g. their current location, an album of photos). Anyone who has the link may access the information, but the link is not posted publicly or guessable by an unintended recipient. The user can share the private link with friends, family or other trusted contacts who can in turn forward the link to others who will be able to access it, without any account authentication or access control lists. 50 | 51 | Services may allow users to revoke existing private links or change the URL to effectively re-set who can access the resource. Additionally, users may set a time-limit for the resource's validity, or have it invalidated upon modification. 52 | 53 | 54 | 55 | 56 | 57 | 58 | ### [Implementation] 59 | 60 | 61 | The controller allows their users' online resources to be shared by publishing an unlisted URL with a complex, long, and randomly generated string. This can be part of a query string as opposed to an on disk location. In this case, the preprocessor intercepts the query and redirects the user to the correct resource. This may be an actual file on disk (probably not served by direct link), generated on the fly, or extracted from a database or compressed file. The preprocessor can verify validity dynamically before serving the resource. 62 | 63 | The situation in which the user has a direct link to the resource location is not ideal, however, as it will need to change in the event of a time or version restriction since access to the file is not controlled by the preprocessor. 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | ## Examples 76 | 77 | 78 | 1. Flickr "Guest Pass" 79 | 2. Google "anyone with the link" sharing 80 | 3. Tripit "Get a link" 81 | 4. Dropbox "Share Link" 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | ### [Related Patterns] 94 | 95 | 96 | [Private link](Private-link) may be complemented by [Active broadcast of presence](Active-broadcast-of-presence), which deals with sharing with all people, while this pattern considers specific audience. They could together broadcast a more public kind of link. [Private link](Private-link) may also be used by [Support Selective Disclosure](Support-Selective-Disclosure) as a means to provide anonymous access in a resource sharing context. 97 | 98 | This pattern _complements_ a number of patterns, namely [Masquerade](Masquerade), [Decoupling content and location information visibility](Decoupling-[content]-and-location-information-visibility), [Selective Access Control](Selective Access Control), and [Reasonable Level of Control](Reasonable-Level-of-Control). 99 | 100 | [Private link](Private-link) could facilitate both [Masquerade](Masquerade)'s selective identifiability and the provision of decoupled location information to specific individuals, after the fact, and while being open to modification. This private sharing with anonymous users may be achieved along with [Selective Access Control](Selective Access Control)'s definition of audience for a contribution.[Reasonable Level of Control](Reasonable-Level-of-Control) aims to define the granularity while sharing content and the specific audience. These can work together for similar reasons, as [Private link](Private-link) again is able to facilitate. In each case it could provide a more complete sharing (or withholding) solution. 101 | 102 | Finally, Private link](Private-link) _must use_ [Lawful Consent](Lawful-Consent), as the generation of even unpublished links constitutes processing. 103 | 104 | ### [Sources] 105 | 106 | 107 | Based on: 108 | 109 | Doty, N., Gupta, M., & Zych, J. (n.d.). PrivacyPatterns.org. Retrieved February 26, 2015, from http://privacypatterns.org/ 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | -------------------------------------------------------------------------------- /patterns/Protection-against-tracking.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Protection against Tracking" 4 | type: pattern 5 | excerpt: "This pattern avoids the tracking of visitors of websites 6 | via cookies. It does this by deleting them at regular intervals or 7 | by disabling cookies completely." 8 | categories: 9 | - anonymity 10 | - unlinkability 11 | - tracking 12 | - cookies 13 | - minimize 14 | - exclude 15 | status: draft 16 | %} 17 | 18 | [TOC] 19 | 20 | 21 | 22 | 23 | 24 | 25 | ## Summary 26 | 27 | 28 | This pattern avoids the tracking of visitors of websites via cookies. 29 | It does this by deleting them at regular intervals or by disabling 30 | cookies completely. 31 | 32 | ## Context 33 | 34 | 35 | This pattern is applicable when personal identifiable information is 36 | tracked through software tools, protocols or mechanisms such as 37 | cookies and the like. 38 | 39 | ## Problem 40 | 41 | 42 | With every single interaction in the web you leave footmarks and clues 43 | about yourself. Cookies for example enable webservers to gather 44 | information about web users which therefore affects their privacy and 45 | anonymity. Web service providers trace user behavior, which can lead 46 | to user profiling. Also providers can sell the gathered data about 47 | users visiting their pages to other companies. 48 | 49 | ## Solution 50 | 51 | 52 | Restricting usage of cookies on the client side by deleting cookies on 53 | a regular basis e.g. at every start-up of the operating system or 54 | enabling them case-by-case by deciding if the visited website is 55 | trustworthy or not and by accepting a cookie only for the current 56 | session. At the highest level of privacy protection cookies are 57 | disabled, but as a consequence web services are restricted. Another 58 | solution could be that cookies are exchanged between clients, so that 59 | sophisticated user profiles emerge. 60 | 61 | 62 | Restricting a website to not be able to track any of the user's 63 | personal identifiable informations. 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | 75 | ## Consequences 76 | 77 | 78 | 79 | 80 | 81 | With cookies disabled there is no access to sites that require enabled 82 | cookies for logging in. Other tracking mechanisms for user 83 | fingerprinting may still work even when cookies are disabled. 84 | 85 | 86 | 87 | 88 | 89 | 90 | ## Examples 91 | 92 | 93 | Alice wants to buy shoes and she wants to shop online. She heads to an 94 | online shop and searches for shoes but can’t decide which ones she 95 | wants, so she buys neither of them. The next day she finds a couple of 96 | emails in her inbox, giving her suggestions for other shoes and 97 | alerting her that the viewed shoes are now on sale. 98 | 99 | ### [Known Uses] 100 | 101 | 102 | Junkbuster is an old proxy filtering between web server and browser to 103 | block ads and cookies, but it is no longer maintained. A program named 104 | CookieCooker (http://www.cookiecooker.de/) provides protection for 105 | monitored user behaviour and interests by exchanging cookies with 106 | other users or using a random selection of identities. Unfortunately 107 | this project also seems to be not maintained anymore. There is also 108 | the Firefox Add-on Self-Destructing Cookies which deletes cookies of 109 | tabs as soon as they are closed. 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | 135 | -------------------------------------------------------------------------------- /patterns/Pseudonymous-identity.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Psuedonymous Identity" 4 | type: pattern 5 | excerpt: "Hide the identity by using a pseudonym and ensure a 6 | pseudonymous identity that can not be linked with a real identity 7 | during online interactions." 8 | categories: 9 | - anonymity 10 | - pseudonymity 11 | - hide 12 | - dissociate 13 | status: draft 14 | %} 15 | 16 | [TOC] 17 | 18 | 19 | 20 | 21 | 22 | 23 | ## Summary 24 | 25 | 26 | Hide the identity by using a pseudonym and ensure a pseudonymous 27 | identity that can not be linked with a real identity during online 28 | interactions. 29 | 30 | ## Context 31 | 32 | 33 | This pattern can be used for systems in which users are identified by 34 | public identities. 35 | 36 | ## Problem 37 | 38 | 39 | Many kinds of sensitive informations are released through web 40 | interactions, email, data sharing or location-based systems, which can 41 | contain the name of a user or header information in packets. Another 42 | problem could be to interact anonymously in a forum. However too much 43 | interaction in a forum with an anonymous identity can be dangerous in 44 | the sense that the relation between original identity and a 45 | pseudonymous identity can be exposed. 46 | 47 | ## Solution 48 | 49 | 50 | Initiate a random pseudonym, that can not be related to the original, 51 | so that the identity is hidden. Furthermore a pseudonym depends on 52 | concealment, so the pseudonym allocation needs protection. 53 | 54 | 55 | Hide the identity of the participants. 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | ## Consequences 68 | 69 | 70 | 71 | 72 | 73 | The real identity of a user is hidden. In certain scenarios there is a 74 | need for additional space to store the pseudonym-identity mapping. 75 | Extensive Usage of the same pseudonym can weaken it. 76 | 77 | 78 | 79 | 80 | 81 | 82 | ## Examples 83 | 84 | 85 | Assuming some students are writing an exam and they have to fill out a 86 | form about their identity, where there is an optional field for a 87 | chosen pseudonym. This way the result can be released under the chosen 88 | pseudonyms and the identity of each student is hidden. But by being 89 | observant, some students might be able to figure out which identity 90 | belongs to which pseudonym and so the confidentiality of the identity 91 | is compromised. 92 | 93 | ### [Known Uses] 94 | 95 | 96 | Anonymizer are well-known tools for anonymous web interactions. They 97 | work for example by using a proxy between a request sender and a 98 | recipient to strip header information like HTTP_USER_AGENT in packet 99 | headers because they contain metadata about packet senders. The 100 | Mixmaster is an anonymous remailer that hides the sender and recipient 101 | identity by stripping its name and assigning a pseudonym. Some data 102 | sharing systems with a privacy-preserving focus make use of pseudonyms 103 | so that identifying information such as names and social security 104 | numbers are hidden. For example various electronic healthcare systems 105 | are using pseudonyms for the storage of e-health records. 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | -------------------------------------------------------------------------------- /patterns/Pseudonymous-messaging.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Pseudonymous Messaging" 4 | type: pattern 5 | excerpt: "A messaging service is enhanced by using a trusted third 6 | party to exchange the identifiers of the communication partners by 7 | pseudonyms." 8 | categories: 9 | - messaging 10 | - email 11 | - pseudonymity 12 | - hide 13 | - dissociate 14 | status: draft 15 | %} 16 | 17 | [TOC] 18 | 19 | 20 | 21 | 22 | 23 | 24 | ## Summary 25 | 26 | 27 | A messaging service is enhanced by using a trusted third party to 28 | exchange the identifiers of the communication partners by pseudonyms. 29 | 30 | ## Context 31 | 32 | 33 | This pattern can be used for online communications by email, through 34 | message boards, and newsgroups. 35 | 36 | ## Problem 37 | 38 | 39 | Messaging includes all forms of communication through emails, 40 | articles, message boards, newsgroups etc. This information could be 41 | stored and used to build sophisticated user profiles. Sometimes it can 42 | also be used to prosecute people. 43 | 44 | ## Solution 45 | 46 | 47 | A message is send by a user to the server, which exchanges the 48 | sender's address with a pseudonym. Replied messages are sent back to 49 | the pseudonymous address, which will then be swapped back to the 50 | original. 51 | 52 | 53 | The goal of this pattern is to prevent unforeseen ramifications of the 54 | use of online messaging services. 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | ## Consequences 67 | 68 | 69 | 70 | 71 | 72 | Users can communicate more freely. Pseudonym servers can be misused to 73 | send offensive messages, for spam mails or by criminals for illegal 74 | activities. Under those circumstances it could be necessary to revoke 75 | the pseudonymity of the corresponding parties. 76 | 77 | 78 | 79 | 80 | 81 | 82 | ## Examples 83 | 84 | 85 | Alice is a political activist and tries to organize a political 86 | demonstration. Since her government does not like free speech, her 87 | communication channels are intensely monitored and one day, she simply 88 | disappears into a labor camp and is never seen again. 89 | 90 | ### [Known Uses] 91 | 92 | 93 | nym.alias.net a pseudonymous email system with the goal to provide 94 | secure concealment of the user's identity. A Type I Anonymous Remailer 95 | forwards emails by modifying the message header and removing 96 | sender-related information. 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | -------------------------------------------------------------------------------- /patterns/Selective-Access-Control.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Selective access control" 4 | type: pattern 5 | excerpt: "Allow users to specify who may access the content they generate, both during and after submission." 6 | categories: 7 | - control 8 | - choose 9 | status: pattern 10 | use: Lawful-Consent 11 | ref: Reasonable-Level-of-Control 12 | com: Decoupling-[content]-and-location-information-visibility Discouraging-blanket-strategies Private-link Masquerade 13 | address: 14 | 15 | %} 16 | 17 | [TOC] 18 | 19 | ### [Also Known As] 20 | 21 | 22 | Selective Access Control in Forum Software; 23 | Privacy Options in Social Networks 24 | 25 | 26 | ## Context 27 | 28 | 29 | Users enjoy social reaction when posting content in socially oriented services on the Internet. Though sometimes the reactions are not as ideal. Some content is inappropriate for some audiences, and some users would rather keep some content mostly private. While users are capable of sharing content privately, perhaps through [Private Link](Private-link), they may wish to have better control over whom they share with in their service of choice. The controller providing this service may too want its users to share more specifically. 30 | 31 | ## Problem 32 | 33 | 34 | Users want to control the visibility of the content being shared, because it may not currently be appropriate for all users. 35 | 36 | #### Forces/Concerns 37 | - Users aim to share content aimed at different kinds of users because they have varying social proximities (friends, family, colleagues, etc.). 38 | - Users want to share content to certain other users based on the content’s nature for that user specifically. 39 | - Users could have trouble over-sharing, dealing with content aimed at the wrong audience, or under-share as a precaution. 40 | 41 | ## Solution 42 | 43 | 44 | _Provide users with the option to define the audience of their contributions by specifying the access rules to their [content]._ 45 | 46 | 47 | 48 | 49 | 50 | ### [Implementation] 51 | 52 | 53 | Implement visual controls to help users to define access control rules when they create or modify content. 54 | 55 | These rules could be defined based on users, groups of users, or based on context-aiding attributes like age or location. For groups, it should be possible to directly define who may view the post being published (e.g. a post with personal data aimed only at a group of close friends). Contextually, it may be possible to define an attribute constraint based on whom in general the post is intended for (e.g. a post aimed at people in a specific town or region). 56 | 57 | ## Consequences 58 | 59 | 60 | #### Benefits 61 | - Users have the possibility to control access as they want in every submission. It allows configuration based on kinds of users or the content's context. 62 | 63 | #### Liability 64 | - Users could find granular configurations time consuming or tedious, so a default configuration for new submissions would be helpful. 65 | 66 | 67 | 68 | 69 | 70 | 71 | ## Examples 72 | 73 | 74 | - Facebook 75 | - Youtube 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | ### [Related Patterns] 88 | 89 | 90 | [Selective Access Control](Selective-Access-Control) 91 | is complemented by [Private link](Private-link), which focuses on private sharing with anonymous users while this pattern defines the audience for a contribution. It is a part of the [Support Selective Disclosure](Support-Selective-Disclosure) compound pattern, and thus may be used by it. 92 | 93 | This pattern _refines_ [Reasonable Level of Control](Reasonable-Level-of-Control) in a socially oriented service context. It _complements_ both [Discouraging Blanket Strategies](Discouraging-blanket-strategies) (more flexible privacy setting management) and [Decoupling [content] and location information visibility](Decoupling-[content]-and-location-information-visibility) (selectively providing location access). 94 | 95 | Despite focusing on the choices within access control, decisions which users make should still be informed, and explicit, with the consent involved uncoerced. Therefore, this pattern also _must use_ [Lawful Consent](Lawful-Consent). 96 | 97 | ### [Sources] 98 | 99 | 100 | Based on: 101 | 102 | Drozd, O. (n.d.). Privacy Patterns Catalog. Retrieved January 25, 2017, from http://privacypatterns.wu.ac.at:8080/catalog/ 103 | 104 | Fischer-Hübner, S., Köffel, C., Pettersson, J.-S., Wolkerstorfer, P., Graf, C., Holtz, L. E., … Kellermann, B. (2010). HCI Pattern Collection – Version 2. 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | -------------------------------------------------------------------------------- /patterns/Sign-an-Agreement-to-Solve-Lack-of-Trust-on-the-Use-of-Private-Data-Context.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Sign an Agreement to Solve Lack of Trust on the Use of Private Data Context" 4 | type: pattern 5 | excerpt: "Services of a controller may require users to sign contracts that stipulate their obligations and processing purposes for which users must consent to use the service. This ensures that users can trust the controller as it is bound to the contract it signs." 6 | categories: 7 | - control 8 | - consent 9 | status: pattern 10 | sim: Obtaining-Explicit-Consent 11 | address: 12 | 13 | %} 14 | 15 | [TOC] 16 | 17 | ### [Also Known As] 18 | 19 | 20 | Suggested: Contractual Consent 21 | 22 | ## Context 23 | 24 | 25 | Users do not inherently trust controllers who provide services (or products), as they do not have assurances as to what the controller's or their processor's true intentions are. Controllers and processors typically aim to make profit, but this might be at the expense of users if those users do not consider their privacy needs. The controller might have reasonable defaults or [levels of control](Reasonable-Level-of-Control), but users also need to feel reassured that their choices are being honored. This is especially true of what they do or do not provide [Lawful Consent](Lawful-Consent) for. 26 | 27 | ## Problem 28 | 29 | 30 | The controller does not necessarily have the trust of its users, and needs this trust for its services to process their data. 31 | 32 | #### Forces/Concerns 33 | - The controller wishes to provide services to the user, but needs their trust and consent to do so. 34 | - Processors want to manipulate data without having to worry about whether the data contains consented information or not. 35 | - Users want to use services, but not at the risk of their own personal privacy requirements being undermined. 36 | - Users want to know what they can safely provide to the controller and what information might be revealed about them if they use the service. 37 | - Users need to feel that the controller will honor any decision taken about their personal data. 38 | 39 | ## Solution 40 | 41 | 42 | The service should provide the user with a contractual agreement (featuring privacy policy) which binds the controller to their word, provided that the user consents to the processing of data needed for specific purposes. The agreement should also bind any representative of the controller. It should be straightforward and clear enough for the user to comprehend. 43 | 44 | 45 | 46 | 47 | 48 | 49 | ### [Implementation] 50 | 51 | 52 | The service should feature a mechanism (e.g. landing page or unavoidable introduction) prior to collection, which stipulates the need for user consent. There should be a reasonable effort to prevent users from bypassing this mechanism. 53 | 54 | The specific purposes for which their data will be processed should be made clear. The service should, at the same time, outline the contractual obligations it will be held against should the user consent. The user should be able to seek further detail about these obligations without first needing to consent. 55 | 56 | If users decide to consent, they can make this clear by interacting with a mechanism (e.g. button) which clearly represents their agreement to the contract. 57 | 58 | A further implementation could additionally allow the user access to a subset of the service which does not require any data, in order to help justify their consent. This would also alleviate the user's potential apprehension about the time taken to review and inform themselves about their decision. 59 | 60 | ## Consequences 61 | 62 | 63 | #### Benefits 64 | The controller, any of their representatives, and their users are tied to the terms of the contract and the legal implications it holds. Any disputes will involve both contract law and privacy law. 65 | 66 | #### Liabilities 67 | Users may be discouraged to use a service if they are made aware of the risks to their privacy, or introduced to the ways in which their data can be used to reveal information. 68 | 69 | They may also be tempted to consent without reading about the contract or how their data may be used. Therefore it is useful to not force an immediate decision, as this can invalidate the consent as not freely given or uninformed. 70 | 71 | 72 | 73 | 74 | 75 | 76 | 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | ### [Related Patterns] 92 | 93 | [Sign an Agreement to Solve Lack of Trust on the Use of Private Data Context](Sign-an-Agreement-to-Solve-Lack-of-Trust-on-the-Use-of-Private-Data-Context) is one of the components of the compound pattern, [Lawful Consent](Lawful-Consent). As such, they share a _may use_ relationship. 94 | 95 | This pattern is _similar to_ [Obtaining Explicit Consent](Obtaining-Explicit-Consent). It aims to get the user's trust, while the second pattern focuses on collecting consent in a manner which cannot be second-guessed. Although they may seem to be complementary at first, the solutions are actually quite similar. Each pattern points out advantages for the user or the controller from their own perspective. 96 | 97 | ### [Sources] 98 | 99 | 100 | Based on: 101 | 102 | Asnar, Y., Bryl, V., Bonato, R., Campagna, L., Donnan, D., Khoury, P. El, Giorgini, P., Holtmanns, S., Martinez-Juste, M., Massacci, F., Porekar, J., Riccucci, C., Saidane, A., Seguran, M., Thomas, R., & Yautsiukhin, A. (2007). Initial Set of Security and Privacy Patterns at Organizational Level, (December 2006). Retrieved from http://www.serenity-forum.org/IMG/pdf/A1.D3.1_patterns_at_organizational_level_v1.3_final.pdf 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | -------------------------------------------------------------------------------- /patterns/Single-Point-of-Contact.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Single Point of Contact" 4 | type: pattern 5 | excerpt: "The Single Point of Contact is a security authority who protects the privacy and security of sensitive data stored online by validating the authority of requests and ensuring secure communication channels." 6 | categories: 7 | - control 8 | - consent 9 | - choose 10 | status: pattern 11 | use: Lawful-Consent 12 | address: 13 | 14 | %} 15 | 16 | [TOC] 17 | 18 | ### [Also Known As] 19 | 20 | 21 | Personal Agent 22 | 23 | ## Context 24 | 25 | 26 | Many controllers make use of a storage platform (i.e. 'cloud' facilities), such as e-Health services that keep their sensitive patient data in a distributed online storage. The sensitivity of this information raises concern and garners a need for special care. The storage medium in this case rules out typical security approaches. 27 | 28 | ## Problem 29 | 30 | 31 | Effective distributed storage services require specialized privacy management. The deficiencies of traditional means may be expressed through the following: 32 | 33 | - traditional security mechanisms are platform dependent; 34 | - typically they are difficult to federate or distribute; 35 | - compliance with protocol can be cumbersome; and 36 | - as such they are often inflexible. 37 | 38 | #### Forces/Concerns 39 | - Controllers wish to protect the sensitive or otherwise personal data they are charged with 40 | - They want to acquire genuine [Lawful Consent](Lawful-Consent) in a streamlined fashion 41 | - They need this process to be facilitated, supervised, and provably sound 42 | 43 | ## Solution 44 | 45 | 46 | Single Point of Contact adopts a claim-based approach for both authentication and authorization similar to a super-peer design, also acting as a (Resource) Security Token Service, an Identity and Attribute Provider, and a Relying Party. It features a tried and proven expressive e-consent language, and can communicate with other SPoCs in a Circle of Trust 47 | 48 | #### Rationale 49 | Overcoming the inflexibility of traditional security mechanisms is partly solved by claim-based identity, which _provides a platform-independent way of presenting identity information_. 50 | 51 | ### [Structure] 52 | 53 | 54 | _A SPoC is essentially a security authority, which protects patients' privacy in e-Health applications by providing a claim-based authentication and authorisation functionality (Baier et al. 2010), and facilitating secure communication between an e-Health service and its clients._ 55 | 56 | SPoC shares characteristics with a Central Medical Registry (CMReg), which performs authentication and manages identifying access to anonymised medical documents in a central repository. SPoC additionally facilitates secure e-Health service development and integration. It is able to share Electronic Health Records (EHRs) through a peer-to-peer network as an overarching, claim-based, super-peer-like representative of the e-Health community. Multiple SPoCs may also communicate, constituting a Circle of Trust. 57 | 58 | See Fan et al. (2012) Figure 1 for a visual depiction. 59 | 60 | The SPoC features a *Domain Ontology* for providing vocabulary towards claims and policies, a *Policy Engine* for consent syntax using natural language and pseudonym storage, and an *Interface Service*. The interfaces provided include Authentication, Authorisation, and Pseudonym Resolution. 61 | 62 | ### [Implementation] 63 | 64 | 65 | A SPoC is able to issue security tokens as a Security Token Service (STS), authenticate local domain users as an Identity Provider, certify attributes as an Attribute Provider, and accept external claims as a Relying Party. When in a Circle of Trust, the SPoC can also translate the claims of other SPoCs as a Resource STS. 66 | 67 | SPoCs' implementation of e-consent features the following levels, based on Coiera et al. (2004): 68 | 69 | - general consent [with or without specific exclusions]; 70 | - general denial [with or without specific consents]; 71 | - service authorisation; 72 | - service subscription; and 73 | - investigation. 74 | 75 | As with Pruski's (2010) e-CRL, SPoCs' e-consent also considers _specific grantees, operations, purposes and period of validity_. 76 | 77 | For more information see Fan et al. (2012). 78 | 79 | ## Consequences 80 | 81 | 82 | The SPoC ensures that the privacy of sensitive medical data is protected, and that it is distributed securely and only to the people who are allowed to access the data. However, it requires a reliable credential-based authentication system to be able to validate requests. 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | 103 | 104 | ### [Related Patterns] 105 | 106 | 107 | This pattern _must use_ [Lawful Consent](Lawful-Consent), as it features an expressive e-consent language which aims to provide sufficient means to acquire permission for processing that is valid. 108 | 109 | ### [Sources] 110 | 111 | 112 | - Fan, L., Buchanan, W. J., Lo, O., Thuemmler, C., Lawson, A., Uthmani, O., Ekonomou, E., & Khedim, A. S. (2012). SPoC: Protecting Patient Privacy for e-Health Services in the Cloud. Retrieved from http://researchrepository.napier.ac.uk/4992/ 113 | - D. Baier, V. Bertocci, K. Brown, E. Pace, and M. Woloski, A Guide to Claims-based Identity and Access Control, Patterns & Practices. ISBN: 9780735640597, Microsoft Corp., Jan. 2010. 114 | - E. Coiera and R. Clarke, “e-Consent: the Design and Implementation of Consumer Consent Mechanism in an Electronic Environment,” JAMIA, vol. 11, no. 2, pp. 129–140, 2004. 115 | - C. Pruski, “e-CRL: A Rule-Based Language for Expressing Patient Electronic Consent,” in Proc. of eTELEMED. IEEE, 2010, pp. 141–146. 116 | 117 | - C. Bier and E. Krempel, “Common Privacy Patterns in Video Surveillance and Smart Energy,” in ICCCT-2012, 2012, pp. 610–615. 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | -------------------------------------------------------------------------------- /patterns/Sticky-policy.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Sticky Policies" 4 | type: pattern 5 | excerpt: "Machine-readable policies are sticked to data to define 6 | allowed usage and obligations as it travels across multiple 7 | parties, enabling users to improve control over their personal 8 | information." 9 | categories: 10 | - privacy-policy 11 | - enforce 12 | - uphold 13 | status: draft 14 | %} 15 | 16 | [TOC] 17 | 18 | 19 | 20 | 21 | 22 | 23 | ## Summary 24 | 25 | 26 | Machine-readable policies are sticked to data to define allowed usage 27 | and obligations as it travels across multiple parties, enabling users 28 | to improve control over their personal information. 29 | 30 | ## Context 31 | 32 | 33 | Multiple parties are aware of and act according to a certain policy 34 | when privacy-sensitive data is passed along the multiple successive 35 | parties storing, processing and sharing that data. 36 | 37 | ## Problem 38 | 39 | 40 | Data may be accessed or handled by multiple parties that share data 41 | with an organisation in ways that may not be approved by the data 42 | subject. 43 | 44 | ## Solution 45 | 46 | 47 | Service providers use an obligation management system. Obligation 48 | management handles information lifecycle management based on 49 | individual preferences and organisational policies. The obligation 50 | management system manipulates data over time, ensuring data 51 | minimization, deletion and notifications to data subjects. 52 | 53 | 54 | The goal of the pattern is to enable users to allow users to control 55 | access to their personal information. 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | 65 | 66 | 67 | ## Consequences 68 | 69 | 70 | 71 | 72 | 73 | Benefits: Policies can be propagated throughout the cloud to trusted 74 | organisations, strong enforcement of the policies, traceability. 75 | Liabilities: Scalability: policies increase size of data. Practicality 76 | may not be compatible with existing systems. It may be difficult to 77 | update the policy after sharing of the data and existence of multiple 78 | copies of data. It requires ensuring data is handled according to 79 | policy e.g. using auditing. 80 | 81 | 82 | 83 | 84 | 85 | 86 | ## Examples 87 | 88 | 89 | When data is shared by an organisation they can use privacy preserving 90 | policy to enforce respecting user privacy by third party organisations 91 | that use, process and store such data. For example, a hospital may 92 | share data with third party organisations requiring adhering to 93 | specific privacy policies associated with the data. 94 | 95 | ### [Known Uses] 96 | 97 | 98 | Examples of policy specification languages include EPAL, OASIS XACML 99 | and W3C P3P. Tracing of services can use Identifier-Based Encryption 100 | and trusted technologies. An alternative approach using Merkle hash 101 | tree has been proposed by Pöhls (2008). A Platform for Enterprise 102 | Privacy Practices (E-P3P) (2003) distinguishes the enterprise-specific 103 | deployment policy from the privacy policy and facilitates the 104 | privacy-enabled management and exchange of customer data. References: 105 | Pearson, S., Sander, T. and Sharma, R., Privacy Management for Global 106 | Organisations, Data Privacy Management and Autonomous Spontaneous 107 | Security, LNCS 5939, Springer, pp. 9-17., 2009 Phöls, H.G., Verifiable 108 | and Revocable Expression of Consent to Processing of Aggregated 109 | Personal Data. ICICS, pp.279-293, 2008 Karjoth, G., Schunter, M., & 110 | Waidner, M., Platform for enterprise privacy practices: 111 | Privacy-enabled management of customer data. In Privacy Enhancing 112 | Technologies, pp. 69-84, Springer Berlin Heidelberg, 2003 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | 135 | 136 | 137 | 138 | -------------------------------------------------------------------------------- /patterns/Strip-invisible-metadata.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Strip Invisible Metadata" 4 | excerpt: "Strip potentially sensitive metadata that isn't directly visible to the end user." 5 | status: draft 6 | display_in_list: True 7 | type: pattern 8 | categories: 9 | - metadata 10 | - minimization 11 | - exif 12 | - location 13 | - media 14 | - minimize 15 | - strip 16 | %} 17 | 18 | [TOC] 19 | 20 | 21 | 22 | 23 | 24 | 25 | ## Summary 26 | 27 | 28 | Strip potentially sensitive metadata that isn't directly visible to the end user. 29 | 30 | 31 | 32 | 33 | ## Context 34 | 35 | 36 | When a service requires a user to import data from external sources (eg. 37 | pictures, tweets, documents) different types of metadata may be 38 | transmitted. Users may not be aware of the metadata as it can be 39 | automatically generated or not directly visible. Services might be 40 | inadvertently responsible for exposing private metadata, or going 41 | against users' expectations. 42 | 43 | ## Problem 44 | 45 | 46 | Users are not always fully aware of the various kinds of metadata 47 | attached to files and web resources they share with online services. 48 | Much of this data is automatically generated, or not directly visible to 49 | users during their interactions. This can create situations where, even 50 | though users share information explicitly with services, they may be 51 | surprised to find this data being revealed. In certain cases where the 52 | data is legally protected, the service could be held responsible for any 53 | leakage of sensitive information. 54 | 55 | How should services that need users to share data and upload files 56 | treat additional metadata attached with files? In case of uploading 57 | documents and images, which parts of the metadata can be treated as 58 | explicitly shared information. 59 | 60 | ## Solution 61 | 62 | 63 | Stripping all metadata that is not directly visible during upload time, 64 | or during the use of the service can help protect services from 65 | leaks and liabilities. Even in cases where the information is not 66 | legally protected, the service can protect themselves from surprising 67 | their users and thus alienating them. 68 | 69 | Additionally when users share data with services, they can be presented 70 | with a preview of the data obtained by the service, including any 71 | metadata ``[[Preview Shared Data]]``. This allows users to be more aware 72 | of information that they are sharing with the services, and in many 73 | cases with other entities on the Internet. 74 | 75 | To summarize: user metadata that can not be made visible to users 76 | clearly should be stripped to avoid overstepping the users' expectations. 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | ## Examples 99 | 100 | 101 | 1. Uploading images to twitter.com 102 | 103 | Twitter.com removes EXIF data from images uploaded to their image 104 | sharing service. Previously there have been many breaches of personal 105 | location by using EXIF data shared by image sharing services. 106 | 107 | 2. Hiding EXIF data on Flickr.com 108 | 109 | In certain cases services might build features based on 110 | metadata, or the metadata sharing could be an important part of the 111 | community of users. Flickr.com allows users to hide their EXIF data from 112 | public display, and also provides an interface for users to easily see 113 | whether they are sharing location as part of uploading their images. 114 | 115 | _TODO: add screenshots_ 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | 135 | 136 | 137 | 138 | 139 | 140 | 141 | 142 | 143 | 144 | 145 | 146 | -------------------------------------------------------------------------------- /patterns/Trustworthy-privacy-plugin.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Trustworthy Privacy Plug-in" 4 | type: pattern 5 | excerpt: "Aggregate usage records at the user side in a 6 | trustworthy manner." 7 | categories: 8 | - aggregate 9 | - hide 10 | - restrict 11 | status: draft 12 | %} 13 | 14 | [TOC] 15 | 16 | 17 | 18 | 19 | 20 | 21 | ## Summary 22 | 23 | 24 | Aggregate usage records at the user side in a trustworthy manner. 25 | 26 | ## Context 27 | 28 | 29 | A service provider gets continuous measurements of a service attribute linked to a service individual. Applicable service tariffs may vary over time. 30 | 31 | ## Problem 32 | 33 | 34 | The provision of a service may require repeated, detailed measurements of a service attribute linked to a data subject to e.g. properly bill them for the service usage. However, these measurements may reveal further information (e.g. personal habits, etc.) when repeated over time. 35 | 36 | ## Solution 37 | 38 | 39 | Host a Privacy Plugin at a consumer-trusted device, in between the metering and the billing systems. and the service provider in charge of billing for the service usage. This privacy plugin, under the consumer’s control, computes the aggregated invoice and sends it to the service provider (or to its billing subsystem), which does not need any fine-grained consumption records anymore. Cryptographic techniques (homomorphic commitments, zero-knowledge proofs of knowledge, digital signatures) are used to ensure trustworthiness of the generated invoices without requiring tamper-proof hardware. 40 | 41 | 42 | A service provider can get a trustworthy measurement of service usage along a period to issue a bill for the service usage; however, the detailed consumption for finer intervals cannot be obtained. 43 | 44 | 45 | 46 | 47 | 48 | 49 | 50 | 51 | 52 | 53 | 54 | ## Consequences 55 | 56 | 57 | 58 | 59 | 60 | The service provider does not need anymore to access detailed consumption data in order to issue reliable bills. 61 | 62 | 63 | 64 | 65 | 66 | 67 | ## Examples 68 | 69 | 70 | An electric utility operates a smart grid network with smart meters that provide measurements of the instantaneous power consumption of each user. Depending on the power demand, dynamic tariffs are applied. The utility employs that information to bill each client periodically, according to his aggregated consumption over the billing period and the respective tariffs at each moment. However, this information can also be exploited to infer sensitive user information (e.g. at what time he or she leaves and comes back to home, etc.) 71 | 72 | ### [Known Uses] 73 | 74 | 75 | - Alfredo Rial and George Danezis. 2011. Privacy-preserving smart metering. In Proceedings of the 10th annual ACM workshop on Privacy in the electronic society (WPES '11). ACM, New York, NY, USA, 49-60. 76 | - Rial, A., & Danezis, G. (2011, October). Privacy-preserving smart metering. In Proceedings of the 10th annual ACM workshop on Privacy in the electronic society (pp. 49-60). ACM. 77 | 78 | 79 | 80 | 81 | 82 | 83 | 84 | 85 | 86 | 87 | 88 | 89 | 90 | 91 | 92 | 93 | 94 | 95 | 96 | 97 | 98 | 99 | 100 | 101 | 102 | -------------------------------------------------------------------------------- /patterns/Use-of-dummies.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "Use of dummies" 4 | type: pattern 5 | excerpt: "This pattern hides the actions taken by a user by adding 6 | fake actions that are indistinguishable from real." 7 | categories: 8 | - hide 9 | - obfuscate 10 | status: draft 11 | %} 12 | 13 | [TOC] 14 | 15 | 16 | 17 | 18 | 19 | 20 | ## Summary 21 | 22 | 23 | This pattern hides the actions taken by a user by adding fake actions 24 | that are indistinguishable from real. 25 | 26 | ## Context 27 | 28 | 29 | This pattern is applicable when it is not possible to avoid executing, 30 | delaying or obfuscating the content of an action. 31 | 32 | ## Problem 33 | 34 | 35 | When users interact with ICT systems their actions reveal a lot of 36 | information about themselves. An option would be for users to not 37 | perform such actions to protect their privacy. However, this is not 38 | possible since users cannot completely avoid executing these actions 39 | because they need to perform them to achieve a goal (e.g., search for 40 | a word on the Internet, send an email, search for a location). 41 | 42 | ## Solution 43 | 44 | 45 | Since the action must be accurately performed, an option to provide 46 | privacy is to simultaneously perform other actions in such a way that 47 | the adversary cannot distinguish real and fake (often called dummy) 48 | actions. 49 | 50 | 51 | To hinder the adversary’s ability to infer the user behavior, as well 52 | as her preferences. 53 | 54 | 55 | 56 | 57 | 58 | 59 | 60 | 61 | 62 | 63 | 64 | ## Consequences 65 | 66 | 67 | 68 | 69 | 70 | This pattern entails the need for extra resources to perform the dummy 71 | actions, both at the side of the user that must repeat the action, and 72 | at the server side that must process several actions. Sometimes it may 73 | degrade the quality of service since the service provider cannot 74 | personalize services. It has been demonstrated that generating dummies 75 | that are perfectly indistinguishable from real actions (in terms of 76 | content, timing, size, etc...) is very difficult. 77 | 78 | 79 | 80 | 81 | 82 | 83 | ## Examples 84 | 85 | 86 | Alice wants to search for an abortion clinic on Google, but she does 87 | not want to reveal her intentions of abort to an adversary that may be 88 | eavesdropping this search (e.g., ISP provider, system administrator of 89 | her workplace, etc). 90 | 91 | ### [Known Uses] 92 | 93 | 94 | The use of this pattern has been proposed to protect privacy in 95 | location based services (the user reveals several locations to the 96 | service provider so that her real location is hidden), anonymous 97 | communications (the user sends fake messages to fake recipients to 98 | hide her profile), web searches (the user searches for fake terms to 99 | hide her real preferences). 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | -------------------------------------------------------------------------------- /patterns/User-data-confinement-pattern.md: -------------------------------------------------------------------------------- 1 | {%hyde 2 | 3 | title: "User data confinement pattern" 4 | type: pattern 5 | excerpt: "Avoid the central collection of personal data by 6 | shifting some amount of the processing of personal data to the 7 | user-trusted environments (e.g. their own devices). Allow users to 8 | control the exact data that shares with service providers" 9 | categories: 10 | - data-minimization 11 | - separate 12 | - isolate 13 | status: draft 14 | %} 15 | 16 | [TOC] 17 | 18 | 19 | 20 | 21 | 22 | 23 | ## Summary 24 | 25 | 26 | Avoid the central collection of personal data by shifting some amount 27 | of the processing of personal data to the user-trusted environments 28 | (e.g. their own devices). Allow users to control the exact data that 29 | shares with service providers 30 | 31 | ## Context 32 | 33 | 34 | This pattern may be used whenever the collection of personal data with 35 | one specific and legitimate purpose still pose a relevant level of 36 | threat to the users' privacy 37 | 38 | ## Problem 39 | 40 | 41 | The engineering process is biased to develop system-centric 42 | architectures where the data is collected and processed in single 43 | central entities, forcing users to trust them and share potentially 44 | sensible personal data 45 | 46 | ## Solution 47 | 48 | 49 | The solution is to shift the trust relationship, meaning that instead 50 | of having the customer trust the service provide to protect its 51 | personal data, the service provider now haves to trust the customers' 52 | processing. 53 | 54 | In the smart meter example, the smart meter would receive the monthly 55 | tariff and calculate the customer's bill which will be then sent to 56 | the energy provider where it will be processed. The main benefit is 57 | that at no moment the personal data has left the users trusted 58 | environment. 59 | 60 | 61 | Avoid the need for trust in service providers and the collection of 62 | personal data 63 | 64 | 65 | 66 | 67 | 68 | 69 | 70 | 71 | 72 | 73 | 74 | ## Consequences 75 | 76 | 77 | 78 | 79 | 80 | Depending on the type of processing (e.g calculate the bill for the 81 | monthly energy consumption or the age from the birth date) the service 82 | provider will require some guarantees from the processor (the end 83 | user). This may involve the usage of Trusted Platform Modules or 84 | cryptographic algorithms (e.g. ABC4Trust) 85 | 86 | 87 | 88 | 89 | 90 | 91 | ## Examples 92 | 93 | 94 | The smart grid is a domain with a clear example: having smart meters 95 | delivering hourly customers' energy consumption to the energy provider 96 | poses a serious threat to the customers' privacy. If the only purpose 97 | of collecting these data is to bill the customer, why cannot this 98 | calculation be done by the customer based on pre-established tariffs? 99 | 100 | Similar examples in other domains are "pay as your drive" insurance 101 | policies where the insurance price is calculated based on the drivers 102 | behaviour or electronic toll pricing 103 | 104 | ### [Known Uses] 105 | 106 | 107 | Smart meter, Privacy-enhanced attribute based credentials, pay as your 108 | drive insurances, electronic toll pricing 109 | 110 | 111 | 112 | 113 | 114 | 115 | 116 | 117 | 118 | 119 | 120 | 121 | 122 | 123 | 124 | 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | -------------------------------------------------------------------------------- /principles/Minimization.md: -------------------------------------------------------------------------------- 1 | 2 | ## Definition 3 | 4 | Data minimization is the principle of collecting less personal data or less specific data and retaining specific data for a shorter period of time in order to decrease privacy risks. 5 | 6 | Data minimization techniques can decrease the risk of unexpected later access — through a security breach or lawful access request, for example — provide demonstrable clarity of data handling practices and remove the burdens of safe-guarding sensitive data. 7 | 8 | ## See also 9 | 10 | * [Data Minimization in Web APIs](http://www.w3.org/2001/tag/doc/APIMinimization.html), World Wide Web Consortium Technical Architecture Group -------------------------------------------------------------------------------- /principles/location.md: -------------------------------------------------------------------------------- 1 | 2 | ## Definition 3 | 4 | > As location tracking capabilities of mobile devices are increasing, problems related to user privacy arise, since user's position and preferences constitute personal information and improper use of them violates user's privacy. Several methods to protect user's privacy when using location based services have been proposed, including the use of anonymizing servers, blurring of information e.a. Methods to quantify privacy have also been proposed, to be able to calculate the equilibrium between the benefit of providing accurate location information and the drawbacks of risking personal privacy.Users of such services may also choose to display more generic location information (i.e. "In the City" or "Philadelphia" or "Work") to some of their more casual acquaintances while only displaying specific location information, such as their exact address, to closer contacts like spouse, relatives, and good friends. 5 | 6 | Source: [Wikipedia](http://en.wikipedia.org/wiki/Privacy#Privacy_and_location-based_services) -------------------------------------------------------------------------------- /resources/privacypatterns-sxsw.key: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/privacypatterns/patterns/3fe5821aeb207904946990b76527054d8a92d67b/resources/privacypatterns-sxsw.key -------------------------------------------------------------------------------- /resources/privacypatterns.key: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/privacypatterns/patterns/3fe5821aeb207904946990b76527054d8a92d67b/resources/privacypatterns.key -------------------------------------------------------------------------------- /specs/patterns-schemas.md: -------------------------------------------------------------------------------- 1 | ## Patterns: schema. 2 | 3 | It would be in line with modern web standards to ensure that the 4 | patterns published on the website conform to web standards. Some of 5 | these are mentioned below: 6 | 7 | * Search Schema support (see [[schema.org|http://schema.org/Thing]]) 8 | Patterns published on the website **SHOULD** follow microformat 9 | specifications 10 | Probably a custom subset of *Thing::Article* would be something we 11 | could look at. 12 | 13 | * RSS/Atom feeds of new patterns 14 | Eventually it would be a good idea to expose feeds for patterns by 15 | topics and/or category. 16 | 17 | * Template for writing patterns 18 | We should decide on a underlying pattern template which can be used to 19 | write new patterns. This might evolve over time. 20 | 21 | ## Hyde variables 22 | 23 | We use the following hyde variables in a pattern to supply metadata to our pre-processor. 24 | 25 | {%hyde _metadata blocks must begin with this_ 26 | 27 | title: Title of the pattern _variable names must be lower case_ 28 | 29 | type: pattern _values allowed: pattern, principle, page_ 30 | 31 | excerpt: This short description is a 1-2 line abstract to appear on the homepage and other shortened forms. 32 | 33 | categories: location, notice, smartphone, ui _tags that classify the subject matter, the purpose of the pattern, and the applicable audience_ 34 | 35 | status: draft _values allowed: stub, draft, published_ 36 | 37 | address: http://npdoty.name, @m0hit _comma-separated list of authors, or more importantly, the people to contact or blame about the content, to be displayed in an <address> HTML element on the page. in future, URLs may be de-referenced and hCard identities extracted; Twitter/github handles and email addresses also possible._ 38 | 39 | %} _end the metadata block with this_ -------------------------------------------------------------------------------- /specs/requirements.md: -------------------------------------------------------------------------------- 1 | ## Tools for communication and tracking 2 | 3 | ### Project Requirements 4 | 5 | #### Website 6 | 7 | * source control for website source. 8 | * dynamic web hosting and domain name for the website 9 | * Bugs and task tracking for the website 10 | 11 | #### Patterns 12 | 13 | * collaboration tool for writing patterns 14 | * multiple authors (initially Nick and Mohit, but potentially others who are less technical) 15 | * preferably authoring in preferred formats (plain-text Markdown for Mohit, MediaWiki for Nick, web-based for Deirdre, etc.) 16 | * allow links between patterns and to principles / other pages 17 | * revision history for patterns 18 | * faceted and/or hierarchical organization for patterns 19 | * public sharing for patterns (privacypatterns website) 20 | * publicly-shared patterns should be separate from patterns currently being drafted 21 | * it should be fairly easy to publish a pattern from the collaborative writing tool once it's ready 22 | 23 | #### Task tracking, meeting notes and misc items 24 | 25 | * Task tracking for the project 26 | * repository for meeting notes 27 | * repository for presentations, image resources and other media. 28 | 29 | #### Interviews 30 | 31 | * people to interview 32 | * set up meetings and track them 33 | * some of this data needs to be private. 34 | -------------------------------------------------------------------------------- /specs/website.md: -------------------------------------------------------------------------------- 1 | ## Privacy Patterns public website outline 2 | 3 | The privacy patterns project **REQUIRES** a public facing website as a space where people can 4 | 5 | * learn about the project 6 | * about the people and organizations participating in the project 7 | * the goals and plans of the project 8 | * motivation for the project 9 | 10 | * browse available patterns. The site 11 | * **MUST** allow comments on the existing patterns 12 | * **MAY** allow users to suggest new patterns, or organization of patterns 13 | * **MAY** allow users to add example uses of patterns. 14 | 15 | * individuals or organizations **MAY** express interest in participating with the project 16 | * submit new patterns 17 | * help with the development of the site 18 | * participate in interviews about design for privacy 19 | 20 | --------------------------------------------------------------------------------