Movatterモバイル変換


[0]ホーム

URL:


US8578501B1 - Anonymous social networking with community-based privacy reviews obtained by members - Google Patents

Anonymous social networking with community-based privacy reviews obtained by members
Download PDF

Info

Publication number
US8578501B1
US8578501B1US11/870,475US87047507AUS8578501B1US 8578501 B1US8578501 B1US 8578501B1US 87047507 AUS87047507 AUS 87047507AUS 8578501 B1US8578501 B1US 8578501B1
Authority
US
United States
Prior art keywords
privacy
review
consenting
online community
online
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US11/870,475
Inventor
John W. Ogilvie
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by IndividualfiledCriticalIndividual
Priority to US11/870,475priorityCriticalpatent/US8578501B1/en
Application grantedgrantedCritical
Publication of US8578501B1publicationCriticalpatent/US8578501B1/en
Expired - Fee Relatedlegal-statusCriticalCurrent
Adjusted expirationlegal-statusCritical

Links

Images

Classifications

Definitions

Landscapes

Abstract

A member of an online community has an online identity published in the online community, and owns an offline identity which is not published there. The member manifests consent to a privacy review of an electronic communication involving the member, to help prevent disclosure of the member's offline identity within the online community. The electronic communication is reviewed by a human privacy reviewer and/or by automatically scanning for privacy concern triggers. Review results are provided to the member, who provides an opinion of the review that is then reflected in a summary of the reviewer's online reputation.

Description

RELATED APPLICATIONS
The present application incorporates and claims priority to each of the following: U.S. provisional patent application Ser. No. 60/865,757 filed Nov. 14, 2006; U.S. provisional patent application Ser. No. 60/866,418 filed Nov. 18, 2006; and U.S. provisional patent application Ser. No. 60/868,619 filed Dec. 5, 2006.
BACKGROUND
Social network services are provided online for communities of people who share interests. Social network services provide ways for members of an online community to learn about each other, such as directories, profiles, personal pages, and search facilities. Social networks also build on or provide ways for members of an online community to communicate electronically with each other, such as chat, email, instant messaging, blogs, forums, video transmissions, and discussion groups.
Contacts made online through a social network using online identities may be pursued offline. People who first met online may decide to meet in person offline for dating, friendship, business, or philanthropic activities, for example. Even if a member of an online community chooses not to meet other members in person offline, the member's offline identity may become known to others, through a communication from the member or otherwise.
SUMMARY
In connection with some embodiments, a member of an online community has an online identity which is published within the online community, and the member owns an offline identity which is not published within the online community. The member electronically manifests consent to a privacy review of an electronic communication (possibly not yet created) for which the member is a sender and/or an intended receiver. A goal of the privacy review is a lowered risk of disclosure of the member's offline identity within the online community. A service provider obtains the electronic communication and determines that the electronic communication should be submitted to a privacy review to assess the extent to which the electronic communication discloses the member's offline identity. A human privacy reviewer reviews the electronic communication. Results of the review are provided to the member.
The examples given are merely illustrative. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Rather, this Summary is provided to introduce—in a simplified form—some concepts that are further described below in the Detailed Description. The innovation is defined with claims, and to the extent this Summary conflicts with the claims, the claims should prevail.
DESCRIPTION OF THE DRAWINGS
A more particular description will be given with reference to the attached drawings. These drawings only illustrate selected aspects and thus do not fully determine coverage or scope.
FIG. 1 is a block diagram illustrating an operating environment, some roles, some data structures, and some system and configured storage medium embodiments;
FIG. 2 is a flow chart illustrating steps of some method and configured storage medium embodiments from a point of view of a member of an online community;
FIG. 3 is a flow chart illustrating steps of some method and configured storage medium embodiments from a point of view of a service provider who facilitates an online community; and
FIG. 4 is a flow chart illustrating steps of some method and configured storage medium embodiments from a point of view of a human privacy reviewer who reviews an electronic communication for an online community.
DETAILED DESCRIPTION
Overview
Reference will now be made to exemplary embodiments such as those illustrated in the drawings, and specific language will be used herein to describe the same. But alterations and further modifications of the features illustrated herein, and additional applications of the principles illustrated herein, which would occur to one skilled in the relevant art(s) and having possession of this disclosure, should be considered within the scope of the claims.
The meaning of terms is clarified in this disclosure, so the claims should be read with careful attention to these clarifications. Specific examples are given, but those of skill in the relevant art(s) will understand that other examples may also fall within the meaning of the terms used, and within the scope of one or more claims. Terms do not necessarily have the same meaning here that they have in general usage, in the usage of a particular industry, or in a particular dictionary or set of dictionaries. Reference numerals may be used with various phrasings, to help show the breadth of a term. Omission of a reference numeral from a given piece of text does not necessarily mean the content of a Figure is not being discussed by the text. The inventor asserts and exercises his right to his own lexicography. Terms may be defined, either explicitly or implicitly, here in the Detailed Description and/or elsewhere in the application file.
As used herein, a “computer system” may include, for example, one or more personal computers (portable or not), servers, personal digital assistants, cell or mobile phones, and/or device(s) having a processor controlled at least in part by instructions. The instructions may be in the form of software in memory and/or specialized circuitry. In particular, although it may occur that many embodiments run on personal computers and/or on servers, other embodiments may run on other computing devices, and any one or more such devices may be part of a given embodiment. Terms such as “computerized” refer to devices having a microprocessor and memory, not merely to personal computers or servers.
“Electronic” refers to digital and/or analog electronic circuitry.
“Automatic” means without requiring ongoing real-time human input or guidance to perform the immediately contemplated operation.
Operating Environment
With reference toFIG. 1, roles within an operating environment for an embodiment may include amember100 of anonline community102, aservice provider104, aprivacy reviewer106, and an intendedreceiver108. In a given configuration, theservice provider104, theprivacy reviewer106, and/or the intendedreceiver108 may also be members of the online community. An online community may have more than one service provider, e.g., it may have both an internet service provider (ISP) and an online community services provider (OCSP), withservice provider104 services being provided by either of these or by both the ISP and the OCSP, depending on the configuration. Commercial embodiments may operate on an ad-revenues business model, on a user-fee model (e.g., with anonymous payments), and/or on other business models.
In some configurations, theservice provider104 provides general-purpose services such as email, web page hosting and message forum hosting, which have been adapted bymembers100 for uses specific to the online community. In some configurations, theservice provider104 provides services that are specific to the online community, such as profile-editing software. In some configurations, theservice provider104 provides both general-purpose services and specific services to support the online community.
In some configurations, a human person serves as aprivacy reviewer106. In some configurations, the role ofprivacy reviewer106 is filled in part or in full by a special-purpose software process that automatically scanselectronic communications110 for specified keywords, specified data types, and/or other specified privacy concern triggers. In some configurations, the role of privacy reviewer is filled by a human person assisted by such special-purpose software.
The intendedreceiver108 may be a member of the online community, but need not be a member in every configuration. The intended receiver may also be an actual receiver, that is, anelectronic communication110 intended for thereceiver108 may have been actually delivered to thereceiver108. On the other hand, the intended receiver may be simply someone to whom the member can address anelectronic communication110, and it is not necessary in every configuration for theelectronic communication110 to exist in order for the role of intended receiver to be filled. A member may give consent, for example, to privacy review of allelectronic communications110 made or yet to be made by the member within the online community, thereby placing every current and future member of the online community in the role of intended receiver forelectronic communications110 that have not yet been created.
Because of space limitations,FIG. 1 shows only onemember100, oneservice provider104, oneprivacy reviewer106, and one intendedreceiver108. However, a given configuration may include zero or more members, service providers, privacy reviewers, and receivers, depending on the requirements of the embodiment being discussed. Each of these entities may also belong to or facilitate one or moreonline communities102 in a given configuration.
An operating environment for an embodiment may include, for instance, amember computer system112, a serviceprovider computer system114, a privacyreviewer computer system116, and areceiver computer system118. Each computer system112-118 has aprocessor120 and a memory122 which operate together to provide functionality discussed herein. But the various computer systems112-118 need not be identical with each other. For example, theservice provider system114 may include privacyreviewer selection software124 and/or privacy reviewercontact management software126 that is not present on the member/sender computer system112 or on the member/receiver computer system118. Similarly, thereviewer computer system116 may includeprivacy review software128 that is not present on theother computer systems112,114,118.
Such software124-128, like other software discussed herein, includes instructions that are executable by aprocessor120 and also includes data which is created, modified, referenced, structured, and/or otherwise used by the instructions. The software's instructions and data configure the memory(ies)122 in which they reside. For example, the software may configure a removable memory device130 such as a DVD or a flash memory even when that memory device is not plugged into a computer system. The software may also configure a memory122 that is a functional part of a given computer system, such as RAM or a plugged-in removable memory130, in which case the software instructions and data also configure the given computer system.
In some configurations,peripheral equipment134 such as human user I/O devices (screen, keyboard, mouse, microphone, speaker, motion sensor, etc.) will be present in operable communication with theprocessor120 and the memory122. However, a software embodiment may also be deeply embedded, on aservice provider server114 for example, such that the software in the embodiment has no human user interaction through human user I/O devices during normal operation.
In some configurations,networking interface equipment134 such as a packet-switched network interface card, a wireless transceiver, or a telephone network interface, for example, will be present in a computer system112-118. However, a computer system may also communicate through direct memory access, removable nonvolatile media, or other information storage-retrieval and/or transmission approaches.
An operating environment for an embodiment may include a singlemember computer system112, a single serviceprovider computer system114, a single privacyreviewer computer system116 and/or a singlereceiver computer system118. A given embodiment may also include two or more computer systems112-118, which may be linked to one another for networked communication.
Each computer system112-118 may run any network andoperating system software132, and may use any network interface equipment and otherperipheral equipment134, now known or hereafter formed. The operating environment may include computer systems112-118 that are client-server networked and/or peer-to-peer networked. Some operating environments include a stand-alone (non-networked) computer system, such as a privacyreviewer computer system116 configured for use in reviewingelectronic communications110 which are accessed from a removable storage medium130 such as a magnetic tape.
Systems
In some embodiments, acomputer system112 configured for use by amember100 of anonline community102 includes a memory122 configured with computer-executable instructions, and aprocessor120, coupled with the memory, that executes instructions. The instructions are part of software with which themember100 electronically manifests aconsent136 to a privacy review of anelectronic communication110 by ahuman privacy reviewer106.
For example, the member may press a user interface button labeled “Yes, I consent” after being shown an appropriate notice on ascreen134, thereby generating aconsent136 in the form of a cookie, a certificate, a bit flag, or another data structure. Privacy review consents136 manifested on amember computer system112 may be logged into a database ofconsents136 and/or otherwise tracked on a serviceprovider computer system114 in client-server configurations. In peer-to-peer configurations, consents136 generated on amember computer system112 may be transmitted to a peer computer system such as a privacyreviewer computer system116 and/or areceiver computer system118.
Theelectronic communication110 has not necessarily been created yet when theconsent136 is manifested. Themember100 is confirmed as a sender and/or an intended receiver of theelectronic communication110 after the electronic communication is created. Some examples ofelectronic communications110 include an email, an instant message, a blog entry, a blog comment, a forum posting, a video file or stream, and a voip communication.
The member is identified by anonline identity138 in the electronic communication. A goal of the privacy review is a lowered risk of disclosure of an offline identity of140 the member within theonline community102.
Some examples ofonline identities138 are usernames, email addresses, web page addresses, and avatars. Some examples ofoffline identities140 are legal names, residential addresses, employer names, and information of the type found on drivers licenses, passports, and other government-issued identification documents.
Space limitations and deference prevent showing every item inFIG. 1 at every possible location of the item. For example, in some embodiments anotice142 is generated on a serviceprovider computer system114, as shown, and then transmitted to amember computer system112, despite the fact thatFIG. 1 does not expressly illustrate anotice142 on themember computer system112 shown. The content of a givennotice142 may vary, as discussed elsewhere herein. As another example of howFIG. 1 merely helps illustrate possible configurations, each of the computer systems112-118 has one ormore processors120 and at least one memory122, even though these two items are shown expressly only forsystem112.
A privacy review may decrease in various ways the risk of disclosure of a member's offline identity within the online community. At a basic level, simply knowing that privacy reviews are used routinely, or perhaps even merely knowing that they are an option available to members, may be enough in some cases to reduce disclosure risk because members' knowledge of privacy reviews leads members to proactively draft and review their electronic communications with offline privacy protection in mind. The members check at least some of their communications themselves to look for offline identity information, instead of simply submitting the communications to privacy review by someone else.
At a more advanced level, amember communication110 undergoes a privacy review beyond whatever personal review was done by themember100 that wrote the communication. The privacy review may be performed automatically and/or by ahuman privacy reviewer106.
Automatic privacy review may be performed by scanning software144 that scans the communication for specified privacy concern triggers146. Some examples of possible triggers146 based on content include: personal names, family names, phone numbers, offline addresses, online addresses, geographic names, landmark names, questions seeking geographic information, statements containing geographic information, questions seeking employment information, statements containing employment information, indications of gender, indications of race, indications of ethnicity, indications of age, indications of title, indications of profession.
Some of these triggers can be readily recognized automatically by searching for keywords. For example, a list of states, provinces, and countries can be searched automatically to check offline addresses. An automatic search can be made for geographic terms like “river”, “lake”, “mountains”, “border” and the like. An automatic search can be made for landmark names like “Eiffel”, “Parliament”, “National Park” and the like. In some cases, the set of keywords automatically searched may be taken from adatabase148. Personal and family names, and employer names, for instance, can be found in numerous searchable public databases.
Some automatic searches may scan for particular data formats. Some example data formats include formats used foraddresses166 orimages168. Some examples ofaddresses166 include alphanumeric strings that match telephone number syntax, alphanumeric strings that match email address or website address syntax, alphanumeric/numeric (depending on location) strings that match zip code or other postal code syntax, and so on. Some examples ofimages168 includecommunication110 files having extensions associated with images (.jpg, .mpeg, .pdf, etc.), inline images in the body of acommunication110, and hyperlinks in a communication to a data stream or website that contains an image; links may be followed and destination pages parsed.
Privacy review may be performed by ahuman privacy reviewer106 who controls execution of scanning software144. In some embodiments, a processor coupled with memory executes instructions for amember100 to electronically receive anotice142 that review can be performed by a human privacy reviewer if the member so desires (opt-in approach). In some cases, anotice142 is given that the human privacy review will be made unless the member chooses otherwise (opt-out approach).
In some cases, anotice142 is given that the human privacy reviewer belongs to the online community. Whenprivacy reviewers106 are alsomembers100 of anonline community102, their reviews may better reflect the standards of that online community. The diligence ofprivacy reviewers106 who belong to anonline community102 may also be boosted by their concern for their reputation (even under a username) within that online community.
Thehuman privacy reviewer106 may useprivacy review software128 for highlighting, annotating and/or editing acommunication110 to point out and/or suggest a change in a part of the communication that the reviewer believes presents a noteworthy risk of offline identity disclosure. Annotated communications, changes, proposed changes, comments, assessments, and/or other privacy review results150 pertaining to the disclosure risk posed by the communication are provided electronically to themember100 by email or otherwise.
Theprivacy review software128 may also provide ahuman privacy reviewer106 with access to aprivacy review history152. Theprivacy review history152 may contain information specific to individual privacy reviews, may contain aggregated statistical information based on multiple privacy reviews, or both. It may track items such as particular privacy concern trigger keywords (e.g., “Texas”), particular types of privacy concern triggers (e.g., US state names), particular members (e.g., a member identified to the reviewer only as number 31415926 has not previously sent any communications that triggered a privacy review), particular communications (e.g., this email message was edited to its current form by the originating member after an automatic privacy review found an email address in the message body), privacy reviewers (e.g., communications from this member have been reviewed by seven different human privacy reviewers), and other pertinent information regarding privacy reviews.
In some embodiments, amember100 has electronic access to a portion of theprivacy review history152, by email, web page access, voicemail, text message, or otherwise. Theprivacy review history152 provided to the member may be considered part or all of the privacy review results150. In addition to suggestions for changing a communication to make it less risky, theprivacy review history152 provided to the member may contain other data. For example, a read-only copy of a privacy review history given to a member could display the relative frequency with which human privacy reviewers as a group have treated a given item as a risk to the privacy of offline identities.
In some embodiments, a processor coupled with memory executes instructions for themember100 to electronically provide an authority within theonline community102 with anopinion154 about the privacy review performed by thehuman privacy reviewer106. The authority may be the onlinecommunity services provider104 or software for which that services provider is responsible, or the authority may be the human privacy reviewer, for example. The opinion may be requested and obtained without disclosing the reviewer's identity (online and/or offline) to the member. Likewise, in connection with privacy review results150 or otherwise, in some embodiments a processor coupled with memory executes instructions for amember100 to electronically receive anotice142 that an identity (online and/or offline) of the member is kept hidden from ahuman privacy reviewer106.
Theopinion154 may include a free-form comments section and/or a multiple-choice section, for example. A purpose of the opinion may be to reflect the extent to which the reviewed member (the member whose communication was subject to privacy review) believes that the suggestions and/or changes from thehuman privacy reviewer106 actually reduce the risk of disclosure of the reviewed member's offline identity within the online community. Another purpose of the opinion may be to reflect ways in which the reviewed member believes that different edits to the communication would have been better.
Another purpose of theopinion154 may be to provide a basis for confirming or modifying a privacyreviewer reputation summary156. In some embodiments a processor coupled with memory executes instructions for a member to electronically receive a reputation summary of the human privacy reviewer, the reputation summary being a response toopinions154 of online community members about privacy reviews performed by the human privacy reviewer. More generally,reputation software164 solicits and receivesopinions154 and usesopinions154 to calculatereputation summaries156, which thereputation software164 then makes available for use bymembers100 andreviewers106.
A reputation summary for a given human privacy reviewer may include one or more ratings, based onopinions154 from reviewed members, indicating for example the speed, helpfulness, clarity, respectfulness, and/or credibility of privacy review results150 generated by that human privacy reviewer. Comments from reviewed members may be included in some reputation summary embodiments.
Minimum acceptable privacy reviewer ratings may be specified by a member in privacyreviewer selection criteria158. In addition, or without specifying acceptable ratings, a member may specify other privacyreviewer selection criteria158, such as a minimum level of experience as a privacy reviewer, (no) previous experience reviewing the member'scommunications110, and/or (non)membership in theonline community102 or some identified group therein. In some embodiments, a particular privacy reviewer can be selected by a member, using an online identity of the privacy reviewer.
Some configurations includegeneral protection software160 such as encryption software, anti-phishing software, firewall software, anti-virus software, anti-adware software, and the like. General protection software may be used to further raise awareness of identity crimes and unwanted imposition on privacy. However,general protection software160 is not specifically designed to help maintain the privacy of offline identities within an online community as described herein.
Somemember systems112,privacy reviewer systems116, and/orreceiver systems118 will be configured withapplication software162 such as word processors, email and instant messaging programs, and/or other applications that can be used to create, modify, transmit, store, retrieve, and/or otherwise manage electronic communications. Howeverapplication software162 is not specifically designed to help maintain the privacy of offline identities within an online community as described herein.
Someservice provider systems114 are configured with identity andaccess control software170, which manages online community member profiles172 and usernames174. For example,access control software170 may require a password from amember100 before allowing the member to read members-only postings or allowing the member to make a change in a profile of the member that is available through the online community.
Some embodiments include a configured computer-readable storage medium130. In a computer system112-118, disks (magnetic, optical, or otherwise), RAM, EEPROMS or other ROMs, and/or other configured storage medium can be provided as part of working memory122, and/or in addition to working memory122. A general-purpose storage medium, which may be removable or not, and may be volatile or not, is configured with data structures and instructions to thereby form a configured medium which is capable of causing a system with a processor to perform steps and provide functionality disclosed herein.
For example, a system may be configured with data such as privacy review results150, privacyreviewer reputation summaries156,privacy review histories152, privacyreviewer selection criteria158, andmember opinions154 of privacy reviews.
Also, a system may be configured with instructions capable of performing functions such as selecting a privacy reviewer (e.g., with software124), contacting a privacy reviewer with pertinent information such as the electronic communication to be reviewed (e.g., with software126), and generating privacy reviewer reputation summaries (e.g., with software164).
Configuring a system with such data and/or such instructions creates a special-purpose system which accepts input representing items outside the system, and transforms that input to provide useful and concrete results that help reduce the offline identity disclosure risk.FIG. 1 helps illustrate configured storage media embodiments, as well as system embodiments, process product embodiments, and method embodiments. A configured medium may also be considered an article of manufacture and/or a process product, produced using for example steps shown inFIGS. 2-4.
Some embodiments include a computer system such assystem114 which is configured for use by service provider such as an onlinecommunity service provider104 to help maintain the privacy ofoffline identities140 ofmembers100 of anonline community102. The system includes a memory122 configured with computer-executable instructions. Aprocessor120 in the system is coupled with the memory and executes the instructions.
For example, in some embodiments thesystem114 obtains anelectronic communication110 which involves amember100 of theonline community102. The member in question is involved in that the member is a sender and/or an intended receiver of the electronic communication. The communication may be obtained, for example, by making a copy in memory after giving appropriate notice to themember100.
Thesystem114 determines that theelectronic communication110 should be submitted to a privacy review to assess the extent to which the electronic communication discloses the member'soffline identity140. A determination may be made by scanning for privacy concern triggers, by noting a standing instruction from the user, by noting an alert placed by theservice provider104 or by an authorized privacy reviewer, and/or in some other manner. Some embodiments detail the determination criteria in anotice142 to the member. Some embodiments indicate the basis of the determination in a privacy review history or a comment attached to the communication.
Not every item shown inFIG. 1 need be present in every system embodiment or in every configured medium embodiment. Although implementation possibilities are illustrated here in text and drawings by specific examples, other embodiments may depart from these examples. For instance, specific features of an example may be omitted, renamed, grouped differently, repeated, instantiated in hardware and/or software differently, or be a mix of features appearing in two or more of the examples.
Methods
FIGS. 2-4 illustrate some method embodiments. In a given embodiment zero or more illustrated steps of a method may be repeated, perhaps with different parameters or data to operate on. Steps in an embodiment may also be done in a different order than the top-to-bottom order that is laid out in the Figures. Steps may also be omitted, combined, or otherwise depart from the illustrated flow, provided that the method performed is operable and conforms with at least one claim.
FIG. 2 shows aflow chart200 illustrating steps of some method and configured storage medium embodiments from a point of view of amember100 of anonline community102, for example.
Actions by a member discussed herein may equivalently be considered actions by software and hardware for which the member is responsible, e.g., by a system over which the member has control, and vice versa. The same holds true of actions by a service provider, by a human privacy reviewer, or by a human receiver of a communication. That is, a system of hardware and software, a system of hardware, and a system of software, may each be deemed an agent or alter ego of a human who controls that system.
As indicated by steps202 and204, the member has one or moreonline identities138 and oneoffline identity140, respectively. In particular, a member may have 202 an online identity which is published within anonline community102, and may also own204 an offline identity which is not published within the online community. Themember100 may haveonline identities138 in the form of usernames, avatars, personal web pages, and other online data which reflects aspects of the member's activities and preferences.
Online identity is generally under at least partial control of the member, and in many cases is under complete, or nearly complete, control of the member, e.g., by setting profile information and choosing email addresses. Indeed, a member may choose to have more than one online identity within a given online community.
By contrast, the offline identity of a given member can be considered unique. However, this is a definitional preference, not a requirement in every embodiment. One could also defineoffline identities140 according to time periods in the member's life, for example, or roles played by the member in the offline worlds, e.g., at home versus at work. Online identities can, however, provide some anonymity which is rarely if ever provided by offline identities.
During one or more notice receiving steps, a member receives electronically anotice142. A givennotice142 may be triggered by an event such as admission to membership in an online community, creation of anelectronic communication110 by a member, updates sent to the online community membership generally, or receipt of aprivacy review result150 by a particular member. Several notice steps206-212 are illustrated inFIG. 2; zero or more of these steps and/or other notice steps may be part of a given method.
During an automatic scanningnotice receiving step206, amember100 receives anotice142 indicating that a privacy review is based at least in part on automatically scanning anelectronic communication110 before delivery of the electronic communication to an intended receiver. For example, an electronic communication involving the member can be (or in some cases, will be) automatically scanned for privacy concern triggers146. The notice given may include examples of privacy concern triggers that will be detected automatically.
During a privacy review basisnotice receiving step208, amember100 receives anotice142 listing one or more grounds on which a privacy review will be made, automatically and/or by a human privacy reviewer. The notice given may include examples of privacy concern triggers146, excerpts from privacyreviewer selection criteria158, and/or excerpts from aprivacy review history152. The notice may also include a statement from ahuman privacy reviewer106 explaining the reviewer's goals and approach during privacy reviews.
In particular, a member may electronically receive anotice142 that a privacy review is based at least in part on scanning anelectronic communication110 for at least one of the following: personal name, family name, phone number, offline address, online address, geographic name, landmark name, a question seeking geographic information, a statement containing geographic information, a question seeking employment information, a statement containing employment information, and/or an indication of gender, race, ethnicity, age, title, or profession.
During an automatic scanningnotice receiving step210, amember100 receives anotice142 about ahuman privacy reviewer106. The notice given may include an indication of whether the human privacy reviewer is a member of theonline community102 to which the member also belongs, areputation summary156 pertaining to the privacy reviewer, and/or any of the information discussed in connection with privacy review basisnotice receiving step208.
During an identity secrecynotice receiving step212, amember100 receives anotice142 about identity secrecy. The notice given may include an indication that the member's offline and/or online identity(ies) will not be disclosed to ahuman privacy reviewer106. Likewise, the notice given may include an indication that the human privacy reviewer's offline and/or online identity(ies) will not be disclosed to the member.
During a privacy reviewer selectioncriteria specifying step214, amember100 specifies, through a user interface or by accepting default values, criteria for selecting ahuman privacy reviewer106 to perform a privacy review of anelectronic communication110 involving the member. In general, a member consents to a privacy review conditioned on specified privacyreviewer selection criteria158 being met.
At one extreme, in some embodiments the member may specify that only automatic scanning may be used to perform privacy reviews, in which event no human privacy reviewer meets the selection criteria. In other embodiments and/or other circumstances, the member may specify a particularhuman privacy reviewer106, either by that reviewer's username or by selecting areputation summary156 of that reviewer in a context that requires or invites selection of a human privacy reviewer.
In some cases, a member electronically specifies214 at least one geographic region as a privacy reviewer selection criterion, for example. The member electronically manifests consent to privacy review by a human privacy reviewer who resides outside the specified geographic region(s), thereby reducing the chance that the privacy reviewer and the member will ever meet in person offline.
During a privacy reviewtrigger specifying step216, a member specifies conditions that will trigger a privacy review. For example, triggers may be set such that privacy review may be performed on alloutgoing communications110, on allincoming communications110, or both. Privacy review may be performed on communications sent by the member to, and/or those received by the member from, specified populations. Whitelists, blacklists, histories, and/or usernames, for example, may be used to define the populations that trigger privacy review. For instance, a member may specify216 that allcommunications110 sent by the member to anyone never previously written to by the member, or who has not written the member in the past six months, or whose username is Fubar, should be subject to privacy review. In general, communications that are subject to privacy review are also subject to possible editing (including cancellation of the communication) by the member before being delivered to the destination(s) chosen by the member.
During aconsent manifesting step218, a member electronically manifests consent to at least one privacy review, to be performed automatically and/or by a human privacy reviewer, of a communication involving the member. When a member electronically manifests aconsent136 to a privacy review of anelectronic communication110, that electronic communication has not necessarily been created yet. The member is a sender and/or an intended receiver of the electronic communication after the electronic communication is created.
In some cases, a stated goal of the privacy review is a lowered risk of disclosure of a member's offline identity within theonline community102. In other cases, goals of a privacy review are not expressly stated in connection with acorresponding consent136.
Consent to an action, as used herein, includes consent prior to the action, ratification of the action after it was performed, or both. Consent may be manifested during initial registration of amember100 with an online community, and/or consent may be sought from the member on a communication-by-communication basis when privacy review is triggered automatically, for example. Consent for privacy reviews may not be legally required in every jurisdiction, but consent may be sought nonetheless out of respect for members and as a good business practice. A button press, an email, and/or any other tool used to form a contractual agreement or to obtain permission in other contexts may be adaptable for use in the context of obtaining amember consent136 to a privacy review.
Anappropriate notice142 may be given to expressly inform a member about the privacy review activity for which consent is being sought. In some cases, for example, the member electronically manifestsconsent218 specifying that website addresses and email addresses in electronic communications are subject to privacy review. In some, the member electronically manifests consent specifying that offline addresses in electronic communications are subject to privacy review, and/or specifying that images in electronic communications are subject to privacy review.
During aconsent manifesting step220, a member electronically manifests consent to at least one privacy review to be performed by a human privacy reviewer. Accordingly,consent manifesting step220 is a special case ofconsent manifesting step218, becausestep218 contemplates consent to automatic and/or human privacy review. In some cases, the member electronically receives anotice142 that privacy review includes submission of the electronic communication to a human privacy reviewer if a privacy concern trigger is found by automatic scanning of an electronic communication, and the consent is manifested220 based on the notice.
During a reputationsummary receiving step222, a member receives electronically (e.g., by email, web page viewing, voicemail, text message, etc.) information from areputation summary156 of a privacy reviewer. The privacy reviewer whose reputation is summarized and presented to the member may be a human privacy reviewer or an automatic privacy reviewer.
During a privacy reviewresult obtaining step224, a member obtains electronically aresult150 of a privacy review of an electronic communication. Privacy review results150 may include, for example, a copy of a reviewed communication with portions highlighted and annotated to explain privacy risks found in thecommunication110 during the review.
As an example, an incoming message and a draft response submitted to privacy review might be annotated as shown below. In this example, which is merely one of many possible different examples, privacy reviewer comments are enclosed in braces { }.
To: Pumpkin Farmer
From: Market Manager
Subject: Great Pumpkins!
Hi! I saw your pumpkin photos on the www waytoomanythingsforsale corn website {RISK: website address named} and got your email address from there. Do you live anywhere near West Smallishton? {RISK: city or town named} I would like to discuss some ways to help you increase your sales. We could meet by the Seven Squash Fountain downtown, Friday evening. I look forward to meeting you! Anxiously awaiting your reply, MM.
From: Pumpkin Farmer
To: Market Manager
Subject: RE: Great Pumpkins!
Sorry, but I only sell pix. My actual pumpkins themselves are not for sale. You might be a nice person, but if you are you should stay away from the Seven Squash Fountain because it has been a dangerous place ever since they built the Flack Pit next door. Sincerely, PF.
In this particular example, only a town and a website were flagged as privacy concerns by hypothetical automatic scanning software144. In other examples, particularly if a privacy review were performed by ahuman privacy reviewer106, an additional risk could be identified in that the first message above asks for information about where the recipient lives. The references to Seven Squash Fountain and to the Flack Pit could also be identified as risky, because the reply message indicates familiarity with those locations and hence discloses offline identity information about the user who is named Pumpkin Farmer. Specifically, Pumpkin Farmer's evident knowledge of the Seven Squash Fountain and the Flack Pit suggests that Pumpkin Farmer lives somewhere in their general vicinity, or at least has visited that vicinity not long ago. Thus, offline identity information about Pumpkin Farmer is at risk of being disclosed.
During anediting step226, a member edits acommunication110 in response to aprivacy review result150. This may be performed using the same tools162 (email program, instant messaging program, voice command interface, etc.) that were used to initially create the communication. Risk is generally lowest if the member edits the electronic communication after obtaining the privacy review and before delivery of the electronic communication to any intended receiver, but editing in other circumstance may still be desirable.
As an example, the draft reply from Pumpkin Farmer shown above might be edited in a variety of different ways. An extreme case of editing is complete deletion; the reply could be canceled and never sent to Market Manager. Alternately, the reply could be sent, but without any geographic reference made, e.g., after editing226 the draft reply to delete the sentence that refers to Seven Squash Fountain and the Flack Pit.
During acommunication submitting step228, a member submits acommunication110 to a privacy review. In some cases amember edits226 anelectronic communication110 after obtaining the privacy review results150 and then electronically submits228 the editedelectronic communication110 for privacy review. In some cases, thesubmission228 occurs without a previous privacy review. In some cases, a privacy review has been done, but noedits226 were made before are-submission228 of theunchanged communication110 because the member accepts the risks of identity disclosure posed by the communication and/or because the member disagrees with the privacy review's assessment of disclosure risks.
Submission228 (including resubmission) may be done by expressly asking for a privacy review of a givencommunication110, or by expressly consenting to a dialog box that asks whether a privacy review can be done on the communication in question. Submitting a communication for privacy review may also be implicit in submitting it to communications transmittal software in an online community. That is, asking aservice provider104 to send acommunication110 may include, based on aprior consent136, an inherent request that the communication also be submitted228 to a privacy review and that review results150 and an opportunity to edit226 the communication be given to the member who wrote the communication before any version of the communication is actually delivered to the intended receiver.
During anopinion providing step230, amember100 provides anopinion154 about aprivacy review result150 and/or about aprivacy reviewer106. Theopinion154 may be in the form of an email, a survey response, a dialog box response, or another electronic form, for example.
During a participatingstep232, a member participates in anonline community102 by submitting electronic posts, sending/receiving other electronic communications, and so on. For example, asocial network102 may be organized into groups based on shared interest in a given topic and/or based on questions of the form “Looking for advice on——————” or “Has anyone ever——————?”.Participation232 is limited to members of theonline community102.
During an identitysecrecy protecting step234, amember100 takes steps to keep secret the member's offline identity. For example, the member may submit228 for a privacy review communications that involve the member, may seek or otherwise consent218 to privacy review, may select a username that does not share any semantic content with the member's offline name, may scrutinize steps taken by the online community to protect member offline identities, and so on.
Some of the steps shown inFIG. 2 may be performed during registration of new members, or even earlier during marketing of anonline community102. Some examples include steps such as receiving 206-212 a notice, manifesting218,220 a consent, owning204 an offline identity, ratifying data, and so on. The term “member” as used herein with respect to such steps should be understood to include not onlycurrent members100 of anonline community102 but alsoprospective members100 who express interest in joining theonline community102, and in-process-of-registration members100 who are in the process of joining theonline community102.
FIG. 3 shows aflow chart300 illustrating steps of some method and configured storage medium embodiments from a point of view of a service provider who facilitates an online community. Methods illustrated inFIG. 3 may helpservice providers104 and others maintain the privacy of offline identities of members of an online community.
During anotice providing step302, aservice provider104 provides one ormore notices142 to one ormore members100 of an online community that is being served by the service provider.Notice providing step302 inFIG. 3 corresponds generally to notice receiving steps206-212 and other notice receiving steps not detailed inFIG. 1, except thatnotice providing step302 is performed by a service provider whereas notice receiving steps are performed by a member.
During a concerns identifying step304, aservice provider104 identifies one or more privacy concern triggers146. The privacy concern triggers may be set using values specified216 by one ormore members100 and/or by using additional or alternate values. The privacy concern triggers may be identified expressly by listing them and/or they may be identified implicitly by identifying adatabase148 which lists them, such as a database of personal names, place names, email addresses, domain names, business names, and so on. Although adatabase148 is shown on amember computer system112 inFIG. 1, the database(s) used to identify privacy concern triggers may reside anywhere they are accessible to the scanning software144.
During a consent obtaining step306, aservice provider104 obtains at least one consent from at least one member for at least one privacy review of at least oneelectronic communication110. Consent obtaining step306 corresponds generally to consent manifestingsteps218,220, except that consent obtaining step306 is performed by a service provider whereasconsent manifesting steps218,220 are performed by a member.
During acommunication obtaining step308, aservice provider104 obtains at least oneelectronic communication110 involving at least onemember100. As indicated inFIG. 3, theelectronic communication110 may have been edited226 in response to aprivacy review result150. In some cases, asystem114 obtains308 an electronic communication before that communication has been received by at least one intendedreceiver108 of the electronic communication; in some cases, the communication is obtained308 before it has been received by any of several intended receivers.Communication obtaining step308 corresponds generally tocommunication submitting step228, except thatcommunication obtaining step308 is performed by a service provider whereascommunication submitting step228 is performed by a member.
During a determiningstep310, aservice provider104 determines that anelectronic communication110 should be submitted to a privacy review. The privacy review may be done to assess the extent to which the electronic communication discloses or seeks disclosure of a member's offline identity. For a givencommunication110, thisdetermination310 may correspond generally to aconsent manifesting step218, for example, if all communications meeting specified criteria (e.g., all outgoing emails) are subject to privacy review and thecommunication110 in question satisfies the specified criteria. Alternately, or in addition, a system may determine310 that acommunication110 is subject to a privacy review if thecommunication110 is randomly selected, e.g., under a spot-checking approach. Alternately, or in addition, a system may determine310 that acommunication110 is subject to a privacy review if automatic scanning of thecommunication110 identifies a specified set of one or more privacy concern triggers146 in thecommunication110.
During a deliveringstep312, aservice provider104 delivers anelectronic communication110 to an intendedreceiver108. For example, an email may be placed in the receiver's in-box, a web page may be displayed on the receiver's screen, or a voicemail or text message may be placed in memory and the receiver notified that a message is waiting to be heard or seen. An instance of deliveringstep312 may occur regardless of a privacy review or lack thereof, and regardless of editing or lack thereof in response to a privacy review.
During ascanning step314, software and/or hardware144 under the custody or control of aservice provider104 automatically scans an electronic communication to detect one or more privacy concern triggers146. In some systems, aprocessor120 coupled with memory122 executes instructions for automatically scanning314 anelectronic communication110 to detect whether the communication contains any of the following privacy concern triggers146: personal name, family name, phone number,offline address166,online address166, geographic name, landmark name, indication of gender, indication of race, indication of ethnicity, indication of age, title, profession, animage168, a link to animage168.
As with other steps discussed herein, results of this scanning step configure a memory122 and may subsequently influence the execution of other steps. For instance, scanning results may be sent to amember100 in a notice of privacy review basis or as part of privacy review results.
During a human privacyreviewer selecting step316, aservice provider104 selects ahuman privacy reviewer106 to perform a privacy review of anelectronic communication110. In some systems, aprocessor120 coupled with memory122 executes instructions for automatically selecting ahuman privacy reviewer106 to perform a privacy review of theelectronic communication110. In some cases, the selectingstep316 selects a human privacy reviewer from among members of theonline community102.
Selection316 may be random, round-robin, or according to some other approach within a pool of available human privacy reviewers. In some embodiments candidates may be placed in the pool only if they meet privacyreviewer selection criteria158 specified by a member who is involved with the communication, e.g., as the creator of the communication. For example, privacyreviewer selection criteria158 provided by the member may include one or more of the following: a geographic criterion pertaining to human privacy reviewer offline geographic location, a profile criterion pertaining to human privacy reviewer online profile content, an activity criterion pertaining to human privacy reviewer online activity.
In some cases, the selectingstep316 selects thehuman privacy reviewer106 based at least in part on instances in which the human privacy reviewer has previously reviewed an electronic communication110 (whether it be the communication presently in question, or another communication) that involves the member.
In some cases, the selectingstep316 selects thehuman privacy reviewer106 based at least in part on areputation summary156 of the human privacy reviewer, the reputation summary being a response toopinions154 of online community members about privacy reviews performed by thehuman privacy reviewer106.
During a human privacyreviewer contacting step318, aservice provider104 contacts ahuman privacy reviewer106, e.g., by sending the privacy reviewer a copy of anelectronic communication110 to be privacy reviewed, or by a message asking whether the privacy reviewer is available to perform a privacy review. Contact with thereviewer106 may be made through email, voicemail, text message, instant message, or other electronic communication. In somesystems114, aprocessor120 coupled with memory122 executes instructions for automatically contacting318 ahuman privacy reviewer106 to request a privacy review of anelectronic communication110 by the human privacy reviewer.
During a reviewer selection criteria using step320, aservice provider104 uses privacyreviewer selection criteria158. Such criteria may be used, for example, when determining310 whether acommunication110 should be submitted for a privacy review, or when selecting316 a human privacy reviewer. Use of privacyreviewer selection criteria158 as examples may also be made when documenting or otherwise describing privacy protection services and related aspects such asnotices142,consents136, privacy concern triggers146,member opinions154, privacyreviewer reputation summaries156, privacy review results150, andprivacy review histories152.
During acommunication submitting step322, aservice provider104 submits anelectronic communication110 to a human privacy reviewer for a privacy review. The entire body of thecommunication110 may be submitted322. In some cases a smaller portion of the communication may be submitted, e.g., any sentence containing a trigger found byautomatic scanning314 could be submitted322 without giving thereviewer106 the rest of thecommunication110.
During anidentity withholding step324, aservice provider104 withholds identity information. For example, theservice provider104 may withhold324online identity information138 of themember100 whosecommunication110 is being reviewed from thehuman privacy reviewer106, may withholdoffline identity information140 of themember100 from thehuman privacy reviewer106, may withholdonline identity information138 of thehuman privacy reviewer106 from themember100, and/or may withholdoffline identity information140 of thehuman privacy reviewer106 from themember100. Withholding324 information may be implemented in some cases by not supplying the information. Withholding324 information may also include securing the information by file system access controls, password requirements, encryption, separate storage requiring physical action by a human for access, and/or other security tools and techniques.
During anopinion requesting step326, aservice provider104 electronically requests from amember100 anopinion154 about a particular privacy review, about a set of privacy reviews, about a particular privacy reviewer (human or automated), and/or about a set of privacy reviewers.Opinion requesting step326 corresponds generally toopinion providing step230, except thatopinion requesting step326 is performed by a service provider whereasopinion providing step230 is performed by a member.
During a using/updating step328, aservice provider104 electronically updates and/or otherwise uses a privacyreviewer reputation summary156. For example, theservice provider104 may update the reputation summary in response to anopinion154 received from amember100; may display the reputation summary to amember100 or to ahuman privacy reviewer106; or may compare the reputation summary to privacyreviewer selection criteria158.
During acommunication reducing step330, aservice provider104 electronically reduces access to communication with amember100 in anonline community102. Reduction may be done at least partially in response to repeateddeterminations310 thatelectronic communications110 which involve themember100 in question should be submitted to a privacy review. Reduction may also be done in response to an express request by themember100, e.g., the member may request automatic filtering out of any privacy concern triggers146 detected by scanning software144.Reduction330 may reducedeliverable communication110 content and/or reduce the set of persons involved in a delivered communication as sender or receiver. Eliminating communication is within the scope of reducing it.
During a username changing step332, aservice provider104 electronically changes a username of amember100 in anonline community102. The new username may be automatically sent to some but not all of the persons with whom themember100 has previously communicated electronically, with the exception, for example, of persons who have previously requestedoffline identity information140 from the member in scanned314 and/or human privacy reviewer-reviewedcommunications110. The username may be changed332 at least partially in response to repeateddeterminations310 that electronic communications which involve the member should be submitted to a privacy review.
FIG. 4 shows aflow chart400 illustrating steps of some method and configured storage medium embodiments from a point of view of a human privacy reviewer who reviews an electronic communication for an online community.
During an electroniccommunication receiving step402, ahuman privacy reviewer106 receives anelectronic communication110 to be reviewed for risks of offline identity privacy disclosure. The points of view differ, but electroniccommunication receiving step402 corresponds generally with electroniccommunication submitting steps228 and322.
During a privacyreview performing step404, ahuman privacy reviewer106 performs a privacy review of anelectronic communication110. A privacy review may include scanning the electronic communication for privacy concern triggers specified by the privacy reviewer and/or by others, usingprivacy review software128. Additionally, or instead, a privacy review may include reading theelectronic communication110 and bringing to bear on its content as understood by thereviewer106 the reviewer's knowledge of natural languages, human culture, and human behavior, to identify privacy concerns while guided by a desire to reduce or prevent disclosure ofoffline identity information140.
During a reduced concernversion creating step406, ahuman privacy reviewer106 creates a version of anelectronic communication110 designed to reduce concerns of offline identity disclosure. The reduced concern version created 406 may be an annotated, redacted, and/or otherwise edited version of theelectronic communication110 in question. The reduced concern version may be created 406 using a word processor, macros that search for and highlight privacy concern triggers, comparison software such as diff, and/or otherprivacy review software128. The reduced concern version may be subsequently obtained224 as privacy review results150 by themember100 whoseelectronic communication110 was used as a basis for the reduced concern version.
During a privacy reviewcomment writing step408, ahuman privacy reviewer106 writes comments directed toward amember100 whoseelectronic communication110 is being privacy reviewed. The comments may explain specific suggestions or edits made406 by thehuman privacy reviewer106. The comments may be concatenated onto, embedded within, or otherwise associated with theelectronic communication110 being discussed by the comments, and may be subsequently obtained224 as privacy review results150 by themember100 whoseelectronic communication110 is being privacy reviewed.
During anopinion getting step410, ahuman privacy reviewer106 gets anopinion154 from amember100 regarding a privacy review performed by theprivacy reviewer106 and/or regarding theprivacy reviewer106. The points of view differ, butopinion getting step410 corresponds generally withopinion requesting step326 andopinion providing step230, except that in some embodiments human privacy reviewers do not get individual opinions but instead seeonly reputation summaries156 that are based on multiple opinions.
During areputation scrutinizing step412, ahuman privacy reviewer106 scrutinizes areputation summary156. Scrutinizingopinions154 and/orreputation summaries156 may helphuman privacy reviewers106 gain better understanding of how their privacy review efforts are perceived by reviewedmembers100.
During a reputationappealing step414, ahuman privacy reviewer106 appeals to aservice provider104 seeking correction or clarification of some aspect of areputation summary156. An appeal process may be supported byprivacy review software128.
During a participatingstep416, ahuman privacy reviewer106 participates in anonline community102 as amember100 of that online community; this corresponds withstep232.Human privacy reviewers106 may or may not be members of online communities for which they perform404 privacy reviews.
Additional Examples
Some possible embodiments provide new social networking tools and techniques, and in particular, new tools and techniques for facilitating social networks in which members meet online but face little or no risk of ever meeting offline. Some of these possible embodiments include features beyond the privacy review features discussed above. Privacy review features and other features are discussed below in connection with various “embodiments” but it will be understood that a claim defines what actually constitutes an embodiment of that claim, so features discussed in examples should not necessarily be read into a given claim.
Some embodiments may help encourage and support online communities which have an ethos of members providing other members with anonymous help based on candid disclosure of opinions and social facts online, with little risk that the disclosures will lead to unwanted or complicated offline interaction. Embodiments may operate online communities through websites under domains containing marks such as “NeverMeet”, “NoFaces”, “FriendlyStrangers”, “SmallWorld”, or the like, depending on the legal availability of such domains and marks.
Some approaches described herein run counter to an assumption that social networking sites should help people meet each other in person. Instead, some embodiments take the approach that an online version of a “strangers in a bar” conversation can be worthwhile. People may be more candid in seeking—and giving—life advice, for instance, if they know they'll never meet in person. Other interactions may also be less inhibited. It may also be helpful for conventional matchmaking sites to offer subscribers a practice forum in which they converse with people whose actual identity they will almost certainly never learn, who will almost certainly never learn their identity, and whom they will almost certainly never meet in person (intentionally or even by accident).
In some embodiments, social network member geographic locations are obtained or approximated, and that geographic information is used to limit online interaction in order to reduce the risk that members who interact online will meet (accidentally and/or intentionally) offline.
For example, in some embodiments, a member can specify one or more geographic areas to be avoided by the system when the system is determining which other members should be able to contact this member. In one simple case, a member who lives in city F can tell the system to avoid allowing that member contact with other members who also live in F. Depending on the implementation, the territories to avoid may be landmarks (Eiffel Tower, . . . ), cities, counties, provinces, states, regions, nations, and/or continents, for instance. A time zone is another example of a geographic region. Territories may be predefined, and accessed through a menu.
In some embodiments, a social networking system may help reduce or prevent online contact between members whose avoidance areas overlap. Thus, if member A says to avoid areas X, Y, Z, and member B says to avoid areas R, S, X, and member C says to avoid areas R, S, T, and member D says to avoid area W, then the social network operates to reduce or eliminate/prevent online interaction (within the social network's virtual community(ies)) between A and B, and between B and C, and it operates to allow (or even encourage) online interaction between A and C, A and D, and B and D. As another example, if Bob lives in California and travels (or plans to travel) to Canada, and Pat lives in Oregon and does not travel, then Bob could list avoidance areas California and
Canada, and Pat could list avoidance area Oregon. The system would then allow (or encourage) online interaction between Bob and Pat, because—based on the avoidance areas they specified—there is little risk they will ever be in the same geographic area, and hence little risk they will ever meet offline. By contrast, if Pat listed California in addition to listing Oregon, then the system would take steps to limit or prevent online interaction between Pat and Bob, because their avoidance areas (a.k.a., their personal territories, or their safety zones) overlap.
Some embodiments require that a member specify at least N personal territories, and/or that the member specify a combination of personal territories that satisfies some geographic size requirement. For instance, a member might be required in one implementation to specify at least three personal territories, or to specify at least two territories which are each at least the size of Switzerland, or which meet some minimum combined population total, e.g., territories containing at least 50 million people.
In some embodiments, virtual community cultural pressure, community website contractual terms of use, and/or other similar tools are used to encourage or legally require members to specify a personal territory that includes their current residence. In some embodiments, as an alternative or in addition, tools such as geolocation software or correlation with a payment database are used to identify the apparent approximate geographic location of the computer or other device being used by a member to access the online community, and that geographic region is included (visibly to the member in some cases, invisibly in others) among the member's personal territories. In some embodiments, a member's list of personal territories is private to the member—it is used by the system internally, but is not made visible to other members.
A geographic territory normally is a characteristic of a member, at least as to the geographic territory in which the member resides. But other criteria need not apply to the member who specifies them as avoidance criteria. A member can ask to avoid communication with members who have a particular profession, for instance, without also being a member of that profession.
In some embodiments, a member can specify avoidance criteria that are not geographic in addition to, or instead of, specifying the geographic territories to avoid. For example, a physician who is an expert in some medical field may tell the system to help her avoid communications online with other physicians generally, or perhaps only with other physicians in her medical field. Another physician may similarly tell the system to avoid communications with attorneys. More generally, avoidance criteria may be any of a wide variety of criteria, e.g., geographic location, profession, certain topics of discussion, and so on. Avoidance criteria may be specified in a profile.
The avoidance criteria may have an effect in a system in various ways, depending on the system embodiment.
First, when the system is making or offering a random or semi-random (e.g., based on shared interest in a topic) introduction between two members, it may operate to avoid introducing two members whose personal territories overlap.
Second, when the system is selecting a privacyquality control reviewer106 of acommunication110, it may operate to avoid selecting316 a reviewer whose territory overlaps with either thesource member100 of thecommunication110 or the intendeddestination member100 of thecommunication110.
Third, when the system is preparing to display a blog posting, forum posting, comment, or other quasi-public posting by one member, it may limit what is seen by other member(s) so that the posting is not seen by member(s) whose personal territory(ies) overlap the personal territory of the poster. As a result, not every member who looks at (or tries to look at) a blog at a given point in time will necessarily see the same content as the other member(s). Rather, postings may be filtered to prevent viewing by members whose personal territories overlap those of the original poster and/or those of a subsequent commenter. In some implementations, overlap between a potential viewer's territory and any poster's (original, later commenter) territory makes the entire blog (comments and all) unavailable to the potential viewer. In other implementations, redactions are made based on individual's territories, so that the potential viewer sees at least some of the blog but does not see portions posted by members whose territory overlaps the viewer's territory. More generally, a system may filter access to postings to satisfy member avoidance criteria, geographic or otherwise, to reduce the risk that members who communicate online might meet offline.
Some embodiments do not ask members for personally identifyinginformation140 when they register to obtain a username174. Other embodiments do ask, e.g., to receive a one-time registration fee, but do not correlate usernames174 to thatpersonal information140.
In some embodiments, at least some socialnetwork member communications110 are reviewed404 for potential disclosure of personally identifyinginformation140, and reviewresults150 are used to discourage and/or limitonline communications110 that apparently increase the risk thatmembers100 who interact232 online will meet (accidentally and/or intentionally) offline. Such privacy reviews404 may be automated314, bypeople106, or both.
For example, in some embodiments, member communications110 (posting, email, IM, chat, etc.) are scanned314 for key words and phrases146 that may indicate increased risk of disclosing a member'soffline identity140; online, usernames not reminiscent of offline names etc. are used to identify members. Such privacy concern triggers146 may include, e.g., personal or family names, phone numbers, addresses (postal, email, web), account numbers, gender, race, ethnicity, age, title, profession, geographic names, landmark names, employer names, phrases such as “where do you live?”, “I live in . . . ”, “How old are you?”, “What school do you go to?”, etc.
Various steps may be taken when scanning314 detects such a privacy concern trigger146. Thecommunication sender100 may be told224, and given a chance to edit226 thecommunication110 before it is sent to anyother member108. The communication may be sent322 to a randomly selected316 (or an expertise-and-trust-proven-selected316) member who serves as a privacyquality control reviewer106. The trigger146 may be modified (for learning, eg., as spam detectors learn, but to detect privacy concerns better, not to detect spam). Thecommunication110 may be sent to its intended member destination(s)108, with or without somemodification226 by thesender100 and/or by thesystem114 to enhance sender privacy.
In some embodiments, a privacyquality control reviewer106 receives acommunication110 snippet without receiving any indication who100 is sending it, reviews404 it, and makes a judgment about whether it reveals personally offline-identity-revealinginformation140. Reviewer comments150 are sent back to thesender100. The sender may makechanges226, after which the editedcommunication110 is sent to another randomly selected316 (but again with non-overlapping personal territory) privacyquality control reviewer106, and so on. Thus, thecommunity102 helps protect the privacy of itsmembers100. Individual members may build up, over time, expertise in judging the risk of disclosure, and that expertise may in turn be rated230 anonymously by themembers100 whosecommunications110 are reviewed404.
Members106 who prove to be expert and trustworthy at assessing privacy disclosure risks—as judged230 by those 100 whose privacy they seek to protect—may be rewarded in ways that do not risk disclosure of their own privacy. For example,reviewers106 may take pride inprivate recognition156 by the system of their relative rank among allprivacy reviewers106.Reviewers106 may enjoy being trusted402 with review ofmessages110 which are more likely than other reviewed messages to disclose a member'soffline identity140.
In some embodiments noprivacy reviewer106 is sent322 more than some small predetermined number ofcommunications110 from a givenmember100 to review404. For example, areviewer106 might be sent no more than fivecommunications110 over the course of one year from a givenmember100.
In some embodiments, a system goal is to strike a balance that favorsonline interaction232 without unacceptable risk of disclosingoffline identities140. In some embodiments, the system cannot prevent intentional disclosure of a member'soffline identity140 by thatmember100. But it can often prevent, or at least reduce, the risk of accidental disclosure of a member'soffline identity140 by thatmember100.
In some embodiments, social network member computing characteristics are reviewed for potential disclosure of offline geographic location or offline identity revealing information. Computing characteristics may then be hidden and/or altered to reduce or eliminate the risk that members who interact online will meet (accidentally and/or intentionally) offline. Familiar technical means of promoting anonymity by hiding and/or altering computing characteristics can be used, such as not tracking IP addresses (except possibly to initially assign a personal territory as discussed herein), using anonymizing servers or proxies, and so on.
Usernames can be compared to lists of personal and family names, cities, etc., to reduce the risk that a username containing those or other privacy concern triggers will be accepted for use in the system. Dictionary search tools used to find passwords, for instance, could be adapted for use in scanning usernames for personal names, cities, family names, professions, etc.
In some embodiments, posting or other communication of pictures (jpg, gif, tiff, pdf, etc.) is not supported by the system. In other embodiments,pictures168 may be allowed, but every picture is subject to privacyquality control review404. For example, cartoon images, avatars, animations, and other images that do not readily reveal the type of identifyingcharacteristics140 shown in an identification photograph may be allowed.
In some embodiments, links to outside websites are not supported by the system. In other embodiments, links may be allowed, but every link is subject to privacyquality control review404. At least some disguised links, such as “goo g le dot co m” (note spacing, use of “dot”), may be detected and treated as links.
In some embodiments, each user has two usernames. One (internal username) is seen by the user, while the other (external username) is seen by other people in the system. Messages can be scanned automatically for either type of username; internal usernames in particular can be privacy concern triggers. The user does not necessarily know its own external username; in some embodiments, external usernames are kept secret from their users. Postings of a user which include the user's external username are modified to show the user's internal username instead, at least when the user is logged on. Another person logging on nearby, e.g., a friend of the user, should not see those messages anyway, since the friends' personal territories will overlap. Likewise, if the user logs in under a different account, but is still in the same territory, the original account's messages should be filtered out and thus not displayed to the user.
In some embodiments, the external username associated with a given internal username (via a table or other data structure) is changed on occasion. The user is not normally notified that a change in external username has occurred, but may infer such a change from a loss of contact with some other user that occurs when the old username is disabled. An external username may be changed332 or otherwise disabled (e.g.,user100 evicted from system) on a regular schedule, e.g., every month, on a randomized schedule, in response to a request from the user100 (“I′m uncomfortable—please move me to a new virtual bar with a fresh face and new people to meet online”), and/or in response to heightened risk of privacy loss as indicated310 by automated review of messages to/from the user for privacy concern triggers146 and/or by actions by privacy quality control reviewers106 (especially if the system notes ahistory152 of privacy concerns). The new external username174 normally bears little or no resemblance to the previous external username.
In some embodiments, a given internal username is associated with more than one external username, e.g., a different external username may be used in each of several different countries or other territories. This may reduce the risk that when users A and B communicate, A, and C communicate, and B and C communicate, that B and C will together learn more than desired about A's identity. B and C will know A under different external usernames of A, and hence be less likely to correlate information about A.
It will be apparent that preserving one's anonymity is a way to help reduce the risk that one will never meet in person offline someone that one has met online. But it is not the only way. Embodiments can also help prevent unwanted offline meetings by limiting online interaction to members whose personal territories (as stated by the members and/or determined automatically by the system from geolocation) do not overlap.
Traditional profile elements, which contain personally identifying information such as age, gender, race, profession, and geographic location, will likely be used rarely if at all in some embodiments. However, topics of interest might be specified in a profile that is accessible to other members (at least, to those whose personal territories do not overlap your own).
Tools and techniques presented herein may be embodied in various ways, e.g., processes and/or hardware on a server computer, on a client or peer, or on a standalone computer, software (data instructions) in RAM or permanent storage for performing a process, general purpose computer hardware configured by software, special-purpose computer hardware, data produced by a process, and so on. Computers, PDAs, cell phones, and any device having user interface and some network transmission capabilities may be part of a given embodiment. Touch screens, keyboards, other buttons, levers, microphones, speakers, light pens, sensors, scanners, and other I/O devices may be configured to facilitate or perform operations to achieve the methods and systems, and method results, which are described here. Combinations of these may also form a given embodiment.
In view of the foregoing, it will be understood that the present disclosure describes features which can be used independently of one another in embodiments that focus on different approaches. Many features described here could be provided in a given commercial product or services package, but may nonetheless be patentably distinct. Determinations of patentable distinctness are made after a disclosure is filed, and are made by patent examination authorities.
It may be helpful, however, to note here that one of the various ways in which features disclosed herein can be grouped is according to which entity acts. Some steps are unique to a role. Amember100 does steps that are not done by aservice provider104, by ahuman privacy reviewer106, or by areceiver108, for example. The same is true of each of the roles; aservice provider104 does steps not done in any of the other three roles, and so does ahuman privacy reviewer106, and so does areceiver108.
It may also be helpful to note that another way to group features disclosed herein is according to the steps/structures employed.
For example, some embodiments employ avoidance criteria and/or take steps to limit offline interaction based on information from online community members about their offline identity. Thus, some embodiments include accepting an avoidance criterion from a member (current or prospective) of a social network; and limiting (reducing and/or preventing between those with overlapping avoidance criteria, and/or favoring and/or requiring between those with non-overlapping avoidance criteria) online interaction between the member and at least one other member of the social network based at least in part on the members' avoidance criteria. In some, the social network accepts avoidance criteria including a list of personal territories from the member, and limits online interaction based on the personal territories of the members.
As another example, some embodiments employ privacy concern trigger and/or take steps to alert online community members when their offline identity information might be disclosed by a communication. Thus, some embodiments include automatically scanning a communication from a member (current or prospective) of a social network for at least one privacy concern trigger; and submitting the communication to a privacy quality control reviewer after finding at least one privacy concern trigger. In some, the privacy quality control reviewer anonymously reviews the communication and indicates an extent to which the reviewer has concluded that the communication is likely to disclose offline identity information of the member.
As another example, some embodiments employ username mapping and/or take steps to hide/change usernames to make an online community member's online identity a moving target or otherwise difficult to permanently pin down. Thus, some embodiments include accepting a user-visible (internal) username from a user of a website, phone, PDA, or other networked service; and displaying a different username (external) username for that same user to other users of the service. Some also include dynamically changing the external username while maintaining the associated internal username; the change may be on an automated schedule, and/or at specific request of the user, and/or in response to some indication (detected automatically or manually) that the privacy of the user may be compromised or near compromise.
As another example, some embodiments provide privacy protection through username restrictions that limit username content to reduce or avoid use of offline identity information in usernames.
Features disclosed herein may also be categorizable into patentably distinct embodiments in other ways. Regardless, we now turn to more detailed examples of ways in which features may be organized.
In the following examples particular attention is paid to anonymous social networking with community-based privacy reviews, from a reviewed person's perspective.
Some embodiments include a method for use by a first person belonging to an online community, the first person having an online identity published within the online community, the first person also having an offline identity which the first person has asserted should not be published in the online community, the method including the first person: consenting to a privacy review of a communication between the first person and a second person who also belongs to the online community; and receiving a result of the privacy review, the result indicating the extent to which the communication was considered to pose a risk of disclosing at least part of the first person's offline identity in the online community.
In some cases, the first person receives notice that the privacy review is based at least in part on scanning communications before they are delivered to their identified destination(s) in the online community.
In some cases, the first person receives notice that the privacy review is based at least in part on manually and/or automatically scanning communications for at least one of the following: personal name, family name, phone number, offline address, online address, geographic name, landmark name, questions seeking geographic information, statements containing geographic information, questions seeking employment information, statements containing employment information, gender, race, ethnicity, age, title, profession.
In some cases, the first person consents to privacy review of the communication by a third person who also belongs to the online community, and the third person has an offline identity which is not disclosed to the first person. In some cases, the first person consents to privacy review of the communication by a third person, and the method further includes the first person providing an opinion about the third person's privacy review. In some cases, the first person consents to privacy review of the communication by a third person, and the method further includes the first person receiving a reputation summary indicative of the third person's reputation for privacy reviews, based on multiple privacy reviews performed by the third person. In some cases, the first person consents to privacy review of the communication by a third person who also belongs to the online community, and the third person is at an offline location which lies outside a list of territories specified by the first person.
Some methods further include the first person editing the communication, in response to the privacy review, before the communication reaches the second person. Some include the first person submitting the edited communication to another privacy review.
In some embodiments, the first person receives notice that their offline identity is hidden from any person who performs a privacy review on their communication. In some, the first person receives notice that their online identity is hidden from any person who performs a privacy review on their communication.
In some embodiments, the first person consents to privacy review of some images in communications from the first person to another person in the online community. In some, the first person consents to privacy review of all images in communications from the first person to another person in the online community. In some, the first person consents to privacy review of some online addresses in communications from the first person to another person in the online community, and online addresses include at least website addresses and email addresses. In some, the first person consents to privacy review of all online addresses in communications from the first person to another person in the online community. In some, the first person consents to privacy review of offline addresses in communications from the first person to another person in the online community.
In some embodiments, the first person receives notice that the privacy review includes automatically scanning a communication and then submitting the communication to a person for privacy review if a privacy concern trigger is found by the automatic scanning.
In the following examples particular attention is paid to anonymous social networking with offline encounter avoidance criteria, from a service provider's perspective.
Some embodiments include a method to help reduce the risk of offline encounters between members of an online community, the method including: obtaining a first avoidance criterion from a first member of the online community, the first avoidance criterion specifying an aspect of the first member's offline life that is designated by the first member to be shielded from the first member's online life; obtaining a second avoidance criterion from a second member of the online community, the second avoidance criterion specifying an aspect of the second member's offline life that is designated by the second member to be shielded from the second member's online life; and using the avoidance criteria to determine an offline encounter risk level of the two members, namely, a value which is based at least in part on the extent of overlap, if any, between their avoidance criteria.
In some embodiments, a method includes securely storing the obtained avoidance criteria such that a member's choice of avoidance criteria is not published in the online community to other members. Some embodiments include at least one of the following: securely storing offline identity information about members so that it is not published in the online community to other members; informing members that their offline name is not required and then allowing them to post communications in the online community without first providing their offline names.
Some embodiments include displaying to one of the members an indication of the number of members of the online community whose avoidance criteria overlap at least one avoidance criterion of that member, thereby allowing that member to estimate the reduction in online community access which would result from retaining the at least one avoidance criterion. An indication of the number of members may be numeric or visual (e.g., partially filled bar or map), and may be an exact count or an estimate.
Some embodiments include regulating communication between the first member and the second member in accordance with their offline encounter risk level, with a goal of reducing the risk that they will encounter each other offline as a result of communications in the online community.
In some embodiments, the obtaining steps obtain geographic territory designations, and communication between the two members is regulated in at least one of the following ways: direct communication between the two members is not supported by online community services when their respective geographic territory designations overlap; direct communication between the two members is suggested by an online community service when their respective geographic territory designations do not overlap.
In some embodiments, the obtaining steps obtain geographic territory designations, and the method further includes submitting a communication to the second member for privacy review when the respective geographic territory designations of the two members do not overlap, the communication being from the first member and also being not addressed to the second member by the first member.
In some embodiments, the obtaining steps obtain geographic territory designations, and the method further includes informing a member of at least one of the following: the territory in which the member resides should be designated, the territory in which the member resides must be designated, the territory in which the member resides will be automatically designated, at least one territory in which a member does not reside may be designated, a territory in which the member plans to travel should be designated, a territory in which the member plans to travel must be designated, a territory in which the member plans to travel may be designated.
In some embodiments, the obtaining steps obtain geographic territory designations, and the method further includes informing a member of at least one of the following: at least one territory should be designated, at least one territory must be designated, at least N territories should be designated (N being a stated value greater than one), at least N territories must be designated, territories which together have at least a specified total area should be designated, territories which together have at least a specified total area must be designated, territories which together have at least a specified total population should be designated, territories which together have at least a specified total population must be designated.
In some embodiments, the obtaining steps obtain geographic territory designations, and the method further includes automatically determining a member's likely residence and then including within that member's avoidance criteria at least one covering geographic territory, that is, a territory which includes the member's likely residence. In some, the steps of automatically determining the member's likely residence and including a covering geographic territory are performed transparently to the member.
In some embodiments, the obtaining steps obtain designations of at least one of the following: geographic territory, profession, a discussion topic listed as appropriate for an existing forum in the online community, gender, marital status, ethnicity, race, age, offline family name, offline personal name, organization membership, religious affiliation, membership in one or more specified online communities, thereby allowing members to designate characteristics of other members who they wish to avoid encountering.
A specified aspect of offline life may be past, present, and/or contemplated in the future; it need not come about to be specified as an avoidance criterion. It may even be intended solely as a buffer, e.g., specifying an entire state instead of merely specifying a county within the state even if there are no plans to travel outside the county.
In some embodiments, the first obtaining step obtains designations of multiple avoidance criteria from the first member, and the offline encounter risk level depends on at least two of those multiple avoidance criteria. In some, the first obtaining step obtains designations of multiple avoidance criteria from the first member in a Boolean expression. Boolean expression operators may be implicit, e.g., a blank space could be used to denote a logical AND operator.
Some embodiments include obtaining from the first member an indication of a first acceptable level of risk, which represents the first member's willingness to risk encountering offline some other member of the online community, and obtaining from the second member an indication of a second acceptable level of risk, which represents the second member's willingness to risk encountering offline some other member of the online community. Some include at least one of the following steps: hiding online community contact information of each of the two members from the other member after determining that the offline encounter risk level of the two members exceeds a level corresponding to the level of acceptable risk indicated by at least one of two members; displaying to at least one of the two members an online community username of the other member after determining that the offline encounter risk level of the two members is less than the level(s) of acceptable risk indicated by the two members; introducing the first member and the second member online using their respective usernames, after determining that the offline encounter risk level of the two members is less than the level(s) of acceptable risk indicated by the two members.
In some embodiments, the offline encounter risk level of the two members exceeds a predetermined value, and the method further includes hiding from each of the two members communications which are posted in the online community by the other of the two members. In some, the hiding step hides at least one of the following: a blog posting, a forum posting, a member profile, a member username, an electronic communication.
In some embodiments, the step of obtaining a first avoidance criterion occurs during registration of the first member, and that registration must be completed before the first member can post any communication to other members in the online community. In some, the step of obtaining a first avoidance criterion occurs after registration of the first member and modifies a previously obtained set containing at least one first member avoidance criterion, and the method further includes re-determining the offline encounter risk level of the two members in view of the modified avoidance criterion. In some embodiments, securely storing data does not preclude data access by authorized administrative personnel.
In the following examples particular attention is paid to anonymous social networking with offline encounter avoidance criteria, from an online community member's perspective.
Some embodiments include a method for an online community member to use to help reduce the risk of an offline encounter with another member of the online community, the method including the online community member: receiving notice that communications in an online community will be regulated in order to reduce online communication between community members who have overlapping avoidance criteria; and ratifying an avoidance set which includes at least one avoidance criterion.
In some embodiments, the ratifying step includes at least one of the following: the online community member selecting at least one avoidance criterion from displayed selectable avoidance criteria, the online community member communicating in the online community while subject to an avoidance criterion which is automatically included in the avoidance set. Some embodiments include the online community member reviewing selectable avoidance criteria displayed by an online community service provider. Some include the online community member receiving notice that a given member's avoidance set is not published in the online community to other members. Some include the online community member receiving an indication of the number of other members of the online community whose avoidance criteria overlap that member's avoidance set.
In some embodiments, the receiving notice step includes receiving notice that communications will be regulated with the goal of preventing any direct communication in the online community between community members who have overlapping avoidance criteria. In some, the receiving notice step includes receiving notice that communications in the online community will be regulated with the goal of hiding, from each of two members who have overlapping avoidance criteria, the online presence of the other of the two members.
Some embodiments include the online community member modifying the avoidance set by at least one of the following: selecting an avoidance criterion to include in the avoidance set, selecting an avoidance criterion to exclude from the avoidance set. In some, the avoidance set includes at least one geographic territory designation, thereby indicating that the online community member will have reduced communication in the online community with other members who may be physically located in the designated geographic territory(ies).
In some embodiments, the online community member is a first member, and the method further includes the first member consenting to allow privacy review of one of its online communications by a privacy reviewer if the privacy reviewer is also a member of the online community who has designated at least one geographic territory in a privacy reviewer avoidance set, and if the privacy reviewer avoidance set does not overlap the geographic territory(ies) designated in the first member's avoidance set. In some, the avoidance set includes at least the territory in which the online community member resides. In some, the avoidance set includes at least one territory in which the online community member does not reside but plans to travel.
In some embodiments, the avoidance set ratifying step includes selecting a geographic territory using a Boolean combination of constituent territories. In some, the ratifying step includes selecting designations of at least one of the following: geographic territory, profession, a discussion topic listed as appropriate for an existing forum in the online community, gender, marital status, ethnicity, race, age, offline family name, offline personal name, organization membership, religious affiliation, specified online community membership, thereby allowing the online community member to designate characteristics of other members who the online community member wishes to avoid encountering. In some, the ratifying step includes selecting designations of multiple avoidance criteria combined in a Boolean expression.
Some embodiments include the online community member accessing the online community through a username which has been subjected to privacy review to reduce the risk that it will disclose information about the online community member's offline identity.
Some embodiments include the online community member specifying an acceptable level of risk, which represents the member's willingness to risk encountering offline some other member of the online community.
In some embodiments, the online community member is a first member, and the method includes the first member receiving an introduction to another online community member whose avoidance criteria do not overlap the first member's avoidance set. In some, the online community member is a first member, and the method includes the first member attempting unsuccessfully to communicate directly in the online community with another online community member whose avoidance criteria overlaps the first member's avoidance set.
In the following examples, particular attention is paid to privacy protection through username restrictions.
Some embodiments include a method for use by an online service provider to help maintain the privacy of offline identities of online users, the method including: testing a proposed username by comparing at least a portion of its content to a set of personal identification information tokens; and accepting the proposed username if it satisfies a predetermined privacy criterion, the privacy criterion being defined in terms of matches to personal identification information tokens.
In some embodiments, the testing step compares proposed username content to tokens using at least one of the following: an interactive question-and-answer session; an automatic string operation. In some embodiments, the testing step compares proposed username content to tokens obtained from personal information supplied by an online user, and the privacy criterion is defined in terms of avoiding matches to those tokens. In some, the testing step compares proposed username content to tokens obtained from at least one of: a directory of offline addresses, a directory of online addresses, a directory of names, a directory of phone numbers, and the privacy criterion is defined in terms of avoiding matches to those tokens. In some, the testing step compares proposed username content to tokens obtained from at least one of: a database of registrations, a database of licenses, a database of grants, a database of government records, and the privacy criterion is defined in terms of avoiding matches to those tokens. In some, the testing step compares proposed username content to tokens obtained from a collection of fictional names, and the privacy criterion is defined in terms of matching those tokens. In some, the testing step compares proposed username content to a result of an online search engine search.
Some embodiments include accepting the proposed username from an online user before testing the proposed username. Some include automatically generating the proposed username before testing the proposed username.
Some embodiments include a method for username selection which reduces the risk that a username will disclose information about an online user's offline identity, the method including: receiving a question regarding a proposed username and its relation, if any, to the online user's offline identity; and answering the question. Some include receiving additional questions regarding the proposed username and its relation, if any, to the online user's offline identity, and answering the additional questions.
Some embodiments include proposing a username. Some include specifying an acceptable level of risk that the proposed username will disclose information about the online user's offline identity. Some include stating that the proposed username is a fictional name.
Some embodiments include receiving and answering at least one of the following questions: whether the proposed username contains any part of your name, whether the proposed username contains any part of the name of anyone in your family, whether the proposed username contains any part of the name of anyone you have met, whether the proposed username contains the name of a pet, whether the proposed username contains a nickname, whether the proposed username contains the name of your employer, whether the proposed username contains the name of a business you are connected with, whether the proposed username refers to your religious or spiritual beliefs, whether the proposed username refers to your political beliefs, whether the proposed username refers to any organization to which you belong or which you support, whether the proposed username contains any part of any of your email addresses, whether the proposed username contains any part of a website address, whether the proposed username contains any part of any of your offline addresses, whether the proposed username contains any part of any of your phone numbers, whether the proposed username refers to any of your physical characteristics (e.g., height, weight, gender, race, hair color, eye color, tattoos, disabilities), whether the proposed username refers to your ethnicity. Some embodiments include reading part of a search engine search result and being asked whether it pertains to you or anyone you know.
In the following examples, particular attention is paid to privacy protection through username mapping.
Some embodiments include a method for use by an online service provider to help maintain the privacy of offline identities of online users, the method including: assigning a first user of an online service a private username which is not kept hidden from the first user but is kept hidden from other users of the online service; and assigning the first user at least one public username which is kept hidden from the first user but is not kept hidden from at least some other users of the online service.
Some embodiments include receiving from the first user content directed to at least one other user; and displaying the content together with an attribution which depends on the online service account used, namely, showing an attribution to the private username when the first user is logged in and showing an attribution to a public username when another user is logged in.
Some embodiments include associating geographic territories with users of the online service; the content is hidden from display to other users whose associated geographic territory overlaps the first user's associated geographic territory. Some include associating geographic territories with users of the online service, and all public usernames assigned to the first user are kept hidden from other users whose associated geographic territory overlaps the first user's associated geographic territory.
In some embodiments, at least two different public usernames of a given user are in active use and displayed in each of at least two respective geographic territories at one time.
In some embodiments, the private username is chosen by the online user, in some it is assigned by the system, and in some it is chosen by the user subject to approval by the system. The public names are generated by the system. Different public (aka external) usernames of a given user may be used in different geographic regions and/or in different forums. The system may autogenerate usernames by combining root words, numeric values, and in some cases associated images (which do not contain realistic user likenesses). Hiding a username does not necessarily preclude revealing it to an authorized administrator, but in some embodiments the correlation between users and usernames is not readily determined even by such administrators.
Conclusion
Although particular embodiments are expressly illustrated and described herein as methods or systems, it will be appreciated that discussion of one type of embodiment also generally extends to other embodiment types. For instance, the descriptions of methods in connection withFIGS. 2-4 also help describe systems like those described in connection withFIG. 1, and vice versa. Likewise, example method embodiments help describe system embodiments that operate according to those methods, product embodiments produced by those methods (such as acommunication excerpt150 with privacy concerns highlighted, or a listing showing personal territories to avoid meeting members from), and configured media embodiments in which a medium is configured by data and instructions to perform those methods. It does not follow that all limitations from a given embodiment are necessarily read into another.
Components, steps, and other aspects of different examples given herein may be combined to form a given embodiment.
Reference has been made to the figures throughout by reference numerals. Any apparent inconsistencies in the phrasing associated with a given reference numeral, in the figures or in the text, should be understood as simply broadening the scope of what is referenced by that numeral.
As used herein, terms such as “a” and “the” are inclusive of one or more of the indicated item or step. In particular, in the claims a reference to an item generally means at least one such item is present and a reference to a step means at least one instance of the step is performed.
Headings are for convenience only; information on a given topic may be found outside the section whose heading indicates that topic.
All claims as filed are part of the specification. Repeated claim language may be inserted outside the claims as needed.
While exemplary embodiments have been shown in the drawings and described above, it will be apparent to those of ordinary skill in the art that numerous modifications can be made without departing from the principles and concepts set forth in the claims. Although the subject matter is described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above the claims. It is not necessary for every means or aspect identified in a given definition or example to be present or to be utilized in every embodiment. Rather, the specific features and acts described are disclosed as examples for consideration when implementing the claims.
All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope to the full extent permitted by law.

Claims (21)

What is claimed is:
1. A method for use by a consenting member of an online community, the method comprising the consenting member:
having an online identity which is published within the online community;
owning an offline identity which is not published within the online community, and which includes characteristics of the consenting member's offline identity which are designated by electronically stored data that is not published within the online community, the characteristics comprising at least two of the following: geographic territory, profession, a discussion topic listed for an existing forum in the online community, gender, marital status, ethnicity, race, age, organization membership, or religious affiliation; and
electronically manifesting consent to regulated review of electronic communications between the consenting member and at least one other member of the online community, the consent including consent to reducing offline identity disclosure risk by comparing electronic communications with at least two characteristics of the consenting member's offline identity which are designated by electronically stored data that is not published within the online community; and
electronically obtaining a result of regulated review of the electronic communications.
2. The method ofclaim 1, further comprising the consenting member receiving a notice that the regulated review is based at least in part on automatically scanning the electronic communications.
3. The method ofclaim 1, further comprising the consenting member receiving a notice that the regulated review is based at least in part on scanning the electronic communications before delivery of the electronic communications to an intended receiver.
4. The method ofclaim 1, further comprising the consenting member receiving a notice that the regulated review is also based at least in part on scanning the electronic communications for at least one of the following: personal name, family name, phone number, offline address, online address, geographic name, or landmark name.
5. The method ofclaim 1, further comprising the consenting member receiving a notice that the regulated review is also based at least in part on scanning the electronic communications for at least one of the following: a question seeking geographic information, or a question seeking employment information.
6. The method ofclaim 1, further comprising the consenting member receiving a notice that the regulated review is also based at least in part on scanning the electronic communications for an indication of at least one of the following: title, or profession.
7. The method ofclaim 1, further comprising the consenting member editing at least one of the electronic communications after obtaining the regulated review and before delivery of the edited electronic communication(s) to an intended receiver.
8. The method ofclaim 1, further comprising the consenting member editing at least one of the electronic communications after obtaining the regulated review and then electronically submitting the edited electronic communication(s) for regulated review.
9. A computer system configured for use by a consenting member of an online community to execute computer-executable instructions, the consenting member having an online identity in the online community, the system comprising:
a memory configured with the computer-executable instructions; and
a processor, coupled with the memory, that executes the computer-executable instructions for performing a method comprising the following steps:
the member electronically manifesting consent to regulated review of electronic communications between the consenting member and at least one other member of the online community, the consent including consent to reduce offline identity disclosure risk by comparing electronic communications with at least two characteristics of an offline identity of the consenting member which are designated by electronically stored data that is not published within the online community, the offline identity having at least one characteristic which is not present in the online identity, the offline identity characteristics comprising at least two of the following: geographic territory, profession, a discussion topic listed for an existing forum in the online community, gender, marital status, ethnicity, race, age, organization membership, or religious affiliation; and
the consenting member electronically obtaining a result of regulated review of the electronic communications.
10. The system ofclaim 9, wherein the processor coupled with the memory executes computer-executable instructions for the consenting member to electronically receive a notice that the regulated review is performed at least in part by a human reviewer who also belongs to the online community.
11. The system ofclaim 9, wherein the processor coupled with the memory executes computer-executable instructions for the consenting member to electronically provide an authority within the online community with an opinion in response to regulated review which was performed at least in part by a human reviewer who is not the consenting member.
12. The system ofclaim 9, wherein the processor coupled with the memory executes computer-executable instructions for the consenting member to electronically receive a reputation summary of a human reviewer, the reputation summary being a response to opinions of online community members about regulated reviews performed by the human reviewer.
13. The system ofclaim 9, wherein the processor coupled with the memory executes computer-executable instructions for the consenting member to electronically receive at least one of the following: a notice that the offline identity of the consenting member is kept hidden from a human reviewer who performs regulated review, or a notice that the online identity of the consenting member is kept hidden from a human reviewer who performs regulated review.
14. A configured storage device configured with data and instructions to cause a processor and a memory to perform steps of a method for use by a consenting member of an online community, the member having an online identity which is published within the online community, the member owning an offline identity which is not published within the online community, the method steps comprising:
the member electronically manifesting consent to regulated review of electronic communications between the consenting member and at least one other member of the online community, the consent including consent to reducing offline identity disclosure risk by comparing electronic communications with at least two characteristics of the consenting member's offline identity which are designated by electronically stored data that is not published within the online community, the characteristics comprising at least three of the following: geographic territory, profession, a discussion topic listed for an existing forum in the online community, gender, marital status, ethnicity, race, age, organization membership, or religious affiliation; and
the consenting member electronically obtaining a result of the privacy regulated review of the electronic communications by the privacy reviewer.
15. The configured storage device ofclaim 14, wherein the consenting member electronically manifests consent specifying that website addresses and email addresses in electronic communications are subject to regulated review.
16. The configured storage device ofclaim 14, wherein the consenting member electronically manifests consent specifying that offline addresses in electronic communications are subject to regulated review.
17. The configured storage device ofclaim 14, wherein the consenting member electronically manifests consent specifying that images in electronic communications are subject to regulated review.
18. The configured storage device ofclaim 14, wherein the consenting member is notified that the regulated review is performed at least in part by a human reviewer.
19. The configured storage device ofclaim 14, wherein the consenting member electronically receives notice that regulated review includes submission of the electronic communication to a human reviewer if a privacy concern trigger is found by automatic scanning of an electronic communication.
20. The configured storage device ofclaim 14, wherein the method further comprises the consenting member electronically specifying at least one geographic region, and the consenting member electronically manifests consent to regulated review of the electronic communication by a human reviewer who resides outside the at least one geographic region.
21. The configured storage medium ofclaim 14, wherein the method further comprises the consenting member electronically specifying at least one regulated reviewer selection criterion.
US11/870,4752006-11-142007-10-11Anonymous social networking with community-based privacy reviews obtained by membersExpired - Fee RelatedUS8578501B1 (en)

Priority Applications (1)

Application NumberPriority DateFiling DateTitle
US11/870,475US8578501B1 (en)2006-11-142007-10-11Anonymous social networking with community-based privacy reviews obtained by members

Applications Claiming Priority (4)

Application NumberPriority DateFiling DateTitle
US86575706P2006-11-142006-11-14
US86641806P2006-11-182006-11-18
US86861906P2006-12-052006-12-05
US11/870,475US8578501B1 (en)2006-11-142007-10-11Anonymous social networking with community-based privacy reviews obtained by members

Publications (1)

Publication NumberPublication Date
US8578501B1true US8578501B1 (en)2013-11-05

Family

ID=44994434

Family Applications (3)

Application NumberTitlePriority DateFiling Date
US11/870,475Expired - Fee RelatedUS8578501B1 (en)2006-11-142007-10-11Anonymous social networking with community-based privacy reviews obtained by members
US11/924,845Expired - Fee RelatedUS8069467B1 (en)2006-11-142007-10-26Privacy protection through restrictions on usernames and other online identifiers
US14/201,826AbandonedUS20140195613A1 (en)2006-11-142014-03-08Offline Names Addressing Online Community Web Pages

Family Applications After (2)

Application NumberTitlePriority DateFiling Date
US11/924,845Expired - Fee RelatedUS8069467B1 (en)2006-11-142007-10-26Privacy protection through restrictions on usernames and other online identifiers
US14/201,826AbandonedUS20140195613A1 (en)2006-11-142014-03-08Offline Names Addressing Online Community Web Pages

Country Status (1)

CountryLink
US (3)US8578501B1 (en)

Cited By (168)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20090254656A1 (en)*2008-03-032009-10-08Kidzui, IncMethod and apparatus for custodial monitoring, filtering, and approving of content
US8799028B1 (en)*2011-02-242014-08-05Allstate Insurance CompanySocial network risk analysis
US8862679B1 (en)*2014-04-182014-10-14Secret, Inc.Displaying comments on a secret in an anonymous social networking application
US9064288B2 (en)2006-03-172015-06-23Fatdoor, Inc.Government structures and neighborhood leads in a geo-spatial environment
WO2015074030A3 (en)*2013-11-182015-11-12Antoine ToffaEnabling pseudonymous lifelike social media interactions
US20160352805A1 (en)*2015-05-282016-12-01Bizhive, LlcOnline reputation monitoring and intelligence gathering
CN106302092A (en)*2015-05-262017-01-04腾讯科技(深圳)有限公司A kind of information interacting method and device
US9736134B2 (en)2005-03-182017-08-15Leapfrog Enterprises, Inc.Child-oriented computing system
US10127506B2 (en)2015-08-072018-11-13International Business Machines CorporationDetermining users for limited product deployment based on review histories
US10331436B2 (en)2017-03-202019-06-25International Business Machines CorporationSmart reviews for applications in application stores
US10373213B2 (en)2015-03-042019-08-06International Business Machines CorporationRapid cognitive mobile application review
US10402630B2 (en)*2017-03-102019-09-03Sony Interactive Entertainment LLCMaintaining privacy for multiple users when serving media to a group
US10416993B2 (en)2017-10-062019-09-17International Business Machines CorporationMobile application update manager
US20200004986A1 (en)*2016-06-102020-01-02OneTrust, LLCConsent conversion optimization systems and related methods
US10564935B2 (en)2016-06-102020-02-18OneTrust, LLCData processing systems for integration of consumer feedback with data subject access requests and related methods
US10565397B1 (en)2016-06-102020-02-18OneTrust, LLCData processing systems for fulfilling data subject access requests and related methods
US10565236B1 (en)2016-06-102020-02-18OneTrust, LLCData processing systems for generating and populating a data inventory
US10564936B2 (en)2016-06-102020-02-18OneTrust, LLCData processing systems for identity validation of data subject access requests and related methods
US10567439B2 (en)2016-06-102020-02-18OneTrust, LLCData processing systems and methods for performing privacy assessments and monitoring of new versions of computer code for privacy compliance
US10565161B2 (en)2016-06-102020-02-18OneTrust, LLCData processing systems for processing data subject access requests
US10574705B2 (en)2016-06-102020-02-25OneTrust, LLCData processing and scanning systems for generating and populating a data inventory
US10572686B2 (en)2016-06-102020-02-25OneTrust, LLCConsent receipt management systems and related methods
US10586075B2 (en)2016-06-102020-03-10OneTrust, LLCData processing systems for orphaned data identification and deletion and related methods
US10586072B2 (en)2016-06-102020-03-10OneTrust, LLCData processing systems for measuring privacy maturity within an organization
US10585968B2 (en)2016-06-102020-03-10OneTrust, LLCData processing systems for fulfilling data subject access requests and related methods
US10592692B2 (en)2016-06-102020-03-17OneTrust, LLCData processing systems for central consent repository and related methods
US10592648B2 (en)2016-06-102020-03-17OneTrust, LLCConsent receipt management systems and related methods
US10594740B2 (en)2016-06-102020-03-17OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10599870B2 (en)2016-06-102020-03-24OneTrust, LLCData processing systems for identifying, assessing, and remediating data processing risks using data modeling techniques
US10607233B2 (en)2016-01-062020-03-31International Business Machines CorporationAutomated review validator
US10606916B2 (en)2016-06-102020-03-31OneTrust, LLCData processing user interface monitoring systems and related methods
US10607028B2 (en)2016-06-102020-03-31OneTrust, LLCData processing systems for data testing to confirm data deletion and related methods
US10606906B1 (en)*2017-09-012020-03-31Workday, Inc.Summary based privacy security for benchmarking
US10614246B2 (en)2016-06-102020-04-07OneTrust, LLCData processing systems and methods for auditing data request compliance
US10614247B2 (en)2016-06-102020-04-07OneTrust, LLCData processing systems for automated classification of personal information from documents and related methods
US10642870B2 (en)2016-06-102020-05-05OneTrust, LLCData processing systems and methods for automatically detecting and documenting privacy-related aspects of computer software
US20200151278A1 (en)*2018-11-132020-05-14Bizhive, LlcOnline reputation monitoring and intelligence gathering
US10678945B2 (en)2016-06-102020-06-09OneTrust, LLCConsent receipt management systems and related methods
US10685140B2 (en)2016-06-102020-06-16OneTrust, LLCConsent receipt management systems and related methods
US10692033B2 (en)2016-06-102020-06-23OneTrust, LLCData processing systems for identifying, assessing, and remediating data processing risks using data modeling techniques
US10706176B2 (en)2016-06-102020-07-07OneTrust, LLCData-processing consent refresh, re-prompt, and recapture systems and related methods
US10706447B2 (en)2016-04-012020-07-07OneTrust, LLCData processing systems and communication systems and methods for the efficient generation of privacy risk assessments
US10706379B2 (en)2016-06-102020-07-07OneTrust, LLCData processing systems for automatic preparation for remediation and related methods
US10708305B2 (en)2016-06-102020-07-07OneTrust, LLCAutomated data processing systems and methods for automatically processing requests for privacy-related information
US10706131B2 (en)2016-06-102020-07-07OneTrust, LLCData processing systems and methods for efficiently assessing the risk of privacy campaigns
US10706174B2 (en)2016-06-102020-07-07OneTrust, LLCData processing systems for prioritizing data subject access requests for fulfillment and related methods
US10726158B2 (en)2016-06-102020-07-28OneTrust, LLCConsent receipt management and automated process blocking systems and related methods
US10740487B2 (en)2016-06-102020-08-11OneTrust, LLCData processing systems and methods for populating and maintaining a centralized database of personal data
US10762236B2 (en)2016-06-102020-09-01OneTrust, LLCData processing user interface monitoring systems and related methods
US10769302B2 (en)2016-06-102020-09-08OneTrust, LLCConsent receipt management systems and related methods
US10769301B2 (en)2016-06-102020-09-08OneTrust, LLCData processing systems for webform crawling to map processing activities and related methods
US10769298B1 (en)2017-09-012020-09-08Workday, Inc.Security system for benchmark access
US10776517B2 (en)2016-06-102020-09-15OneTrust, LLCData processing systems for calculating and communicating cost of fulfilling data subject access requests and related methods
US10776514B2 (en)2016-06-102020-09-15OneTrust, LLCData processing systems for the identification and deletion of personal data in computer systems
US10776518B2 (en)2016-06-102020-09-15OneTrust, LLCConsent receipt management systems and related methods
US10776515B2 (en)2016-06-102020-09-15OneTrust, LLCData processing systems for fulfilling data subject access requests and related methods
US10783256B2 (en)2016-06-102020-09-22OneTrust, LLCData processing systems for data transfer risk identification and related methods
US10796260B2 (en)2016-06-102020-10-06OneTrust, LLCPrivacy management systems and methods
US10798133B2 (en)2016-06-102020-10-06OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10803198B2 (en)2016-06-102020-10-13OneTrust, LLCData processing systems for use in automatically generating, populating, and submitting data subject access requests
US10803202B2 (en)2018-09-072020-10-13OneTrust, LLCData processing systems for orphaned data identification and deletion and related methods
US10803199B2 (en)2016-06-102020-10-13OneTrust, LLCData processing and communications systems and methods for the efficient implementation of privacy by design
US10803200B2 (en)2016-06-102020-10-13OneTrust, LLCData processing systems for processing and managing data subject access in a distributed environment
US10839102B2 (en)2016-06-102020-11-17OneTrust, LLCData processing systems for identifying and modifying processes that are subject to data subject access requests
US10848523B2 (en)2016-06-102020-11-24OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10846433B2 (en)2016-06-102020-11-24OneTrust, LLCData processing consent management systems and related methods
US10853501B2 (en)2016-06-102020-12-01OneTrust, LLCData processing and scanning systems for assessing vendor risk
US10873606B2 (en)2016-06-102020-12-22OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10878127B2 (en)2016-06-102020-12-29OneTrust, LLCData subject access request processing systems and related methods
US10885485B2 (en)2016-06-102021-01-05OneTrust, LLCPrivacy management systems and methods
US10896394B2 (en)2016-06-102021-01-19OneTrust, LLCPrivacy management systems and methods
US10909488B2 (en)2016-06-102021-02-02OneTrust, LLCData processing systems for assessing readiness for responding to privacy-related incidents
US10909265B2 (en)2016-06-102021-02-02OneTrust, LLCApplication privacy scanning systems and related methods
US10944725B2 (en)2016-06-102021-03-09OneTrust, LLCData processing systems and methods for using a data model to select a target data asset in a data migration
US10949565B2 (en)2016-06-102021-03-16OneTrust, LLCData processing systems for generating and populating a data inventory
US10949170B2 (en)2016-06-102021-03-16OneTrust, LLCData processing systems for integration of consumer feedback with data subject access requests and related methods
US10970675B2 (en)2016-06-102021-04-06OneTrust, LLCData processing systems for generating and populating a data inventory
US10970417B1 (en)2017-09-012021-04-06Workday, Inc.Differential privacy security for benchmarking
US10997315B2 (en)2016-06-102021-05-04OneTrust, LLCData processing systems for fulfilling data subject access requests and related methods
US10997318B2 (en)2016-06-102021-05-04OneTrust, LLCData processing systems for generating and populating a data inventory for processing data access requests
US11004125B2 (en)2016-04-012021-05-11OneTrust, LLCData processing systems and methods for integrating privacy information management systems with data loss prevention tools or other tools for privacy design
US11023842B2 (en)2016-06-102021-06-01OneTrust, LLCData processing systems and methods for bundled privacy policies
US11025675B2 (en)2016-06-102021-06-01OneTrust, LLCData processing systems and methods for performing privacy assessments and monitoring of new versions of computer code for privacy compliance
US11038925B2 (en)2016-06-102021-06-15OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11057356B2 (en)2016-06-102021-07-06OneTrust, LLCAutomated data processing systems and methods for automatically processing data subject access requests using a chatbot
US11076038B2 (en)2019-12-312021-07-27Bye! Accident LlcReviewing message-based communications via a keyboard application
US11074367B2 (en)2016-06-102021-07-27OneTrust, LLCData processing systems for identity validation for consumer rights requests and related methods
US11087260B2 (en)2016-06-102021-08-10OneTrust, LLCData processing systems and methods for customizing privacy training
US11100444B2 (en)2016-06-102021-08-24OneTrust, LLCData processing systems and methods for providing training in a vendor procurement process
US11134086B2 (en)*2016-06-102021-09-28OneTrust, LLCConsent conversion optimization systems and related methods
US11138242B2 (en)2016-06-102021-10-05OneTrust, LLCData processing systems and methods for automatically detecting and documenting privacy-related aspects of computer software
US11138299B2 (en)2016-06-102021-10-05OneTrust, LLCData processing and scanning systems for assessing vendor risk
US11144675B2 (en)2018-09-072021-10-12OneTrust, LLCData processing systems and methods for automatically protecting sensitive data within privacy management systems
US11146566B2 (en)2016-06-102021-10-12OneTrust, LLCData processing systems for fulfilling data subject access requests and related methods
US11144622B2 (en)2016-06-102021-10-12OneTrust, LLCPrivacy management systems and methods
US11151233B2 (en)2016-06-102021-10-19OneTrust, LLCData processing and scanning systems for assessing vendor risk
US11157600B2 (en)2016-06-102021-10-26OneTrust, LLCData processing and scanning systems for assessing vendor risk
US11188615B2 (en)2016-06-102021-11-30OneTrust, LLCData processing consent capture systems and related methods
US11188862B2 (en)2016-06-102021-11-30OneTrust, LLCPrivacy management systems and methods
US11200341B2 (en)2016-06-102021-12-14OneTrust, LLCConsent receipt management systems and related methods
US11210420B2 (en)2016-06-102021-12-28OneTrust, LLCData subject access request processing systems and related methods
US11222139B2 (en)2016-06-102022-01-11OneTrust, LLCData processing systems and methods for automatic discovery and assessment of mobile software development kits
US11222142B2 (en)2016-06-102022-01-11OneTrust, LLCData processing systems for validating authorization for personal data collection, storage, and processing
US11222309B2 (en)2016-06-102022-01-11OneTrust, LLCData processing systems for generating and populating a data inventory
US11227247B2 (en)2016-06-102022-01-18OneTrust, LLCData processing systems and methods for bundled privacy policies
US11228620B2 (en)2016-06-102022-01-18OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11238390B2 (en)2016-06-102022-02-01OneTrust, LLCPrivacy management systems and methods
US11244367B2 (en)2016-04-012022-02-08OneTrust, LLCData processing systems and methods for integrating privacy information management systems with data loss prevention tools or other tools for privacy design
US11277448B2 (en)2016-06-102022-03-15OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11295316B2 (en)2016-06-102022-04-05OneTrust, LLCData processing systems for identity validation for consumer rights requests and related methods
US11294939B2 (en)2016-06-102022-04-05OneTrust, LLCData processing systems and methods for automatically detecting and documenting privacy-related aspects of computer software
US11301796B2 (en)2016-06-102022-04-12OneTrust, LLCData processing systems and methods for customizing privacy training
US11328092B2 (en)2016-06-102022-05-10OneTrust, LLCData processing systems for processing and managing data subject access in a distributed environment
US11336697B2 (en)2016-06-102022-05-17OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11343284B2 (en)2016-06-102022-05-24OneTrust, LLCData processing systems and methods for performing privacy assessments and monitoring of new versions of computer code for privacy compliance
US11341447B2 (en)2016-06-102022-05-24OneTrust, LLCPrivacy management systems and methods
US11354435B2 (en)2016-06-102022-06-07OneTrust, LLCData processing systems for data testing to confirm data deletion and related methods
US11354434B2 (en)2016-06-102022-06-07OneTrust, LLCData processing systems for verification of consent and notice processing and related methods
US11366909B2 (en)2016-06-102022-06-21OneTrust, LLCData processing and scanning systems for assessing vendor risk
US11366786B2 (en)2016-06-102022-06-21OneTrust, LLCData processing systems for processing data subject access requests
US11373007B2 (en)2017-06-162022-06-28OneTrust, LLCData processing systems for identifying whether cookies contain personally identifying information
US11392720B2 (en)2016-06-102022-07-19OneTrust, LLCData processing systems for verification of consent and notice processing and related methods
US11397819B2 (en)2020-11-062022-07-26OneTrust, LLCSystems and methods for identifying data processing activities based on data discovery results
US11403377B2 (en)2016-06-102022-08-02OneTrust, LLCPrivacy management systems and methods
US11410106B2 (en)2016-06-102022-08-09OneTrust, LLCPrivacy management systems and methods
US11416589B2 (en)2016-06-102022-08-16OneTrust, LLCData processing and scanning systems for assessing vendor risk
US11416798B2 (en)2016-06-102022-08-16OneTrust, LLCData processing systems and methods for providing training in a vendor procurement process
US11416590B2 (en)2016-06-102022-08-16OneTrust, LLCData processing and scanning systems for assessing vendor risk
US11416109B2 (en)2016-06-102022-08-16OneTrust, LLCAutomated data processing systems and methods for automatically processing data subject access requests using a chatbot
US11418492B2 (en)2016-06-102022-08-16OneTrust, LLCData processing systems and methods for using a data model to select a target data asset in a data migration
US11438386B2 (en)2016-06-102022-09-06OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11436373B2 (en)2020-09-152022-09-06OneTrust, LLCData processing systems and methods for detecting tools for the automatic blocking of consent requests
US11444976B2 (en)2020-07-282022-09-13OneTrust, LLCSystems and methods for automatically blocking the use of tracking tools
US11442906B2 (en)2021-02-042022-09-13OneTrust, LLCManaging custom attributes for domain objects defined within microservices
US11461500B2 (en)2016-06-102022-10-04OneTrust, LLCData processing systems for cookie compliance testing with website scanning and related methods
US11475165B2 (en)2020-08-062022-10-18OneTrust, LLCData processing systems and methods for automatically redacting unstructured data from a data subject access request
US11475136B2 (en)2016-06-102022-10-18OneTrust, LLCData processing systems for data transfer risk identification and related methods
US11481710B2 (en)2016-06-102022-10-25OneTrust, LLCPrivacy management systems and methods
US11494515B2 (en)2021-02-082022-11-08OneTrust, LLCData processing systems and methods for anonymizing data samples in classification analysis
US11520928B2 (en)2016-06-102022-12-06OneTrust, LLCData processing systems for generating personal data receipts and related methods
US11526624B2 (en)2020-09-212022-12-13OneTrust, LLCData processing systems and methods for automatically detecting target data transfers and target data processing
US11533315B2 (en)2021-03-082022-12-20OneTrust, LLCData transfer discovery and analysis systems and related methods
US11546661B2 (en)2021-02-182023-01-03OneTrust, LLCSelective redaction of media content
US11544409B2 (en)2018-09-072023-01-03OneTrust, LLCData processing systems and methods for automatically protecting sensitive data within privacy management systems
US11544667B2 (en)2016-06-102023-01-03OneTrust, LLCData processing systems for generating and populating a data inventory
US11562078B2 (en)2021-04-162023-01-24OneTrust, LLCAssessing and managing computational risk involved with integrating third party computing functionality within a computing system
US11562097B2 (en)2016-06-102023-01-24OneTrust, LLCData processing systems for central consent repository and related methods
US11586700B2 (en)2016-06-102023-02-21OneTrust, LLCData processing systems and methods for automatically blocking the use of tracking tools
US11601464B2 (en)2021-02-102023-03-07OneTrust, LLCSystems and methods for mitigating risks of third-party computing system functionality integration into a first-party computing system
US11620142B1 (en)2022-06-032023-04-04OneTrust, LLCGenerating and customizing user interfaces for demonstrating functions of interactive user environments
US11625502B2 (en)2016-06-102023-04-11OneTrust, LLCData processing systems for identifying and modifying processes that are subject to data subject access requests
US11636171B2 (en)2016-06-102023-04-25OneTrust, LLCData processing user interface monitoring systems and related methods
US11651402B2 (en)2016-04-012023-05-16OneTrust, LLCData processing systems and communication systems and methods for the efficient generation of risk assessments
US11651104B2 (en)2016-06-102023-05-16OneTrust, LLCConsent receipt management systems and related methods
US11651106B2 (en)2016-06-102023-05-16OneTrust, LLCData processing systems for fulfilling data subject access requests and related methods
US11675929B2 (en)2016-06-102023-06-13OneTrust, LLCData processing consent sharing systems and related methods
US11687528B2 (en)2021-01-252023-06-27OneTrust, LLCSystems and methods for discovery, classification, and indexing of data in a native computing system
US11727141B2 (en)2016-06-102023-08-15OneTrust, LLCData processing systems and methods for synching privacy-related user consent across multiple computing devices
US11775348B2 (en)2021-02-172023-10-03OneTrust, LLCManaging custom workflows for domain objects defined within microservices
US11797528B2 (en)2020-07-082023-10-24OneTrust, LLCSystems and methods for targeted data discovery
US12045266B2 (en)2016-06-102024-07-23OneTrust, LLCData processing systems for generating and populating a data inventory
US12052289B2 (en)2016-06-102024-07-30OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US12118121B2 (en)2016-06-102024-10-15OneTrust, LLCData subject access request processing systems and related methods
US12136055B2 (en)2016-06-102024-11-05OneTrust, LLCData processing systems for identifying, assessing, and remediating data processing risks using data modeling techniques
US12153704B2 (en)2021-08-052024-11-26OneTrust, LLCComputing platform for facilitating data exchange among computing environments
US12265896B2 (en)2020-10-052025-04-01OneTrust, LLCSystems and methods for detecting prejudice bias in machine-learning models
US12299065B2 (en)2016-06-102025-05-13OneTrust, LLCData processing systems and methods for dynamically determining data processing consent configurations
US12381915B2 (en)2016-06-102025-08-05OneTrust, LLCData processing systems and methods for performing assessments and monitoring of new versions of computer code for compliance

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US7667871B1 (en)*2004-01-302010-02-23Roskind James AVisual cryptography and voting technology using a pair of enhanced contrast glyphs in overlay
US8893287B2 (en)*2012-03-122014-11-18Microsoft CorporationMonitoring and managing user privacy levels
US9288217B2 (en)*2013-12-022016-03-15Airbnb, Inc.Identity and trustworthiness verification using online and offline components
JP5943356B2 (en)*2014-01-312016-07-05インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation Information processing apparatus, information processing method, and program
US20150231506A1 (en)*2014-02-192015-08-20Sony Computer Entertainment America LlcCelebrity video gaming network
CN104166731B (en)*2014-08-292017-11-17河海大学常州校区A kind of overlapping community discovery system and method for social networks
US9112931B1 (en)*2014-10-272015-08-18Rushline, LLCSystems and methods for enabling dialog amongst different participant groups
US10475043B2 (en)2015-01-282019-11-12Intuit Inc.Method and system for pro-active detection and correction of low quality questions in a question and answer based customer support system
US10755294B1 (en)2015-04-282020-08-25Intuit Inc.Method and system for increasing use of mobile devices to provide answer content in a question and answer based customer support system
US10447777B1 (en)*2015-06-302019-10-15Intuit Inc.Method and system for providing a dynamically updated expertise and context based peer-to-peer customer support system within a software application
US10475044B1 (en)2015-07-292019-11-12Intuit Inc.Method and system for question prioritization based on analysis of the question content and predicted asker engagement before answer content is generated
US10394804B1 (en)2015-10-082019-08-27Intuit Inc.Method and system for increasing internet traffic to a question and answer customer support system
US10599699B1 (en)2016-04-082020-03-24Intuit, Inc.Processing unstructured voice of customer feedback for improving content rankings in customer support systems
US10460398B1 (en)2016-07-272019-10-29Intuit Inc.Method and system for crowdsourcing the detection of usability issues in a tax return preparation system
US10467541B2 (en)2016-07-272019-11-05Intuit Inc.Method and system for improving content searching in a question and answer customer support system by using a crowd-machine learning hybrid predictive model
US10445332B2 (en)2016-09-282019-10-15Intuit Inc.Method and system for providing domain-specific incremental search results with a customer self-service system for a financial management system
US10572954B2 (en)2016-10-142020-02-25Intuit Inc.Method and system for searching for and navigating to user content and other user experience pages in a financial management system with a customer self-service system for the financial management system
US10733677B2 (en)2016-10-182020-08-04Intuit Inc.Method and system for providing domain-specific and dynamic type ahead suggestions for search query terms with a customer self-service system for a tax return preparation system
US10552843B1 (en)2016-12-052020-02-04Intuit Inc.Method and system for improving search results by recency boosting customer support content for a customer self-help system associated with one or more financial management systems
US10748157B1 (en)2017-01-122020-08-18Intuit Inc.Method and system for determining levels of search sophistication for users of a customer self-help system to personalize a content search user experience provided to the users and to increase a likelihood of user satisfaction with the search experience
US10796015B2 (en)*2017-03-292020-10-06Mybitchbook, Inc.Method and system for anonymous user data storage and controlled data access
US10938586B2 (en)*2017-05-062021-03-02Servicenow, Inc.Systems for peer-to-peer knowledge sharing platform
US10719811B2 (en)*2017-06-292020-07-21Salesforce.Com, Inc.Method and system for retroactive removal of content from an organization activity timeline
US10686741B2 (en)2017-06-292020-06-16Salesforce.Com, Inc.Method and system for real-time blocking of content from an organization activity timeline
US10922367B2 (en)2017-07-142021-02-16Intuit Inc.Method and system for providing real time search preview personalization in data management systems
US11093951B1 (en)2017-09-252021-08-17Intuit Inc.System and method for responding to search queries using customer self-help systems associated with a plurality of data management systems
US11436642B1 (en)2018-01-292022-09-06Intuit Inc.Method and system for generating real-time personalized advertisements in data management self-help systems
US11269665B1 (en)2018-03-282022-03-08Intuit Inc.Method and system for user experience personalization in data management systems using machine learning
US20230169070A1 (en)*2021-11-292023-06-01International Business Machines CorporationData Transformations for Mapping Enterprise Applications
US12190070B2 (en)*2022-06-222025-01-07International Business Machines CorporationDynamic meeting attendee introduction generation and presentation

Citations (38)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6073142A (en)*1997-06-232000-06-06Park City GroupAutomated post office based rule analysis of e-mail messages and other data objects for controlled distribution in network environments
US6209100B1 (en)*1998-03-272001-03-27International Business Machines Corp.Moderated forums with anonymous but traceable contributions
US20010034723A1 (en)*2000-02-112001-10-25Subramaniam Arun K.System and method for providing anonymous internet transactions
US20020023230A1 (en)2000-04-112002-02-21Bolnick David A.System, method and computer program product for gathering and delivering personalized user information
US6438632B1 (en)*1998-03-102002-08-20Gala IncorporatedElectronic bulletin board system
US6681108B1 (en)2000-08-162004-01-20Mitsubishi Electric Research Laboratories, Inc.Network and method for identifying entities sharing a common network location
US20040181434A1 (en)*2001-09-282004-09-16Olympus CorporationDistribution method of medical information
US20040215793A1 (en)2001-09-302004-10-28Ryan Grant JamesPersonal contact network
US20050038876A1 (en)2003-08-152005-02-17Aloke ChaudhuriSystem and method for instant match based on location, presence, personalization and communication
US20050165623A1 (en)*2003-03-122005-07-28Landi William A.Systems and methods for encryption-based de-identification of protected health information
US20050192999A1 (en)2003-11-212005-09-01Cook Scott J.System and method of virtualizing physical locations
US20050236474A1 (en)*2004-03-262005-10-27Convergence Ct, Inc.System and method for controlling access and use of patient medical data records
US20050256866A1 (en)2004-03-152005-11-17Yahoo! Inc.Search system and methods with integration of user annotations from a trust network
US20060004590A1 (en)2004-07-022006-01-05Denis KhooTravel planning for social networks
US20060041543A1 (en)2003-01-292006-02-23Microsoft CorporationSystem and method for employing social networks for information discovery
US20060048059A1 (en)2004-08-262006-03-02Henry EtkinSystem and method for dynamically generating, maintaining, and growing an online social network
US20060042483A1 (en)*2004-09-022006-03-02Work James DMethod and system for reputation evaluation of online users in a social networking scheme
US20060074863A1 (en)2004-09-202006-04-06Microsoft CorporationMethod, system, and apparatus for maintaining user privacy in a knowledge interchange system
US20060075228A1 (en)*2004-06-222006-04-06Black Alistair DMethod and apparatus for recognition and real time protection from view of sensitive terms in documents
US20060089857A1 (en)*2004-10-212006-04-27Zimmerman Roger STranscription data security
US20060121987A1 (en)2004-12-072006-06-08Microsoft CorporationUser-centric method of aggregating information sources to reinforce digital identity
US7069308B2 (en)2003-06-162006-06-27Friendster, Inc.System, method and apparatus for connecting users in an online computer system based on their relationships within social networks
US20060143067A1 (en)2004-12-232006-06-29Hermann CalabriaVendor-driven, social-network enabled review system with flexible syndication
US7080139B1 (en)2001-04-242006-07-18Fatbubble, IncMethod and apparatus for selectively sharing and passively tracking communication device experiences
US20060178910A1 (en)*2005-01-102006-08-10George EisenbergerPublisher gateway systems for collaborative data exchange, collection, monitoring and/or alerting
US20060190281A1 (en)2005-02-222006-08-24Microsoft CorporationSystems and methods to facilitate self regulation of social networks through trading and gift exchange
US20060195441A1 (en)*2005-01-032006-08-31Luc JuliaSystem and method for delivering content to users on a network
US20060242558A1 (en)*2005-04-252006-10-26Microsoft CorporationEnabling users to redact portions of a document
US20060238380A1 (en)2005-04-212006-10-26Microsoft CorporationMaintaining user privacy in a virtual earth environment
US20060259957A1 (en)2004-11-042006-11-16Tam Chung MSystem and method for creating a secure trusted social network
US20070027715A1 (en)*2005-06-132007-02-01Medcommons, Inc.Private health information interchange and related systems, methods, and devices
US20070038437A1 (en)*2005-08-122007-02-15Xerox CorporationDocument anonymization apparatus and method
US20070067405A1 (en)*2005-09-202007-03-22Eliovson Joshua MModerated anonymous forum
US20070218997A1 (en)*2006-03-172007-09-20Wing ChoSystems, methods and techniques for safely and effectively coordinating video game play and other activities among multiple remote networked friends and rivals
US20070271336A1 (en)*2006-05-212007-11-22Venkat RamaswamyA system and method of spreading messages in a social network
US20080162202A1 (en)*2006-12-292008-07-03Richendra KhannaDetecting inappropriate activity by analysis of user interactions
US20080172745A1 (en)*2007-01-122008-07-17Sap AgSystems and methods for protecting sensitive data
US20110209168A1 (en)*2002-08-162011-08-25Media Ip, Inc.Method and apparatus for interactive programming using captioning

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US20030158888A1 (en)*2001-12-282003-08-21Magnus BjorklundSafe communication
US20050125408A1 (en)*2003-11-202005-06-09Beena SomarooListing service tracking system and method for tracking a user's interaction with a listing service
US8560860B2 (en)*2005-07-012013-10-15Red Hat, Inc.Strong password entry
WO2007019699A1 (en)*2005-08-172007-02-22Canada Post CorporationElectronic content management systems and methods

Patent Citations (38)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US6073142A (en)*1997-06-232000-06-06Park City GroupAutomated post office based rule analysis of e-mail messages and other data objects for controlled distribution in network environments
US6438632B1 (en)*1998-03-102002-08-20Gala IncorporatedElectronic bulletin board system
US6209100B1 (en)*1998-03-272001-03-27International Business Machines Corp.Moderated forums with anonymous but traceable contributions
US20010034723A1 (en)*2000-02-112001-10-25Subramaniam Arun K.System and method for providing anonymous internet transactions
US20020023230A1 (en)2000-04-112002-02-21Bolnick David A.System, method and computer program product for gathering and delivering personalized user information
US6681108B1 (en)2000-08-162004-01-20Mitsubishi Electric Research Laboratories, Inc.Network and method for identifying entities sharing a common network location
US7080139B1 (en)2001-04-242006-07-18Fatbubble, IncMethod and apparatus for selectively sharing and passively tracking communication device experiences
US20040181434A1 (en)*2001-09-282004-09-16Olympus CorporationDistribution method of medical information
US20040215793A1 (en)2001-09-302004-10-28Ryan Grant JamesPersonal contact network
US20110209168A1 (en)*2002-08-162011-08-25Media Ip, Inc.Method and apparatus for interactive programming using captioning
US20060041543A1 (en)2003-01-292006-02-23Microsoft CorporationSystem and method for employing social networks for information discovery
US20050165623A1 (en)*2003-03-122005-07-28Landi William A.Systems and methods for encryption-based de-identification of protected health information
US7069308B2 (en)2003-06-162006-06-27Friendster, Inc.System, method and apparatus for connecting users in an online computer system based on their relationships within social networks
US20050038876A1 (en)2003-08-152005-02-17Aloke ChaudhuriSystem and method for instant match based on location, presence, personalization and communication
US20050192999A1 (en)2003-11-212005-09-01Cook Scott J.System and method of virtualizing physical locations
US20050256866A1 (en)2004-03-152005-11-17Yahoo! Inc.Search system and methods with integration of user annotations from a trust network
US20050236474A1 (en)*2004-03-262005-10-27Convergence Ct, Inc.System and method for controlling access and use of patient medical data records
US20060075228A1 (en)*2004-06-222006-04-06Black Alistair DMethod and apparatus for recognition and real time protection from view of sensitive terms in documents
US20060004590A1 (en)2004-07-022006-01-05Denis KhooTravel planning for social networks
US20060048059A1 (en)2004-08-262006-03-02Henry EtkinSystem and method for dynamically generating, maintaining, and growing an online social network
US20060042483A1 (en)*2004-09-022006-03-02Work James DMethod and system for reputation evaluation of online users in a social networking scheme
US20060074863A1 (en)2004-09-202006-04-06Microsoft CorporationMethod, system, and apparatus for maintaining user privacy in a knowledge interchange system
US20060089857A1 (en)*2004-10-212006-04-27Zimmerman Roger STranscription data security
US20060259957A1 (en)2004-11-042006-11-16Tam Chung MSystem and method for creating a secure trusted social network
US20060121987A1 (en)2004-12-072006-06-08Microsoft CorporationUser-centric method of aggregating information sources to reinforce digital identity
US20060143067A1 (en)2004-12-232006-06-29Hermann CalabriaVendor-driven, social-network enabled review system with flexible syndication
US20060195441A1 (en)*2005-01-032006-08-31Luc JuliaSystem and method for delivering content to users on a network
US20060178910A1 (en)*2005-01-102006-08-10George EisenbergerPublisher gateway systems for collaborative data exchange, collection, monitoring and/or alerting
US20060190281A1 (en)2005-02-222006-08-24Microsoft CorporationSystems and methods to facilitate self regulation of social networks through trading and gift exchange
US20060238380A1 (en)2005-04-212006-10-26Microsoft CorporationMaintaining user privacy in a virtual earth environment
US20060242558A1 (en)*2005-04-252006-10-26Microsoft CorporationEnabling users to redact portions of a document
US20070027715A1 (en)*2005-06-132007-02-01Medcommons, Inc.Private health information interchange and related systems, methods, and devices
US20070038437A1 (en)*2005-08-122007-02-15Xerox CorporationDocument anonymization apparatus and method
US20070067405A1 (en)*2005-09-202007-03-22Eliovson Joshua MModerated anonymous forum
US20070218997A1 (en)*2006-03-172007-09-20Wing ChoSystems, methods and techniques for safely and effectively coordinating video game play and other activities among multiple remote networked friends and rivals
US20070271336A1 (en)*2006-05-212007-11-22Venkat RamaswamyA system and method of spreading messages in a social network
US20080162202A1 (en)*2006-12-292008-07-03Richendra KhannaDetecting inappropriate activity by analysis of user interactions
US20080172745A1 (en)*2007-01-122008-07-17Sap AgSystems and methods for protecting sensitive data

Non-Patent Citations (29)

* Cited by examiner, † Cited by third party
Title
"Amidst Growing Internet Privacy Concerns, Online Blog Comunity SoulCast.com One-Ups MySpace and LiveJournal on Privacy", PRWeb, May 24, 2006.
"Anonymity Preserved for Online Embroidery Fans", from www.eff.org, Sep. 14, 2006.
"Anonymous Communication", exitthematrix.dod.net, no later than Nov. 13, 2006.
"Category:Social networking", en.wikipedia.org, Nov. 14, 2006.
"EFF Fighting to Protect AnonymityOf Video Publishers", from www.podcastingnews.com, Oct. 31, 2006.
"Experience project is social networking by what you do, not who you know", blog.wired.com, Oct. 6, 2006.
"Have you ever met anyone from here?", postings from community.channel4.com, Oct. 13, 2006.
"Have you ever met anyone from here?", postings from community.channel4.com, Sep. 4, 2006.
"Hot-P2P '06 Program", web-minds.consorzio-cini.it, Apr. 29, 2006.
"How to . . . Go Meet someone that you Met online?", postings from forum.literotica.com:81, Oct. 8, 2003.
"Meetup.com", en.wikipedia.org, Nov. 11, 2006.
"Re: [GNU-net developers] question on how to enforce anonymity", from copilotconsulting.com, Apr. 13, 2004.
"Share a Beer with a Complete Stranger", from www.liquorsnob.com, Feb. 10, 2006.
"Social network", en.wikipedia.org, Nov. 14, 2006.
"Social Networking using Google Maps mashups", googlemapsmania.blogspot.com, Sep. 1, 2006.
"Staying online and staying healthy", from www.mysocialnetwork.net, Sep. 29, 2006.
"TAS Administration Manual" excerpts, from www.engenio.com on Nov. 20, 2006.
"Tell the World Something about Yourself", pages from www.experienceproject.com, Nov. 14, 2006.
"The Experience Project", postings from www.asexuality.org, Sep. 14, 2006.
"Virtual community", en.wikipedia.org, Nov. 14, 2006.
"Welcome to the Experience Project", pages from www.experienceproject.com/learnmore.php, Nov. 14, 2006.
'BulletinBoards.com', Privacy Policy-Terms of Use, http://www.bulletinboards.com/TermsOfUse.cfm, Aug. 1, 2003.*
Joon Koh et al., "Encouraging Participation in Virtual Communities", Communications of the ACM, Feb. 2007, vol. 50 No. 2, pp. 69-73.
K.Y. Chan and S.W. Kwok, "Information Seeking Behavior in Peer-to-Peer Networks: An Exploratory Study", www.kc.tsukuba.ac.jp/dlkc/e-proceedings/papers/dlkc04pp40.pdf, 2003.
Norman Makoto Su et al., "A Bosom Buddy Afar Brings a Distant Land Near: Are Bloggers a Global Community?", www.ics.uci.edu/~normsu/papers/Su-Bosom-CT05.pdf, 2005.
Norman Makoto Su et al., "A Bosom Buddy Afar Brings a Distant Land Near: Are Bloggers a Global Community?", www.ics.uci.edu/˜normsu/papers/Su-Bosom-CT05.pdf, 2005.
Pages from www.diggwire.com, Nov. 14, 2006.
Pages from www.soulcast.com, Nov. 13, 2006.
Steven Cherry, "Virtually Private: A new Swedish network helps you hide online", IEEE Spectrum, Dec. 2006, pp. 52 ff.

Cited By (277)

* Cited by examiner, † Cited by third party
Publication numberPriority datePublication dateAssigneeTitle
US9736134B2 (en)2005-03-182017-08-15Leapfrog Enterprises, Inc.Child-oriented computing system
US9064288B2 (en)2006-03-172015-06-23Fatdoor, Inc.Government structures and neighborhood leads in a geo-spatial environment
US9300675B2 (en)2008-03-032016-03-29Leapfrog Enterprises, Inc.Method and apparatus for custodial monitoring, filtering, and approving of content
US20090254656A1 (en)*2008-03-032009-10-08Kidzui, IncMethod and apparatus for custodial monitoring, filtering, and approving of content
US8868741B2 (en)*2008-03-032014-10-21Leapfrog Enterprises, Inc.Method and apparatus for custodial monitoring, filtering, and approving of content
US10121206B1 (en)2011-02-242018-11-06Allstate Insurance CompanySocial network risk analysis
US9483795B1 (en)2011-02-242016-11-01Allstate Insurance CompanySocial network risk analysis
US10861103B1 (en)2011-02-242020-12-08Allstate Insurance CompanySocial network risk analysis
US8799028B1 (en)*2011-02-242014-08-05Allstate Insurance CompanySocial network risk analysis
US11727496B1 (en)2011-02-242023-08-15Allstate Insurance CompanySocial network risk analysis
WO2015074030A3 (en)*2013-11-182015-11-12Antoine ToffaEnabling pseudonymous lifelike social media interactions
US9591097B2 (en)2013-11-182017-03-07Antoine ToffaSystem and method for enabling pseudonymous lifelike social media interactions without using or linking to any uniquely identifiable user data and fully protecting users' privacy
US8862679B1 (en)*2014-04-182014-10-14Secret, Inc.Displaying comments on a secret in an anonymous social networking application
US10380657B2 (en)2015-03-042019-08-13International Business Machines CorporationRapid cognitive mobile application review
US10373213B2 (en)2015-03-042019-08-06International Business Machines CorporationRapid cognitive mobile application review
CN106302092A (en)*2015-05-262017-01-04腾讯科技(深圳)有限公司A kind of information interacting method and device
US20160352805A1 (en)*2015-05-282016-12-01Bizhive, LlcOnline reputation monitoring and intelligence gathering
US10127506B2 (en)2015-08-072018-11-13International Business Machines CorporationDetermining users for limited product deployment based on review histories
US10607233B2 (en)2016-01-062020-03-31International Business Machines CorporationAutomated review validator
US12288233B2 (en)2016-04-012025-04-29OneTrust, LLCData processing systems and methods for integrating privacy information management systems with data loss prevention tools or other tools for privacy design
US11651402B2 (en)2016-04-012023-05-16OneTrust, LLCData processing systems and communication systems and methods for the efficient generation of risk assessments
US11244367B2 (en)2016-04-012022-02-08OneTrust, LLCData processing systems and methods for integrating privacy information management systems with data loss prevention tools or other tools for privacy design
US11004125B2 (en)2016-04-012021-05-11OneTrust, LLCData processing systems and methods for integrating privacy information management systems with data loss prevention tools or other tools for privacy design
US10956952B2 (en)2016-04-012021-03-23OneTrust, LLCData processing systems and communication systems and methods for the efficient generation of privacy risk assessments
US10853859B2 (en)2016-04-012020-12-01OneTrust, LLCData processing systems and methods for operationalizing privacy compliance and assessing the risk of various respective privacy campaigns
US10706447B2 (en)2016-04-012020-07-07OneTrust, LLCData processing systems and communication systems and methods for the efficient generation of privacy risk assessments
US11146566B2 (en)2016-06-102021-10-12OneTrust, LLCData processing systems for fulfilling data subject access requests and related methods
US10565236B1 (en)2016-06-102020-02-18OneTrust, LLCData processing systems for generating and populating a data inventory
US10586075B2 (en)2016-06-102020-03-10OneTrust, LLCData processing systems for orphaned data identification and deletion and related methods
US10586072B2 (en)2016-06-102020-03-10OneTrust, LLCData processing systems for measuring privacy maturity within an organization
US10585968B2 (en)2016-06-102020-03-10OneTrust, LLCData processing systems for fulfilling data subject access requests and related methods
US10592692B2 (en)2016-06-102020-03-17OneTrust, LLCData processing systems for central consent repository and related methods
US10592648B2 (en)2016-06-102020-03-17OneTrust, LLCConsent receipt management systems and related methods
US10594740B2 (en)2016-06-102020-03-17OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10599870B2 (en)2016-06-102020-03-24OneTrust, LLCData processing systems for identifying, assessing, and remediating data processing risks using data modeling techniques
US10574705B2 (en)2016-06-102020-02-25OneTrust, LLCData processing and scanning systems for generating and populating a data inventory
US10606916B2 (en)2016-06-102020-03-31OneTrust, LLCData processing user interface monitoring systems and related methods
US10607028B2 (en)2016-06-102020-03-31OneTrust, LLCData processing systems for data testing to confirm data deletion and related methods
US12412140B2 (en)2016-06-102025-09-09OneTrust, LLCData processing systems and methods for bundled privacy policies
US10614246B2 (en)2016-06-102020-04-07OneTrust, LLCData processing systems and methods for auditing data request compliance
US10614247B2 (en)2016-06-102020-04-07OneTrust, LLCData processing systems for automated classification of personal information from documents and related methods
US10642870B2 (en)2016-06-102020-05-05OneTrust, LLCData processing systems and methods for automatically detecting and documenting privacy-related aspects of computer software
US12381915B2 (en)2016-06-102025-08-05OneTrust, LLCData processing systems and methods for performing assessments and monitoring of new versions of computer code for compliance
US10678945B2 (en)2016-06-102020-06-09OneTrust, LLCConsent receipt management systems and related methods
US10685140B2 (en)2016-06-102020-06-16OneTrust, LLCConsent receipt management systems and related methods
US10692033B2 (en)2016-06-102020-06-23OneTrust, LLCData processing systems for identifying, assessing, and remediating data processing risks using data modeling techniques
US10706176B2 (en)2016-06-102020-07-07OneTrust, LLCData-processing consent refresh, re-prompt, and recapture systems and related methods
US10565161B2 (en)2016-06-102020-02-18OneTrust, LLCData processing systems for processing data subject access requests
US10706379B2 (en)2016-06-102020-07-07OneTrust, LLCData processing systems for automatic preparation for remediation and related methods
US10708305B2 (en)2016-06-102020-07-07OneTrust, LLCAutomated data processing systems and methods for automatically processing requests for privacy-related information
US10705801B2 (en)2016-06-102020-07-07OneTrust, LLCData processing systems for identity validation of data subject access requests and related methods
US10706131B2 (en)2016-06-102020-07-07OneTrust, LLCData processing systems and methods for efficiently assessing the risk of privacy campaigns
US10706174B2 (en)2016-06-102020-07-07OneTrust, LLCData processing systems for prioritizing data subject access requests for fulfillment and related methods
US10713387B2 (en)*2016-06-102020-07-14OneTrust, LLCConsent conversion optimization systems and related methods
US10726158B2 (en)2016-06-102020-07-28OneTrust, LLCConsent receipt management and automated process blocking systems and related methods
US10740487B2 (en)2016-06-102020-08-11OneTrust, LLCData processing systems and methods for populating and maintaining a centralized database of personal data
US10754981B2 (en)2016-06-102020-08-25OneTrust, LLCData processing systems for fulfilling data subject access requests and related methods
US10762236B2 (en)2016-06-102020-09-01OneTrust, LLCData processing user interface monitoring systems and related methods
US10769302B2 (en)2016-06-102020-09-08OneTrust, LLCConsent receipt management systems and related methods
US10769303B2 (en)2016-06-102020-09-08OneTrust, LLCData processing systems for central consent repository and related methods
US10769301B2 (en)2016-06-102020-09-08OneTrust, LLCData processing systems for webform crawling to map processing activities and related methods
US12299065B2 (en)2016-06-102025-05-13OneTrust, LLCData processing systems and methods for dynamically determining data processing consent configurations
US11222142B2 (en)2016-06-102022-01-11OneTrust, LLCData processing systems for validating authorization for personal data collection, storage, and processing
US10776514B2 (en)2016-06-102020-09-15OneTrust, LLCData processing systems for the identification and deletion of personal data in computer systems
US10776518B2 (en)2016-06-102020-09-15OneTrust, LLCConsent receipt management systems and related methods
US10776515B2 (en)2016-06-102020-09-15OneTrust, LLCData processing systems for fulfilling data subject access requests and related methods
US10783256B2 (en)2016-06-102020-09-22OneTrust, LLCData processing systems for data transfer risk identification and related methods
US10791150B2 (en)2016-06-102020-09-29OneTrust, LLCData processing and scanning systems for generating and populating a data inventory
US10796020B2 (en)2016-06-102020-10-06OneTrust, LLCConsent receipt management systems and related methods
US10796260B2 (en)2016-06-102020-10-06OneTrust, LLCPrivacy management systems and methods
US10798133B2 (en)2016-06-102020-10-06OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10803198B2 (en)2016-06-102020-10-13OneTrust, LLCData processing systems for use in automatically generating, populating, and submitting data subject access requests
US12216794B2 (en)2016-06-102025-02-04OneTrust, LLCData processing systems and methods for synching privacy-related user consent across multiple computing devices
US10805354B2 (en)2016-06-102020-10-13OneTrust, LLCData processing systems and methods for performing privacy assessments and monitoring of new versions of computer code for privacy compliance
US10803199B2 (en)2016-06-102020-10-13OneTrust, LLCData processing and communications systems and methods for the efficient implementation of privacy by design
US10803200B2 (en)2016-06-102020-10-13OneTrust, LLCData processing systems for processing and managing data subject access in a distributed environment
US10803097B2 (en)2016-06-102020-10-13OneTrust, LLCData processing systems for generating and populating a data inventory
US10839102B2 (en)2016-06-102020-11-17OneTrust, LLCData processing systems for identifying and modifying processes that are subject to data subject access requests
US10846261B2 (en)2016-06-102020-11-24OneTrust, LLCData processing systems for processing data subject access requests
US10848523B2 (en)2016-06-102020-11-24OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10846433B2 (en)2016-06-102020-11-24OneTrust, LLCData processing consent management systems and related methods
US10567439B2 (en)2016-06-102020-02-18OneTrust, LLCData processing systems and methods for performing privacy assessments and monitoring of new versions of computer code for privacy compliance
US10853501B2 (en)2016-06-102020-12-01OneTrust, LLCData processing and scanning systems for assessing vendor risk
US10564936B2 (en)2016-06-102020-02-18OneTrust, LLCData processing systems for identity validation of data subject access requests and related methods
US10867007B2 (en)2016-06-102020-12-15OneTrust, LLCData processing systems for fulfilling data subject access requests and related methods
US10867072B2 (en)2016-06-102020-12-15OneTrust, LLCData processing systems for measuring privacy maturity within an organization
US10873606B2 (en)2016-06-102020-12-22OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US10878127B2 (en)2016-06-102020-12-29OneTrust, LLCData subject access request processing systems and related methods
US10885485B2 (en)2016-06-102021-01-05OneTrust, LLCPrivacy management systems and methods
US10896394B2 (en)2016-06-102021-01-19OneTrust, LLCPrivacy management systems and methods
US10909488B2 (en)2016-06-102021-02-02OneTrust, LLCData processing systems for assessing readiness for responding to privacy-related incidents
US10909265B2 (en)2016-06-102021-02-02OneTrust, LLCApplication privacy scanning systems and related methods
US10929559B2 (en)2016-06-102021-02-23OneTrust, LLCData processing systems for data testing to confirm data deletion and related methods
US10944725B2 (en)2016-06-102021-03-09OneTrust, LLCData processing systems and methods for using a data model to select a target data asset in a data migration
US10949567B2 (en)2016-06-102021-03-16OneTrust, LLCData processing systems for fulfilling data subject access requests and related methods
US10949544B2 (en)2016-06-102021-03-16OneTrust, LLCData processing systems for data transfer risk identification and related methods
US10949565B2 (en)2016-06-102021-03-16OneTrust, LLCData processing systems for generating and populating a data inventory
US10949170B2 (en)2016-06-102021-03-16OneTrust, LLCData processing systems for integration of consumer feedback with data subject access requests and related methods
US11222139B2 (en)2016-06-102022-01-11OneTrust, LLCData processing systems and methods for automatic discovery and assessment of mobile software development kits
US12204564B2 (en)2016-06-102025-01-21OneTrust, LLCData processing systems and methods for automatically detecting and documenting privacy-related aspects of computer software
US10970675B2 (en)2016-06-102021-04-06OneTrust, LLCData processing systems for generating and populating a data inventory
US12190330B2 (en)2016-06-102025-01-07OneTrust, LLCData processing systems for identity validation for consumer rights requests and related methods
US10970371B2 (en)2016-06-102021-04-06OneTrust, LLCConsent receipt management systems and related methods
US10972509B2 (en)2016-06-102021-04-06OneTrust, LLCData processing and scanning systems for generating and populating a data inventory
US10984132B2 (en)2016-06-102021-04-20OneTrust, LLCData processing systems and methods for populating and maintaining a centralized database of personal data
US10997315B2 (en)2016-06-102021-05-04OneTrust, LLCData processing systems for fulfilling data subject access requests and related methods
US10997318B2 (en)2016-06-102021-05-04OneTrust, LLCData processing systems for generating and populating a data inventory for processing data access requests
US10997542B2 (en)2016-06-102021-05-04OneTrust, LLCPrivacy management systems and methods
US10565397B1 (en)2016-06-102020-02-18OneTrust, LLCData processing systems for fulfilling data subject access requests and related methods
US11023616B2 (en)2016-06-102021-06-01OneTrust, LLCData processing systems for identifying, assessing, and remediating data processing risks using data modeling techniques
US11023842B2 (en)2016-06-102021-06-01OneTrust, LLCData processing systems and methods for bundled privacy policies
US11025675B2 (en)2016-06-102021-06-01OneTrust, LLCData processing systems and methods for performing privacy assessments and monitoring of new versions of computer code for privacy compliance
US11030563B2 (en)2016-06-102021-06-08OneTrust, LLCPrivacy management systems and methods
US11030274B2 (en)2016-06-102021-06-08OneTrust, LLCData processing user interface monitoring systems and related methods
US11030327B2 (en)2016-06-102021-06-08OneTrust, LLCData processing and scanning systems for assessing vendor risk
US11038925B2 (en)2016-06-102021-06-15OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11036771B2 (en)2016-06-102021-06-15OneTrust, LLCData processing systems for generating and populating a data inventory
US11036674B2 (en)2016-06-102021-06-15OneTrust, LLCData processing systems for processing data subject access requests
US11036882B2 (en)2016-06-102021-06-15OneTrust, LLCData processing systems for processing and managing data subject access in a distributed environment
US11057356B2 (en)2016-06-102021-07-06OneTrust, LLCAutomated data processing systems and methods for automatically processing data subject access requests using a chatbot
US11062051B2 (en)2016-06-102021-07-13OneTrust, LLCConsent receipt management systems and related methods
US11068618B2 (en)2016-06-102021-07-20OneTrust, LLCData processing systems for central consent repository and related methods
US11070593B2 (en)2016-06-102021-07-20OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US12164667B2 (en)2016-06-102024-12-10OneTrust, LLCApplication privacy scanning systems and related methods
US11074367B2 (en)2016-06-102021-07-27OneTrust, LLCData processing systems for identity validation for consumer rights requests and related methods
US11087260B2 (en)2016-06-102021-08-10OneTrust, LLCData processing systems and methods for customizing privacy training
US11100444B2 (en)2016-06-102021-08-24OneTrust, LLCData processing systems and methods for providing training in a vendor procurement process
US11100445B2 (en)2016-06-102021-08-24OneTrust, LLCData processing systems for assessing readiness for responding to privacy-related incidents
US11113416B2 (en)2016-06-102021-09-07OneTrust, LLCApplication privacy scanning systems and related methods
US11122011B2 (en)2016-06-102021-09-14OneTrust, LLCData processing systems and methods for using a data model to select a target data asset in a data migration
US11120162B2 (en)2016-06-102021-09-14OneTrust, LLCData processing systems for data testing to confirm data deletion and related methods
US11120161B2 (en)2016-06-102021-09-14OneTrust, LLCData subject access request processing systems and related methods
US11126748B2 (en)2016-06-102021-09-21OneTrust, LLCData processing consent management systems and related methods
US11134086B2 (en)*2016-06-102021-09-28OneTrust, LLCConsent conversion optimization systems and related methods
US11138242B2 (en)2016-06-102021-10-05OneTrust, LLCData processing systems and methods for automatically detecting and documenting privacy-related aspects of computer software
US11138299B2 (en)2016-06-102021-10-05OneTrust, LLCData processing and scanning systems for assessing vendor risk
US11138336B2 (en)2016-06-102021-10-05OneTrust, LLCData processing systems for generating and populating a data inventory
US11138318B2 (en)2016-06-102021-10-05OneTrust, LLCData processing systems for data transfer risk identification and related methods
US12158975B2 (en)2016-06-102024-12-03OneTrust, LLCData processing consent sharing systems and related methods
US10564935B2 (en)2016-06-102020-02-18OneTrust, LLCData processing systems for integration of consumer feedback with data subject access requests and related methods
US11144622B2 (en)2016-06-102021-10-12OneTrust, LLCPrivacy management systems and methods
US11144670B2 (en)2016-06-102021-10-12OneTrust, LLCData processing systems for identifying and modifying processes that are subject to data subject access requests
US11151233B2 (en)2016-06-102021-10-19OneTrust, LLCData processing and scanning systems for assessing vendor risk
US12147578B2 (en)2016-06-102024-11-19OneTrust, LLCConsent receipt management systems and related methods
US11157600B2 (en)2016-06-102021-10-26OneTrust, LLCData processing and scanning systems for assessing vendor risk
US11182501B2 (en)2016-06-102021-11-23OneTrust, LLCData processing systems for fulfilling data subject access requests and related methods
US11188615B2 (en)2016-06-102021-11-30OneTrust, LLCData processing consent capture systems and related methods
US11188862B2 (en)2016-06-102021-11-30OneTrust, LLCPrivacy management systems and methods
US11195134B2 (en)2016-06-102021-12-07OneTrust, LLCPrivacy management systems and methods
US11200341B2 (en)2016-06-102021-12-14OneTrust, LLCConsent receipt management systems and related methods
US11210420B2 (en)2016-06-102021-12-28OneTrust, LLCData subject access request processing systems and related methods
US10572686B2 (en)2016-06-102020-02-25OneTrust, LLCConsent receipt management systems and related methods
US10776517B2 (en)2016-06-102020-09-15OneTrust, LLCData processing systems for calculating and communicating cost of fulfilling data subject access requests and related methods
US11222309B2 (en)2016-06-102022-01-11OneTrust, LLCData processing systems for generating and populating a data inventory
US11227247B2 (en)2016-06-102022-01-18OneTrust, LLCData processing systems and methods for bundled privacy policies
US11228620B2 (en)2016-06-102022-01-18OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11238390B2 (en)2016-06-102022-02-01OneTrust, LLCPrivacy management systems and methods
US11240273B2 (en)2016-06-102022-02-01OneTrust, LLCData processing and scanning systems for generating and populating a data inventory
US11244072B2 (en)2016-06-102022-02-08OneTrust, LLCData processing systems for identifying, assessing, and remediating data processing risks using data modeling techniques
US11244071B2 (en)2016-06-102022-02-08OneTrust, LLCData processing systems for use in automatically generating, populating, and submitting data subject access requests
US20200004986A1 (en)*2016-06-102020-01-02OneTrust, LLCConsent conversion optimization systems and related methods
US11256777B2 (en)2016-06-102022-02-22OneTrust, LLCData processing user interface monitoring systems and related methods
US11277448B2 (en)2016-06-102022-03-15OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11295316B2 (en)2016-06-102022-04-05OneTrust, LLCData processing systems for identity validation for consumer rights requests and related methods
US11294939B2 (en)2016-06-102022-04-05OneTrust, LLCData processing systems and methods for automatically detecting and documenting privacy-related aspects of computer software
US11301796B2 (en)2016-06-102022-04-12OneTrust, LLCData processing systems and methods for customizing privacy training
US11301589B2 (en)2016-06-102022-04-12OneTrust, LLCConsent receipt management systems and related methods
US11308435B2 (en)2016-06-102022-04-19OneTrust, LLCData processing systems for identifying, assessing, and remediating data processing risks using data modeling techniques
US11328240B2 (en)2016-06-102022-05-10OneTrust, LLCData processing systems for assessing readiness for responding to privacy-related incidents
US11328092B2 (en)2016-06-102022-05-10OneTrust, LLCData processing systems for processing and managing data subject access in a distributed environment
US11336697B2 (en)2016-06-102022-05-17OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US11334682B2 (en)2016-06-102022-05-17OneTrust, LLCData subject access request processing systems and related methods
US11334681B2 (en)2016-06-102022-05-17OneTrust, LLCApplication privacy scanning systems and related meihods
US11343284B2 (en)2016-06-102022-05-24OneTrust, LLCData processing systems and methods for performing privacy assessments and monitoring of new versions of computer code for privacy compliance
US11341447B2 (en)2016-06-102022-05-24OneTrust, LLCPrivacy management systems and methods
US11347889B2 (en)2016-06-102022-05-31OneTrust, LLCData processing systems for generating and populating a data inventory
US11354435B2 (en)2016-06-102022-06-07OneTrust, LLCData processing systems for data testing to confirm data deletion and related methods
US11354434B2 (en)2016-06-102022-06-07OneTrust, LLCData processing systems for verification of consent and notice processing and related methods
US11361057B2 (en)2016-06-102022-06-14OneTrust, LLCConsent receipt management systems and related methods
US11366909B2 (en)2016-06-102022-06-21OneTrust, LLCData processing and scanning systems for assessing vendor risk
US11366786B2 (en)2016-06-102022-06-21OneTrust, LLCData processing systems for processing data subject access requests
US12136055B2 (en)2016-06-102024-11-05OneTrust, LLCData processing systems for identifying, assessing, and remediating data processing risks using data modeling techniques
US11392720B2 (en)2016-06-102022-07-19OneTrust, LLCData processing systems for verification of consent and notice processing and related methods
US12118121B2 (en)2016-06-102024-10-15OneTrust, LLCData subject access request processing systems and related methods
US11403377B2 (en)2016-06-102022-08-02OneTrust, LLCPrivacy management systems and methods
US12086748B2 (en)2016-06-102024-09-10OneTrust, LLCData processing systems for assessing readiness for responding to privacy-related incidents
US11409908B2 (en)2016-06-102022-08-09OneTrust, LLCData processing systems and methods for populating and maintaining a centralized database of personal data
US11410106B2 (en)2016-06-102022-08-09OneTrust, LLCPrivacy management systems and methods
US11416576B2 (en)2016-06-102022-08-16OneTrust, LLCData processing consent capture systems and related methods
US11416636B2 (en)2016-06-102022-08-16OneTrust, LLCData processing consent management systems and related methods
US11416589B2 (en)2016-06-102022-08-16OneTrust, LLCData processing and scanning systems for assessing vendor risk
US11416634B2 (en)2016-06-102022-08-16OneTrust, LLCConsent receipt management systems and related methods
US11416798B2 (en)2016-06-102022-08-16OneTrust, LLCData processing systems and methods for providing training in a vendor procurement process
US11416590B2 (en)2016-06-102022-08-16OneTrust, LLCData processing and scanning systems for assessing vendor risk
US11416109B2 (en)2016-06-102022-08-16OneTrust, LLCAutomated data processing systems and methods for automatically processing data subject access requests using a chatbot
US11418492B2 (en)2016-06-102022-08-16OneTrust, LLCData processing systems and methods for using a data model to select a target data asset in a data migration
US11418516B2 (en)*2016-06-102022-08-16OneTrust, LLCConsent conversion optimization systems and related methods
US11438386B2 (en)2016-06-102022-09-06OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US12052289B2 (en)2016-06-102024-07-30OneTrust, LLCData processing systems for data-transfer risk identification, cross-border visualization generation, and related methods
US12045266B2 (en)2016-06-102024-07-23OneTrust, LLCData processing systems for generating and populating a data inventory
US12026651B2 (en)2016-06-102024-07-02OneTrust, LLCData processing systems and methods for providing training in a vendor procurement process
US11449633B2 (en)2016-06-102022-09-20OneTrust, LLCData processing systems and methods for automatic discovery and assessment of mobile software development kits
US11461722B2 (en)2016-06-102022-10-04OneTrust, LLCQuestionnaire response automation for compliance management
US11461500B2 (en)2016-06-102022-10-04OneTrust, LLCData processing systems for cookie compliance testing with website scanning and related methods
US11468386B2 (en)2016-06-102022-10-11OneTrust, LLCData processing systems and methods for bundled privacy policies
US11468196B2 (en)2016-06-102022-10-11OneTrust, LLCData processing systems for validating authorization for personal data collection, storage, and processing
US11960564B2 (en)2016-06-102024-04-16OneTrust, LLCData processing systems and methods for automatically blocking the use of tracking tools
US11475136B2 (en)2016-06-102022-10-18OneTrust, LLCData processing systems for data transfer risk identification and related methods
US11481710B2 (en)2016-06-102022-10-25OneTrust, LLCPrivacy management systems and methods
US11488085B2 (en)2016-06-102022-11-01OneTrust, LLCQuestionnaire response automation for compliance management
US11921894B2 (en)2016-06-102024-03-05OneTrust, LLCData processing systems for generating and populating a data inventory for processing data access requests
US20220360590A1 (en)*2016-06-102022-11-10OneTrust, LLCConsent conversion optimization systems and related methods
US11520928B2 (en)2016-06-102022-12-06OneTrust, LLCData processing systems for generating personal data receipts and related methods
US11868507B2 (en)2016-06-102024-01-09OneTrust, LLCData processing systems for cookie compliance testing with website scanning and related methods
US11847182B2 (en)2016-06-102023-12-19OneTrust, LLCData processing consent capture systems and related methods
US11544405B2 (en)2016-06-102023-01-03OneTrust, LLCData processing systems for verification of consent and notice processing and related methods
US11727141B2 (en)2016-06-102023-08-15OneTrust, LLCData processing systems and methods for synching privacy-related user consent across multiple computing devices
US11675929B2 (en)2016-06-102023-06-13OneTrust, LLCData processing consent sharing systems and related methods
US11544667B2 (en)2016-06-102023-01-03OneTrust, LLCData processing systems for generating and populating a data inventory
US11550897B2 (en)2016-06-102023-01-10OneTrust, LLCData processing and scanning systems for assessing vendor risk
US11551174B2 (en)2016-06-102023-01-10OneTrust, LLCPrivacy management systems and methods
US11556672B2 (en)2016-06-102023-01-17OneTrust, LLCData processing systems for verification of consent and notice processing and related methods
US11558429B2 (en)2016-06-102023-01-17OneTrust, LLCData processing and scanning systems for generating and populating a data inventory
US11651106B2 (en)2016-06-102023-05-16OneTrust, LLCData processing systems for fulfilling data subject access requests and related methods
US11562097B2 (en)2016-06-102023-01-24OneTrust, LLCData processing systems for central consent repository and related methods
US11586762B2 (en)2016-06-102023-02-21OneTrust, LLCData processing systems and methods for auditing data request compliance
US11586700B2 (en)2016-06-102023-02-21OneTrust, LLCData processing systems and methods for automatically blocking the use of tracking tools
US11651104B2 (en)2016-06-102023-05-16OneTrust, LLCConsent receipt management systems and related methods
US11645418B2 (en)2016-06-102023-05-09OneTrust, LLCData processing systems for data testing to confirm data deletion and related methods
US11609939B2 (en)2016-06-102023-03-21OneTrust, LLCData processing systems and methods for automatically detecting and documenting privacy-related aspects of computer software
US11645353B2 (en)2016-06-102023-05-09OneTrust, LLCData processing consent capture systems and related methods
US11636171B2 (en)2016-06-102023-04-25OneTrust, LLCData processing user interface monitoring systems and related methods
US11625502B2 (en)2016-06-102023-04-11OneTrust, LLCData processing systems for identifying and modifying processes that are subject to data subject access requests
US10402630B2 (en)*2017-03-102019-09-03Sony Interactive Entertainment LLCMaintaining privacy for multiple users when serving media to a group
US10331436B2 (en)2017-03-202019-06-25International Business Machines CorporationSmart reviews for applications in application stores
US11663359B2 (en)2017-06-162023-05-30OneTrust, LLCData processing systems for identifying whether cookies contain personally identifying information
US11373007B2 (en)2017-06-162022-06-28OneTrust, LLCData processing systems for identifying whether cookies contain personally identifying information
US10970417B1 (en)2017-09-012021-04-06Workday, Inc.Differential privacy security for benchmarking
US11403421B2 (en)2017-09-012022-08-02Workday, Inc.Security system for benchmark access
US10606906B1 (en)*2017-09-012020-03-31Workday, Inc.Summary based privacy security for benchmarking
US10769298B1 (en)2017-09-012020-09-08Workday, Inc.Security system for benchmark access
US11853461B2 (en)2017-09-012023-12-26Workday, Inc.Differential privacy security for benchmarking
US10416993B2 (en)2017-10-062019-09-17International Business Machines CorporationMobile application update manager
US11593523B2 (en)2018-09-072023-02-28OneTrust, LLCData processing systems for orphaned data identification and deletion and related methods
US11544409B2 (en)2018-09-072023-01-03OneTrust, LLCData processing systems and methods for automatically protecting sensitive data within privacy management systems
US10803202B2 (en)2018-09-072020-10-13OneTrust, LLCData processing systems for orphaned data identification and deletion and related methods
US11144675B2 (en)2018-09-072021-10-12OneTrust, LLCData processing systems and methods for automatically protecting sensitive data within privacy management systems
US10963591B2 (en)2018-09-072021-03-30OneTrust, LLCData processing systems for orphaned data identification and deletion and related methods
US11157654B2 (en)2018-09-072021-10-26OneTrust, LLCData processing systems for orphaned data identification and deletion and related methods
US11947708B2 (en)2018-09-072024-04-02OneTrust, LLCData processing systems and methods for automatically protecting sensitive data within privacy management systems
US12153636B2 (en)*2018-11-132024-11-26Bizhive, LlcOnline reputation monitoring and intelligence gathering
US20200151278A1 (en)*2018-11-132020-05-14Bizhive, LlcOnline reputation monitoring and intelligence gathering
US11076038B2 (en)2019-12-312021-07-27Bye! Accident LlcReviewing message-based communications via a keyboard application
US12353405B2 (en)2020-07-082025-07-08OneTrust, LLCSystems and methods for targeted data discovery
US11797528B2 (en)2020-07-082023-10-24OneTrust, LLCSystems and methods for targeted data discovery
US11444976B2 (en)2020-07-282022-09-13OneTrust, LLCSystems and methods for automatically blocking the use of tracking tools
US11968229B2 (en)2020-07-282024-04-23OneTrust, LLCSystems and methods for automatically blocking the use of tracking tools
US11475165B2 (en)2020-08-062022-10-18OneTrust, LLCData processing systems and methods for automatically redacting unstructured data from a data subject access request
US11436373B2 (en)2020-09-152022-09-06OneTrust, LLCData processing systems and methods for detecting tools for the automatic blocking of consent requests
US11704440B2 (en)2020-09-152023-07-18OneTrust, LLCData processing systems and methods for preventing execution of an action documenting a consent rejection
US11526624B2 (en)2020-09-212022-12-13OneTrust, LLCData processing systems and methods for automatically detecting target data transfers and target data processing
US12265896B2 (en)2020-10-052025-04-01OneTrust, LLCSystems and methods for detecting prejudice bias in machine-learning models
US12277232B2 (en)2020-11-062025-04-15OneTrust, LLCSystems and methods for identifying data processing activities based on data discovery results
US11397819B2 (en)2020-11-062022-07-26OneTrust, LLCSystems and methods for identifying data processing activities based on data discovery results
US11615192B2 (en)2020-11-062023-03-28OneTrust, LLCSystems and methods for identifying data processing activities based on data discovery results
US12259882B2 (en)2021-01-252025-03-25OneTrust, LLCSystems and methods for discovery, classification, and indexing of data in a native computing system
US11687528B2 (en)2021-01-252023-06-27OneTrust, LLCSystems and methods for discovery, classification, and indexing of data in a native computing system
US11442906B2 (en)2021-02-042022-09-13OneTrust, LLCManaging custom attributes for domain objects defined within microservices
US11494515B2 (en)2021-02-082022-11-08OneTrust, LLCData processing systems and methods for anonymizing data samples in classification analysis
US11601464B2 (en)2021-02-102023-03-07OneTrust, LLCSystems and methods for mitigating risks of third-party computing system functionality integration into a first-party computing system
US11775348B2 (en)2021-02-172023-10-03OneTrust, LLCManaging custom workflows for domain objects defined within microservices
US11546661B2 (en)2021-02-182023-01-03OneTrust, LLCSelective redaction of media content
US11533315B2 (en)2021-03-082022-12-20OneTrust, LLCData transfer discovery and analysis systems and related methods
US11562078B2 (en)2021-04-162023-01-24OneTrust, LLCAssessing and managing computational risk involved with integrating third party computing functionality within a computing system
US11816224B2 (en)2021-04-162023-11-14OneTrust, LLCAssessing and managing computational risk involved with integrating third party computing functionality within a computing system
US12153704B2 (en)2021-08-052024-11-26OneTrust, LLCComputing platform for facilitating data exchange among computing environments
US11620142B1 (en)2022-06-032023-04-04OneTrust, LLCGenerating and customizing user interfaces for demonstrating functions of interactive user environments

Also Published As

Publication numberPublication date
US20140195613A1 (en)2014-07-10
US8069467B1 (en)2011-11-29

Similar Documents

PublicationPublication DateTitle
US8578501B1 (en)Anonymous social networking with community-based privacy reviews obtained by members
US10491558B2 (en)Systems and methods for enabling dialog amongst different participant groups with variable and association-based privacy
Eynon et al.The ethics of internet research
US8387122B2 (en)Access control by testing for shared knowledge
KR101527476B1 (en)Evaluating claims in a social networking system
VinsonThe blurred boundaries of social networking in the legal field: Just face it
McPeakThe Facebook digital footprint: Paving fair and consistent pathways to civil discovery of social media data
US20130097261A1 (en)Safe and monitored virtual world
Reichel et al.'I have too much respect for my elders': Understanding South African Mobile Users' Perceptions of Privacy and Current Behaviors on Facebook and {WhatsApp}
Shields et al.Comparing the social media in the United States and BRIC nations, and the challenges faced in international selection
Wagner et al.Hide and seek: location sharing practices with social media
Tominaga et al.How self-disclosure in Twitter profiles relate to anonymity consciousness and usage objectives: a cross-cultural study
HartzogSocial Data
HerbertWorkplace consequences of electronic exhibitionism and voyeurism
DevPrivacy-preserving conversational interfaces
Lou et al.Behind the Same Mask: Understanding the Practice of Spontaneous Collective Anonymity on Chinese Social Platforms
GosseMore barriers than solutions: Women’s experiences of support with online abuse
Wang et al." Is Reporting Worth the Sacrifice of Revealing What I've Sent?": Privacy Considerations When Reporting on {End-to-End} Encrypted Platforms
Wang et al.From Inquisitorial to Adversarial: Using Legal Theory to Redesign Online Reporting Systems
SessionsExploring Personal Protection Strategies of Cybersecurity Specialists in Social Media
Kent IIIUser Perceptions of the Impact of Anonymity on Collaboration Using Enterprise Social Media
Broughton et al.Workplaces and social networking
Metzger-RiftkinOpen, Out, and Online: Privacy Management and Queer Relating in Consensually Nonmonogamous Relationships
ROSLI et al.INFORMATION REVELATION AND INTERNET PRIVACY ON MOBILE SOCIAL NETWORK SITE (FACEBOOK): A CASE OF UNDERGRADUATE STUDENTS IN SCHOOL OF BUSINESS MANAGEMENT, UUM
WildermuthEvaluation of Hennepin County's Community Productive Day Construction Pathway Program

Legal Events

DateCodeTitleDescription
REMIMaintenance fee reminder mailed
LAPSLapse for failure to pay maintenance fees

Free format text:PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.)

STCHInformation on status: patent discontinuation

Free format text:PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FPLapsed due to failure to pay maintenance fee

Effective date:20171105


[8]ページ先頭

©2009-2025 Movatter.jp