Movatterモバイル変換


[0]ホーム

URL:


Skip to main content

Advertisement

Springer Nature Link
Log in

Is It Harmful? Re-examining Privacy Concerns

  • Chapter
  • First Online:

Abstract

The increased popularity of interconnected devices, which we rely on when performing day-to-day activities expose people to various privacy harms. This paper presents findings from the empirical investigation of privacy concerns. The study revealed that people, regardless of their diversity, perceive privacy harms as generic and simplified models, not individually as suggested in Solove’s framework. Additionally, the results identified differences in privacy concerns related to information disclosure, protection behavior, and demographics. The findings may benefit privacy and system designers, ensuring that policies and digital systems match people’s privacy expectations, decreasing risks and harms.

You have full access to this open access chapter, Download chapter PDF

Similar content being viewed by others

Keywords

1Introduction

The widespread Internet availability and access to various devices, from PCs, through mobile to smart devices, enabled the establishment of an ecosystem of interconnected applications. People adapt these technologies and feed them with a large amount of data. Such applications assist people with performing most of their daily activities, including socializing, healthcare, financial transactions, work and more. People voluntarily, and sometimes unknowingly contribute data to Internet-based applications, and that may expose them to privacy risks, violations, and harms.

Due to the increasing amount of security breaches, digital privacy became a subject of public debate. The news about data leakages and their potential effects frequently appear in media, informing the audience about the potential privacy risks. Since privacy violations are in the center of interest, governments and policymakers introduced legal guidelines and regulations aiming to protect personal data, such as the General Data Protection Regulation (GDPR) in Europe [42] or FTC requirements in USA [49]. Simultaneously, the academic research resulted in multiple studies about online privacy, demonstrating that people are concerned about their data, nevertheless, they trade them for potential benefits arising from applications [4,16,55]. Despite the efforts of researchers and policymakers, as well as increased privacy awareness raised in media, people’s attitudes and behaviors remain unchanged. Regardless of their concerns, people provide personal information to online companies to use their services, ensure social interactions, improve well-being and more.

The aim of this study is to investigate privacy perceptions and to re-examine some of the privacy behaviors. The primary contribution of this research is a novel instrument to measure privacy attitudes, Privacy Harms Concerns (PHC) scale. Following the recommendation of the past research [28], we usedprivacy harms identified by Daniel Solove as a foundation for the scale’s development [48] (Table 1). The goal of this research was to identify how people perceive privacy concerns relevant to harms (to ensure consistency labeled privacy concerns throughout the article). The results confirmed that people, in spite of their diversity, tend to have rather comprehensive and simplified view of privacy concerns, perceiving their severity and importance in a similar manner. Regardless of this general tendencies, we identified differences in privacy perceptions, information disclosure, and protection behaviors. Additionally, the findings demonstrate a potential for demographic differences in privacy concerns. Overall, the results contribute to further understanding of people’s privacy attitudes and behaviors.

Table 1. Typology of privacy harms according to the Solove’s framework.

2Related Work

2.1Privacy Attitudes: Concerns and Harms

According to Westin, privacy concern is the intention to protect personal information from others [10]. Thus it carries a negative weight and should result in preventive or protective actions. As defined by Campbell, the information privacy concern is a subjective notation concentrated around theinput, use and control of data [32]. Therefore, the information concern is related to the flow of data between the user and involved data processors. The online privacy research recognized various antecedents of privacy concerns such as trust, risk perception, previous privacy experience, privacy awareness, personality traits and demographic differences [8,21,26,31,46,52]. Some of this research investigated the influence of concerns on privacy behaviors but the results are inconsistent. Some studies show that despite concerns, people disclose information, however, it is a natural consequence of being a part of community [31]. On the other hand, there is a large volume of research illustrating, that regardless of privacy concerns, people tend to share their information, and their decisions are based on cost and benefit trade-off [2,18,44]. This so-calledprivacy paradox is frequently explained by factors such as information asymmetry [1,2,5] or psychological biases and heuristics [9,12,20,25,29].

To the best of our knowledge, privacy concerns have not been investigated from the perspective of privacy harms. Similarly to the notion of privacy itself, there is no clear definition of privacy harm. However, scholars from legislative sector tried to provide a coherent explanation of the term. For instance, Solove identified a privacy problem as a result of harm, claiming that harms do not have to be physical or emotional, theycan occur by chilling socially beneficial behavior (for example, free speech and association) or by leading to power imbalances that adversely affect social structure (for example, excessive executive power) [50]. Similarly, Calo defines harm as a conceptualized negative consequence of privacy violation [11]. Nevertheless, the most comprehensive definition of privacy harms we found was provided by researchers investigating smart grids privacy, De & Métayer. They defined harm asthe negative impact on a data subject, or a group of data subjects, or the society as a whole, from the standpoint of physical, mental, or financial well-being or reputation, dignity, freedom, acceptance in society, self-actualization, domestic life, freedom of expression, or any fundamental right, resulting from one or more feared events [15]. In this research, we follow this definition and consider harms as a multidimensional notion.

The previous research resulted in multiple scales measuring privacy concerns. Such measuring scales are constructed in various ways, for example by asking people directly about their concerns, treating concerns as latent variables or as moderators [38]. For instance, Smith et al. [47] developed Concerns for Information Privacy (CFIP) scale aiming to explore the concerns’ structure. The study identified four dimensions of privacy concerns: improper access, unauthorized secondary use, error, and collection. Malhotra et al. developed Internet Users Information Privacy Concern (IUIPC) scale identifying three dimensions: collection, control, and awareness of privacy practices [32]. According to their research consumers perceive as the most important awareness and control over their data stored by online companies. The IUIPC scale can be applied to privacy research in various contexts. Regardless of the coherent nature of this scale, it seems to be an organization- and consumer-oriented, as authors put it, IUIPC isrepresentation of online consumers’ concerns for information privacy. Buchanan et al. developed another privacy concerns scale, measuring individual privacy issues, asking directly about concerns, for instance regarding personality theft, access to medical records etc. [10].

Considering the definitions of privacy harms and the past research, we want to improve understanding of attitudes and re-examine dimensionality of privacy concerns. Hence, the first research question:

  • RQ 1 How do people perceive privacy harms concerns?

  • RQ 1.1 What are the main dimensions of privacy concerns?

  • RQ 1.2 Are some concerns perceived as more severe than others?

2.2Privacy Behaviors

In order to cross-validate findings of privacy concerns, some research examined their relations with other attitudinal or behavioral factors, such as information disclosure or protection behavior.

According to the research, the information disclosure behavior varies, depending on psychological states [40], risk perceptions [14,56], trust and more. Several studies explored the relationship between privacy concerns and information disclosure. For example, research showed significant effects of privacy concerns on information disclosure, influenced by psychological biases, such as optimism bias, over-disclosure and others [29,39,52]. Similarly, the researchers found evidence of irrational behavior, when people tend to disclose data knowingly about the potential risks [17].

In this research, we will not examine factors influencing privacy concerns or the direction of the relationship between attitude and behavior. Instead, we focus on a variance of privacy concerns among people who disclose (or not) sensitive or non-sensitive information. Hence, our next research question:

  • RQ 2 Is there a relationship between privacy concerns and privacy behavior?

  • RQ 2.1 Do privacy concerns vary among people disclosing and not-disclosing non-sensitive information?

  • RQ 2.2 Do privacy concerns vary among people disclosing and not-disclosing sensitive information?

The past privacy research identified control as an important factor influencing privacy behaviors [5,19]. To achieve control over online information disclosure people apply different protection measures. Some use technical protections, such as anti-malware or anti-virus software, add blockers, or other privacy enhancing technologies. Others may be more careful about their physical privacy (hiding PIN, shredding documentation), limit information provided to social networks (such as reduction of the posts’ audience, limited profile visibilities etc.), decrease number of online profiles or even entirely resign from the online presence. The relationship between privacy concerns and protection is unclear. There is some research claiming, that such relationship exists, however, the correlations are low and people less concerned about privacy use more of protective measures [3,37].

Considering the past research demonstrating that the relationship between concerns and behavior exists, we ask following questions:

  • RQ 2.3 Is there a relationship between people’s privacy concerns and general privacy caution?

  • RQ 2.4 Is there a relationship between people’s privacy concerns and technical privacy protection?

2.3Demographics

To assess individual differences in privacy concerns some of the researchers used demographics, such as geographic/cultural differences, age, education or gender [13,41]. However, the results of studies investigating demographic dependencies are inconclusive. For instance, there are studies claiming that gender impacts privacy perceptions and females are more concerned about their data than males. However, some of these findings show that the impact of gender on privacy attitudes and behaviors is indirect or insignificant [6,23,36]. Regarding the age, there seems to be a general tendency that older generations are more concerned about their privacy than the younger ones [34,51]. Nevertheless, it does not mean that younger people ignore it. In contrary, the research demonstrates that younger people use technical protection measures to better manage their privacy [33].

The previous research associated privacy concerns with geographic/cultural background [7,54]. The geographic divide was confirmed in the qualitative study of seven European countries, identifying main privacy concerns influencing information disclosure and a variety of privacy fears among different nationalities [34]. Similarly, other studies showed differences among respondents from North America and Europe [45], and France and Hong Kong [22]. Such differences were accredited to cultural dimensions, for instance, assertiveness or gender egalitarianism [41,53].

Considering the previous research’s findings, we aim to examine whether there are any significant demographic differences in privacy concerns and behaviors among the participants of our study. Hence, our last research question:

  • RQ 3 Do privacy concerns differ depending on the demographic background?

3Method

The online survey was created to answer the research questions. It contained 80 questions, divided into thematic sections, such as participants’ demographics, opinions related to data collection and processing, security, identity, and personal questions. To measure the responses, we used mixed design, including questions collecting responses on the scale ranging from 0 to 100 (strongly disagree/strongly agree; never/always) and multiple choice questions.

Before participating in the survey, respondents were presented with informed consent, explaining what type of information will be requested during the survey, what is the study purpose and who should be contacted in case of any questions. Each participant had to agree to the informed consent and confirm that he/she is over 18 years old.

3.1Instrument

The online survey consisted of three major sections: the new scale to measure privacy concerns, and two scales acquired from the past research, measuring privacy behaviors. Due to the thematic division of the survey and to ensure the instrument’s consistency, some of the questions from the PHC were mixed with questions from the scale measuring protection behavior.

To create the new scale, we applied the privacy harms framework defined by Solove [48]. We developed the 48 items scale derived from Solove’s 16 privacy harms. Solove categorized 16 privacy harms into four groups, which are presented in Table 1. Solove’s work addresses privacy harms from the legal perspective, however, in the past it was used in the information privacy research [27]. Additionally, we believe that privacy harms may be recognizable and meaningful, since the framework origins from court cases and real-life examples. Originally we aimed to measure each individual privacy harm, hence we used three items for each of them. The instrument collected continuous data, scores ranging from 0 to 100 (strongly disagree/strongly agree). After all data were collected, some of the items were modified, to ensure scores’ consistency.

The scale measuring information disclosure was acquired from Joinson et al. [24]. It consisted of 11 items, asking respondents questions of personal nature. To ensure consistency, the information disclosure scale was modified and did not include two questions requiring respondents to type answers in the text boxes. The scale aimed to measure disclosure of sensitive and non-sensitive information. The sensitive items were measured by asking intimate questions, such as ‘How many different sexual partners have you had?’. The non-sensitive items contained less invasive questions, for instance ‘Are you right or left handed?’. The disclosure level was measured by providing respondents with option ‘I prefer not to say’, which if chosen was coded as 1 (don’t disclose). All other responses were coded 0. In a result participants who do not disclose scored 5 per sensitive and 4 per non-sensitive items. All other participants were treated as disclose group. This resulted in division of respondents to two groups: disclosing sensitive information (\(N=273\)) and non-disclosing sensitive information (\(N=109\)), and disclosing non-sensitive (\(N=325\)) and non-disclosing non-sensitive information (\(N=57\)).

The second scale acquired from the previous research aimed to measure protection behavior [10]. It consisted of 12 items, 6 measuring a general privacy caution and 6 measuring technical protection [10]. To ensure consistency we modified the scale, and instead of Likert scale, we applied range scores. In a result, we collected continuous data with scores ranging from 0 to 100 (never/always).

3.2Data Collection

The online survey was distributed on two platforms, Microworkers and CallForParticipants (CFP). Participation in the survey was voluntary. Microworkers’ participants received financial compensation $1–$1.50 per response, while CFP respondents did not receive any compensation. The total number of participants reached 437 (375 from Microworkers, 62 from CallForParticipants), however, only 382 responses were valid. On Microworkers the response validity was checked automatically. Additionally, all responses were monitored manually, one by one. Furthermore, any surveys completed in less than five minutes or longer than four hours were removed. Participants had to respond to all questions and in a result, there was no missing data; the survey allowed respondents to backtrack and amend responses. Each respondent could participate in the survey only once.

Furthermore, to decrease the possibility of statistical bias, the data set was scanned for outliers. As recommended in the literature, instead of using a standard method for detecting extreme cases, such as the mean plus/minus two or three standard deviations [30], we applied 3x Inter-quartile Range. All responses that contained outliers were removed from the analysis, which left the sample of 382 responses.

To assess the desired demographics, we used a geographic cluster sampling, with cluster sizes aiming to reach 100 respondents each. Choice of geographic areas was based on the results from the Data Protection Eurobarometer [35]. We focused on four geographic areas: UK, USA, Italy and Nordic countries (Sweden, Norway, Finland, Denmark, and Germany). Among the respondents\(57.9\%\) (\(N\,=\,221\)) were males and\(42.1\%\) (\(N\,=\,161\)) females; the average age was 32 years (\(Min\,=\,18; Max\,=\,70\)). The full demographics beak-down is presented in Table 2.

Table 2. Participants demographics

4Results

4.1Dimensions of Privacy Concerns

To assess the answer to the RQ 1 we commenced with investigating its sub-question:What are the dimensions of privacy concerns? (RQ 1.1). We created the PHC and used the Exploratory Factor Analysis (EFA) to assess dimensions of privacy concerns.

The EFA was used because it allows to ascertain factors that may explain correlations between variables, but it does not require underlying theoretical structure [43]. The Kaiser-Meyer-Olkin measure (.903) and Bartlett test for sphericity (significant at the level\(p <.001\)) confirmed EFA’s suitability. We used orthogonal rotation, varimax presuming that the correlations between the variables are weak.

To extract factors, we used the principal axis factoring (PAF) allowing to measure the latent structure of variables and their relationships [43]. From the original 48 items 30 items remained, after removing factors with communalities\(<.3\), item loadings\(<.3\) and factors consisting of less than three loaded items.

After applying the solution and scree plot analysis, we extracted seven factors, identifying people’s perceptions of privacy concerns:unauthorized access,misuse of data,secondary use of data,insecurity,exposure,interrogation,distortion. When computing the internal consistency for the scale based on the factors, the Cronbach alpha scores for the identified factors were all above .7 (Table 3).

Additionally we computed the means for each dimension of the privacy concerns as demonstrated in Table 4. We used the means in further analysis, to assess the relationship with behavior and investigate demographics.

Table 3. The results of Exploratory Factor Analysis;\(N=382.\)
Table 4. Means of the privacy concerns dimensions,N = 382.

4.2Information Disclosure

To asses the differences in concern between respondents who disclose sensitive/non-sensitive items (RQ 2.1 and RQ 2.2) we performed the independent-samplet-Test. We checked the outcomes of Levene’s test that were significant at level\(<.05\), hence we report the results for equal variances not assumed.

We found a significant difference among respondents that disclose (M = 70.5, SD = 20.6) and do not disclose (M = 77.1, SD = 17.5) sensitive information about thesecondary use of data,\(t(380)\,=\,-2.9\),\(p\,=\,.002\); andinterrogation (M = 42.9, SD = 21.4; M = 53.3, SD = 18.8 respectively),\(t(380)\,=\,-4.4\),\(p<.001\).

We identified the same type of concerns among participants disclosing non-sensitive information. The respondents who did not disclose information (M = 77.6, SD = 19) were significantly more concerned about thesecondary use of data than those who disclose it (M = 71.5, SD = 20.1),\(t(380)\,=\,-2.1\),\(p\,=\,.029\); the same behavior was observed regardinginterrogation (M = 52.9, SD = 21.9; M = 44.6, SD = 20.9 respectively),\(t(380)\,=\,-2.7\),\(p\,=\,.010\).

4.3Protection Behavior

To determine the relationship between privacy concerns and protection behaviors (RQ 2.3 and RQ 2.4) we performed Pearson Correlation tests, and examined scatter plots for the correlated variables (Table 5).

Table 5. Correlations between privacy concerns and protection behaviors;\(N=382\).

We identified significant correlations between general caution and technical protection behavior, and privacy concerns, ranging between\(r=.184\) and\(r=.404\) (Table 6). The results demonstrate positive correlations for general caution and concerns aboutunauthorized access, misuse of data, insecurity, exposure anddistortions, and a negative correlation forinterrogation. Similarly, positive correlations were found for technical protection behavior andunauthorized access, misuse of data, secondary use of data, insecurity, exposure anddistortions. However, we did not identify a relationship between general caution andsecondary use, as well as between technical protection andinterrogation.

4.4Demographics

We conducted One-Way Analysis of Variance (ANOVA),t-Tests and Chi-Square to analyze whether there are significant differences in privacy concerns among people from various demographics (RQ 3).

First, we analyzed responses of participants from different geographic locations (Table 7). There were significant effects forsecondary use of data (\(F(3, 381) = 5.010; p =.002\)),interrogation (\(F(3,381)= 3.241; p =.022\)) anddistortion (\(F(3,381) = 2.885; p =.036\)). The post-hoc Tukey test results confirmed significant differences (\(p =.001\)) between Italy (M = 77.2; SD = 19.9) and the UK (M = 77.2; SD = 17.9) regarding thesecondary use of data. Similarly, there was a significant difference (\(p = .038\)) between Italy (M = 40.3; SD = 20.5) and the Nordic Countries (M = 49.8; SD = 19.7), and Italy and the UK (M = 48.3; SD = 20.7), (\(p =.034\)) in concerns related tointerrogation. Additionally, we found a significant difference (\(p =.017\)) between the USA (M = 68.4; SD = 20.1) and Nordic Countries (M = 60.6; SD = 19.7), and the USA and Italy (M = 60.4; SD = 23.0) about distortion (\(p =.010\)).

Table 6. Differences in privacy concerns among participants from different geographic areas;\(N=382\),\(p<.05\) (One-Way ANOVA).

We performed One-way ANOVA and the post-hoc Tukey test to asses whether there are potential differences in privacy concerns, protection behavior and information disclosure among participants from different age groups. For this purpose we divided our sample to four age groups: 18–24, 25–34, 35–44 and over 45 years old. We found a significant effect of age on concerns about theunauthorized access (\(F(3,378)=4.860, p=.002\)),misuse of data (\(F(3,378)=3.094, p=.027\)),secondary use of data (\(F(3,378)=3.162, p=.013\)),insecurity (\(F(3,378)=4.710, p=.003\)) andexposure (\(F(3,378)=3.759, p=.011\)). The participants belonging to 35–44 and 18–24 years old groups differed in perception aboutunauthorized access andmisuse of data;over 45 and 18–24 differed in perceptions ofexposure; over 45 differed from 18–24 and 25–34 years old in perception ofsecondary use of data. Lastly, participants belonging to 35–44 andover 45 years old differed from the 18–24 years old in concerns aboutinsecurity. We did not find any significant differences among participants from different age groups in relation to protection behavior and information disclosure.

Lastly, we used the independentt-Test to see whether there are significant gender differences about privacy concerns, but we did not find any\(p<.05\). Similarly, we did not identify any gender dependencies in regards to both general caution and technical protection. Furthermore, we used Chi-Square test to determine whether the sensitive and non-sensitive information disclosure differed among males and females, however, once again the results were insignificant.

5Discussion

To improve understanding of privacy perceptions we investigated privacy harms by creating the new scale measuring privacy concerns (RQ 1). As we wanted to achieve a greater understanding of people’s attitudes, we used the legal framework as a basis for the study design. The results demonstrated, that privacy perceptions vary from those identified by Solove. However, there are some resemblances. While Solove proposed to consider harms at the individual level, the results showed that people express privacy concerns differently. They tend to perceive concerns as comprehensive and simplified models. Possibly, such perception is related to the cognitive information processing, intending to decrease the cognitive effort and use affect heuristics.

We identified seven dimensions of privacy concerns:insecurity, exposure, unauthorized access, secondary use of data, misuse of data, distortion andinterrogation. The analysis of the means suggests that people express high concerns aboutsecurity. They want to be informed about data security breaches and in general, they expect that online services will guarantee safety. According to the findings, people worry aboutexposure, which may suggest that they care about online presence and information visibility. They want to be in control of personal information, ensuring that none of it is used without their knowledge or permission. The findings show general worries about thesecondary use of data, such as selling or sharing data with external organizations, and aboutmisuse of data, such as blackmail or malicious use of information by strangers to reach their own goals.Distortion seems to be less important, andinterrogation is perceived as the least severe. Consideringinterrogation, paradoxically, respondents expressing concerns about secondary use or misuse of information did not find the information probing important. Overall, the new dimensions show similarities to Solove’s findings. The results show that almost all of the harms defined by Solove are subject to concern, however, not at the individual level and not accordingly with the process of information flow. Additionally, it seems that invasions are the one group of harms which is perceived as less severe than others.

The identified dimensions of privacy concerns relate to findings from the past research. For instance an improper access and secondary use of data, the two of four dimensions defined by CFIP [47]. Similarly, our findings relate to the factors identified by IUIPC: collection (interrogation andinsecurity) and control (exposure,distortion) [32]. The seven dimensions of PHC add to the previous scales by identification of wider range of concerns. Our findings origin from participants with broad demographics, while CFIP was based on students and professionals from business environment, IUIPC was customer oriented. Furthermore, the PHC uncovers issues related to theself (me as a person and as a part of the society), such asdistortion orexposure, showing that personal image, online reputation, fear of the damages, which could be caused by disclosed data are important factors causing privacy concerns.

Additionally, we investigated whether privacy perceptions differ among people who disclose sensitive and non-sensitive information (RQ 2). The findings demonstrate that privacy concerns of participants who do not disclose both sensitive and non-sensitive information differ from those who disclose information. Respondents who do not disclose information expressed concerns about their data being sold to third parties and about providing feedback related their online activities. This result suggests that people concerned about their data ownership use preventive methods, such as non-disclosure, to ensure that none of their information, whether it has sensitive or non-sensitive nature, is provided to the online companies. Additionally, the results found that people’s privacy concerns are the same among those who disclose/non-disclose sensitive and non-sensitive information. Presumably, if one worries about the privacy, he/she will behave in the same way regardless of information sensitivity.

Further, the study identified relationships between protection behaviors and privacy concerns (RQ 2). Despite the low correlations between protection behaviors and privacy concerns, scatter-plots’ analysis confirmed the relationships. Respondents with higher technical protection behavior seemed to have high concerns about theunauthorized access,misuse andsecondary use of data,insecurity,exposure anddistortions. The same applies to general caution, except there is no correlation withsecondary use of data. Instead, the higher general caution, the higherinterrogation concerns. Interestingly, our results did not find any correlation between technical protection andinterrogation. This may suggest, that people using different technical protections may feel confident that data will not be sold or transferred to unknown organizations, because of users’ preventive measures. On the other hand, it may be related to the fact, that people do not perceiveinterrogation as a very severe concern.

The demographic results indicate possible differences in privacy perceptions among respondents from different geographic locations, education and age groups (RQ 3). We identified differences between respondents from different countries. This could imply the role of cultural diversity in shaping people’s concerns. However, due to the small sample size, our findings are only an indication of possible cultural dependencies, which require further studies.

Considering other demographics, our results show that people from older generations express more concerns about privacy than the younger generations, confirming findings from the previous research [51]. The age divide may be explained by the fact that older people have more experience, awareness, and knowledge related to privacy violations. Also, the younger population may use internet as a tool for communications, to develop social relationships or as a source of leisure activities, while older people may use it to cope with day-to-day activities, such as work, financial transactions, information source. For that reason, older generation may add more value to their online information, and in a result express stronger privacy concerns. On the other hand, as demonstrated in the past research, the younger generation may express fewer concerns due to their protection behaviors.

Limitations. There is a number of limitations in this study. The method: self-reported survey, may decrease validity and reliability of the results. However, as the study was designed to reach international respondents within a short time, this method was the most effective. Similarly, the enlarged sample size could improve the results, especially the demographic assumptions. Furthermore, the research explored general privacy concerns and did not investigate whether they would change considering specific context, for instance, different technologies. The collected data did not allow to model causal relationships between concerns and behaviors. The investigation of causal relations could provide a better overview on the role of privacy concerns in the decision making.

6Conclusion

This study contributes a new measurement instrument for privacy concerns. To differentiate it from the existing privacy scales, we aimed to shift the focus of privacy concerns to privacy harms, based on the framework developed by Solove. We demonstrated that identified privacy concerns vary among individuals, by analyzing self-reported behavior and demographics. The new instrument can be used in future studies assessing privacy attitudes.

Additionally, the results suggest that there are some general tendencies in privacy concerns. The findings show that people create simplified models of privacy harms, such as worries about security, unlawful use of data, disclosure or exposure. All of these concerns can be addressed by developers and designers to ensure privacy. Due to the similarities among people from different demographics, we can assume that there is a potential to build systems with ‘privacy for all’ or ‘privacy with no borders’.

Future Work. Our privacy scale requires further validation in qualitative and quantitative studies. For instance, to improve the scale it is recommended to implement it in experiments of the actual privacy behavior, using the PHC as pre- and/or post-questionnaire. Our results will be fundamental to develop models for instruments influencing peoples’ behavior, nudging people’s privacy choices and improving their privacy risk awareness. Similarly, further studies of PHC could result in the set of guidelines for developers and designers of privacy enhancing technologies (PET). Such guidelines could enable easier assessment of people’s privacy needs, improving usability of PETs and in a result increasing users’ satisfaction.

References

  1. Acquisti, A., Brandimarte, L., Loewenstein, G.: Privacy and human behavior in the age of information. Science347(6221), 509–514 (2015)

    Article  Google Scholar 

  2. Acquisti, A., Grossklags, J.: Privacy attitudes and privacy behavior. In: Economics of Information, Security, pp. 1–15 (2004)

    Google Scholar 

  3. Acquisti, A., Grossklags, J.: Privacy and rationality in individual decision making. challenges in privacy decision making. the survey. IEEE Secur. Priv.3(1), 26–33 (2005)

    Article  Google Scholar 

  4. Acquisti, A., Taylor, C.: The Economics of privacy. J. Econ. Lit.52, 1–64 (2016)

    Google Scholar 

  5. Adjerid, I., Acquisti, A., Brandimarte, L., Loewenstein, G.: Sleights of privacy: framing, disclosures, and the limits of transparency. In: Symposium on Usable Privacy and Security (SOUPS), p. 17 (2013)

    Google Scholar 

  6. Sheehan, K.B.: An investigation of gender differences in on-line privacy concerns and resultant behaviors. J. Interact. Market.13(4), 24–38 (1999)

    Article  Google Scholar 

  7. Bellman, S., Johnson, E.J., Kobrin, S.J., Lohse, G.L.: International differences in information privacy concerns: a global survey of consumers. Inf. Soc.20(5), 313–324 (2004)

    Article  Google Scholar 

  8. Bergström, A.: Online privacy concerns: a broad approach to understanding the concerns of different groups for different uses. Comput. Hum. Behav.53, 419–426 (2015)

    Article  Google Scholar 

  9. Brandimarte, L., Acquisti, A., Loewenstein, G.: Misplaced confidences: privacy and the control paradox. Soc. Psychol. Personal. Sci.4(3), 340–347 (2013)

    Article  Google Scholar 

  10. Buchanan, T., Paine, C., Joinson, A.N., Reips, U.-D.: Development of measures of online privacy concern and protection for use on the internet. J. Assoc. Inf. Sci. Technol.58(2), 157–165 (2007)

    Article  Google Scholar 

  11. Calo, R.: The boundaries of privacy harm. Indiana Law J.86(3), 1131 (2011)

    Google Scholar 

  12. Camp, L.J.: Mental models of privacy and security. IEEE Technol. Soc. Mag.28(3), 37–46 (2009)

    Article  Google Scholar 

  13. Cho, H., Rivera-Sánchez, M., Lim, S.S.: A multinational study on online privacy: global concerns and local responses. New Media Soc.11(3), 395–416 (2009)

    Article  Google Scholar 

  14. Coventry, L., Jeske, D., Briggs, P.: Perceptions and actions: combining privacy and risk perceptions to better understand user behaviour. In: Symposium on Usable Privacy and Security (SOUPS) (2014)

    Google Scholar 

  15. De, S.J., Le Métayer, D.: Privacy harm analysis: a case study on smart grids. In: Security and Privacy Workshops (SPW), pp. 58–65. IEEE (2016)

    Google Scholar 

  16. Dinev, T., Hart, P.: Internet privacy concerns and their antecedents - measurement validity and a regression model. Behav. Inf. Technol.23(6), 413–422 (2004)

    Article  Google Scholar 

  17. Egelman, S.: “My profile is my password, verify me!”: the privacy/convenience tradeoff of facebook connect. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2369–2378. ACM (2013)

    Google Scholar 

  18. Fagan, M., Khan, M.M.H.: “Why do they do what they do?": a study of what motivates users to (not) follow computer security advice. In: Proceedings of the Symposium On Usable Privacy and Security (SOUPS) (2016)

    Google Scholar 

  19. Fogel, J., Nehmad, E.: Internet social network communities: risk taking, trust, and privacy concerns. Comput. Hum. Behav.25(1), 153–160 (2009)

    Article  Google Scholar 

  20. Gambino, A., Kim, J., Sundar, S.S., Ge, J., Rosson, M.B.: User disbelief in privacy paradox: heuristics that determine disclosure. In: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 2837–2843. ACM (2016)

    Google Scholar 

  21. Harbach, M., Hettig, M., Weber, S., Smith, M.: Using personal examples to improve risk communication for security & privacy decisions. In: Proceedings of the 32nd Annual ACM Conference on Human factors in Computing Systems, CHI 2014, pp. 2647–2656 (2014)

    Google Scholar 

  22. Ho, K.T., Li, C.: From privacy concern to uses of social network sites: a cultural comparison via user survey. In: Proceedings - 2011 IEEE International Conference on Privacy, Security, Risk and Trust and IEEE International Conference on Social Computing, PASSAT/SocialCom, pp. 457–464 (2011)

    Google Scholar 

  23. Hoy, M.G., Milne, G.R.: Gender differences in privacy-related measures for young adult facebook users. J. Interact. Advert.10(2), 28–45 (2010)

    Article  Google Scholar 

  24. Joinson, A.N., Paine, C., Buchanan, T., Reips, U.D.: Measuring self-disclosure online: blurring and non-response to sensitive items in web-based surveys. Comput. Hum. Behav.24(5), 2158–2171 (2008)

    Article  Google Scholar 

  25. Kehr, F., Wentzel, D., Kowatsch, T.: Privacy paradox revised: pre-existing attitudes, psychological ownership, and actual disclosure. In: IS Security and Privacy, pp. 1–12 (2014)

    Google Scholar 

  26. Kehr, F., Wentzel, D., Kowatsch, T., Fleisch, E.: Rethinking privacy decisions: pre-existing attitudes, pre-existing emotional states, and a situational privacy calculus. In: ECIS 2015 Completed Research Papers (2015)

    Google Scholar 

  27. Knijnenburg, B.P., Kobsa, A.: Making decisions about privacy: information disclosure in context-aware recommender systems. ACM Trans. Interact. Intell. Syst.3(23), 1–23 (2013)

    Article  Google Scholar 

  28. Kokolakis, S.: Privacy attitudes and privacy behaviour: a review of current research on the privacy paradox phenomenon. Comput. Secur.7(2), 1–29 (2015)

    Google Scholar 

  29. Krasnova, H., Kolesnikova, E., Guenther, O.: “It won’t happen to me!”: self-disclosure in online social networks. In: AMCIS 2009 Proceedings, p. 343 (2009)

    Google Scholar 

  30. Leys, C., Ley, C., Klein, O., Bernard, P., Licata, L.: Detecting outliers: do not use standard deviation around the mean, use absolute deviation around the median. J. Exp. Soc. Psychol.49(4), 764–766 (2013)

    Article  Google Scholar 

  31. Lutz, C., Strathoff, P.: Privacy concerns and online behavior - Not so paradoxical after all? Multinationale Unternehmen und Institutionen im Wandel Herausforderungen für Wirtschaft, Recht und Gesellschaft, pp. 81–99 (2013)

    Google Scholar 

  32. Malhotra, N.K., Kim, S.S., Agarwal, J.: Internet users’ information privacy concerns (iuipc): The construct, the scale, and a causal model. Inf. Syst. Res.15(4), 336–355 (2004)

    Article  Google Scholar 

  33. Marwick, A.E., Boyd, D.: Networked privacy: how teenagers negotiate context in social media. New Media Soc.16(7), 1051–1067 (2014)

    Article  Google Scholar 

  34. Miltgen, C.L., Peyrat-guillard, D.: Cultural and generational influences on privacy concerns: a qualitative study in seven European countries. Eur. J. Inf. Syst.23(2), 103–125 (2014)

    Article  Google Scholar 

  35. T. Opinion and Social. Special Eurobarometer 431 ‘Data Protection’. Technical report, European Union (2015)

    Google Scholar 

  36. Park, Y.J.: Do men and women differ in privacy? gendered privacy and (in)equality in the internet. Comput. Hum. Behav.50, 252–258 (2015)

    Article  Google Scholar 

  37. Park, Y.J., Campbell, S.W., Kwak, N.: Affect, cognition and reward: predictors of privacy protection online. Comput. Hum. Behav.28(3), 1019–1027 (2012)

    Article  Google Scholar 

  38. Preibusch, S.: Guide to measuring privacy concern: review of survey and observational instruments. Int. J. Hum. Comput. Stud.71(12), 1133–1143 (2013)

    Article  Google Scholar 

  39. Preibusch, S., Krol, K., Beresford, A.R.: The privacy economics of voluntary over-disclosure in web forms. In: Böhme, R. (ed.) The Economics of Information Security and Privacy, pp. 183–209. Springer, Heidelberg (2013).https://doi.org/10.1007/978-3-642-39498-0_9

    Chapter  Google Scholar 

  40. Raij, A., Ghosh, A., Kumar, S., Srivastava, M.: Privacy risks emerging from the adoption of innocuous wearable sensors in the mobile environment. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 11–20. ACM (2011)

    Google Scholar 

  41. Reed, P.J., Spiro, E.S., Butts, C.T.: “Thumbs up for privacy?": Differences in online self-disclosure behavior across national cultures. Social Science Research (2016)

    Google Scholar 

  42. G.D.P. Regulation: Regulation (eu) 2016/679 of the European parliament and of the council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing directive 95/46. Off. J. Eur. Union (OJ) vol. 59, pp. 1–88 (2016)

    Google Scholar 

  43. Reio, T.G., Shuck, B.: Exploratory factor analysis: implications for theory, research, and practice. Adv. Dev. Hum. Res.17(1), 12–25 (2015)

    Article  Google Scholar 

  44. Roback, D., Wakefield, R.L.: Privacy risk versus socialness in the decision to use mobile location-based applications. ACM SIGMIS Database44(2), 19 (2013)

    Article  Google Scholar 

  45. Sheth, S., Kaiser, G., Maalej, W.: Us and them: a study of privacy requirements across North America, Asia, and Europe. In: Proceedings of the 36th International Conference on Software Engineering, pp. 859–870 (2014)

    Google Scholar 

  46. Slyke, C.V., Shim, J.T., Johnson, R., Jiang, J.: Concern for information privacy and online consumer purchasing. J. Assoc. Inf. Syst.7(6), 415–444 (2006)

    Google Scholar 

  47. Smith, H., Milberg, S., Burke, S.: Information privacy: measuring individuals’ concerns about organizational practices. MIS Q.20(2), 167–196 (1996)

    Article  Google Scholar 

  48. Solove, D.: A taxonomy of privacy. Univ. Pa. Law Rev.154(477), 477–560 (2006)

    Article  Google Scholar 

  49. Solove, D., Hartzog, W.: The FTC and the new common law of privacy. Columbia Law Rev.114(3), 583–676 (2014)

    Google Scholar 

  50. Solove, D.J.: I’ve got nothing to hide and other misunderstandings of privacy. S.Diego Law Rev.44, 745 (2007)

    Google Scholar 

  51. Steijn, W.M., Schouten, A.P. Vedder, A.H.: Why concern regarding privacy differs: The influence of age and (non-) participation on facebook. Cyberpsychology: J. Psychosoc. Res. Cyberspace,10(1) (2016)

    Google Scholar 

  52. Stutzman, F., Capra, R., Thompson, J.: Factors mediating disclosure in social network sites. Comput. Hum. Behav.27(1), 590–598 (2011)

    Article  Google Scholar 

  53. Sun, Y., Wang, N., Shen, X.L., Zhang, J.X.: Location information disclosure in location-based social network services: privacy calculus, benefit structure, and gender differences. Comput. Hum. Behav.52, 278–292 (2015)

    Article  Google Scholar 

  54. Lim, S.S., Cho, H., Rivera-Sanchez, M.: A multinational study on online privacy: global concerns and local responses. New Media Soc.11(3), 395–416 (2009)

    Article  Google Scholar 

  55. Taddicken, M.: The ‘Privacy Paradox’ in the social web: the impact of Privacy concerns, individual characteristics, and the perceived social relevance on different forms of self-disclosure. J. Comput.-Mediated Commun.19(2), 248–273 (2014)

    Article  Google Scholar 

  56. Trepte, S., Dienlin, T., Reinecke, L.: Risky behaviors: How online experiences influence privacy behaviors. Von Der Gutenberg-Galaxis Zur Google-Galaxis, From the Gutenberg Galaxy to the Google Galaxy (2014)

    Google Scholar 

Download references

Acknowledgment

This work has received funding from the European Unions Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No 675730.

To obtain more information about the study or to gain access to the original questionnaire, please contact the corresponding author.

Author information

Authors and Affiliations

  1. Karlstad University, Karlstad, Sweden

    Agnieszka Kitkowska, Erik Wästlund & Leonardo A. Martucci

  2. Tel Aviv University, Tel Aviv, Israel

    Joachim Meyer

Authors
  1. Agnieszka Kitkowska

    You can also search for this author inPubMed Google Scholar

  2. Erik Wästlund

    You can also search for this author inPubMed Google Scholar

  3. Joachim Meyer

    You can also search for this author inPubMed Google Scholar

  4. Leonardo A. Martucci

    You can also search for this author inPubMed Google Scholar

Corresponding author

Correspondence toAgnieszka Kitkowska.

Editor information

Editors and Affiliations

  1. Unabhängiges Landeszentrum für Datenschutz (ULD), Kiel, Germany

    Marit Hansen

  2. Tilburg Institute for Law, Technology and Society (TILT), Tilburg University, Tilburg, The Netherlands

    Eleni Kosta

  3. European Commission - Joint Research Centre, Ispra, Italy

    Igor Nai-Fovino

  4. Karlstad University, Karlstad, Sweden

    Simone Fischer-Hübner

Rights and permissions

Copyright information

© 2018 IFIP International Federation for Information Processing

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Kitkowska, A., Wästlund, E., Meyer, J., Martucci, L.A. (2018). Is It Harmful? Re-examining Privacy Concerns. In: Hansen, M., Kosta, E., Nai-Fovino, I., Fischer-Hübner, S. (eds) Privacy and Identity Management. The Smart Revolution. Privacy and Identity 2017. IFIP Advances in Information and Communication Technology(), vol 526. Springer, Cham. https://doi.org/10.1007/978-3-319-92925-5_5

Download citation

Publish with us

Societies and partnerships


[8]ページ先頭

©2009-2025 Movatter.jp