The term Web 2.0 was “originally coined by Darcy DiNucci, an information architect consultant that wrote an article named Fragmented Future in 1999” (O’Reilly, 2009). However, the concept was made popular by Tim O’Reilly and MediaLive International after a 2004 conference (O’Reilly, 2009). Web 2.0 can be described as the second phase of the World Wide Web: the move from static web pages to dynamic or user generated content, which eventually lead to the rise of social networks such as Facebook, MySpace and LinkedIn (Grandison 2014, p.41-42). “The success of these websites depend to a large extent on the disclosure of personal data by their users, therefore concerns about privacy issues have been raised” (Custers, Van der Hof, and Schermer, 2014). This paper will analyse the rights of social networkers and whether the notion of privacy is dead. Social networkers place high value on privacy however the problem lies in their disinterest of reading and understanding privacy policies (Custers, Van der Hof, & Schermer, 2014).
Social network sites like Facebook and Google+ and user generated content sites such as YouTube and Wikipedia, by their very definition, collect data and content created by their users and are only economically feasible because of this data harvesting. Studies have shown that companies use “algorithms to sort their behaviour and user-generated content for economic benefits derived from big data” (Heyman, De Wolf, and Pierson, 2014) and “In the modern digital era, the quantity of information; the ability to collect, assemble, and analyse it; the ability to store it inexpensively; and the magnitude and rapidity of all aspects of how we approach, use, characterise, and manipulate information are continuously evolving.” (Sexton 2015, p. 2). Cote and Pybes (2007) have illustrated how users easily disclose personal information to maintain their online identities which can be exploited later for economic benefit (Heyman, De Wolf, & Pierson, 2014). In order to increase trust between parties, platforms typically publish written privacy policies which outline to users how their data is being collected and subsequently controlled and used. The storage of information is consensual as users must agree to the policies during registration, however many subjects automatically consent when greeted with a consent request (Böhme, Köpsell, 2010) and only a very small number of people actually read privacy policies, even when they are kept short and written in simple English (Malaga, 2014). There is a clear disconnect between a user’s privacy concerns and their actual desire to peruse website privacy policies. Users generally assume privacy policies are written in complex legal terms and do not attempt to click on their links; perhaps a new method of informing users of privacy practices is required (Malaga, 2014).
“The right to privacy is our right to keep a domain around us, which includes all those things that are part of us, such as our body, home, property, thoughts, feelings, secrets and identity. The right to privacy gives us the ability to choose which parts in this domain can be accessed by others, and to control the extent, manner and timing of the use of those parts we choose to disclose.” (Druckman, Maron, Onn, and Timor, 2005, p.12). The freedom of information age within the digital world has developed a number of platforms that require a possible redefinition of a users’ right to privacy (Druckman et al, 2005). This is because crossing another’s boundaries of privacy is only one click away, and technology is often misused by authorities and individuals (Druckman et al, 2005) to exploit this. Privacy violations are serious due to the “use of new technology, enabling aggregation of an enormous amount of information on an unlimited number of people” (Druckman et al 2005, p. 22-23) which can then reproduced at “minimal costs, transmitted and traded in a manner that does not involve costs of storage or transport” (Druckman et al 2005, p. 22-23). Data mining programs ensure easy categorisation to serve the economic purposes of social networking companies (Druckman et al, 2005). Due to the rapid development of technologies, it is impossible for users to stay aware and informed of their rights to privacy.
The internet belongs to “everyone” globally, and due to this it is very difficult to apply one law to protect the rights of users, however private bodies have developed technologies to protect their users’ privacy (Druckman et al, 2005). For example, LinkedIn allows for “predefined groups (colleagues, family) and then treat these groups differently” (Misra, Gaurav & Such, Jose 2016, p. 97). Facebook provides “default groups but also allow users to manage their own circles and lists per individual, more closely reflecting real-life relationships” (Misra, Gaurav & Such, Jose 2016, p. 97). Most social networking sites are developing socially aware privacy controls which proves the importance of privacy in the digital world (Misra, Gaurav & Such, Jose 2016, p. 98). The users right to privacy must be protected at all times, as privacy “encourages us to express ourselves freely” (Druckman et al, 2005). Users must be allowed to create their own content without fear of their information being disclosed. To ensure users are aware and informed companies must fulfil two responsibilities: the first is to keep the user aware of the data collection and the second is keeping the user informed as to how their data will be used and by whom (Malaga, 2014).
Privacy, as a basic human right, has been and will remain important to users into the future, despite their unwillingness to read and understand privacy policies. What is required is a more flexible way for users to manage their privacy on different platforms in order to make abundantly clear the effects of their actions. Privacy is not dead “officially”, rather it is “effectively” dead. Users unwittingly engage and interact on social media platforms without considering the privacy of their data or lack thereof. Very few users invest the time to read and understand policies regarding their data, effectively rendering policies useless. The policy then in turn only protects the platform, not the user who has agreed to all conditions by default. A great deal of responsibility has been placed on the user. With future generations of technologies developing new privacy controls, it is important to maintain a balance between institutional and social privacy to protect users at both levels.
Reference List
Böhme, Rainer, Köpsell, Stefan. 2010, “Trained to accept?: a field experiment on consent dialogs”, ACM, , pp. 2403.
Coté, Mark, Pybus, Jennifer, 2007, “Learning to immaterial labour 2.0: MySpace and social networks”,Ephemera: Theory and Politics in Organization, Vol. 7 No. 1, pp. 88-106.
Custers, Bart, Van der Hof, Simone, Schermer, Bart, 2014, “Privacy Expectations of Social Media Users: The Role of Informed Consent in Privacy Policies: Privacy Expectations of Social Media Users”, Policy & Internet, vol. 6, no. 3, pp. 268-295.
Druckman, Yaniv, Maron, Tamar, Onn, Yael, Timor, Rom, 2005, Privacy In The Digital Environment. 7th ed., ebook, Haifa: The Haifa Centre of law & Technology, pp.12-35, Available at: https://books.google.com.au/books, Accessed 8 Sep. 2016.
Grandison, Tyrone, 2014, “Security and Privacy in Web 2.0 [Guest editor’s introduction]”, IEEE Internet Computing, vol. 18, no. 6, pp. 41-42.
Heyman, Rob, De Wolf, Ralf, Pierson, Jo, 2014, “Evaluating social media privacy settings for personal and advertising purposes”, info, vol. 16, no. 4, pp. 18-32.
Malaga, Ross, 2014, “DO WEB PRIVACY POLICIES STILL MATTER?”, Academy of Information and Management Sciences Journal, vol. 17, no. 1, pp. 95.
Misra, Gaurav, Such, Jose, 2016, “How Socially Aware Are Social Media Privacy Controls?”,Computer, vol. 49, no. 3, pp. 96-99.
O’Reilly, Tim, 2009, “What is Web 2.0” Google Books. Available at: https://books.google.com.au/books?hl=en&lr=&id=NpEk_WFCMdIC&oi=fnd&pg=PT3&dq=web+2.0&ots=OYSCT8jCEW&sig=tcUQ8XSR7ts-HKadi-n43IffB_o#v=onepage&q=web%202.0&f=false, Accessed 9 Sep. 2016
Peterson, Dane, Meinert, David, Criswell, John, Crossland, Martin, 2007) Journal of Small Business and Enterprise Development 14. 4: 654-669
Sexton, Christopher, 2015, “The interweaving web of privacy and technology in the digital age: Perceived serious invasions of privacy in Australia”, Intellectual Property Forum: journal of the Intellectual and Industrial Property Society of Australia and New Zealand, no. 103, pp. 2-7.