top of page

FOLLOW ME:

  • Facebook Clean Grey
  • Twitter Clean Grey
  • Instagram Clean Grey

RECENT POSTS: 

FEED: Blog post 2

The world of data is fast evolving, with big companies competing to both store and sell your data. Alongside the vast thriving data industry, there are many issues of surveillance, data protection, and governance that are not always positive in the eyes of the user. When creating FEED, an extensive amount of research was dedicated to data methods, storage, and usage, to make the app as ethical as possible within the realms of data.


FEED combines the elements of a food ordering service, with the added benefit of donating a meal to those who need it most. These donated meals are generated through the method of selling the data and information optionally generated by users, the more data a user “donates” the more meals are donated. Furthermore, FEED aims to keep all aspects of data collection as transparent as possible, in compliance with the Data Protection Act (GOV, 2018). FEED also has a clear page with transparent data setting for users to select the data they wish to share; this allows the user to feel in control of the data being donated. Having a platform that is profitable alongside the element of charity, should provide confidence in users to donate their data with trust.


Tufekci states (2014, p.3), data has always been a ‘political process involving questions of power, transparency, and surveillance.' Exploring ideas that living in a world of datafication has caused an excessive amount of data available for everyone and anyone to access. Thus, becoming a systematic process where ‘government surveillance, corporate profiling, algorithmic discrimination, and platform capitalism’ (Beraldo, 2019) are all a huge process within the data world, raising concerns of privacy for users.


One of the primary concerns regarding data is surveillance, previously being limited to video surveillance, but as technology develops the capability to surveille us digitally increases, thus creating surveillance culture (Bigo, Isin, & Ruppert, 2019). Zuckerberg’s Facebook is a clear example of surveillance capitalism, with its business model based on data collection and usage (Dulong de Rosnay, 2020), with its mission to “make the world more open and connected” (Desharnais, 2010). With settings automatically defaulted to allow the sharing of information, such as friend lists as publicly available information, unless amended directly by the user (Desharnais, Jagadeesh, & O’Rourke, 2018). Facebook is evidently ‘open’ with data sharing and use; therefore, it is no surprise Facebook is repeatedly under fire for its data regulations (Forrest, 2019). Lyon (2017) argues surveillance has become so pervasive in our day-to-day lives that the majority comply without question. This is apparent on social media platforms like Facebook and Twitter who use data to generate profits, which goes unquestioned by the masses. As data surveillance continues, there are fears of moving into a post-Orwellian disaster (Ngungumbane, 2020), with the ability of big data to control and scaremonger the masses through the manipulation of our own data.


Accompanying issues of privacy are the discrimination of algorithms. Algorithms are an extension of a data culture, going unquestioned and unchallenged. Computer-generated algorithms do not consider race, gender, or the disadvantaged in their collection process (Kleinberg, Ludwig, Mullainathan, Sunstein, 2019). Arguably, algorithms can specify a target market, enabling a discriminatory process (Grewal, Levy, 2019). For example, search results being flawed with bias due to gender stereotypes (Zou, 2016), such as only manual jobs being presented to men and not women when searching the internet. When a computer collects data, it does not have the capability to consider a user’s personal attributes that are not directly given, only data that is inputted into a system can be analysed, thus creating unintentional discrimination.


Data security, how and where our data is stored, is a large area of discussion. Firstly The Cloud marketed as a digital means that is accessible from anywhere and on the move (Microsoft Azure, 2020). Alternatively, Data Farms marketed with imagery of the countryside, not an industrial factory (Reading, 2014). The names of these data servers are not accidental, giving these mass industrial infrastructures idealistic names, giving connotations of being “green.” Thus, they do not draw upon the issues from global memory factories that require mass non-renewable energy, electricity, and water to function (Reading, 2014). It is apparent storing mass data comes at an environmental cost.


Due to the conflict that arises within data storage, choosing an “ethical” server to store FEED’s data proved difficult. Ecosia, arguably one of the most transparent in their digital activities (Ecosia, 2020), stores their data with DigiCert ‘one of the largest and most trusted SSL certificate authorities working towards better ways to secure the internet (Digicert, 2020). Therefore, it seemed appropriate and trustworthy to use their platform also.


It is clear when developing an app data must be taken into consideration, as more data privacy scandals arise, the more the masses will question the integrity of companies who store their data. The careful consideration of storage and transparency must be explored and protected in an ethically viable manner.


Blog post 1 HERE

Blog post 3 HERE


References:

Beraldo, M. (2019). From data politics to the contentious politics of data. Big Data & Society, 6(2), 205395171988596–. https://doi.org/10.1177/2053951719885967


Bigo, D., Isin, E., & Ruppert, E. (2019). Data politics : worlds, subjects, rights . Routledge.


Desharnais, J. (2010). Facebook: The Evolution of Privacy? In Facebook: The Evolution of Privacy? The Eugene D. Fanning Center for Business Communication, Mendoza College of Business, University of Notre Dame. https://doi.org/10.4135/9781526404602

Desharnais, Y., Jagadeesh, N., & O’Rourke, J. (2018). Facebook : the evolution of privacy? . SAGE Publications Ltd.


DigiCert. (2020). SSL Certificates to Meet Any Need. Retrieved from https://www.digicert.com/ssl-certificate/


Dulong de Rosnay, M. (2020). Alternatives for the Internet: A Journey into Decentralised Network Architectures and Information Commons. TripleC, 18(2), 622–629. https://doi.org/10.31269/triplec.v18i2.1201


Ecosia. (2020). Financial Reports. Retrieved from https://blog.ecosia.org/ecosia-financial-reports-tree-planting-receipts/


Forrest, A. (2019). Facebook data scandal: Social network fined $5bn over 'inappropriate' sharing of users' personal information. Retrieved from https://www.independent.co.uk/news/world/americas/facebook-data-privacy-scandal-settlement-cambridge-analytica-court-a9003106.html


GOV. (2018). The Data Protection Act. Retrieved from https://www.gov.uk/data-protection


Grewal, D. Levy, M. (2019). Are Algorithms Discriminatory? Questions About How Facebook Targets Advertising. Retrieved from https://grewallevymarketing.com/2019/06/26/are-algorithms-discriminatory-questions-about-how-facebook-targets-advertising/


Kleinberg, J. Ludwig, J. Mullainathan, S. Sunstein, C. (2019). Discrimination in the Age of Algorithms, Journal of Legal Analysis, Volume 10, 2018, Pages 113–174, https://doi.org/10.1093/jla/laz001


Lyon, D. (2017). Surveillance culture: Engagement, exposure, and ethics in digital modernity. International Journal of Communication (Online), 824–.


Microsoft Azure. (2020). What is The Cloud? Retrieved from https://azure.microsoft.com/en-gb/overview/what-is-the-cloud/


Ngungumbane. (2020.) We are moving into a post-Orwellian Disaster. Retrieved from https://ngungumbane.wordpress.com/2020/04/30/we-are-moving-into-a-post-orwellian-disaster/


Reading, A. (2014). Seeing red: a political economy of digital memory. Media, Culture & Society, 36(6), 748–760. https://doi.org/10.1177/0163443714532980


Tufekci, Z. (2014). Engineering the public: Big data, surveillance and computational politics. First Monday, 19(7). https://doi.org/10.5210/fm.v19i7.4901


Zou, J. (2016). Removing gender bias from alogrithms. Retrieved from https://theconversation.com/removing-gender-bias-from-algorithms-64721

© 2023 by Closet Confidential. Proudly created with Wix.com

bottom of page