User:Michaela/thesis

From XPUB & Lens-Based wiki
< User:Michaela
Revision as of 13:13, 11 February 2014 by Michaela (talk | contribs)

Useful links:
A Guide to Essay Writing

Thesis Guidelines

Draft_ thesis plan:

ABSTRACT

QUESTIONS I WANT TO ADDRESS IN MY ESSAY

#problem_aspect #1

THEME: OWNERSHIP/ DIGITAL PROPERTY/ PROPERTY RIGHTS

#problem_aspect #2
SUB_THEME: PRIVACY IN DIGITAL REALM
The ethical issue I want to address is: who has the right to withdrawal someone's data and how this data could be used, reused or misused?

#problem_aspect #3
SUB_THEME: AUTHORSHIP/CO-AUTHORSHIP/ MULTIPLE - AUTHORS
The problematic aspect of the recovered data: who is the actual author of the final work?

<ownership

'data erasure' term - or the only way to erase data permanently is physically to destroy a hard disk drive. In the past shredders were used to destroy confidential, secret papers now there are replaces by data shredders. Questions about e-waste and digital recycling, digital trash and data recycling. Except the physical implementation of data storage of digital trash camps outrageously sent to remote areas in Africa nevertheless there is the ethical issue or who has the right to intervene with someone's data.


<<The story of Ghana digital trash camps <<< e-waste, data remanence data thief or how the data can be recovered and serve as an evidence

comment: to find more examples and stories

<memory

This chapter will be about the technical aspect of data recovery process more specifically: clusters, partitions, Gutmann method, modern methods of data recovery

links: Legend of multiple passes of overwriting:

http://grot.com/wordpress/?p=154

http://www.pcworld.com/article/209418/how_do_i_permanently_delete_files_from_my_hard_disk.html

https://ssd.eff.org/tech/deletion

http://security.stackexchange.com/questions/10464/why-is-writing-zeros-or-random-data-over-a-hard-drive-multiple-times-better-th

---> more over There is a well-known reference article by Peter Gutmann on the subject. However, that article is a bit old (15 years) and newer harddisks might not operate as is described. Some data may fail to be totally obliterated by a single write due to two phenomena: We want to write a bit (0 or 1) but the physical signal is analog. Data is stored by manipulating the orientation of groups of atoms within the ferromagnetic medium; when read back, the head yields an analog signal, which is then decoded with a threshold: e.g., if the signal goes above 3.2 (fictitious unit), it is a 1, otherwise, it is a 0. But the medium may have some remanence: possibly, writing a 1 over what was previously a 0 yields 4.5, while writing a 1 over what was already a 1 pumps up the signal to 4.8. By opening the disk and using a more precise sensor, it is conceivable that the difference could be measured with enough reliability to recover the old data. Data is organized by tracks on the disk. When writing over existing data, the head is roughly positioned over the previous track, but almost never exactly over that track. Each write operation may have a bit of "lateral jitter". Hence, part of the previous data could possibly still be readable "on the side".

<code

This chapter is about the performative aspect of the code. Code as Language


<space

This chapter will be about the final work. Description/ set up etc.