User:Francg/expub/thesis/bibliography
5 sources
19.10.17
Mining the Social Web, Matthew A. Rossell
We live in a world of big data: the amount of information collected on human behavior each day is staggering, and exponentially greater than at any time in the past. How data mining substantially differs from conventional statistical modeling familiar to most social scientists. The authors also empower social scientists to tap into these new resources and incorporate data mining methodologies in their analytical toolkits. Data Mining for the Social Sciences demystifies the process by describing the diverse set of techniques available, discussing the strengths and weaknesses of various approaches, and giving practical demonstrations of how to carry out analyses using tools in various statistical software packages.
Future Shock
Future Shock is a book written by the futurist Alvin Toffler in 1970. In the book, Toffler defines the term "future shock" as a certain psychological state of individuals and entire societies. His shortest definition for the term is a personal perception of "too much change in too short a period of time". The book grew out of an article "The Future as a Way of Life" in Horizon magazine, Summer 1965 issue.
The Tao of Open Source Cyber Intelligence
The Internet has become the defining medium for information exchange in the modern world, and the unprecedented success of new web publishing platforms such as those associated with social media has confirmed its dominance as the main information exchange platform for the foreseeable future. But how do you conduct an online investigation when so much of the Internet isn't even indexed by search engines? Accessing and using the information that's freely available online is about more than just relying on the first page of Google results. Open source intelligence (OSINT) is intelligence gathered from publically available sources, and is the key to unlocking this domain for the purposes of investigation.
It catalogues and explains the tools and investigative approaches that are required when conducting research within the surface, deep and dark webs.It explains how to scrutinise criminal activity without compromising your anonymity - and your investigation.It examines the relevance of cyber geography and how to get round its limitationsIt describes useful add-ons for common search engines, as well as considering Metasearch engines (including Dogpile, Zuula, PolyMeta, iSeek, Cluuz, and Carrot2) that collate search data from single-source intelligence platforms such as Google.It considers deep web social media platforms and platform-specific search tools, detailing such concepts as concept mapping, Entity Extraction tools, and specialist search syntax (Google Kung-Fu).It gives comprehensive guidance on Internet security for the smart investigator, and how to strike a balance between security, ease of use and functionality, giving tips on counterintelligence, safe practices, and debunking myths about online privacy.
OSINT is a rapidly evolving approach to intelligence collection, and its wide application makes it a useful methodology for numerous practices, including within the criminal investigative community. The Tao of Open Source Cyber Intelligenceis your guide to the cutting edge of this information collection capability.
About the author: Stewart K. Bertram is a career intelligence analyst who has spent over a decade working across the fields of counterterrorism, cyber security, corporate investigations and geopolitical analysis. The holder of a Master's degree in Computing and a Master of Letters in Terrorism studies, Stewart is uniquely placed at the cutting edge of intelligence and investigation, where technology and established tradecraft combine. Stewart fuses his academic knowledge with significant professional experience, having used open source intelligence on such diverse real-world topics as the terrorist use of social media in Sub-Saharan Africa and threat assessment at the London Olympic Games. Stewart teaches courses on open source intelligence as well as practising what he preaches in his role as a cyber threat intelligence manager for some of the world's leading private-sector intelligence and security agencies.
Reality Mining: Using Big Data to Engineer a Better World
Big Data is made up of lots of little data: numbers entered into cell phones, addresses entered into GPS devices, visits to websites, online purchases, ATM transactions, and any other activity that leaves a digital trail. Although the abuse of Big Data -- surveillance, spying, hacking -- has made headlines, it shouldn't overshadow the abundant positive applications of Big Data. InReality Mining, Nathan Eagle and Kate Greene cut through the hype and the headlines to explore the positive potential of Big Data, showing the ways in which the analysis of Big Data ("Reality Mining") can be used to improve human systems as varied as political polling and disease tracking, while considering user privacy.Eagle, a recognized expert in the field, and Greene, an experienced technology journalist, describe Reality Mining at five different levels: the individual, the neighborhood and organization, the city, the nation, and the world. For each level, they first offer a nontechnical explanation of data collection methods and then describe applications and systems that have been or could be built. These include a mobile app that helps smokers quit smoking; a workplace "knowledge system"; the use of GPS, Wi-Fi, and mobile phone data to manage and predict traffic flows; and the analysis of social media to track the spread of disease. Eagle and Greene argue that Big Data, used respectfully and responsibly, can help people live better, healthier, and happier lives.
Data Mining for the Social Sciences: An Introduction
We live in a world of big data: the amount of information collected on human behavior each day is staggering, and exponentially greater than at any time in the past. Additionally, powerful algorithms are capable of churning through seas of data to uncover patterns. Providing a simple and accessible introduction to data mining, Paul Attewell and David B. Monaghan discuss how data mining substantially differs from conventional statistical modeling familiar to most social scientists. The authors also empower social scientists to tap into these new resources and incorporate data mining methodologies in their analytical toolkits.Data Mining for the Social Sciencesdemystifies the process by describing the diverse set of techniques available, discussing the strengths and weaknesses of various approaches, and giving practical demonstrations of how to carry out analyses using tools in various statistical software packages.
Digital Methods
It is not a toolkit for Internet research, or operating instructions for a software package; it deals with broader questions. How can we study social media to learn something about society rather than about social media use? How can hyperlinks reveal not just the value of a Web site but the politics of association? Rogers proposes repurposing Web-native techniques for research into cultural change and societal conditions. We can learn to reapply such "methods of the medium" as crawling and crowd sourcing, PageRank and similar algorithms, tag clouds and other visualizations; we can learn how they handle hits, likes, tags, date stamps, and other Web-native objects. By "thinking along" with devices and the objects they handle, digital research methods can follow the evolving methods of the medium. Rogers uses this new methodological outlook to examine the findings of inquiries into 9/11 search results, the recognition of climate change skeptics by climate-change-related Web sites, the events surrounding the Srebrenica massacre according to Dutch, Serbian, Bosnian, and Croatian Wikipedias, presidential candidates' social media "friends," and the censorship of the Iranian Web. With Digital Methods, Rogers introduces a new vision and method for Internet research and at the same time applies them to the Web's objects of study, from tiny particles (hyperlinks) to large masses (social media).
The second wave of digital-era governance: a quasi-paradigm for government on the Web
Widespread use of the Internet and the Web has transformed the public management 'quasi-paradigm' in advanced industrial countries. The toolkit for public management reform has shifted away from a 'new public management' (NPM) approach stressing fragmentation, competition and incentivization and towards a 'digital-era governance' (DEG) one, focusing on reintegrating services, providing holistic services for citizens and implementing thoroughgoing digital changes in administration. We review the current status of NPM and DEG approaches, showing how the development of the social Web has already helped trigger a 'second wave' of DEG2 changes. Web science and organizational studies are converging swiftly in public management and public services, opening up an extensive agenda for future redesign of state organization and interventions. So far, DEG changes have survived austerity pressures well, whereas key NPM elements have been rolled back.
Web as History: Using Web Archives to Understand the Past and the Present (chapter 2 - Live versus archive: Comparing a web archive to a population of web pages
The World Wide Web has now been in use for more than 20 years. From early browsers to today’s principal source of information, entertainment and much else, the Web is an integral part of our daily lives, to the extent that some people believe ‘if it’s not online, it doesn’t exist.’ While this statement is not entirely true, it is becoming increasingly accurate, and reflects the Web’s role as an indispensable treasure trove. It is curious, therefore, that historians and social scientists have thus far made little use of the Web to investigate historical patterns of culture and society, despite making good use of letters, novels, newspapers, radio and television programmes, and other pre-digital artefacts. This volume argues that now is the time to question what we have learnt from the Web so far. The 12 chapters explore this topic from a number of interdisciplinary angles – through histories of national web spaces and case studies of different government and media domains – as well as an introduction that provides an overview of this exciting new area of research.
Mining the Social Web: Data Mining Facebook, Twitter, LinkedIn, Google+, GitHub, and More 2nd Edition
International Journal of Information Management, 2000
[https://web.archive.org/web/20141006092529/http://cpe.njit.edu/dlnotes/CIS/CIS735/StructuringComputerMediated.pdf STRUCTURING COMPUTER-MEDIATED
COMMUNICATION SYSTEMS TO AVOID INFORMATION OVERLOAD - Article 1985]
[https://web.archive.org/web/20051025082125/http://www.ravid.org/gilad/isr.pdf Information Overload and the Message Dynamics of
Online Interaction Spaces: A Theoretical Model and Empirical Exploration, 2004]
Web Scraping with Python: Collecting Data from the Modern Web 1st Edition
->