Abstract: This paper explores some key ways in which the scale and form of information today challenges some of sociol- ogy’s core methods and practice. Information has shaped sociology in two key ways. First, it has become an object of study, largely in the form of accounts of the epochal shift to ‘the information society’. This paper examines interactivity as a key element of such changes, especially in relation to the mass media. The second way in which sociology is being transformed by the growth of information is that, with the growth of huge volumes of commercial transactional information, social data is no longer the preserve of sociologists. Moreover, new tools have emerged to challenge the research methods that lie at the heart of sociology. This paper explores this argument, originally developed by Savage and Burrows, in relation to the BBC World Service’s use of socialmediamonitoringtools. It examines some implications of the growth of interactivity, and the associated new forms of data and new research tools, to explore some key implications of information for the practice of sociology today. It concludes that the vast amount of available information affords new possibilities for sociologists as well as for the organisations that collect it, but that this requires sociologists to develop new tools and practices.
The importance of scanning the Internet and collecting feedback from various sites is only the first step in generating value from SocialMediamonitoring. The form of feedback collected, whether raw data or information, and how the MNC integrates the feedback back into the company is also vital to generating value from the process. Corner et al. (1997), explain that knowledge only becomes knowledge when data or information is interpreted by the receiver. The MNCs use of the information generated from SocialMediamonitoring is equally as important as the content itself. In Dixon‟s (1999) four steps to organizational learning, she describes the way for a company to participate in collective learning and convert information into working knowledge for the organization. These steps begin with the generation of widespread information, or SocialMediamonitoring, and end with measuring the results. It is also important for the MNC to know the purpose of their SocialMediamonitoring efforts (Fresh Networks SocialMedia Influences Report, 2010). For example, some companies use SocialMediamonitoringtools for brand management, others for public relations, and others for marketing (Newlands, 2011). Depending on the desired end result, different SocialMediamonitoringtools can be used. As stated by Newlands (2011), both monitoring the information and listening to the information are important for the MNC. Listening involves turning the information into collective knowledge within the MNC and applying it to useful ends.
Basically,  monitoring is defined as a process of keeping track and in return organization gives responses to online customer reviews, requests, and related information about companies, brands, and results. While online security be- comes more crucial for companies commerce and communications, over the past few decades, computer viruses pose the biggest threat to individuals and companies just like an easily spread disease. Since the characters of viruses are regular and unexpected, it is worth learning with models of worldwide main se- curity software companies, such as Symantec, Macfee, and IBM, etc., to some certain degree, these companies have achieved some results due to they recog- nize socialmedia as a tool to engage with customers, evaluate their own or com- petitors’ products, and even the whole competitive market structure; specifically they use socialmediamonitoringtools to work as a logical workflow in order to follow online comments and requests, and give responses in return .
1. Ongoing analytics – Monitoring that tracks activity over time. These are necessary for keeping up with the overall conversation about your brand or organization and should be aligned with the brand or organization’s business goals. Are we winning or losing?
Socialmedia has emerged into a crucial resource for obtaining population-based signals for various public health monitoring and surveillance tasks, such as pharmacovigilance. There is an abundance of knowledge hidden within socialmedia data, and the volume is growing. Drug-related chatter on socialmedia can include user-generated information that can provide insights into public health problems such as abuse, adverse reactions, long-term effects, and multi-drug interactions. Our objective in this paper is to present to the biomedical natural lan- guage processing, data science, and public health communities data sets (annotated and unan- notated), tools and resources that we have collected and created from socialmedia. The data we present was collected from Twitter using the generic and brand names of drugs as key- words, along with their common misspellings. Following the collection of the data, annotation guidelines were created over several iterations, which detail important aspects of socialmedia data annotation and can be used by future researchers for developing similar data sets. The an- notation guidelines were followed to prepare data sets for text classification, information ex- traction and normalization. In this paper, we discuss the preparation of these guidelines, out- line the data sets prepared, and present an overview of our state-of-the-art systems for data col- lection, supervised classification, and information extraction. In addition to the development of supervised systems for classification and extraction, we developed and released unlabeled data and language models. We discuss the potential uses of these language models in data mining and the large volumes of unlabeled data from which they were generated. We believe that the summaries and repositories we present here of our data, annotation guidelines, models, and tools will be beneficial to the research community as a single-point entry for all these re- sources, and will promote further research in this area.
All the surveyed approaches use local streams in their implementations, i.e., streams are not being retrieved and consumed online, and vocabularies are known beforehand (c.f. Req. 1). These approaches require new hand-crafted queries to be registered if vocabularies change. The (pre-)processing of streams using ex- ternal tools (c.f. Req. 2) can be achieved if streams can be created and added to the engine on the fly from outside the engine environment, and if the system can handle several parallel streams. The engines discussed in this paper use two different approaches for handling streams; data from streams are added to a repository but the streams are never referenced directly, or streams are stored in a repository but must be referenced explicitly in queries. The first approach en- ables simple collaboration between queries, but keeping the contents of different streams apart can be difficult. The second approach enables referencing specific streams directly, but makes communication between queries complicated.
Meltwater Buzz is the socialmedia analysis tool that complements its parent product, the more traditional media-monitoring tool, Meltwater. Because of this, Meltwater Buzz only harvests content that it defines as ÔsocialÕ. It does not look at general online information, such as newspaper articles or other web pages, as this is done by the original Meltwater product. Rather, it looks at what people are saying about certain keywords. The product accesses a comprehensive range of social data sources including Twitter, Facebook, blogs, comments boards, message boards, Wikipedia, Youtube and ÔOthersÕ. While it does not harvest complete web pages (unless they are blog posts), as it regards these as non-socialmedia, Meltwater Buzz does harvest individual comments posted on web pages (such as news articles), which is particularly important for those interested in what local people are saying about specific issues. Meltwater Buzz allows the user to tailor the data sources accessed through the ability to block content so that particular data sources do not appear in your results (which is useful for getting rid of irrelevant data). It is not possible for the general user to target searches at specific data sources (such as particular local forums) but it may be possible for the tool administrators to do this. Meltwater is relatively easy to use, though the Boolean search system requires training for those who have never used it before. This search interface provides a flexible and powerful way to filter results.
These days, the impact of Web 2.0 on both consumers and businesses is rising. Web 2.0 represents the large amount of interactive and user-controlled online applications whereby the power of the customer increases and the consumer behavior is affected (Constantinides & Fountain, 2007). After the global recession in 2008, socialmedia marketing appeared as an interesting opportunity for firms because of the notable advantage to reach a substantial amount of people quickly with a minimum of costs (Kirtis & Karahan, 2011). In 2014, more than 80 percent of the inhabitants of the Netherlands from 12 years old using socialmedia (Centraal Bureau voor de Statistiek, 2015). Moreover, 10.2 million people in the Netherlands use Facebook frequently (Statista, 2016). As a consequence, socialmedia became one of the main sources to collect detailed customer information based on their online activities. More and more companies use the socialmedia as a new marketing channel (Constantinides, 2009). The use of socialmedia has also resulted in an increased customer empowerment by now a foundational element of the Web 2.0 landscape (Constantinides, Romero & Boria 2008). While in several studies the terms ‘Web 2.0’ and ‘Socialmedia’ are used interchangeably, the meaning is different (Kaplan & Haenlein, 2010). As Constantinides (2014) stated, socialmedia are the Web 2.0 platforms (e.g. Facebook, Pinterest and YouTube). Since socialmedia provide new possibilities and extensive information available to the customer, the customer is more difficult to convince and engage to your company (Constantinides, 2014).
However, traditional media news organizations have also been skeptical of new technology. As stated previously, even though many technological innovations had potential advantages, some news-organization decision makers have been hesitant to adopt them (Arceneaux & Weiss, 2010; Garrison, 2001b; Nguyen, 2008). Fearing the Web might cannibalize revenues generated from tradition-media products, Nguyen (2008) noted that several news managers relied on a defensive strategy to protect their organizations’ profits. In Garrison’s (2001b) survey about newspaper newsrooms and Internet adoption, managers cited several specific reasons for their reluctance to adopt the new medium as a news- reporting tool: “lack of resources to invest in new technology, lack of expertise, fear of lost time required to learn and not enough time in the work schedule” (p. 232). Several analysts presume this unwillingness to utilize the Internet for newsgathering, news distribution, and advertising purposes are the primary factors for television stations and newspapers ending up in the crosshairs of creative destruction (Hamilton, 2004; Kaye, 2010; McChesney, 2010; Meyer, 2009).
virtual level. Only the activity of each organizational member will separate one from another. Active users can use this as a benefit. They have the tools, the possibility, and the access to discuss various topics with users at different organizational levels. Having more information available will enhance the quality of acquired knowledge as the understanding of what is important to the organization is increased. Therefore, H2 is supported. However, the data doesn’t support arguments about a significant change in knowledge assimilation between active and conservative users and therefore both H3 and H4 are not supported. Together with regression analysis previously, that implies that organizational level measures are required for addressing assimilation. Even though none of the case organizations included scenarios in which online collaboration tools were used for direct communication with external parties at time of the data collection, those organizational members that are also active users consider that OCT will also reflect positively on communication with external parties. This understanding is mainly based on expectations, but it open interesting avenues for discussion. Active users have noticed the benefits of internal communication and collaboration, and they might be anxious to transfer the same tools to enhance communication and collaboration also with external parties. Naturally, there will be different challenges; for example, the amount of trust is totally different, and innovative and open communication and collaboration between organizational members and external parties might be extremely difficult to reach (Lohikoski & Haapasalo, 2013; Martins et al., 2004; Von Krogh, 2002). A summary of the statistically significant differences between conservative and active users is presented in Table 4.8.
We will now apply the notions of identify formation, oscillation, mimicry, being stuck and threshold concepts to our experience of using socialmediatools for a doctoral study. Our discussion relates our experiences to these constructs and identifies the benefits and challenges of using socialmedia practices to support doctoral studies. We experienced socialmediatools both as supporting academic identity formation and inhibiting it. When we felt confident, we participated in and contributed to the online community through use of #phdchat, 1 and then we (mostly) reaped the benefits of being connected to and challenged by the community PhD students and of developing our academic identity and profile. The socialmediatools had particular potential to connect us to a vast and ‘always connected’ audience who we experienced as willing to engage in discussions about our academic area and to overcome the isolation that is typically felt by PhD students. Our experience is supported by the findings of Mewburn (2011) who has identified the process of talking about the challenges of doctoral study, a notion that she calls ‘troubles talk’, as an essential part of doctoral student identity formation and argues that it is through ‘troubles talk’ that the doctoral student experiences the PhD community. She argues that online spaces, such as blogs and Twitter, provide a place for the important aspects of ‘performing, managing, displaying and controlling emotional reactions to the exigencies of doctoral study’ (p. 330).
There are other limitations to our methodology. As has been noted, only a small proportion of the participants actually completed the course: since our methodology involved analysing blog posts, those who did not finish the course will be under- represented within our findings. Although some of them did use their blogs to chart their struggles and eventual decision to give up, others simply disappeared without trace: we will not have fully captured their experience. The small scale of the project would also be a significant limitation if any attempts were made to claim that it can fully represent a wider population. We make no such claims: rather, we offer it as an example of one intervention which made some progress in engaging academics with socialmediatools, and consider the success of this specific project. Participants were not informed that their blogs would be used as part of the project evaluation. We considered the ethical implications of this, but concluded that, since the blogs are in the public domain, it was an acceptable approach. Informing the participants that their blogs would form part of the evaluation might have affected what and how they chose to post, and it was important that the blogs represented an honest reflection of their experience. Of course, a risk remained that they would shape their posts to reflect their perceptions of our ‘desired’ outcomes – particularly given that we were active commenters on their blogs: as with the ‘dropout’ bloggers, it is likely that our analysis over-emphasises the positive experiences and under-represents the negative ones.
systems are inevitable on every computer system; especially security related ones, while others are useful in different environments, which depend on user requirements and limitations as well as system-specific features. When planning cluster monitoring system, and in order to be able to utilize cluster inherent facilities as well as self-contained monitoring systems to obtain maximal amount of crucial information with minimum impact on system performance, one should carefully think about cluster type to be used and its specific features.
Global trends in the patterns of online behavior (using Web 2.0 as a potential eWOM facility) are also starting to be established in the academic literature. Using data from a global panel study, Jobs (2011) and Jobs and Gilfoil (2011) found that developing countries, when compared to technologically and economically developed countries, are adopting micro-blogging services at a significantly greater and relative rate than social networking services. The latter study attempted to explain these divergent usage patterns using Hofstede’s cultural dimensions model. In a follow up study, Gilfoil and Jobs (2011) found that there are currently more sellers than buyers using four socialmedia broadcast tools across sixteen individual countries analyzed (with the exception of China) and that the size of the gaps between sell and buy activities varied widely by country. While these individual country sell biased gaps were interesting, the authors suggested that additional research be initiated to investigate sell vs buy motivational usage of a broader set of socialmediatools with an eye towards mapping these tools (as appropriate sell or buy platforms for marketers or consumers) in emerging and/or developed economies around the globe.
Ganglia , LiveRAC  and ENaVis  are cluster monitoringtools that collect per-node as well as system-wide data of several high-level variables (e.g., CPU utilization, I/O throughput, free disk space for each monitored nodei). However, they mainly focus on high-level variables and track only system wide totals. It is usually used to help flag misbehaviors (e.g., ”a node went down”). Our system correlates OS metrics together with high level MapReduce abstraction, and thus would be more powerful in debugging MapReduce performance problems. Artemis  is a plug- gable framework for distributed log collection, analysis and visualization. Our system collected and analyzed MapRe- duce level information from Hadoop logs and also OS level metrics on cluster machines. We can build our system as Artemis plugins for online monitoring.
Matrix tests are a specific case of the complex tests, testing specific set of services, but across the whole Grid. The matrix tests, while useful, pose too high load to be used in a regular way. We consider them as a complement to the passive tests coming from the middleware instrumentation. The current set of infrastructure monitoringtools still does not include passive tests based on middleware instrumentation. We are, however, working with the GAT developers to add new service interfaces to the middleware—these interfaces will provide information that will be collected be the specific version of the worm, processed and sent via shepherds to the central monitoring service. This way the use of the instrumented middleware will have no negative effect on the applications (no direct interaction with a remote service will be necessary). As already pointed out in Sec. 3 and more thoroughly discussed in , the status monitoring based on (active) tests is suboptimal since it poses additional load on the infrastructure tested. Less intrusive strategy is the passive monitoring that relies on information sent by applications and instrumented middleware running on the grid. This is not yet included in the current set of monitoringtools, but we are working with the GAT developers to add new service interfaces to the middleware. These interfaces will provide information that will be collected by the specific version of the worm, processed by it and sent via shepherds to the central monitoring service. This way the use of the instrumented middleware will have no negative effect on the applications (no direct interaction with a remote service will be necessary). And worms can also start specific local tests if there is insufficient information collected by the instrumented middleware (e. g., because of low application load).