The data collected form this study obtained three findings. First, mode of Web 2.0 applications for the academic library websites in public and private universities is almost the same. In particular, the order of popularity of Web 2.0 applications implemented in library websites are as follows: Facebook, Blog, Twitter, RSS, Live chat, Streaming media and Wikis. Thus, according to classification developed by Chua and Goh (Table 2), it can be concluded that, most of the libraries websites use Web 2.0 applications for “Information sharing”. Second, it was observed that links to the Web 2.0 applications was commonly placed on the main page. But, there were some library websites placed the links on the other page, which was quite hidden. Third, the present study found that Web 2.0 applications have not widely used in academic library websites in Malaysia yet, except for Facebook. Most of the libraries use email or online form as the way to communicate with their users rather than live reference chat. Most of them also use .pdf format form, rather than online form to request for materials.
Some time ago I set up a Delicious.com subject resource for the Centre of Translation and Comparative Cultural Studies postgraduate students. The idea was to enable users to both search for useful websites using tags, and to promote collaboration between students. Potentially useful web pages relating to Translation and Comparative Cultural Studies were saved in http://delicious.com/CTCCS and using simple html code, a tag cloud was generated on the Library subject web page. This project was designed to replace the traditional linear list of useful website links with a more interactive searchable web page. In addition users are able recommend and add additional resources to the delicious account at any time.
The study was mainly based on primary data. In order to achieve the objectives of the study, a structured questionnaire was constructed to collect data. The study was conducted among 47 librarians randomly selected from different college libraries in India. In the present study, the main purpose of the questionnaire was to collect the data on the use of web information services by library professionals as research scholars. The data were analyzed using percentage method. The study was designed not only to capture current attitudes and patterns of adoption but also to identify researchers’ needs and aspirations, and problems they encounter. The study began with a survey, which collected information about researchers’ information gathering and dissemination habits and their attitudes towards web 2.0.
own. Use of open standards improves interoperability so that we are not locked to a specific implementation. Delegating some components to third-party services makes agile development possible and improves sustainability. We have chosen to investigate REST, Web 2.0, Gadget and OpenSocial. REST makes our gadgets able to access backend services easily without incurring complex message manipulation operations (compared with SOAP). The gadget specification is used to build modular web components (gadgets) that can be deployed to any standard- compliant gadget container. Numerous existing gadgets in iGoogle and Orkut developed by programmers across the world can be used by our gateway easily. This is a notable improvement over the older portlet component model, which required server- side (rather than client-side) integration and which never developed the extensive library of standard and portable component that was anticipated. Also OpenSocial makes it possible for our portal to access users’ social data in existing social networking services. The third-party services used in our current implementation include Google Picasa, Google Calendar and Twitter. They are used to save filter execution results, filter events and filter execution status notifications.
Other contributors to Warwick’s Facebook pres- ence include our politics and international studies librarian, who has started to experiment with Facebook as a way of fielding subject-specific enquiries; a librarian at Wolverhampton Univer- sity learning centres who provided us with the basic code from which our library links box was created; and one of our teaching grid advisors who I spent a fun afternoon with trying to get our blogs to feed to Facebook pages after the recent interface changes.
Marcel Linnenfelser & al.  published a work that aims to define a system of qualified criterion for comparing RIA platforms. Evaluated technologies were AJAX, Microsoft Silverlight, Adobe Flex and JavaFX. They claim that AJAX and Adobe Flex are the most dominant and specially AJAX is supported by most popular browsers and does not require the installation of additional plug-ins or Runtimes environments. Adobe Flex technology consists in generating a graphically rich flash file that requires more performance from client side and a high bandwidth, which is the major inconvenience of this technology.
own fieldwork, for example, necessarily portrays traits of Internet use specific of young students in the privileged economic context—at least relative to global wealth distribution— of a western global city), and the relevance of specific technologies and narratives is de- bated in ethnographic studies of Internet users, an important common element of distinc- tion between the beginning and the end of the 2000s decade is the availability of technical affordances designed to support (in various ways and to different extents) the practices of appropriation and reconfiguration that were already visible in earlier times. These af- fordances constitute what I grouped under the label of Read/Write Internet in Chapter 2, whereas the commonly known label for these is that of Web 2.0. This narrative was ar- ticulated within the hegemonic discourse through which ‘the Web’ has been experienced by users during most of the 2000 decade, and it has also informed the actual software engineering practices of successive versions of Web 2.0 sites and applications, as well as shaping mainstream media coverage: for these reasons, it is useful to start from the Web 2.0 discourse when tracing the historical trajectory of lifeworld and Read/Write Internet. Both within academic and non-academic literature over the past decade, the exact traits and scope of Web 2.0 have been discussed in countless different ways, most of them informed by the broad definition proposed by O’Reilly (2005b) and his successive simpli- fied versions (O’Reilly 2005a, 2006), while still providing a puzzling multitude of often irreconciliable focuses: the main point of agreement seems to be, indeed, that there is no substantial agreement (Floridi 2014, ch7). My aim here, therefore, is not to try to find a common ground between the innumerable existing interpretations nor to propose yet an- other one: rather, it is to highlight how the computational foundation of the core traits of O’Reilly’s Web 2.0 manifesto relates to the analysis of computational agency at the core of my own research, as well as to the materiality of user experiences as described by research participants throughout my fieldwork.
Web 2.0 may be considered as a back-to-basics web, with its social and community features perfectly illustrated by the unprecedented development of wikipedia.com, which is generally agreed to be as - if not more - reliable than the best printed encyclopaedias. It is only a "partial" return to the internet's roots, though, because the commercial web is and will remain a linchpin of Web 2.0. Web 2.0 is admittedly a collaborative and social web, but it remains a commercial web, whether directly or indirectly. Lots of Web 2.0 services are directly (sale of services or payment of a fee) or indirectly (sale of advertising spaces and/or customer data) commercial activities.
To gain understanding about those tools we tested twenty Web 2.0 collaborative tools, the idea partially come from a previous study by  in which seven of the tools ware adopted, while we reviewed the remaining tools such as: eXo Platform, Basecamp, Zoho project, Wrike, Asana, Huddle, Mavenlink, Trello, ProWorkflow, Skype, Google Hangout, Zimbra, Groupware, WebEx, PHProject, Bluetie, Microsoft SharePoint, Kune, and Microsoft Office Groove, based on our experience. The researcher try to cover more wider ranges of tools that has more different features, ranging from freeware to paid, client server to hybrid architecture, from charting, to project management and to document management.
Whenever a new form of communication appears on the scene, it immediately becomes the object of discussion. This has been going on since the first penny press edition in 1834, whereas today discussions are carried out with reference to the Internet. The stability with which mass-media have faced different criticism can be well understood thanks to the functionalist analysis which considers the media as a social system working within an external system made up of a set of cul- tural and social conditions. In spite of its complexity, any set of repetitive actions contribute to maintaining or to weakening the stability of the system. We can say that globalization would not have been possible without the media and Web 2.0 may be of remarkable interest for its role in in- fluencing cultural identity. All the past technologies, from electric light to the airplane, took a whole generation to gain ground among people, and Internet has not required such a long time. The impossibility to digest the new modalities of communication offered by the net creates the risk of unexpected contamination. Geographical magazines often show pictures of native Amazonians dressed in their traditional costumes while using computers and mobile phones. Educational uses of Web 2.0 and mobile learning tools have been rapidly expanded over the last few years and a great number of projects have been planned for teaching languages. Mobile learning includes many areas: handheld computers, MP3 players, notebooks and mobile phones. In this paper we shall outline the methodology including selection of web tools, task design, implementation and intercultural communication. The study carried out at the University of Florence shows that learners develop their communication competence while performing entertaining activities which enable them to achieve the desired goals.
The second support team was the analysis and research team in charge of analyzing information received on the platform and providing situation room reports in the form of data visualizations. The team was to release reports twice daily that were an aggregation of the reports received depicting trends and collated statistics. The largest challenge for the analysis team was the “dirty” raw data that had not been properly coded or cleaned. This made analysis on a tight timeline difficult since some reports had not been categorized well and therefore the data team had to manually check for the content and categorize them before analyzing. During the election period, Kenyans mobilized each other primarily through SMS and social media, (Facebook and Twitter). Politicians mobilized citizens through political rallies and both traditional (TV, radio, billboards, etc.) and social media (Twitter, Facebook). The Uchaguzi platform utilized SMS, social media, email and a web form in order to offer citizens a variety of options through which to report. SMS was the most utilized channel due to ease of access for most citizens. Many Kenyan media houses have also embraced the rising role of new media in coverage of Kenya’s elections and the mobilization of public opinion around such tools. Agencies such as Nation Media and the Standard Group harnessed social media by using it to broadcast their news and engage in dialogue with citizens. There were also various Twitter hashtags that were used during the elections such as #KenyaDecides, #kenyaelections13, #uchaguzi, #choice2013, #Ke2013 and
Figure 1 shows the overall architecture of our proposed integration model. This system consists of six components: (a) Tools, external web tools to provide services to clients; (b) Integration Manager, have information service and provide communication between tools, client, and responsible for integration operation in the system; (c) Filter, operate two-way data filtering; (d) Permission Handler, checks existing Digital Entity(DE)s permission or build a new permission token for new DEs; (e) Data Manager, provides a mechanism to extract data from a repository and insert data into a repository; and (f) Storage, maintains user data and permissions in the database.
2.2 Linkage comparisons: BIaaS vs. Supply Chain Rungtusanatham et al.  introduced the concept of linkages for supply chain, and they define it as “explicit and/or implicit connections that a firm creates with critical entities of its supply chain in order to manage the flow and/or quality of inputs from suppliers into the firm and of outputs from the firm to customers.” There is another type of information-based linkage that can improve the visibility of customers’ and suppliers’ operational activities . Barratt and Barratt  present their external and internal supply chain linkages and use a Coffee case study to demonstrate linkages in relationship and business activities between different roles and companies. Although they show a workflow diagram, data analysis and three propositions, their presentation is still a conceptual framework without any implementations or services in place.
Product design course lecturers were invited to form an intentional COP (Langelier, 2005) to investigate the use of Web 2.0 tools within their teaching. This first attempt at establishing a lecturer COP was short-lived, although one lecturer was motivated to explore these ideas further in 2007. While no formal changes were made to the traditional paper-based implementation of the major project in 2006, reflections on these experiences merged to form the foundational concepts underpinning subsequent implementation and research into mobile learning. The 2006 trials were also used to develop and test the research questions and data collection instruments.
This year students will be required to undertake a regular 'nomadic' session where they work away from the studio, but continue collaborating and learning conversations via mobile web 2.0 connectivity. Social software tools can be effectively integrated into both face-to-face and online environments; the most promising settings for a pedagogy that capitalizes on the capabilities of these tools are fully online or blended so that students can engage with peers, instructors, and the community in creating and sharing ideas (McLoughlin, Lee. Future Learning Landscape 2008, p3). Throughout the duration of the final year of Product Design, students will be required to integrate web 2.0 into their studio practice. To this end, the programme will be providing smart phones (Nokia N95) and a weekly community of practice meeting that will focus on understanding and experimenting with web 2.0 tools and technologies. Through out ShaC09, data sharing will be enabled through a range of software applications. Staff and students will make project work and resources available to the rest of the world online, via blogs, wikis and other web 2.0 applications. Moving further away from the Atelier Method environment and building upon the work carried out in 2008 our research focus for 2009 is focussed on the seemless integration of web2.0 into the Bachelor of Product Design as well as augmenting the level of flexibility for students to allow them to choose to work in virtually any context on and off campus.
Under i-SSIS is the raw data in the Web, which is crawled and processed by the modules in the Data Col- lection Layer. The crawlers in i-SSIS fetch the pages from the Web sites to gather data involving human re- source and social security information. The crawlers are embedded with some inner analyzers to filter the unnec- essary information during crawling, which means that only the documents involving social security information such as the person, the organization, the role and the in- surance are collected. Then the raw data would be proc- essed through three procedures to become the entities managed by the database system. Firstly, the information of i-SSIS entities, e.g., people and organizations, is ex- tracted with information preprocessing tools such as natural language processing tools. In this procedure, Web documents are initially summarized with a statistics tool to find its theme and then are processed by information
In this paper, we present our initial design for construct the E- learner’s Collective Intelligence System Framework. Our methodology is based on the novel web usage mining techniques along with introduce a novel approach to collective intelligence with the use of mashup and web 2.0 technology approach and incorporate artificial neural network to predict potential e-Learners’ pattern and need in future. The remainder of this paper is organized as follows. Section 2 describes the background and development of E-Learning and E-learning 2.0, Collective Intelligence, Classifying Intelligence and E-Learning 2.0 Ecosystem and artificial neural network. Section 3 explains the Collaboration of Collective Intelligence and Web 2.0 in detail. Section 4 provides an overview of the latest Web 2.0 technologies which can be used to integrate the process. Section 5 describes the collaboration web service for CI. Section 6 describes about the E-learner knowledge discovery layer which comprise of different layer in non-linear order discussed in detail and also describes about the E-learners Collective Intelligence System (ECIS) Framework. In section 7 discuss the performance evaluation. Finally, section 8 is the conclusion and future work.