• No results found

Evaluating smart city learning

N/A
N/A
Protected

Academic year: 2021

Share "Evaluating smart city learning"

Copied!
15
0
0

Loading.... (view fulltext now)

Full text

(1)

Evaluating Smart City Learning

Penelope J Lister

Abstract – MMeeaassuurreemmeenntt aanndd aannaallyyssiiss ooff iinnddiivviidduuaallllyy iinntteerrpprreetteedd lleeaarrnniinngg eexxppeerriieenncceess ccaann b

buuiilldd aa kknnoowwlleeddggee ppiiccttuurree ooff hhooww lleeaarrnneerrss ppeerrcceeiivvee iimmmmeerrssiivvee tteecchhnnoollooggyy--mmeeddiiaatteedd lleeaarrnniinngg iinn ssmmaarrtt cciittiieess.. CCoommppaarriissoonn ooff tthheessee lleeaarrnniinngg eexxppeerriieenncceess,, wwiitthh tthheeoorreettiiccaall ffaaccttoorrss d

deerriivveedd ffrroomm rreelleevvaanntt lliitteerraattuurree,, mmaayy tthheenn sshheedd lliigghhtt oonn tthhee uusseeffuullnneessss ooff tthheeoorryy iinn pprraaccttiiccaall lleeaarrnniinngg ddeessiiggnn aanndd aapppprrooaacchheess ttoo tthhee eevvaalluuaattiioonn ooff iimmmmeerrssiivvee lleeaarrnniinngg eennvviirroonnmmeennttss aannaallyysseedd ffrroomm aa tthheeoorreettiiccaall bbaassiiss.. IInn ttuurrnn,, tthhiiss mmaayy ccoonnttrriibbuuttee ttoo ccuurrrreenntt aapppprrooaacchheess ooff u

urrbbaann ssmmaarrtt cciittyy eennvviirroonnmmeenntt ppllaannnniinngg ffoorr cciittiizzeenn eennggaaggeedd ‘‘hhuummaann ssmmaarrtt cciittiieess’’ [[1144]].. M

Moobbiillee lleeaarrnniinngg llooccaattiioonn--bbaasseedd pprroottoottyyppeess wwiillll bbee ddeevveellooppeedd wwiitthh ssuubbjjeecctt eexxppeerrttss aanndd iimmpplleemmeenntteedd iinn ooppeenn ((uurrbbaann)) ssppaacceess llooccaatteedd aatt UUppppeerr BBaarrrraakkkkaa GGaarrddeennss,, VVaalllleettttaa ffoorr hhiissttoorryy aanndd AArrggoottttii GGaarrddeennss,, FFlloorriiaannaa ffoorr bboottaannyy.. TThhiiss ppaappeerr ddiissccuusssseess ppootteennttiiaall mmeetthhooddoollooggiieess ffoorr d

deessiiggnniinngg aa mmeeaassuurreemmeenntt ooff tthhee eeffffeeccttiivveenneessss ooff tthheessee lleeaarrnniinngg eexxppeerriieenncceess aanndd aassssoocciiaatteedd lleeaarrnniinngg ddeessiiggnn ffoorr iimmmmeerrssiivvee uurrbbaann lleeaarrnniinngg eennvviirroonnmmeennttss mmeeddiiaatteedd bbyy mmoobbiillee aanndd n

neettwwoorrkkeedd tteecchhnnoollooggiieess.. A

Acckknnoowwlleeddggiinngg tthhee hhyybbrriidd nnaattuurree [[99]] ooff ssmmaarrtt cciittyy lleeaarrnniinngg,, iinntteerraaccttiioonnss bbeettwweeeenn ddiiggiittaall ttoooollss,, ccoonntteenntt aanndd ccoommmmuunniittyy,, mmeeaassuurriinngg bbootthh iinnttrraa-- aanndd iinntteerr--lleeaarrnneerr eexxppeerriieenncceess iiss aannttiicciippaatteedd.. IIddeennttiiffyyiinngg aanndd qquuaannttiiffyyiinngg tthheessee ddiimmeennssiioonnss ooff iinntteerraaccttiioonnss wwiillll hheellpp uuss uun n--d

deerrssttaanndd mmoorree aabboouutt hhooww uurrbbaann ssmmaarrtt lleeaarrnniinngg aaccttiivviittiieess ccrreeaattee iimmmmeerrssiivvee eexxppeerriieenncceess ffoorr eeaacchh lleeaarrnneerr,, eennggaaggiinngg tthheemm iinn aa vvaarriieettyy ooff iinntteerrnnaall ccooggnniittiivvee aanndd ssoocciiaall pprroocceesssseess.. TToo ccllaarriiffyy mmuuttuuaall iinntteerraaccttiioonn bbeettwweeeenn tthheeoorreettiiccaall aanndd eemmppiirriiccaall ffaaccttoorrss,, aa ssyysstteemm ooff tthheeoorreettiiccaall ffaaccttoorrss ooff ssiiggnniiffiiccaannccee iiss pprrooppoosseedd ttoo bbee ddeevveellooppeedd,, aanndd tthheenn ccoorrrreellaatteedd,, wwiitthh lleeaarrnniinngg e

exxppeerriieennccee aannaallyyssiiss ffaaccttoorrss.. A

A bbrriieeff rreevviieeww ooff hhyybbrriidd lleeaarrnniinngg eennvviirroonnmmeenntt rreesseeaarrcchh,, iinncclluuddiinngg uubbiiqquuiittoouuss lleeaarrnniinngg [[44]] m

maanniiffeesstteedd iinn hhyybbrriidd [[99]],, mmoobbiillee [[88]] aanndd ssmmaarrtt cciittyy [[22]] eennvviirroonnmmeennttss,, pprroovviiddeess ccoonntteexxtt ffoorr h

hooww aannaallyyttiiccaall mmeetthhooddoollooggyy mmiigghhtt bbee aapppplliieedd ttoo aann iinntteerraaccttiivvee lleeaarrnniinngg ssyysstteemm iinn ssmmaarrtt cciittiieess.. PPhheennoommeennooggrraapphhiicc tteecchhnniiqquueess ooff vvaarriiaattiioonn aanndd oouuttccoommee ssppaaccee aarree iinnvveessttiiggaatteedd,, ttooggeetthheerr wwiitthh tthhee DDiiaallooggiicc SSppaaccee ccoonncceepptt [[3300]] ooff ccoonnvveerrssaattiioonn iinntteerraaccttiioonn ffoorr aannaallyyssiinngg d

diiaalloogguueess..

Keywords—smart city learning; mobile learning; networked learning; interactions; evaluation system; connectivism; dialogism; community, social media

I. INTRODUCTION

An evaluation system is being proposed to measure effective learning, considering the learner, the underlying design and the authentic immersive environment [24], using pedagogical theory as a basis for measurement approaches. The design of the evaluation

(2)

system must be versatile in order to measure and analyse the proposed learning experi-ences, and then to make measurable connections with theoretical factors derived from an analysis of relevant theory and research discourse.

A. The context of the learning design

The context of the learning design is defined here as incorporating the pedagogical approach taken in the (explicit or implicit) design, the affordance of digital tool(s), the interface design in relation to the learning design [1], the ‘target audiences’ of the learning design, and the authentic space in which the learning is designed for promoting participation ([8], [27], [7], [4]).

B. The context of the authentic environment

The context of the authentic environment is defined here as learning experiences located in geo-responsive physical environments that mediate interactions between persons, technology and the ubiquitous learning [4] space around them. These experiences may involve synchronous and asynchronous individual interactions with content and a community of learners in the network of participants of the learning experience [28], with digital tools mediating those experiences and facilitating the storage of constructed knowledge in the system ([29], [5]).

C. Mobile learning, WAY-Cyberparks and Smart Data

Mobile learning (ML) prototypes will be developed with subject experts and implemented in open (urban) spaces. At the Upper Barrakka Gardens, Valletta the ML activity is about an identified historical event and in the Argotti Gardens, Floriana about the history and architecture of the place and the potential learning experiences in botany that can be developed at this site. Plans for using similar mobile learning location-based prototypes for other information rich spaces related to different curricular areas such as visual and performing arts will be developed and evaluated as the project progresses. These mobile learning experiences will be mediated by the Way-Cyberparks application (an EU COST funded project research initiative). “CyberParks’ main objective is to create a research platform on the relationship between Information and Communication Technologies (ICT) and the production of public open spaces, and their relevance to sustainable urban development. The impact of this relationship will be explored from social, ecological and urban design perspectives.” [30]. Augmented reality (mobile) learning may form a potentially significant part of this research.

Smart data is gathered by the WAY-Cyberparks application, in that users running the mobile application on their phones (and actively logged in) can walk through a public space and interact with it through the application. The mobile app persistently collects data about their ‘itinerary’ that provides researchers with information to develop knowledge on the interactions between users and that space over time. This data can then be used to enhance user experiences when visiting public spaces. This means that over time, a hybrid 242

(3)

immersive technology mediated learning experience can utilise what the community of learners has constructed as knowledge to enhance the overall personal experience for each learner. The term ‘smart city learning’ for clarification of interpretation may generally refer to the use of these types of large evolving data sets that can be used to inform design, content or interaction, sometimes instantaneously.

II. LITERATURE

A. The Smart Learning City

Buchem & Pérez-Sanagustín [7] offer useful definitions of smart city learning, ‘… as “open libraries” containing a huge number of resources, such as buildings or artworks, that can be used for learning…’ [14], and ‘… encompasses formal, informal and mixed learning experiences in urban spaces […] with embedded technologies, supporting new kinds of learning, especially constructing contextual knowledge by moving and operating in an authentic environment’. The authentic environment that learners inhabit impacts on their perceptions of a learning experience, as ‘the location from which the individual participant accesses (the) online environment is an integral element in the participant’s learning experience’ [16]. This has the potential of ‘transforming learners into active citizens’ [2], in a ‘participatory urbanism’ [7] of smart city living. Buchem & Pérez-Sanagustín provide some inspiration for measuring the impact of an authentic environment on a smart city learning experience with their discussion of blended spaces in the ‘movements of everyday life’, moving between localness and virtuality, allowing learners to play active roles using digital tools of choice and compiling their own learning experiences ([8], [4]).

B. The Interactive System

The interactive system manifested in smart city learning can be considered as a context that provides interactions with subject content in a particular area of knowledge, through a digital environment or tool and involving interpersonal interaction within a community. In this context, evaluation of learning experiences is fundamentally about interactions mediated by technology between learners, content and other learners in a networked community. These interactions create a ‘seamless’ [27] and ‘glocal’ (Certeau, 1988 in [7], [24]) learning experience that is enriched by augmented reality [7] through which learner citizens progress in their awareness, knowledge and competence development. Also described as ‘geo-learning’ [27], smart city learning experiences are (predominantly) accessed via smartphones that use location-based technology. These technologies mediate new ways of learning, but also pose challenges. Questions around privacy [13], user accessibility [26] and technology device provision are apparent. Though smartphone ownership continues to increase, especially in Europe [11], participation may still remain problematic. Historically, participation rates have been low for technology mediated learning experiences [18], and the Internet culture ‘Rule of 1%’ appears to often still be true [9]. While use of social media technologies may facilitate easier access [17], partici-pation and engagement of learners may not increase or improve quality of learning [15]

(4)

without active moderation [9] and social presence of facilitators [19]. Learning design, therefore, might need to address these shortcomings.

C. Measuring Interactions

Methods of data capture and analysis in evaluating smart city learning are complex, as the interactions themselves are multi-modal (face-to-face, virtual, networked) as well as multi-voiced, indicating a move ‘toward more dynamic, social alternatives that recognise the situated and intersubjective nature of meaning-making’, ([12] in [3]). Literature provides useful contexts and inspiration, with particular importance given to phenomenography [22] and phenomenography based approaches [33], networked learning research [6] and dialogism for concepts around dialogic space ([32], [31]), and the self identities of individual learners ([32], [3]). Mamaghani et al.’s [21] analysis of children’s drawn images outlines an approach to iterative content analysis using phenomenographic variation and outcome space categories which could be applied to smart city learner-generated content experiences iteratively over time or activities. Edwards [10] study of experiences of web-based information retrieval, illustrates an approach to creating phenomenographic outcome spaces relevant to this project, demonstrating multiple layers of experience of the same event, dependent on perspective, prior knowledge and purpose.

Considering interactions with the community, aside from dialogic space and the multi-voiced self and ‘other’, Pask’s [23] notion of ‘the limits of togetherness’ might inform some of the analysis of comments amongst groups. This may help to establish and measure conversation (defined by Pask as ‘concept-sharing’) between members of the learning community, as opposed to ‘communication which looks like conversation but is not at all conversational […]’, [23]. This may be distinct from whether or not knowledge is constructed by the networked community [29], and Ravenscroft’s work [25], with the Interloc application, might offer an alternate way of facilitating knowledge construction, if this is considered a desired outcome of ‘effectiveness’ for smart city learning. Laurillard’s [20] warnings about conversation of learners in relation to learning content and navigation of the digital tool (p111-112), and not in relation to learning content itself, may indicate another layer to measure, as “the material [learners] found was highly relevant […] yet appears to have afforded no productive response of any kind”.

III. DEVELOPING THE FRAMEWORK

A. CyberParks Learning at Argotti Gardens, Floriana & Upper Barrakka Gardens, Valletta Mobile learning located at Argotti Gardens in Floriana will consist of various mobile learning activities (Points of Interest) linked to ‘hotpoints’ within and in the vicinity of the gardens. Similar procedure will be applied at the Upper Barrakka Gardens including several Points of Interest for the piloting phase through a single hotpoint. Activated by GNSS (Global Navigation Satellite System) via the CyberParks Android mobile application, a user is offered a selection of PoI, which provide predetermined learning content and functionality to 244

(5)

contribute with user-learner generated content and commentary. The learning design will offer four learning pathways with associated activities: ‘History’ (the history of the location), ‘Structures’ (important structures in the location), ‘Processes’ (industry, manufacturing or social behaviour and traditions at the location) and finally ‘Reflect’ (follow-up activities and additional learning opportunities) on completing the hotpoint(s) journey. These pathways provide learning for novice level acquisition of facts and concepts, participatory support and guidance level (for additional problem solving), ‘metacognition’, and for contributory learning. Evaluation of learning therefore is required to establish the process of learning throughout the experience, for ‘what’ and ‘where’ domain content is being learned or engaged with, and then also ‘how’ it is being learned and to what level. ‘Who’ and ‘why’ factors also contribute to both domain content processing, as well as additional emotional processing of knowledge and engagement. Learning might be evidenced through the creation of user-learner content or in conversations taking place externally from the CyberParks application, for example using Facebook or Instagram, as well as internally within the CyberParks mobile app.

Technical and learner analytics data, such as the number of connections between learners, frequency of shared content and sentiment of comments will be measured against the stage of learning and learning pathway. Analytics will be available within the CyberParks app and externally using social network analysis techniques. Knowledge construction, concept sharing and dialogue concept expansion in learning experience pathways can be measured using learning outcome criteria developed in conjunction with learning de-signers, to recognise and record evidence of learning, at which cognitive level, learning stage and pathway. Pérez-Sanagustín et al [24] describe multi-channel, multi-context, multiple-objective ‘glue’ services for smart city learning. By measuring interactions in relation to geo cached learning hotpoints in AR learning locations, more might be learned about how ‘place and space’ affect and impact learning quality and engagement in relation to conceptualising the glue that Pérez-Sanagustín’s paper discusses. Noting how learner networks form, and the (multiple) roles that learners may adopt, and evaluating the knowledge being constructed ‘in the system’ it is potentially possible to evidence how ‘connectivist’ learning in a smart city hybrid technology mediated environment takes place. This may help to develop useful relationships between learning design and learning experience practice and other stakeholders involved in smart city design and planning, such as technical infrastructure specialists, architects and urban community planners. B. Anticipated learning experiences at Argotti Gardens and Upper Barrakka Gardens This paper focuses on user-learner interactions and on the prediction and gathering of data for evaluation of smart city learning, specifically from user-learner sample groups, though other stakeholder sample groups are also involved in smart city learning implementations (such as learning designers, content creators, subject area specialists and technical application designers and developers). Focusing on mobile learning location-based prototypes being developed and implemented in open/urban spaces located at Upper Barrakka Gardens,

(6)

Valletta for history and Argotti Gardens, Floriana for botany, learning experiences anticipated will include playful learning, citizen enquiry, seamless learning, geo-learning and crowd learning. The structure of data gathering and analysis would be iterative (over time) and in addition be used to investigate a direct or indirect relationship to relevant pedagogical theory and discourse, with a special focus on Connectivism.

C. Evaluating learning in an interactive system - interactions with Content, Digital Tools and Community

In the context of phenomenographical category layers and iteration, and using a dialogic space concept analysis, factors relevant to measuring effectiveness of smart city learning may be derived from data to discover what might be of significance to user-learners. Assessing this learning effectiveness from a variety of user-learner perspectives and analysing relationships with appropriate pedagogy might be then attempted. A first concept of practical techniques using phenomenography is presented here, with ideas for measure-ment of dialogic space, concept sharing, multi-voiced self and knowledge construction. The proposed system for evaluation of smart city learning at Upper Barrakka and Argotti Gardens is intended to evaluate experiences for user-learners in relation to principle category interaction variables, in a context of theoretical factors of significance derived from appropriate literature. These category variables - content, digital tools and community – are distinct in their differences, though all are interactions. Consequently, the principle category analysis system needs sufficient commonality for correlation of interactions so as to establish meaningful relationships between them. The system proposed here is an iterative approach to gathering sets of data for each principle category that bears relation and connection to each other.

It is anticipated that there will be layers of analysis for these interaction categories, both for factors of interest and for measurement factors, in order to accommodate all layers of interaction. Principal factors of interest would include factors determining learning, Human Computer Interaction, the impact of the authentic space on the augmented reality learning experience and community and social network presence and activity. Facts determining learning would evaluate evidence of facts, concepts, problem solving, meta-cognition in interaction behaviour, dialogue and content. The impact of the authentic space evaluates evidence of immersive smart urban space experiences (diverse agents for providing, collecting, creating and sharing information), measurement of seamless learning (blending learning with everyday life) and of ‘glocality’ (where local and global co-exist). Community and social network presence and activity evaluate the sharing, identity building, community role and collective memory building in any learning communities which may form around the experiences. Interface design, functionality affordance, perceived usefulness, perceived ease of use and frictionless journeys (user friendly journeys and navigational design) would attempt to be evaluated as Human Computer Interaction factors. Layers of analysis also need to take into account multiple literacy modalities to evaluate these factors of interest for the impact of types of content on learners: multimedia content (audio, video, 246

(7)

text, images), domain prescribed content, learner-generated content, and comment interaction content.

By utilising ideas drawn from prior research and discourse, the system proposes to analyse these factors. The following examples provided here draw from Mamaghani et al [21] for content analysis features, Pask’s concept-sharing [23] and Wegerif’s Dialogic Space (of addressee, superadressee, infinite other) [31] evaluation for conceptual presence and relevance to establish depth and scope of factors determining learning, for example novice (acquisition), participatory and contributory [4]. ‘Multi-voiced self’ concepts [3] could evaluate identity variation and role in the network and community. These measurement factors could be applied iteratively into variation categories for evaluating the content, comments and direction of interactions within the principal category variables.

D. Examples of Interaction Analysis

The following examples of interaction analysis outlined here, including all Tables illustrating some potential variation categories and outcome spaces, are developed by the author as the basis of a proposed system of smart city learning evaluation (with other work cited where relevant). These examples demonstrate how a system of Interaction Category Variables Analysis can be used to analyse smart city learning interactions for key factors of interest. Examples given here are firstly for learner-generated content analysis: the increase or decrease over learning activity progression demonstrating conceptual assimilation and processing (e.g. [21]) and secondly for community interactions: the increase or decrease over learning activity progression demonstrating identity (perhaps with alternate ‘self voices’, [3], confidence, dialogic space expansion [31] and ‘concept-sharing’ [23]. A third example of digital tools interactions is provided, to measure growing technical efficacy and engagement with digital tool affordances, which could be evaluated for surface and deep interaction functionality efficacy and network participation throughout the learning experience. Looking at social channel engagement can further investigate processes of knowledge construction, concept sharing and roles, and consequent evaluation of the significance of social network interactions and functionality at stages of learning and as a whole. Attempting to evaluate authentic environment relevance and engagement in content detail through evaluating the increase or decrease over learning activity progression, which may be evidenced in comment interactions, sharing and learner-generated content. 1) Example 1 – Interaction with learner-generated content

Example 1 (Table I) looks at how learner-generated content interactions may be analysed, either within the CyberParks app or externally in social channels such as Facebook, Twitter or Instagram. Content analysis follows a concept of phenomenographic context in iterative learning stages.

An example of learner-generated content analysis: A study on analysis of children’s drawn images with themes of waste recycling [21] outlines an approach to iterative content analysis using phenomenography variation and outcome space categories. This approach

(8)

248

of multi-stage analysis lends itself to the analysis of learner-generated content in smart city learning, as learning experiences may have stages of learning or multiple tasks or activities which progress the learners understanding of the concepts being discussed. If tasks were designed to request learners to upload content at intervals related to specific activity stages, attempts might be made to understand and measure their levels of cognitive processing, engagement, social learning and dialogic space interaction. Example 1 may include more granular variation categories for emotion of content and relevance of content to topic, and go on to be developed for analysis of content at stages of learning activity.

2) Example 2 – Interaction with the community

Example 2 (Table II) looks at how community comment interactions may be analysed, either within the CyberParks app or externally in social channels such as Facebook, Twitter or Instagram. Comment analysis follows a concept of dialogic space in a phenomenographic context.

An example of dialogic space analysis: If interactions in the community were grouped into types of statements, association could be recognised and grouped with addressee (direct), superaddressee (the ‘third perspective’), and infinite other (infinite perspectives appearing from those previously referenced by self or group). These could then be counted and analysed iteratively to establish when and where expansion of dialogic space was being evidenced in relation to learning task, activity or stage in pathway.

Wegerif & Ferreira [31] indicate a system of dialogic space that could be developed and implemented, with “Students unpack(ing) opportunities collaboratively looking for attributes and relationships among concepts and new ideas, […] to organize the information”. Categories can then trace the development of the dialogic space for evidence of expansion and reflection.

Example 2 would also include practical ‘when and where’ variation categories to evaluate stages of learning in relation to the authentic environment. Affective (emotion) categories here are more defined than Example 1 as it may be expected to be more evident in relation to learning experience perceptions.

Example 3 – Interaction with a digital tool

Example 3 (Table III) looks at how user-learner interactions may be analysed for the technology mediation of learning

interactions, predominantly within the CyberParks app though also externally in social channels such as Facebook, Twitter or Instagram. User-learner behaviour analysis follows a concept of usability techniques, and also phenomenographic context in relation to learning experience interactions.

An example of digital tool interaction analysis: This involves looking at a number of factors, both those integral to learning interaction affordance and those of human computer

(9)

Variation Category 1: When it was taken

Variation Category 2: Where it was taken

Variation Category 3: What is in image and relevance

Variation Category 4: Who is in the image

Variation Category 5: Emotion of content

Variation Category 6: Why it was taken

Issues/factors to consider: Participation, confidence in sharing, technical efficacy Theoretical discourse that might be found and matched: Student as producer Student centered Participatory based activities Mobile ‘web 2.0’ pedagogies (creative, self-directed) THEORY/ PEDAGOGY GENERAL FACTORS Authentic environment, relevance Knowledge construction Engagement Authentic environment, relevance Knowledge construction Authentic environment, relevance Knowledge construction Identity, community identity, multivoiced identities, role, self efficacy Emotion of engagement Group identity Self efficacy Role (Positive and negative) Engagement Learning authenticity Creative approach THEORY/ PEDAGOGY SPECIFIC FACTORS

I took it before I started (the activity) I took it during the activity but before I finished

I took it after I finished the whole thing

I took it on task number or task name Time of day

The location in general

The location, at the learning ‘stage’ or activity area

Somewhere else related Somewhere else not related Building, Tree,

Flower, Art, Person, Statue, Animal

Type of shot: Vista, Close up, detail On or off topic Friends Family Strangers Classmates Myself No one Violent Angry Peaceful Happy Beautiful I felt like it

I wanted to show I was there My friend looked cool I was into it

I wanted to remember My mum asked me to It looked really old It was pretty

OUTCOME SPACES (PREDICTED) EXTERNAL REFLECTOR:UPLOAD PHOTOGRAPH TO LEARNING ACTIVITY

249 interaction and interaction (interface) design. With a mixed approach to analysis using

pedagogical factors and usability heuristics some understanding might be derived as to the role of technology mediation and affordance in relation to learning experiences at surface and deep level.

(10)

Variation Category 1: Who is being addressed (or referenced) Variation Category 2: (comment content) Variation Category 3: Active contributions or questions to discussion Variation Category 4: Tone/emotion positive or constructive Variation Category 5: Tone/emotion negative or destructive Variation Category 6: Tone/emotion neutral Issues/factors to consider: Community, communication confidence, identity, self and other efficacy awareness, critical thinking and awareness, willingness to share knowledge, risk, Theoretical discourse that might be found and matched: Dialogic space Addressee Superaddressee Infinite Other Multiple identities (p-individuals, multi-voiced selves) Community and communication Concept-sharing Personal Learning Networks Collaborative Learning Communities of Practice Social presence of experts THEORY/ PEDAGOGY GENERAL FACTORS Identity Role Dialogic Space Knowledge construction Roles Experts Self efficacy Knowledge construction Concept sharing Dialogic space Concept sharing Multi-voiced self P-individual Emotion of engagement (sentiment) Empathy Conceptual assimilation Knowledge construction Concept sharing Authentic learning Confidence and sociability Purpose /understanding THEORY/ PEDAGOGY SPECIFIC FACTORS Named Individual Inferred individual

The specific group on that thread A generality of assumption Summoning larger perspective Concrete concepts Questioned knowledge Trivia Opinions Shared facts What if we…

What are you saying about … What makes you say that? If such and such was the case … In class we did …

I remember another similar … That’s so true Hahahaha It’s amazing Gorgeous/lovely idea/work/skill Imagine if … That’s rubbish I don’t believe that You just made that up Negative memes

I have no clue what you’re talking about

No idea Off topic

OUTCOME SPACES (PREDICTED) EXTERNAL REFLECTOR: INDIVIDUAL POSTS COMMENT (E.G. ABOUT IMAGE)

250

(11)

Variation Category 1: Negative Registration experiences Variation Category 2: Positive Registration experiences Variation Category 3: Neutral Registration experiences Issues/factors to consider: personal identity, privacy, confidence, trust, sociability, consent, purpose, engagement Theoretical discourse that might be found and matched: Identity, trust, perceived usefulness, curiosity, discovery, sociability online THEORY/ PEDAGOGY GENERAL Sociability Self efficacy Digital literacy Perceived usefulness Perceived ease of use Privacy Confidence Sociability Self efficacy Digital literacy Perceived usefulness Perceived ease of use Privacy Confidence Curiosity Sociability Self efficacy Digital literacy Perceived usefulness Perceived ease of use Privacy Confidence THEORY/ PEDAGOGY SPECIFIC

I hate doing this kind of thing It was too fussy

I couldn’t use Facebook I don’t use social media anyway It didn’t work

I don’t give my email to anyone Other negatives

It was ok I had no problem Mum said it was easy I think its fun

I used a mad username I thought I might use it again so it was worth the hassle Other positives

Not sure Don’t know Didn’t think about it *shrugs shoulders* Mum did it Other neutrals

OUTCOME SPACES (PREDICTED) EXTERNAL REFLECTOR: REGISTER ON THE

WAY-CYBERPARKS APPLICATION

251

TABLE III. EXAMPLE 3 - INTERACTION WITH DIGITAL TOOL PREDICTED OUTCOME SPACES

Example 3 might be developed to include other categories for technical self-efficacy (surface and deep structure of the tool for information design and pedagogical features) and emotions about technology. Surface structure interactions refer to interface functional activity, navigation of content and system understanding or technical manipulation of content (creating, editing or sharing content). Deep structure technical interactions may be a measurement of how many interactions a learner makes with asynchronous community members, or connects and interacts with an external expert about domain content or query problem solving.

IV. PARTICIPANT SECOND ANALYSIS

A type of analysis conceptualised by the author, known here as ‘Participant Second Analysis’ might be utilised, where it may be possible to see how participants themselves analyse and interpret interactions. Discussions and category analysis using card-sorting techniques

(12)

Where was it taken? Describe to me in your own words Location and stage in learning activity (factual) What does it represent? Is this image important to you? In what ways?

What is in the photo? – Describe the scene in your own words: (A building, view, landscape, close up detail, atmosphere) – Do you like it? If so, what made you like it? If not, why not? People you know – who are they? Is it important they are included? Why?

People you don’t know – why did you take it with them in it?

Yourself – why did you take a selfie? What does it represent or mean to you?

Why was it taken, what inspired the action?

Did you share it? Where, with whom? Why did you share it?

Student directed learning Student participation Creative pedagogy Personal learning Learner agency and autonomy THEORY/ PEDAGOGY GENERAL FACTORS Knowledge construction Authentic environment situated learning Meaning making Concept-sharing Concept assimilation Multiple intelligences THEORY/ PEDAGOGY SPECIFIC FACTORS POTENTIAL QUESTIONS ABOUT PHOTOGRAPH

OR VIDEO CONTENT GENERATED BY THE LEARNER

(LEARNER-GENERATED CONTENT)

252

might be particularly enlightening for learner-generated content interactions and community interactions, and could be carried out after a learning activity or during the event. This would elicit think-aloud or focus group data, from participant groups or with individuals. I) Participant Second Analysis for learner-generated content

(Table IV.) Potential questions for learner-generated content, looking at content shared in social media channels or in the WAY-Cyberparks app, individuals or groups could be asked to talk about the content.

TABLE IV. EXAMINING LEARNER-GENERATED CONTENT INTERACTIONS IN PARTICIPANT SECOND ANALYSIS

These questions and similar ones in semi-scripted interview or focus group discussion can expand a dialogic space for the learner(s) to tell us about what they experience in a learner-generated content interaction. We are then able to deduce more about levels of concept construction and assimilation, identity development and critical analysis skills. 2) Participant Second Analysis for community interactions

(Table V.) Potential questions about comments made by learners in networked community scenarios, looking at comment threads made in social media channels or in the WAY-Cyberparks app, individuals or groups could be asked to talk about what was going on in the thread. These and similar probing questions could shed light on how learners feel when interacting in comment threads, how they might be developing conceptual understanding, how the process promotes or hinders this, expands and develops dialogic space and can perhaps be measured to create variation categories using some criteria discussed in [31].

(13)

Who are you talking to there? Why did you say that at that point?

Did you mean you agree with that statement, or disagree? Did you get the feeling people liked you in the group? Did you get the feeling people disliked you in the group? Did you feel that comment was bossy or aggressive? Did you want to say more there, and held back? Did you think that some of the people chatting were very knowledgeable?

Did you feel shy? Why?

Did you feel like it was fun or interesting? Why? Did you think this was a boring thread? Did anyone talk about (insertfactual or relevant info on topic)?

Was anyone trolling or being annoying? Why did you start posting in the thread?

Student directed learning Student participation Creative pedagogy Personal learning Learner agency and autonomy THEORY/ PEDAGOGY GENERAL FACTORS Multi-voiced self Identity making Roles in community and network (novice/expert) Confidence Self efficacy Meaning making Concept sharing Dialogic space expansion THEORY/ PEDAGOGY SPECIFIC FACTORS POTENTIAL QUESTIONS ABOUT COMMENTS MADE BY

LEARNERS IN NETWORKED COMMUNITY SCENARIOS

253

TABLE V. EXAMINING INTERACTIONS IN THE COMMUNITY (COMMENTS), IN PARTICIPANT SECOND ANALYSIS

V. CONCLUSIONS

Measuring the effectiveness of learning without resorting to assessment is a challenge in any conventional classroom. To attempt this, with additional challenges and variables posed by physical space and technology mediation impact, further complicates the analysis methodology. However, by looking at the interactions first, for authentic space context, community concept sharing and human computer interaction factors, insight can be gained. Through diligent analysis of the findings, a contribution can potentially be made to urban planning as well as for technical application and learning design. A question persists: is interactivity engagement a reliable measure of learning effectiveness? The rate of active learner participation may not reflect levels of engagement or cognitive processing [15]. Data gathered from interactive geo learning experiences located in Valletta may yield findings to shed further light and contribute to greater understanding in this particular discourse if this question is acknowledged.

Overall, creating effective learning design pedagogy for smart city learning, with its multiple strand stakeholders, considerations and analytical layers, is an evolving process to be established by ongoing research, discourse and interpretation. Many ethical considerations - not discussed in this paper - are potentially problematic for smart city learning, for data privacy, data anonymity, intellectual property rights, legal aspects of terms of use, accessibility and digital literacy amongst others. By gaining insight into levels of usefulness, engagement and learning quality, these separate challenges might have a wider knowledge base on which to form new approaches in some of these areas.

(14)

REFERENCES

[1] Amershi, S, Arksey, N, Carenini, G, Conati, C, Mackworth, A, Maclaren, H, Poole, D, 2005, Designing CIspace: Pedagogy and Usability in a Learning Environment for AI, ITiCSE ’05, June 27-29, 2005, Monte de Caparica, Portugal. Available from: ACM Digital Library [1 March 2016]

[2] Andone, D, Holotescu, C & Grosseck, G, 2014, ‘Learning Communities in Smart Cities, Case Studies’. Available from: IEEE 978-1-4799-5739-2/14/ [12 March 2016]

[3] Aveling, E, Gillespie, A, Cornish, F, 2014, ‘A qualitative method for analysing multivoicedness’, Qualitative Research 1-18, 2014. Available from: Sage Publications DOI: 10.1177/1468794114557991 [3 March 2016] [4] Bonanno, P, 2011, ‘A Process-oriented pedagogy for Ubiquitous Learning’, in ‘Ubiquitous learning: strategies

for pedagogy, course design, and technology’, ed. Kidd, T & Chan, I, Information Age Pub

[5] Bonanno, P, 2014, Designing Learning in Social On-line Learning Environments: A Process-oriented Approach, in Mallia, G.: The Social Classroom: Integrating Social Network Use in Education, IGI Global Publishing. (Pg 40-61)

[6] Booth, S, 2008, ‘Researching Learning in Networked Learning – Phenomenography and Variation theory as empirical and theoretical approaches’, Proceedings of the Sixth International Conference on Networked Learning, Networked Learning 2008, Greece . Available from: http://www.networkedlearningconference.org.uk /past/nlc2008/ [22 April 2016]

[7] Buchem, I and Pérez-Sanagustín, M, 2013, ‘Personal Learning Environments in Smart Cities: Current Approaches and Future Scenarios’, eLearning Papers no. 35, ISSN: 1887-1542. Available from: http://www.openeducationeuropa.eu/en/article/Personal-Learning-Environments-in-Smart-Cities%3A— Current-Approaches-and-Future-Scenarios?paper=133343 [22 February 2016]

[8] Cochrane, T 2014, Mobile Social Media as a Catalyst for Pedagogical Change. In J. Viteli & M. Leikomaa (Eds.), Proceedings of EdMedia: World Conference on Educational Media and Technology 2014 (pp. 2187-2200). Association for the Advancement of Computing in Education (AACE). Preview version available from: https://www.researchgate.net/publication/269992938_Mobile_Social_Media_as_a_Catalyst_for_Pedago gical_Change [27 Oct 2015]

[9] Cook, J, Lander, R and Flaxton, T, 2015, ‘The zone of possibility in citizen led hybrid cities. In: Workshop on Smart Learning Ecosystems in Smart Regions and Cities’, Toledo, Spain, 15 September 2015. Available from: http://eprints.uwe.ac.uk/26208 [12 March 2016]

[10] Edwards, S, 2005, Panning for Gold: Influencing the experience of web-based information searching, PhD thesis, School of Information Systems, Faculty of Information Technology, Queensland University of Technology. Available from http://eprints.qut.edu.au/view/person/Edwards,_Sylvia.html [24 October 2015]

[11] Ericsson (Europe) Mobility Report, Appendix, 2014, Ericsson SE-126 25 Stockholm, Sweden, EAB 14:018052. Available from https://www.ericsson.com/res/docs/2014/emr-november2014-regional-appendices-europe.pdf [27 October 2015]

[12] Gillespie A and Cornish F (2010) Intersubjectivity: towards a dialogical analysis. Journal for the Theory of Social Behaviour 40(1): 19–46. Available from: http://onlinelibrary.wiley.com/doi/10.1111/j.1468-5914.2009.00419.x/abstract [26 July 2016]

[13] Giovannella, C, Carcone, S & Camusi, A 2011, “What and how to monitor complex educative experiences. Toward the definition of a general framework”, IxD&A, N. 11-12, pp. 7-23, in ‘International Observatory on Smart City Learning, Problems Addressed’. Available from: http://www.mifav.uniroma2.it/inevent /events/sclo/index.php?s=166&a=300. [12 February 2016]

[14] Giovanella, C, Martens, A & Zualkernan, I, 2016, Grand Challenge Problem 1: People Centered Smart “Cities” Through Smart City Learning, in J. Eberle et al. (eds.), Grand Challenge Problems in Technology-Enhanced Learning II: MOOCs and Beyond, SpringerBriefs in Education, DOI 10.1007/978-3-319-12562-6_2 [15] Hubble, D, 2009, COLMSCT Final Report ‘Improving student participation in e-learning activities’, UK Open

University. Available from: http://www.open.ac.uk/opencetl/resources/colmsct-resources/hubble-d-2009-colmsct-final-report-%E2%80%98improving-student-participation-e-learning-activities%E2%80%99 [22 February 2016]

[16] Jamieson, P, et al, 2000, Place and Space in the Design of New Learning Environments, HERDSA (Higher Education Research and Development) Volume 19 Number 2 July 2000 pp221-237. Available from: http://www.oecd.org/edu/innovation-education/2675768.pdf [25 July 2016]

(15)

[17] Kent, M, 2013, Using social media dialogically: Public relations role in reviving democracy, Public relations review, Vol 39, p337-345. Available from: Science Direct http://www.sciencedirect.com/science /article/pii/S0363811113001136 [26 July 2016]

[18] Kreijns, K, Kirschner, P, Jochems, W, 2002, ‘The Sociability of Computer Supported Collaborative Learning Environments’, Educational Technology & Society 5 (1). Available from: http://www.ifets.info/journals /5_1/kreijns.html [12 March 2016]

[19] Kyei-Blankson, L, Ntuli, E & Heather Donnelly, D, 2016, ‘Establishing the Importance of Interaction and Presence to Student Learning in Online Environments‘, World Journal of Educational Vol. 3, No. 1, 2016. Available from: http://www.scholink.org/ojs/index.php/wjer/article/view/428 [10 February 2016] [20] Laurillard, D, 2002, Rethinking University Teaching, a conversational framework for the effective use

of learning technologies, 2nd edition. Routledge Falmer

[21] Mamaghani, N, Mostowfi, S & Khorram, M, 2015, ‘Using DAST-C and Phenomenography as a Tool for Evaluating Children’s Experience’, American Journal of Educational Research, Vol. 3, No. 11, pp 1337-1345. Available from: http://pubs.sciepub.com/education/3/11/1/index.html, last viewed 6/2/16 [22] Marton, F, 1981, Phenomenography - Describing Conceptions of the World around us, Instructional Science

10.p 177-200, Elsevier Publishing.

[23] Pask, G, 1980, ‘The Limits of Togetherness’, Invited paper, Information Processing 80, S.H. Lavington (ed.) Norman-Holland Publishing Company.

[24] Pérez-Sanagustín, M, Buchem, I & Kloos, CD, 2013, Multi-channel, multi-objective, multi-context services: The glue of the smart cities learning ecosystem, Interaction Design and Architecture(s) Journal - IxD&A, N. 17, 2013, pp. 43-52

[25] Ravenscroft, A, 2011, ‘Dialogue and Connectivism: A New Approach to Understanding and Promoting Dialogue-Rich Networked Learning’, International Review of Research in Open and Distance Learning, Vol. 12.3. Available from: http://www.irrodl.org/index.php/irrodl/article/view/934/1676 [12 March 2016] [26] Seale, J & Cooper, M, 2010, E-learning and accessibility: An exploration of the potential role of generic pedagogical tools, Computers & Education 54 1107–1116. Available from: Science Direct http://www.sciencedirect.com/science/article/pii/S0360131509003078 [28 February 2016]

[27] Sharples, M., McAndrew, P., Weller, M., Ferguson, R., FitzGerald, E., Hirst, T., and Gaved, M. (2013). Innovating Pedagogy 2013: Open University Innovation Report 2. Milton Keynes: The Open University. Available from: http://www.open.ac.uk/iet/main/sites/www.open.ac.uk.iet.main/files/files/ecms /web-content/Innovating_Pedagogy_report_2013.pdf [1 March 2016]

[28] Siemens, G, 2005, ‘Connectivism: Learning as Network Creation’, ElearnSpace. Available from: http://www.elearnspace.org/Articles/networks.htm [28 January 2016]

[29] Siemens, G, 2004, ‘Connectivism: A Learning Theory for the Digital Age’, ElearnSpace. Available from: http://www.elearnspace.org/Articles/connectivism.htm [28 January 2016]

[30] WAY-Cyberparks, COST European Cooperation in Science and Technology, supported by EU Framework Programme Horizon 2020, available from http://cyberparks-project.eu [10 March 2016].

[31] Wegerif, R & Ferreira, DJ, 2011, Dialogic Framework for Creative and Collaborative Problem-solving, Computer-Supported Collaborative Learning Conference July 4-8, 2011, Hong Kong, China, Vol 2, Short Papers & Posters, p888. Available from: https://www.isls.org/conferences/cscl [3 October 2015] [32] Wegerif, R, and Yang, Y, 2011, ‘Technology and Dialogic Space: Lessons from History and from the

‘Argunaut’ and ‘Metafora’ Projects’, 9th International Computer-Supported Collaborative Learning Conference July 4-8, 2011, Hong Kong, China, Vol 2, Short Papers & Posters, p312. AVasilable from: https://www.isls.org/conferences/cscl [3 October 2016

[33] Yates, C, Partridge, H & Bruce, C, 2012, Exploring information experiences through phenomenography, Library and Information Research Volume 36 Number 11. Available from: http://www.lirgjournal.org.uk /lir/ojs/index.php/lir/article/viewFile/496/552 [14 November 2015]

Penelope J Lister, MA MSc MBCS FHEA, Ph.D. student

Department of Leadership for Learning Innovation University of Malta

Msida, Malta

penelope.lister.16@um.edu.mt

References

Related documents

the design, development, implementation, evaluation and management of instruction and non-instructional processes and resources intended to improve learning and performance in

At moderate work rates (50 to 70 % of maximum oxy- gen uptake, or maximum exercise capacity), respirators impede the loss of heat from the face and can result in hyperthermia

Interestingly, compared to cirrhotic patients who had not undergone splenectomy (non-Spx group) and were carrying the TG/GG allele at the IL28B gene, those who underwent

In Android, you can create a new font family as an XML resource and access it as a single unit, instead of referencing each style and weight as separate resources.. By doing this,

Use the Cyrus & beck approach to determine the line to be

"Ovidius" University of Constan Ġ a, Faculty of Pharmacy, Campus Corp B, University Alley, Nr. ) grows abundantly and spontaneously in the Black Sea. Rhodophyta are

Almost all tailgut cysts can be successfully removed via a posterior approach [ 5 ]; how- ever, for larger lesions or suspected malignancy, a combined approach will allow

When we exam- ined data from a previous study of field isolates from the UK, a similar comparison (combining the results obtained for antisera raised against two vaccine strains