influenced by Kant’s philosophy, albeit this influence is only partial. In particular, Berlin rejects Kantian universalism and rationalism, however he is happy to incorporate his view of individuals as intrinsically valuable – of human beings as ends in themselves, and never as means, and as self-creating creatures – to his own definition of human nature. This presents a considerable innovation in comparison with most theoretical analyses of Berlin, as they either assume a normative argument within value pluralism, 9 or they declare Berlin’s theory as incomplete to this extent and try to elaborate a normative argument for liberalism by extracting conclusions directly from value pluralism. 10 Berlin’s ethical viewpoint is articulated in PIRA more explicitly than anywhere else throughout his oeuvre, however it is a view that shines through many of his works. The notion of humanistic ethics is not extensively developed or systematically detailed anywhere, but instead it comes through in the shape of a hermeneutical discourse that narrates the ideological changes that shaped Europe from the Enlightenment onwards, and the implications and consequences of these. Ultimately, Berlin’s story of Europe is one of the establishment of human beings as autonomous creatures, and of the world and humankind as devoid from great teleological endings, against what the Christian heritage of Europe had determined until the French Revolution. Berlin’s understanding of human nature is then not strictly metaphysical, insofar as it acknowledges that the view of human beings as fundamentally autonomous and of values as plural are the result of a historic development. This view may change over time, but it will stay as it is until some revolution changes it. His philosophy is a call to gain awareness of the fact that we have made it past the point of the teleological and ideal theories of our Greco-Roman heritage. To this extent, Berlin can be seen as a theorist of modernity, and even as a nihilist with his denial of the great narratives of humankind that he locates at the heart of the totalitarianisms he challenges with his work. This
The reception of the theory of descriptions has a long history. When Russell sent the paper to Mind, G. F. Stout, the then editor of the journal, begged him to reconsider its publication. Russell, however, refused. G. E. Moore is reported to have admitted that he could not understand the theory until it was given a clearer for- mulation in Principia, a claim that we have good reasons to doubt. In the decades following its publication, many philosophers looked upon the theory for inspiration in philosophical methodol- ogy. Acknowledging Russell’s distinction between the real and apparent logical form of a proposition, they drew from the theory methodological conclusions which were apparently quite far from anything ever advocated by Russell himself. Thus Gilbert Ryle wrote, though not without sadness, that the task of philosophy was ‘the detection of the sources in linguistic idioms of recurrent misconstructions and absurd theories’ (1931–2, 170). To be sure, Russell himself thought that his theory was essential for logical hygiene as well as for retaining a ‘robust sense of reality’; but he also thought that one could secure such desiderata without in- dulging in the excesses of linguistic philosophy.
On the contrary, in general our instincts are to affirm the rights of the minority, even over and against the rights of the majority, an affirmation that conceals a deeper assumption about the ontology of human conflict. This ontology of human subjectivity, which sees human beings as isolated, atomistic entities, is the legacy of political liberalism which itself has its roots in modern subjectivity of Cartesian philosophy, a legacy that philosophers such as Martin Heidegger, Paul Ricoeur, and Charles Taylor – all of whom draw on ancient Greek philosophy to articulate their senses of political community – have sought to draw attention to. But this philosophical challenge to individualist subjectivity has not yet percolated into the mainstream discourse of educational theory. Evidence of this claim is the present ubiquity of the view that there must exist a conflict of rights or interests; that there is a natural necessity to balance the rights of some against the rights of others. The inability to regard human relations in anything other than conflictual terms is part and parcel of the concealment of what has been called ‘social ontology’ (Lewin 2011, 215-220).
‘new musicology’ of the 1990s persistently confused an American institutional dispute for a global rearrangement of scholarly priorities. The long-term consequence has been the imposition of a postmodern musicological mainstream, which imitates, at a distance, similar institutionalizations in literary theory and philosophy, and which projects geographically circumscribed institutional structures as if they were scholarly universals. Despite protestations of pluralism, Taruskin’s hugely impressive history is likewise ultimately and pointedly world-historical, charting the rise and fall of the ‘literate’ Western tradition over a millennium, and occasionally rearranging its furni- ture to accommodate revised geographical and style-historical priorities. And even the casual observer could not fail to spot the persistent North American tilt of the later chapters: Charles Ives receives 47 pages of sustained attention, whilst Elgar goes unmentioned except for a passing reference in an estimation of Ives’ influences, Sibelius becomes a mere adjunct to Roy Harris and the American symphonists, and Vaughan Williams only makes it as far as the introduction, and then as part of an apology of omission. This is not quite Grout and Palisca’s preposterously overstated ‘American twentieth century’, but the inclination is the same. 61 To put the case bluntly:
reasoning for considering grounded theory methods as appropriate for this qualitative study, noting some of the advantages that I saw in this approach. Next I turn to exploring the research methodology employed, with Section 4 largely serving as the heart of the article and its examination of philosophy of technology using grounded theory methods. Section 4 will describe the study's participants, the sampling methods used, and the data collection methods that combined a semi-structured interview protocol with a written questionnaire. I will also cover the data analysis methods I used that included memo writing, open and axial coding, constant comparative analysis, and theoretical saturation of categories. Section 5 will summarize the research findings and describe the core category that emerged from the data analysis. That section will present an abridged treatment of findings, but not an exhaustive coverage of the results. Finally, Section 6 will present conclusions including a substantive theory, with a graphical figure to show how the core category is given the greater weight in technology decision making. A substantive theory in the grounded theory tradition is a theory generated from empirical data and qualitative analysis that is derived from the substantive area (CORBIN & STRAUSS, 2008), and applies to the data while being independent of it (URQUHART, LEHMANN & MYERS, 2010).  1.2 Research Purpose, Questions, and Design
The next phase of theory-building research is described as operationalization. This phase involves converting the model’s transactions, proposed and elaborated in the conceptual development phase, into questions, hypotheses, and propositions that can be examined through research. This phase parallels what Cook, Bordage, and Schmidt (2008) referred to as clarification studies and what Pierce (2012) referred to as relational studies. Clarification studies take the concepts and propositions derived and explained from philosophy, history, and theory one step further. Here, researchers propose and study impacts on learning and teaching. The findings, though, rather than only reporting the observed effects, are used to clarify the “processes that underlie the observed effects” (p. 30) and thus refine the emerging theoretical framework. Clarification studies, though rare in occupational therapy education, tend to look toward the future by constructing a theoretical infrastructure operationalized for future empirical work. Take again the LELQ model as an example. Having established the model’s premises from philosophy, history, and theory, Wood operationalized the concepts to establish a research program to further assess the concepts (1998a, 1998b, 2002). The research focus was to examine the premises of the emerging model, thus continuing philosophical and theoretical inquiry at an empirical level.
Section 2 briefly summarized the consequences of scientific, technical and economic development in producing entropy in the world. Section 3 illustrates the limitations of scientific methodology due to conservation laws, Godel’s incompleteness theorems, Darwin’s principle of evolution versus the Jain principle of “Live and Let Live” and many others. They all imply a need to develop an abstract concept of consciousness or soul. Section 4 elaborates limitations of science as applied to living systems and need to use General Systems Theory. Section 5 explains that there is a need for a concept of consciousness and its evolution in the modern science if we look at these issues in a systems perspective. Section 6 highlights some factors, attributes and systems which implies a need for recognizing a need for the concept of consciousness within the modern science. Section 7 discusses the concept of soul and knowledge through soul in Jain philosophy, along with the soul matter interaction and the concept of evolution in Jainism. In section 8, it is mentioned that we must treat scientific knowledge as only small sub set of total knowledge which can be perceived through the consciousness. Section 9 gives three examples of higher stages of consciousness and spiritual order with a quantitative evidence about sharp memory of Swamy Vivekanand. Two examples of shatavdhanies are also given which clearly illustrates higher stages of consciousness which may involve extraordinary capability of the human mind and consciousness. An example is also given which shows that ancient Jain acharyas might have directly perceived the smallest particle of matter and even tried to estimate their sizes during these higher stages of pure consciousness. In section 10, a hypothesis is put forward that spiritual processes may be defined as that set of processes in which rate of entropy production and total entropy decreases, along with reduction in consumption of natural resources and is accompanied by the emergence of a new type of order. In section 11, some examples of order in nature are given. Whereas section 12 describes five different types of bodies for living beings, which can exist in nature and have some mysterious structures and functions which include movement of soul from one birth to another, even after death. In section 13, we provide an exploratory view about consciousness, information layers and an optimal strategy with ideas coming from different sources to understand soul. The Discussion and conclusion are given in section 14.
To address this aspect of missing feedback, this study introduces a new theory for understanding our use of technology: mediation theory. The technologies that we use influence how we perceive and act upon the world around us. Binoculars enlarge what you see, but reduce the breadth of your sight. A hammer amplifies the force you exert on a nail, but reduces your striking precision. Technology thus mediates our perception by amplifying some aspects and reducing other aspects. Technology also mediates our actions by amplifying some parts and reducing others. This provides a new interpretation of the cell phone. Talking on the phone can reduce the distance experienced and enlarge our communication capabilities. As the phone mediates our communication, it influences how we relate to the world and to others. In similar fashion, the use of social media amplifies our connection to the people we know and reduces the difficulty to reach a broad audience. The use of social media has an impact on how we perceive and act in relation to other users of the technology. Thus, the technology we use co-constitutes how we relate to the world around us. Being an iPhone-user influences how we relate to the people we interact with. In co-constitution, not only our concept of technology is dynamic, but our understanding of the human becomes dynamic too. To bridge the two domains of IS literature and Philosophy of Technology, a new understanding of appropriation is developed that follows the line of mediation theory. This is essentially an extension of the mediation theory, which focuses on how we come to be a smartphone user: we have to appropriate the smartphone.
A way has been found out of this predicament, however. In the first place, it was argued that the exponents of the CHAT came to the formulation of their theory on the basis of their particular and unique examination of reality, a reality created and sustained by the God of the Bible. While their examination did not yield results that are in all respects Biblical, such as its bias towards the social, cultural and historical aspects of reality, this can be counterbalanced with a Biblical view of reality. Second, CHAT tenets and the presuppositions in which they are rooted have been found to be rarely in conflict with Biblical views. This will become clear in the discussion below where CHAT views are employed in the context of a Biblical view of reality and of education. Third, Christian thinkers are compelled in terms of 2 Corinthians 10:5 to “take captive every thought to make it obedient to Christ” (New International Version). This “making obedient” of the secular thoughts encapsulated in the CHAT entails a process of life-view and / or philosophical and / or religious transformation, as Klapwijk (1989: 48) argued: Christian philosophy (of education) often finds itself challenged by ideas that are in themselves not strictly Biblical in origin, meaning or impact. When this occurs, Christian philosophy (of education) is called to take a critical stance regarding such cultural goods and societal achievements. Within the all-encompassing framework of a secular worldview, these achievements are often objectionable,
But Feyerabend neglects the next vital step entirely. The theorist must demonstrate that his postulates can indeed be used to compress observed data, and compress it better than other ideas, either with greater accuracy or with a broader range of applicability, for example. And the observers and experimenters must demonstrate that their new techniques provide reliable data that can be repeated and verified by skeptical observers under carefully controlled conditions. Novel paradigms and observing techniques that fail these tests can reasonably be rejected. This is essentially a Darwinian “survival of the fittest” process for a given set of theories or compression algorithms and observing techniques. In ignoring this essential feature of scientific methods Feyerabend is, in effect, proposing a philosophy of science akin to a Darwinian evolution theory that lacks the process of natural selection. And any biologist will tell you that such an omission is fatal, that evolution with only random variation and not natural selection will not work. Neither will a philosophy of science that neglects the essential weeding out of observational techniques that fail to provide reproducible results or of theoretical ideas that fail to produce an improved compression of information.
It is focused that the philosophy of stem cell might be derived from the philosophy of “J-Cell”. J-Cell shall mean composed of Three-in-one fundamental Neutrino particles exist in the Sound free environment having creation effect. It is focused that the fertilization of egg shall be considered taking place under absolutely sound free environment for the period up to 15 to 25 days. The first heart beat sound is considered as produced after about 25 days from the fetus. It is focused that Life shall be considered as originated from absolutely sound free environment which is considered as “AMEN”.
John Dewey is the theorist who first examined the importance of experience in the context of learning. Although Dewey had no specific theory of adult education, his thinking has had a profound influence on the entire field and most adult educators have emphasized the fundamental role experience plays in learning in adulthood (e.g. Finger & Asun, 2001; Brookfied, 1987; Knowles, 1980, Lindeman 1961). The paper argues that Dewey’s theory is inadequate in fully explaining the central importance of social context. The problem is that Dewey’s ideas cannot overcome the dualism between the individual and society. The limit of Dewey’s theory is that he doesn’t suggest what the mediation between the individual and the society is, even though he emphasizes the interaction between the two. Thus as Engstrom & Miettinen (1999) have pointed out, the problem with Dewey’s theory is the almost total absence of the process of cultural mediation. We contend that activity theory, specifically cultural historical activity theory (CHAT), has explored the concepts of mediation, tools, signs, and appropriation as a means to overcome the inherent dualism in Dewey. Thus CHAT has made it possible for us to understand the internal and essential connection between the individual and society.
These four models of the relationship between theory, intention and practice are drawn from what I see as the implications of both Longchenpa’s critical reading of his tradition as well as contemporary philosophies of practices. They condense the primary interpretative stances available to us from within a causal metaphysics of practice. Each gives an account of what appear to be spontaneous actions and each presume a fundamentally causal perspective in that they seek the causes, reasons or grounds for those actions. I ultimately find that each configuration obscures how Longchenpa understands and implements spontaneous action. In fact, spontaneity is possible only when these causal configurations fall away. Over the course of the second part of the dissertation, I will lead us through Longchenpa’s explicit or implied critique of all four of these interpretations of practice. In each case, I will show how the configuration in question cannot adequately account for the kind of practice that Longchenpa envisions. At the same time this critique will also become the occasion for me to explore how I see Longchenpa creatively refashioning the
Prediction problems have also intrigued information theorists since the early days of the in- formation theory field. For example, Shannon estimated the entropy of the English language by letting humans predict the next symbol in English texts . Motivated by applications of data compression, Ziv and Lempel  proposed an online universal coding system for arbitrary in- dividual sequences. In the compression setting, the learner is not committed to a single prediction but rather assigns a probability over the set of possible outcomes. The success of the coding system is measured by the total likelihood of the entire sequence of symbols. Feder, Merhav, and Gutman  applied universal coding systems to prediction problems, where the goal is to minimize the number of prediction errors. Their basic idea is to use an estimation of the conditional probabilities of the outcomes given previous symbols, as calculated by the Lempel-Ziv coding system, and then to randomly guess the next symbol based on this conditional probability.
One basic problem discussed in statistical learning theory is the pattern recognition problem: “How can data be used to find good rules for classifying new cases on the basis of the values of certain features of those cases?” As we have indicated, the simplest version of the problem presupposes that there is an unknown statistical probability distribution that specifies probabilistic relations between feature values of each possible case and its correct classification and also specifies how likely various cases are to come up either as data or as new cases to be classified. In this simplest version, these probabilities are assumed to be identically distributed and are independent of each other, but no other assumption about the probability distribution is made.