Collected papers on neutrosophics [such as: ?neutrosophy? - a new branch of philosophy, ?neutrosophic logic? ? a generalization of the fuzzy logic, ?neutrosophic set? ? a generalization of the fuzzy set, and ?neutrosophic probability? ? a generalization of classical probability and imprecise probability] by Florentin Smarandache, Jean Dezert, Andrzej Buller, Mohammad Khoshnevisan, Sarjinder Singh, Sukanto Bhattacharya, Feng Liu, Gh. C. Dinulescu-Campina, Chris Lucas, and Carlos Gershenson.Neutrosophic Logic involved the foundation of the Dezert-Smarandache Theory of Plausible and Paradoxical Reasoning, which has taken into consideration the combination of uncertain and contradictory information, used now in artificial intelligence.
Main headings: Part I. Contributions to the idealizational theory of science.- Part II. The nature of scientific cognition. - Part III. The development of science. - Part IV. Problems of verification of knowledge. - Part V. Philosophy of physics and cosmology. - Part VI. Some problems of the theory of reality.
Models and theories are of central importance in science, and scientists spend substantial amounts of time building, testing, comparing and revising models and theories. It is therefore not surprising that the nature of scientific models and theories has been a widely debated topic within the philosophy of science for many years. The product of two decades of research, this book provides an accessible yet critical introduction to the debates about models and theories within analytical philosophy of science since the 1920s. Roman Frigg surveys and discusses key topics and questions, including: What are theories? What are models? And how do models and theories relate to each other? The linguistic view of theories (also known as the syntactic view of theories), covering different articulations of the view, its use of models, the theory-observation divide and the theory-ladenness of observation, and the meaning of theoretical terms. The model-theoretical view of theories (also known as the semantic view of theories), covering its analysis of the model-world relationship, the internal structure of a theory, and the ontology of models. Scientific representation, discussing analogy, idealisation and different accounts of representation. Modelling in scientific practice, examining how models relate to theories and what models are, classifying different kinds of models, and investigating how robustness analysis, perspectivism, and approaches committed to uncertainty-management deal with multi-model situations. Models and Theories is the first comprehensive book-length treatment of the topic, making it essential reading for advanced undergraduates, researchers, and professional philosophers working in philosophy of science and philosophy of technology. It will also be of interest to philosophically minded readers working in physics, computer sciences and STEM fields more broadly.
Science is highly dependent on technologies to observe scientific objects. For example, astronomers need telescopes to observe planetary movements, and cognitive neuroscience depends on brain imaging technologies to investigate human cognition. But how do such technologies shape scientific practice, and how do new scientific objects come into being when new technologies are used in science? In How Scientific Instruments Speak, Bas de Boer develops a philosophical account of how technologies shape the reality that scientists study, arguing that we should understand scientific instruments as mediating technologies. Rather than mute tools serving pre-existing human goals, scientific instruments play an active role in shaping scientific work. De Boer uses this account to discuss how brain imaging and stimulation technologies mediate the way in which cognitive neuroscientists investigate human cognitive functions. The development of cognitive neuroscience runs parallel with the development of advanced brain imaging technologies, drawing a lot of public attention—sometimes called “neurohype”—because of its alleged capacity to demystify the human mind. By analyzing how the objects that cognitive neuroscientists study are mediated by brain imaging technologies, de Boer explicates the processes by which human cognition is investigated.
This book deals with applications of quantum mechanical techniques to areas outside of quantum mechanics, so-called quantum-like modeling. Research in this area has grown over the last 15 years. But even already more than 50 years ago, the interaction between Physics Nobelist Pauli and the psychologist Carl Jung in the 1950’s on seeking to find analogous uses of the complementarity principle from quantum mechanics in psychology needs noting. This book does NOT want to advance that society is quantum mechanical! The macroscopic world is manifestly not quantum mechanical. But this rules not out that one can use concepts and the mathematical apparatus from quantum physics in a macroscopic environment. A mainstay ingredient of quantum mechanics, is ‘quantum probability’ and this tool has been proven to be useful in the mathematical modelling of decision making. In the most basic experiment of quantum physics, the double slit experiment, it is known (from the works of A. Khrennikov) that the law of total probability is violated. It is now well documented that several decision making paradoxes in psychology and economics (such as the Ellsberg paradox) do exhibit this violation of the law of total probability. When data is collected with experiments which test ‘non-rational’ decision making behaviour, one can observe that such data often exhibits a complex non-commutative structure, which may be even more complex than if one considers the structure allied to the basic two slit experiment. The community exploring quantum-like models has tried to address how quantum probability can help in better explaining those paradoxes. Research has now been published in very high standing journals on resolving some of the paradoxes with the mathematics of quantum physics. The aim of this book is to collect the contributions of world’s leading experts in quantum like modeling in decision making, psychology, cognition, economics, and finance.
This book argues for evolutionary epistemology and distinguishing functionality from physicality in the social sciences. It explores the implications for this approach to understanding in biology, economics, psychology and political science. Presenting a comprehensive overview of philosophical topics in the social sciences, the book emphasizes how all human cognition and behavior is characterized by functionality and complexity, and thus cannot be explained by the point predictions and exact laws found in the physical sciences. Realms of functional complexity – such as the market order in economics, the social rules of conduct, and the human CNS – require a focus on explanations of the principles involved rather than predicting exact outcomes. This requires study of the historical context to understand behavior and cognition. This approach notes that functional complexity is central to classical liberal ideas such as division of labour and knowledge, and how this is a far more powerful and adequate account of social organization than central planning. Through comparison of these approaches, as well as its interdisciplinary scope, this book will interest both academics and students in philosophy, biology, economics, psychology and all other social sciences.
The social sciences study knowing subjects and their interactions. A "cog nitive turn", based on cognitive science, has the potential to enrich these sciences considerably. Cognitive economics belongs within this movement of the social sciences. It aims to take into account the cognitive processes of individuals in economic theory, both on the level of the agent and on the level of their dynamic interactions and the resulting collective phenomena. This is an ambitious research programme that aims to link two levels of com plexity: the level of cognitive phenomena as studied and tested by cognitive science, and the level of collective phenomena produced by the economic in teractions between agents. Such an objective requires cooperation, not only between economists and cognitive scientists but also with mathematicians, physicists and computer scientists, in order to renew, study and simulate models of dynamical systems involving economic agents and their cognitive mechanisms. The hard core of classical economics is the General Equilibrium Theory, based on the optimising rationality of the agent and on static concepts of equilibrium, following a point of view systemised in the framework of Game Theory. The agent is considered "rational" if everything takes place as if he was maximising a function representing his preferences, his utility function.
Filling the gap for an up-to-date textbook in this relatively new interdisciplinary research field, this volume provides readers with a thorough and comprehensive introduction. Based on extensive teaching experience, it includes numerous worked examples and highlights in special biographical boxes some of the most outstanding personalities and their contributions to both physics and economics. The whole is rounded off by several appendices containing important background material.
Why do we volunteer time? Why do we contribute money? Why, even, do we vote, if the effect of a single vote is negligible? Rationality-based microeconomic models are hard-pressed to explain such social behavior, but Howard Margolis proposes a solution. He suggests that within each person there are two selves, one selfish and the other group-oriented, and that the individual follows a Darwinian rule for allocating resources between those two selves. "Howard Margolis's intriguing ideas . . . provide an alternative to the crude models of rational choice that have dominated economics and political science for too long."—Times Literary Supplement
This book offers a thorough introduction to the highly promising complex agent-based approach to economics, in which agent-based models (ABMs) are used to represent economic systems as complex and evolving systems composed of heterogeneous agents of limited rationality who interact with each other, generating the system’s emergent properties in the process. This approach represents a response to the limitations of the dominant theory in economics, which does not consider the possibility of a major crisis, and to the inability of dynamic stochastic general equilibrium theory to generate empirically falsifiable propositions. In the new perspective, the focus is on identifying the elements of instability rather than the triggering event. As the theory of complexity demonstrates, the interactions of heterogeneous agents produce non-linearity: this puts an end to the age of certainties. With ABMs, the methodology is “from the bottom up”. The individual parameters and their distribution are estimated, and then evaluated to verify whether aggregate regularities emerge on the whole. In short, not only micro, but also meso and macro empirical validation are employed. Moreover, it shows that the mantra of growth should be supplanted by the concept of a growth. Given its depth of coverage, the book will enable students at the undergraduate and Master’s level to gain a firm grasp of this important emerging approach. “This book is flower blossomed by one of the two greatest Italian economists.” Bruce Greenwald, Columbia University “The author’s - the ABM prophet’s - thoughts on economics have been at the forefront of the world. Without a firm belief in and dedication to human society, it is impossible to write such a book. This is a work of high academic value, which can help readers quickly understand the history and current situation of complex economic theory. In particular, we can understand the basic viewpoints, academic status, advantages and shortcomings of various schools of economic theory.” Jie Wu, Guangzhou Milestone Software Co., China