No catches, no fine print just unadulterated book loving, with your favourite books saved to your own digital bookshelf.
New members get entered into our monthly draw to win £100 to spend in your local bookshop Plus lots lots more…Find out more
See below for a selection of the latest books from Cognitive science category. Presented with a red border are the Cognitive science books that have been lovingly read and reviewed by the experts at Lovereading. With expert reading recommendations made by people with a passion for books and some unique features Lovereading will help you find great Cognitive science books and those from many more genres to read that will keep you inspired and entertained. And it's all free!
Experts from a range of disciplines explore how humans and artificial agents can quickly learn completely new tasks through natural interactions with each other. Humans are not limited to a fixed set of innate or preprogrammed tasks. We learn quickly through language and other forms of natural interaction, and we improve our performance and teach others what we have learned. Understanding the mechanisms that underlie the acquisition of new tasks through natural interaction is an ongoing challenge. Advances in artificial intelligence, cognitive science, and robotics are leading us to future systems with human-like capabilities. A huge gap exists, however, between the highly specialized niche capabilities of current machine learning systems and the generality, flexibility, and in situ robustness of human instruction and learning. Drawing on expertise from multiple disciplines, this Strungmann Forum Report explores how humans and artificial agents can quickly learn completely new tasks through natural interactions with each other. The contributors consider functional knowledge requirements, the ontology of interactive task learning, and the representation of task knowledge at multiple levels of abstraction. They explore natural forms of interactions among humans as well as the use of interaction to teach robots and software agents new tasks in complex, dynamic environments. They discuss research challenges and opportunities, including ethical considerations, and make proposals to further understanding of interactive task learning and create new capabilities in assistive robotics, healthcare, education, training, and gaming. Contributors Tony Belpaeme, Katrien Beuls, Maya Cakmak, Joyce Y. Chai, Franklin Chang, Ropafadzo Denga, Marc Destefano, Mark d'Inverno, Kenneth D. Forbus, Simon Garrod, Kevin A. Gluck, Wayne D. Gray, James Kirk, Kenneth R. Koedinger, Parisa Kordjamshidi, John E. Laird, Christian Lebiere, Stephen C. Levinson, Elena Lieven, John K. Lindstedt, Aaron Mininger, Tom Mitchell, Shiwali Mohan, Ana Paiva, Katerina Pastra, Peter Pirolli, Roussell Rahman, Charles Rich, Katharina J. Rohlfing, Paul S. Rosenbloom, Nele Russwinkel, Dario D. Salvucci, Matthew-Donald D. Sangster, Matthias Scheutz, Julie A. Shah, Candace L. Sidner, Catherine Sibert, Michael Spranger, Luc Steels, Suzanne Stevenson, Terrence C. Stewart, Arthur Still, Andrea Stocco, Niels Taatgen, Andrea L. Thomaz, J. Gregory Trafton, Han L. J. van der Maas, Paul Van Eecke, Kurt VanLehn, Anna-Lisa Vollmer, Janet Wiles, Robert E. Wray III, Matthew Yee-King
In the wake of fresh allegations that personal data of Facebook users have been illegally used to influence the outcome of the US general election and the Brexit vote, the debate over manipulation of social big data continues to gain more momentum. Cyber Influence and Cognitive Threats addresses various emerging challenges in response to cyber security, examining cognitive applications in decision making, behaviour and basic human interaction. The book examines the role of psychology in cybersecurity by addressing each factor involved in the process: hackers, targets, cybersecurity practitioners, and the wider social context in which these groups operate. Cyber Influence and Cognitive Threats covers a variety of topics including information systems, psychology, sociology, human resources, leadership, strategy, innovation, law, finance and others.
Andy Clark is a leading philosopher of cognitive science, whose work has had an extraordinary impact throughout philosophy, psychology, neuroscience, and robotics. His monographs have led the way for new research programs in the philosophy of mind and cognition: Microcognition (1989) and Associative Engines (1993) introduced the philosophical community to connectionist research and the novel issues it raised; Being There (1997) showed the relevance of embodiment, dynamical systems theory, and minimal computation frameworks for the study of the mind; Natural Born Cyborgs (OUP 2003) presented an accessible development of embodied and embedded approaches to understanding human nature and cognition; Supersizing the Mind (OUP 2008) developed this yet further along with the famous Extended Mind hypothesis; and Surfing Uncertainty (OUP 2017) presents a framework for uniting perception, action, and the embodied mind. In Andy Clark and His Critics, a range of high-profile researchers in philosophy of mind, philosophy of cognitive science, and empirical cognitive science, critically engage with Clark's work across the themes of: Extended, Embodied, Embedded, Enactive, and Affective Minds; Natural Born Cyborgs; and Perception, Action, and Prediction. Daniel Dennett provides a foreword on the significance of Clark's work, and Clark replies to each section of the book, thus advancing current literature with original contributions that will form the basis for new discussions, debates and directions in the discipline.
Modern populations are superficially aware of media potentials and paraphernalia, but recent events have emphasized the general ignorance of the sentient media. Advertising has long been suspected of cognitive manipulation, but emergent issues of political hacking, false news, disinformation campaigns, lies, neuromarketing, misuse of social media, pervasive surveillance, and cyber warfare are presently challenging the world as we know it. Media Models to Foster Collective Human Coherence in the PSYCHecology is an assemblage of pioneering research on the methods and applications of video games designed as a new genre of dream analogs. Highlighting topics including virtual reality, personality profiling, and dream structure, this book is ideally designed for professionals, researchers, academicians, psychologists, psychiatrists, sociologists, media specialists, game designers, and students hoping for the creation of sustainable social patterns in the emergent reality of energy and information.
With contributions from founders of the field, including Justin Barrett, E. Thomas Lawson, Robert N. McCauley, Paschal Boyer, Armin Geertz and Harvey Whitehouse, as well as from younger scholars from successive stages in the field's development, this is an important survey of the first twenty-five years of the cognitive science of religion. Each chapter provides the author's views on the contributions the cognitive science of religion has made to the academic study of religion, as well as any shortcomings in the field and challenges for the future. Religion Explained? The Cognitive Science of Religion after Twenty-five Years calls attention to the field whilst providing an accessible and diverse survey of approaches from key voices, as well as offering suggestions for further research within the field. This book is essential reading for anyone in religious studies, anthropology, and the scientific study of religion.
Robert N. McCauley and E. Thomas Lawson are considered the founders of the field of the cognitive science of religion. Since its inception over twenty years ago, the cognitive science of religion has raised questions about the philosophical foundations and implications of such a scientific approach. This volume from McCauley, including chapters co-authored by Lawson, is the first book-length project to focus on such questions, resulting in a compelling volume that addresses fundamental questions that any scholar of religion should ask. The essays collected in this volume are those that initially defined this scientific field for the study of religion. These essays deal with issues of methodology, reductionism, resistance to the scientific study of religion, and other criticisms that have been lodged against the cognitive science of religion. The new final chapter sees McCauley reflect on developments in this field since its founding. Tackling these debates head on and in one place for the first time, this volume belongs on the shelf of every researcher interested in this now established approach to the study of religion within a range of disciplines, including religious studies, philosophy, anthropology and the psychology of religion.
4E cognition (embodied, embedded, enactive, and extended) is a relatively young and thriving field of interdisciplinary research. It assumes that cognition is shaped and structured by dynamic interactions between the brain, body, and both the physical and social environments. With essays from leading scholars and researchers, The Oxford Handbook of 4E Cognition investigates this recent paradigm. It addresses the central issues of embodied cognition by focusing on recent trends, such as Bayesian inference and predictive coding, and presenting new insights, such as the development of false belief understanding. The Oxford Handbook of 4E Cognition also introduces new theoretical paradigms for understanding emotion and conceptualizing the interactions between cognition, language, and culture. With an entire section dedicated to the application of 4E cognition in disciplines such as psychiatry and robotics, and critical notes aimed at stimulating discussion, this Oxford handbook is the definitive guide to 4E cognition. Aimed at neuroscientists, psychologists, psychiatrists, and philosophers, The Oxford Handbook of 4E Cognition will be essential reading for anyone with an interest in this young and thriving field.
According to a leading cognitive scientist, we've been teaching reading wrong. The latest science reveals how we can do it right. In 2011, when an international survey reported that students in Shanghai dramatically outperformed American students in reading, math, and science, President Obama declared it a Sputnik moment : a wake-up call about the dismal state of American education. Little has changed, however, since then: over half of our children still read at a basic level and few become highly proficient. Many American children and adults are not functionally literate, with serious consequences. Poor readers are more likely to drop out of the educational system and as adults are unable to fully participate in the workforce, adequately manage their own health care, or advance their children's education. In Language at the Speed of Sight, internationally renowned cognitive scientist Mark Seidenberg reveals the underexplored science of reading, which spans cognitive science, neurobiology, and linguistics. As Seidenberg shows, the disconnect between science and education is a major factor in America's chronic underachievement. How we teach reading places many children at risk of failure, discriminates against poorer kids, and discourages even those who could have become more successful readers. Children aren't taught basic print skills because educators cling to the disproved theory that good readers guess the words in texts, a strategy that encourages skimming instead of close reading. Interventions for children with reading disabilities are delayed because parents are mistakenly told their kids will catch up if they work harder. Learning to read is more difficult for children who speak a minority dialect in the home, but that is not reflected in classroom practices. By building on science's insights, we can improve how our children read, and take real steps toward solving the inequality that illiteracy breeds. Both an expert look at our relationship with the written word and a rousing call to action, Language at the Speed of Sight is essential for parents, educators, policy makers, and all others who want to understand why so many fail to read, and how to change that.
Neuroscientific evidence has educated us in the ways in which the brain mediates our thought and behavior and, therefore, forced us to critically examine how we conceive of free will. This volume, featuring contributions from an international and interdisciplinary group of distinguished researchers and scholars, explores how our increasing knowledge of the brain can elucidate the concept of the will and whether or to what extent it is free. It also examines how brain science can inform our normative judgments of moral and criminal responsibility for our actions. Some chapters point out the different respects in which mental disorders can compromise the will and others show how different forms of neuromodulation can reveal the neural underpinning of the mental capacities associated with the will and can restore or enhance them when they are impaired.
A work that reveals the profound links between the evolution, acquisition, and processing of language, and proposes a new integrative framework for the language sciences. Language is a hallmark of the human species; the flexibility and unbounded expressivity of our linguistic abilities is unique in the biological world. In this book, Morten Christiansen and Nick Chater argue that to understand this astonishing phenomenon, we must consider how language is created: moment by moment, in the generation and understanding of individual utterances; year by year, as new language learners acquire language skills; and generation by generation, as languages change, split, and fuse through the processes of cultural evolution. Christiansen and Chater propose a revolutionary new framework for understanding the evolution, acquisition, and processing of language, offering an integrated theory of how language creation is intertwined across these multiple timescales. Christiansen and Chater argue that mainstream generative approaches to language do not provide compelling accounts of language evolution, acquisition, and processing. Their own account draws on important developments from across the language sciences, including statistical natural language processing, learnability theory, computational modeling, and psycholinguistic experiments with children and adults. Christiansen and Chater also consider some of the major implications of their theoretical approach for our understanding of how language works, offering alternative accounts of specific aspects of language, including the structure of the vocabulary, the importance of experience in language processing, and the nature of recursive linguistic structure.