
Categories
- Art (167)
- Other (1,540)
- Philosophy (1,283)
- Psychology (1,775)
- Society (471)
Recent Questions
- Can you describe the circumstances of your first childhood acquaintance with the feeling of a severe burn? Did you remember anything other than the burn itself?
- What should I eat or drink to improve my brain function the night before the exam? (except coffee and green tea)
- If there are mathematical discoveries that no one knew about before, then doesn't this mean that mathematics and is more than a human invention?
- What is the main function of philosophy in relation to scientific knowledge?
- How can two infinities be compared?
Alas, not fiction.
I am interested in answering this question from the standpoint of experimentalism – the cognitive paradigm of human thinking and behavior, I call it the scientific justification of “thought-action”.
So, a person learns / builds reality starting from the basic level of categorization: the child learns to distinguish between simple objects (coins, chairs, cats, flowers, people, water, etc.) and name them. This is the natural level (dog), contrasted with the “created imagination” that are above (animals) and below (German Shepherd) it. Things at the basic level are perceived holistically, as a single gestalt, unlike other levels, where more subtle differences (distinctive properties) are used for identification. The basic level is the pre-conceptual experience gained by a person in the process of manipulating, using, and exploring environmental objects. On the foundation of this level, basic concepts are built, from which the ability to think abstractly grows: phenomena, events, and entities that are not directly perceived arise as a result of metaphorical and metonymic use of basic concepts for mapping. Thus, the richer, more complex, and more diverse the pre-conceptual experience, the more freedom and flexibility a person has in cognition and understanding, the higher their intelligence and adaptability, and the more successful they are in physical and social reality. This is exactly what the article I linked to at the beginning of the answer says:
Gadgets do not provide a full-fledged accumulation of pre-conceptual experience, since they do not use body motor skills, they only imitate physical reality. Thus, a person (not only a child, but also an adult) cannot use such a surrogate for effective modeling of the world, which leads to weak cognitive abilities. This is what digital dementia is all about.
Good day everyone who reads 🙂
Alas and Oh, Yes!!!
It is not clear why it is not customary to talk or write about this? What are we hiding? Like an ostrich, we hide our heads in the sand so that we don't see reality?
Starting with children, from the age of two they are active Internet users, and ending with deep retired tankers who can not imagine their life without them!
Bottom line – students on entrance exams don't know multiplication tables? And after graduating/paying for the university, too!!! Right now!
They are already working, treating us, and “teaching” our children… Aren't you afraid? I'm very. Both painful and embarrassing. Sadness, all of it.
If you agree or disagree, please respond! The question is more than topical. Please, for Christ's sake, don't pass me by, say your word …
Just as much as dementia caused by any other cause is real. Some individual differences inherent in digital dementia are not decisive, since the basis is precisely the underdevelopment of the individual's cognitive functions.
Dementia is a pathology caused by physiological causes. In other words, the brain, as an instrument of consciousness and reason on the physical plane, does not function normally just physiologically.
Digital “dementia” is a metaphor that means not problems with the instrument ( brain), but problems with the psyche of the individual.
This is how such a definition can be taken allegorically.