4 Answers

  1. Neither one nor the other.
    And the problem is not even that the machine does this or that, the problem is that the machine is a T. S. “continuation of the developers 'intelligence”, figuratively speaking.
    In other words, yes, algorithms can be very complex, even self-tuning, and this creates the illusion that the machine is doing something itself. But it doesn't. It just follows the instructions that people put into it. Even when a program writes another program, both the first and second programs are “extensions of the developers 'intelligence”.

    It's like looking at the puppet question: can the puppet dance like a human or can it not? In fact, the answer will be: it depends on the skill of the puppeteer, but without a puppeteer, the puppet will not be able to do anything at all. That's something similar with cars.

    Maybe someday people will figure out how to create a full-fledged AI, and then the situation will change, but not before.

  2. A machine can't become human, and it's not just that the human brain has a lot more neurons than the most highly technical computer, there are concepts of compassion and sensitivity in a person. You can bet that all this is mind games, that by bringing robots to an electro-magnetic perfection, we will be able to make them “read” emotions on the face, and catch smells . No human sensitivity �is not just about that. The machine cannot self-reproduce its neurons – it is dependent . These are different waves and different fields. A completely different rough imitation. A log will remain a log, no matter how it is upgraded.

  3. I have always been struck by the naive belief that people perceive collective abstract concepts as having meaning. Animality, humanity, Europeanness, and so on. “absolutely meaningless words. What is humanity and how should it manifest itself? Tell us about it

  4. How different is a human being from a machine? Our entire system of reactions is built on a primitive method of “carrot and stick”, working on our hormones. Will the machine ever be able to recognize itself as a person? We'll never know. Can you give one hundred percent guarantee that ALL the people around you are sufficiently aware of themselves in this way? I don't think so. Can we be sure that our mind is not a product of matter? To be honest, I don't think so. This is more of a question of philosophy-whether we are a “rational body”, or we are some kind of consciousness that observes our body.

Leave a Reply