IBM

The Next Generation Of Brain Mimicking AI

The Next Generation Of Brain Mimicking AI

#Generation #Brain #Mimicking

“New Mind”

Visit to get a 30-day free trial + 20% off your annual subscription The tech industry’s obsession with AI …

source

 

To see the full content, share this page by clicking one of the buttons below

Related Articles

21 Comments

  1. Iphone needs about 4 Ah at 4 volts, so about 16 Wh to charge, so 300 Wh is less than 20 times energy required to charge iPhones, not 60. Same with other phones. Still a lot. Fast charging on an iPhone 12 Pro (with a 4,082 mAh battery) typically requires around 20-30 watts of power to charge from 0-50% in about 30 minutes. So considering losses during charging it may be only 10-15 times,

  2. Consciousness is non-computable. Thus sentience awaits models that are non-computable. Perhaps this is getting closer. I believe that only a model where there is a collapse of wave functions can produce true intelligence. Our brains are now believed to utilize quantum effects on which decoherence is critically important. It is know that the only non-computable phenomenon in all of nature involve the collapse of a wave function. We must push toward this end or we will just have a very different type of computer but a Touring computer nonetheless.

  3. This is an insane techno version of the cargo cults. Nobody knows how the human brain works to produce the human mind. You may as well be making B29 bombers out of bamboo leaves.

  4. Is it possible to combine Spiking Neural Networks (SNNs) and Artificial Neural Networks (ANNs)? For example, CNNs can be combined with Transformers where the CNN handles initial feature extraction and then passes the processed information to the Transformer. Similarly, SNNs could be integrated with ANNs, where SNNs preprocess and extract features from visual data before passing it to an ANN for further processing.

    Could we extend this concept to create a hybrid model with SNN layers inside a Transformer architecture? The SNN layers would handle early feature extraction, leveraging their efficiency and temporal processing capabilities, and then feed these extracted features into the Transformer for high-level sequence modeling. This approach could combine the strengths of both networks, resulting in a more powerful and efficient system.

    I see this as a more viable approach. There are also chips coming that are specifically made for ANN's that will be much less inefficient.

  5. 2.6 kW with 1/24th of the neural network size of a high-end consumer GPU? It's interesting but we're far from even being as efficient as the matrix-based gradient descent approach. Multiple order-of-magnitude improvements are required to even catch up, and additional improvements would be required to supersede what we have now and be worth the cost of adopting a new and very different technology.

    I want to see this succeed and think it would be cool, but there also seem to be a lot of obstacles. It seems like this may take quite some time.

  6. I have a question for everyone who watches this video. Let's say you prevented such a huge energy expenditure. You have developed a processor and artificial intelligence model that consumes much less energy and makes it much faster. How will you solve the security problems it will create? For example, deciphering passwords and passing digital security with such high speed is not even a task. Developments in the field of security are not as rapid as in the field of artificial intelligence. Such terrible power always poses serious danger in the hands of those with evil intentions. In this case, would you prefer a very advanced processor (hence artificial intelligence) or security?

  7. During dark room meditation I have observed similar images gently moving, traveling, swirling through my mind such as I witnessed at 14.32 minutes to 14.57. They all appeared more like the final images at 14.54 to 14.57 minutes in this video. Even though I had non-specific thoughts during my meditations the greyish blue images moved continually as I observed them.

  8. In the perspective of the brain, it uses 80% of your entire body energy. So if AI can be seen as our collective mind, then the whole idea that it is using too much energy is nonsense. Surely it can be further optimized, but nature shows that intelligent thinking is more important than any other physical process in our body or at least equally important!

  9. Your energy computations are way off. But of course you’re right about your conclusion anyway- that power consumption will be a limiting factor to AGI. So we do have to find more energy efficient neural networks than vector processors like GPUs can do.

  10. But if it can remember the answer and store it the next time it gets asked that question it can simply retrieve the answer. You only have so many questions, and so many answers.
    What is the capital of Hawaii?
    How hot is the sun?
    I get those are simple questions and it’s much more complicated when you asked it to write a program to add a user to O365 with specific questions it ask the user along the way such as position, location, name, picture..
    but still..

Leave a Reply