IBM

NPUs: the most overhyped new chip?

NPUs: the most overhyped new chip?

#NPUs #overhyped #chip

“TechAltar”

The first 500 people to use my link will get a 1 month free trial of Skillshare (sponsored): …

source

 

To see the full content, share this page by clicking one of the buttons below

Related Articles

22 Comments

  1. NPU's are there to be a snitch on you seeing your messages before they are even encrypted through those messaging apps. like Signal. Rob Braxmann covered this issue.

  2. Ai is definitely here to stay for some applications and for some industry. But my God we don't need or want AI in everything. I've tried some of these new ai features but don't use any of them on a regular basis. In fact after messibg around a little I almost never use AI.

  3. I've been working with NPUs for the past 2 years on embedded devices and cameras. I definitely think they are not overhyped at all. But AI in general is in a bubble right now. It seems cryptobros found their new hobby

  4. NPUs are like RTX cards. A good idea in principle, it'll just take 5-10 years until they are actually useful and mature.

    I would absolutely love open-source on-device AI:
    1) spell check/rewriting,
    2) image upscaling,
    3) image style transfer (e.g. to color sketch art I like),
    4) text-image embedding search & clustering/tagging,
    5) live speech-to-speech translation (e.g. automatic english anime dubs),
    6) plant & mushroom identification,
    7) shopping assistant (e.g. to sift through all the unrelated garbage that amazon search gives you and automatically find the closest product to your requirements)
    8) search assistant in general that automatically researches what I'm asking for and ACTUALLY gives me relevant results (unlike any modern search engine)
    9) deep fake & bot detection
    10) health analytics (e.g. EEG analysis)

    All of these are already possible and useful is some cases, as long as you keep the limitations in mind. They aren't mature and reliable enough for most people, but it strongly depends on the use case.

  5. then why is apple being stingy with the RAM. telling people their 8GB RAM is magically equal to 16GB of regular RAM which even so its still not enough.personally i need minimum 32GB and 64GB to feel nice and comfy, but im special snowflake like that. if it was up to me i would make 24GB the new standard.

  6. I just had a weird idea. What if we had BIG CUDA cores. What are the heaviest threads? The big CUDA cores can focus their attention on the big stuff, or is that how it works?

Leave a Reply