gorbot 4 hours ago

I'm an idiot and I know nothing

But I wonder if there could be room for an ARM-like spec that Google could try and own and license but for AI chips. Arm is to risc-cpu as google-thing is to asic-aichip

Prolly a dumb idea, better to sell the chips or access to them?

  • eru 4 hours ago

    I'm not sure the chip spec (or instruction set) is the level of abstraction here?

    Something like DirectX (or OpenGL) might be the better level to target? In practice, CUDA is that level of abstraction, but it only really works for Nvidia cards.

    • karmakaze 4 hours ago

      It's not that it only works on Nvidia cards, it's only allowed to work on Nvidia cards. A non-clean room implementation of CUDA for other hardware has been done but is a violation of EULA (of the thing that was reverse engineered), copyright on the driver binary interface, and often patents. Nvidia aggressively sends cease-and-desist letters and threatens lawsuits (successfully killed ZLUDA, threatened others). It's an artificial (in a technical sense moat).

    • latchkey 4 hours ago

      > CUDA is that level of abstraction, but it only really works for Nvidia cards.

      There are people actively working on that.

      https://scale-lang.com/

shrubble 6 hours ago

Not much real data or news there.

aurareturn 4 hours ago

I think we need an analysis of tokens/$1 and tokens/second for Nvidia Blackwell vs Ironwood.

  • ipnon 4 hours ago

    It depends on how they’re utilized , especially at these scales, you have to squeeze every bit out.

jeffbee 4 hours ago

These are only available in Iowa on GCP, which to me raises this question: do they have them all over the world for their own purposes, or does this limited geography also mean that users of Google AI features get varied experiences depending on their location?

  • wmf 2 hours ago

    Running on v6 vs v7 should just be different performance.

bigyabai 5 hours ago

> It’s designed for AI with AI

CUDA engineers, your job security has never felt more certain.

bgwalter 5 hours ago

So we will be getting wrong answers faster now.