• 346 Posts
  • 526 Comments
Joined 4 years ago
cake
Cake day: June 28th, 2020

help-circle
  • We are not making anyone move. I and the rest of the mods but wabooti who is MIA for 8 months have already left. It’s a courtesy notice to the community. Whether wabooti will continue is up to him. I will just stay here long enough for the discussion, if any.

    You can of course make your own collapse community (with blackjack and hookers) but perhaps pick an instance where the admins are not overshoot deniers. It seems though you don’t quite understand the significance of that “slap on the wrist”.































  • A free running cellular automaton (CA) approach in hardware would work, but each cell would be a much souped up SRAM cell, the interactions would be all local and 2D. Considering Cerebras is 40 G SRAM on the 300 mm WSI and is about at the cooling limit I’m afraid you do not have 5 orders of magnitude. Perhaps reversible spintronics can help with the power draw, but you still have to splat a higher dimensional network so not just local interactions into a 2D array.








  • They write

    “Of course, AMD is trying to get into the the AI training and inferencing game itself with the Instinct MI300 chip. And that, perhaps, is the main if modest cause for hope. If AMD can gain some traction in that huge market, it will not only be making lots of money, it will be in a position to do a similar thing to Nvidia and push some of that technology across into its gaming GPUs.”

    which strikes me as incorrect. AMD MI is pretty widespread in HPC. With margins lower in the consumer market it makes sense to focus on HPC.