Frosh tackled some big questions about the ideals of citizenship and democracy for their second course in Stanford's COLLEGE undergraduate requirement.
By Tom Abate
Smartwatches and other battery-powered electronics would be even smarter if they could run AI algorithms. But efforts to build AI-capable chips for mobile devices have so far hit a wall – the so-called “memory wall” that separates data processing and memory chips that must work together to meet the massive and continually growing computational demands imposed by AI.
Hardware and software innovations give eight chips the illusion that they’re one mega-chip working together to run AI. (Image credit: Stocksy / Drea Sullivan)
“Transactions between processors and memory can consume 95 percent of the energy needed to do machine learning and AI, and that severely limits battery life,” said computer scientist Subhasish Mitra, senior author of a new study published in
E-Mail
Smartwatches and other battery-powered electronics would be even smarter if they could run AI algorithms. But efforts to build AI-capable chips for mobile devices have so far hit a wall - the so-called memory wall that separates data processing and memory chips that must work together to meet the massive and continually growing computational demands imposed by AI. Transactions between processors and memory can consume 95 percent of the energy needed to do machine learning and AI, and that severely limits battery life, said computer scientist Subhasish Mitra, senior author of a new study published in
Nature Electronics.
Now, a team that includes Stanford computer scientist Mary Wootters and electrical engineer H.-S. Philip Wong has designed a system that can run AI tasks faster, and with less energy, by harnessing eight hybrid chips, each with its own data processor built right next to its own memory storage.