EdEarl Posted April 18, 2016 Posted April 18, 2016 Wired.com Today’s Watson is very different. It no longer exists solely within a wall of cabinets but is spread across a cloud of open-standard servers that run several hundred “instances” of the AI at once. Like all things cloudy, Watson is served to simultaneous customers anywhere in the world, who can access it using their phones, their desktops, or their own data servers. This kind of AI can be scaled up or down on demand. Because AI improves as people use it, Watson is always getting smarter; anything it learns in one instance can be immediately transferred to the others. And instead of one single program, it’s an aggregation of diverse software engines—its logic-deduction engine and its language-parsing engine might operate on different code, on different chips, in different locations—all cleverly integrated into a unified stream of intelligence. The Breakthroughs Cheap Parallel Computation Big Data Better AI algorithms AI Everywhere Google Auto driving Financial Predictions others in development, $B for AI AI - Brain 1977: Deep Blue vs Kasparov Today: AI plus Kasparov vs AI Tomorrow, everyone is AI augmented, even more than carrying a cell phone. Will the Unified Theory be discovered by AI plus physicists? What is required?
EdEarl Posted May 5, 2016 Author Posted May 5, 2016 (edited) Given charlatan reviewers and range of paper quality, each scientist might use a Watson assistant to read and rate papers. Its owner would train the AI by having it read papers with relevant information; then let it read the general literature to select papers for the scientist to read. Edited May 5, 2016 by EdEarl
Enthalpy Posted June 13, 2016 Posted June 13, 2016 The Breakthroughs Better AI algorithms Not really better algorithms... What runs presently is still the gradient retropropagation (Yann Lecun about 1987) and the random annealing of neural network coefficients (few years later). It's mainly that computers are faster now, and by chance such algorithms fit parallel computers not too badly. You know the proverb: when programming, you can be clever and fast, or you can be dumb and slow so that hardware progress catches up you bad software. But you shouldn't be dumb and fast. (Hum, Yann isn't dumb for sure. Impressive in fact. But I can't resist a silly joke.)
Recommended Posts