AWS ReInvent: 'We’re Laying The Groundwork For New Innovations To Take Flight'
AWS VP, AI and Data stakes out territory in the history of computing innovation.
(Dr. Swami Sivasubramanian onstage at AWS re:Invent. Credit: AWS)
If Matt Garman's AWS re:Invent keynote told a story about the building blocks of generative AI, the follow up from Dr. Swami Sivasubramanian, AWS VP, AI and Data, examined the data foundation more closely.
Dr. Swami Sivasubramanian grounded his AWS re:Invent day three keynote with an acknowledgement that great discoveries involve building out and integrating the achievements of others. It was a useful analogy, coming as it does at an inflection point for generative AI.
Each breakthrough builds on the last and sets the foundation for the next.
"We are reaching a tipping point with generative AI," Sivasubramanian said. "We saw an enormous opportunity to harness the convergence of big data, analytics, machine learning and now generative AI."
Sivasubramanian said that, in the last year, 140 new capabilities had been added to the Amazon SageMaker machine learning set of services, then went on to announce four more - all with the aim of providing greater flexibility in toolsets and a reduction in model training times, with a corresponding reduction in costs.
These new features are clearly aimed at organizations eyeing up popular models like Llama 3.1 or Mistral 8X22B and wondering how long it will take to train them to their specific use case. Together they amount to a Quick Start process for gen-AI-curious organizations.
For organizations with far more limited machine learning expertise to draw on, Sivasubramanian announced a whole host of updates including further additions to the impressive range of proprietary and open-source models that can be accessed via Amazon Bedrock. This now includes Luma AI's Ray 2, Poolside's Malibu and Point, and Stability AI’s Stable Diffusion 3.5 Large.
Also included are the six new Amazon Nova models Andy Jassy unveiled yesterday. AWS customers can also get access to more than 100 emerging and specialized models through Bedrock.
Sivasubramanian was careful to make sure that clear use cases were presented for technical product updates. Prompt caching, for example, is obviously useful to law firms, or any organization with a great deal of repetition in data, and subsequently their prompts. Caching reduces latency caused by repetition at scale.
There are also new features allowing unstructured multimodal data, which still constitutes most enterprise data to be transformed into usable data for AI and analytics - automatically and from a single API.
Sivasubramanian dropped some impressive names of customers who were running generative AI applications within Bedrock, including Adobe, Argo Labs, BMW Group, Octus, Symbeo, Tenovos and Zendesk.
Other customers, including the software company Autodesk and mortgage and finance disruptor Rocket, spoke live. The convergence of data analytics and AI into the SageMaker Unified Studio product was demonstrated by AWS Technical Keynote Lead Shannon Kalisky.
Neatly returning to his theme of each generation building on the achievements that came before them, Sivasubramanian announced the AWS Education Equity Initiative: both an attempt to secure the pipeline of AWS skills and an opportunity for those in underserved communities to access learning and development opportunities that they would struggle to secure otherwise.
Sivasubramanian concluded by staking out AWS’s place in the long history of computing innovation.
“From powerful tools for training foundation models at scale to Gen AI assistants that are revolutionizing productivity there, here are active participants in this historical moment of convergence, building upon the dreamers that came long before us. By paving the way for the next wave of technology pioneers, we are not just shaping our present, but we are also laying the groundwork for new innovations to take flight.”
This article originally appeared on our sister site Computing.