An introduction to generative AI with Swami Sivasubramanian – All Things Distributed
In the last few months, weve seen an explosion of interest in generative AI and the underlying technologies that make it possible. It has pervaded the collective consciousness for many, spurring discussions from board rooms to parent-teacher meetings. Consumers are using it, and businesses are trying to figure out how to harness its potential. But it didnt come out of nowhere machine learning research goes back decades. In fact, machine learning is something that weve done well at Amazon for a very long time. Its used for personalization on the Amazon retail site, its used to control robotics in our fulfillment centers, its used by Alexa to improve intent recognition and speech synthesis. Machine learning is in Amazons DNA.
To get to where we are, its taken a few key advances. First, was the cloud. This is the keystone that provided the massive amounts of compute and data that are necessary for deep learning. Next, were neural nets that could understand and learn from patterns. This unlocked complex algorithms, like the ones used for image recognition. Finally, the introduction of transformers. Unlike RNNs, which process inputs sequentially, transformers can process multiple sequences in parallel, which drastically speeds up training times and allows for the creation of larger, more accurate models that can understand human knowledge, and do things like write poems, even debug code.
I recently sat down with an old friend of mine, Swami Sivasubramanian, who leads database, analytics and machine learning services at AWS. He played a major role in building the original Dynamo and later bringing that NoSQL technology to the world through Amazon DynamoDB. During our conversation I learned a lot about the broad landscape of generative AI, what were doing at Amazon to make large language and foundation models more accessible, and last, but not least, how custom silicon can help to bring down costs, speed up training, and increase energy efficiency.
We are still in the early days, but as Swami says, large language and foundation models are going to become a core part of every application in the coming years. Im excited to see how builders use this technology to innovate and solve hard problems.
To think, it was more than 17 years ago, on his first day, that I gave Swami two simple tasks: 1/ help build a database that meets the scale and needs of Amazon; 2/ re-examine the data strategy for the company. He says it was an ambitious first meeting. But I think hes done a wonderful job.
If youd like to read more about what Swamis teams have built, you can read more here. The entire transcript of our conversation is available below. Now, as always, go build!
This transcript has been lightly edited for flow and readability.
***
Werner Vogels: Swami, we go back a long time. Do you remember your first day at Amazon?
Swami Sivasubramanian: I still remember… it wasnt very common for PhD students to join Amazon at that time, because we were known as a retailer or an ecommerce site.
WV: We were building things and thats quite a departure for an academic. Definitely for a PhD student. To go from thinking, to actually, how do I build?
So you brought DynamoDB to the world, and quite a few other databases since then. But now, under your purview theres also AI and machine learning. So tell me, what does your world of AI look like?
SS: After building a bunch of these databases and analytic services, I got fascinated by AI because literally, AI and machine learning puts data to work.
If you look at machine learning technology itself, broadly, its not necessarily new. In fact, some of the first papers on deep learning were written like 30 years ago. But even in those papers, they explicitly called out for it to get large scale adoption, it required a massive amount of compute and a massive amount of data to actually succeed. And thats what cloud got us to to actually unlock the power of deep learning technologies. Which led me to this is like 6 or 7 years ago to start the machine learning organization, because we wanted to take machine learning, especially deep learning style technologies, from the hands of scientists to everyday developers.
WV: If you think about the early days of Amazon (the retailer), with similarities and recommendations and things like that, were they the same algorithms that were seeing used today? Thats a long time ago almost 20 years.
SS: Machine learning has really gone through huge growth in the complexity of the algorithms and the applicability of use cases. Early on the algorithms were a lot simpler, like linear algorithms or gradient boosting.
The last decade, it was all around deep learning, which was essentially a step up in the ability for neural nets to actually understand and learn from the patterns, which is effectively what all the image based or image processing algorithms come from. And then also, personalization with different kinds of neural nets and so forth. And thats what led to the invention of Alexa, which has a remarkable accuracy compared to others. The neural nets and deep learning has really been a step up. And the next big step up is what is happening today in machine learning.
WV: So a lot of the talk these days is around generative AI, large language models, foundation models. Tell me, why is that different from, lets say, the more task-based, like fission algorithms and things like that?
SS: If you take a step back and look at all these foundation models, large language models… these are big models, which are trained with hundreds of millions of parameters, if not billions. A parameter, just to give context, is like an internal variable, where the ML algorithm must learn from its data set. Now to give a sense… what is this big thing suddenly that has happened?
A few things. One, transformers have been a big change. A transformer is a kind of a neural net technology that is remarkably scalable than previous versions like RNNs or various others. So what does this mean? Why did this suddenly lead to all this transformation? Because it is actually scalable and you can train them a lot faster, and now you can throw a lot of hardware and a lot of data [at them]. Now that means, I can actually crawl the entire world wide web and actually feed it into these kind of algorithms and start building models that can actually understand human knowledge.
WV: So the task-based models that we had before and that we were already really good at could you build them based on these foundation models? Task specific models, do we still need them?
SS: The way to think about it is that the need for task-based specific models are not going away. But what essentially is, is how we go about building them. You still need a model to translate from one language to another or to generate code and so forth. But how easy now you can build them is essentially a big change, because with foundation models, which are the entire corpus of knowledge… thats a huge amount of data. Now, it is simply a matter of actually building on top of this and fine tuning with specific examples.
Think about if youre running a recruiting firm, as an example, and you want to ingest all your resumes and store it in a format that is standard for you to search an index on. Instead of building a custom NLP model to do all that, now using foundation models with a few examples of an input resume in this format and here is the output resume. Now you can even fine tune these models by just giving a few specific examples. And then you essentially are good to go.
WV: So in the past, most of the work went into probably labeling the data. I mean, and that was also the hardest part because that drives the accuracy.
SS: Exactly.
WV: So in this particular case, with these foundation models, labeling is no longer needed?
SS: Essentially. I mean, yes and no. As always with these things there is a nuance. But a majority of what makes these large scale models remarkable, is they actually can be trained on a lot of unlabeled data. You actually go through what I call a pre-training phase, which is essentially you collect data sets from, lets say the world wide Web, like common crawl data or code data and various other data sets, Wikipedia, whatnot. And then actually, you dont even label them, you kind of feed them as it is. But you have to, of course, go through a sanitization step in terms of making sure you cleanse data from PII, or actually all other stuff for like negative things or hate speech and whatnot. Then you actually start training on a large number of hardware clusters. Because these models, to train them can take tens of millions of dollars to actually go through that training. Finally, you get a notion of a model, and then you go through the next step of what is called inference.
WV: Lets take object detection in video. That would be a smaller model than what we see now with the foundation models. Whats the cost of running a model like that? Because now, these models with hundreds of billions of parameters are very large.
SS: Yeah, thats a great question, because there is so much talk already happening around training these models, but very little talk on the cost of running these models to make predictions, which is inference. Its a signal that very few people are actually deploying it at runtime for actual production. But once they actually deploy in production, they will realize, oh no, these models are very, very expensive to run. And that is where a few important techniques actually really come into play. So one, once you build these large models, to run them in production, you need to do a few things to make them affordable to run at scale, and run in an economical fashion. Ill hit some of them. One is what we call quantization. The other one is what I call a distillation, which is that you have these large teacher models, and even though they are trained on hundreds of billions of parameters, they are distilled to a smaller fine-grain model. And speaking in a super abstract term, but that is the essence of these models.
WV: So we do build… we do have custom hardware to help out with this. Normally this is all GPU-based, which are expensive energy hungry beasts. Tell us what we can do with custom silicon hatt sort of makes it so much cheaper and both in terms of cost as well as, lets say, your carbon footprint.
SS: When it comes to custom silicon, as mentioned, the cost is becoming a big issue in these foundation models, because they are very very expensive to train and very expensive, also, to run at scale. You can actually build a playground and test your chat bot at low scale and it may not be that big a deal. But once you start deploying at scale as part of your core business operation, these things add up.
In AWS, we did invest in our custom silicons for training with Tranium and with Inferentia with inference. And all these things are ways for us to actually understand the essence of which operators are making, or are involved in making, these prediction decisions, and optimizing them at the core silicon level and software stack level.
WV: If cost is also a reflection of energy used, because in essence thats what youre paying for, you can also see that they are, from a sustainability point of view, much more important than running it on general purpose GPUs.
WV: So theres a lot of public interest in this recently. And it feels like hype. Is this something where we can see that this is a real foundation for future application development?
SS: First of all, we are living in very exciting times with machine learning. I have probably said this now every year, but this year it is even more special, because these large language models and foundation models truly can enable so many use cases where people dont have to staff separate teams to go build task specific models. The speed of ML model development will really actually increase. But you wont get to that end state that you want in the next coming years unless we actually make these models more accessible to everybody. This is what we did with Sagemaker early on with machine learning, and thats what we need to do with Bedrock and all its applications as well.
But we do think that while the hype cycle will subside, like with any technology, but these are going to become a core part of every application in the coming years. And they will be done in a grounded way, but in a responsible fashion too, because there is a lot more stuff that people need to think through in a generative AI context. What kind of data did it learn from, to actually, what response does it generate? How truthful it is as well? This is the stuff we are excited to actually help our customers [with].
WV: So when you say that this is the most exciting time in machine learning what are you going to say next year?
More:
An introduction to generative AI with Swami Sivasubramanian - All Things Distributed
- Machine learning-random forest model was used to construct gene signature associated with cuproptosis to predict the prognosis of gastric cancer -... - February 5th, 2025 [February 5th, 2025]
- Machine learning for predicting severe dengue in Puerto Rico - Infectious Diseases of Poverty - BioMed Central - February 5th, 2025 [February 5th, 2025]
- Panoramic radiographic features for machine learning based detection of mandibular third molar root and inferior alveolar canal contact - Nature.com - February 5th, 2025 [February 5th, 2025]
- AI and machine learning: revolutionising drug discovery and transforming patient care - Roche - February 5th, 2025 [February 5th, 2025]
- Development of a machine learning model related to explore the association between heavy metal exposure and alveolar bone loss among US adults... - February 5th, 2025 [February 5th, 2025]
- Identification of therapeutic targets for Alzheimers Disease Treatment using bioinformatics and machine learning - Nature.com - February 5th, 2025 [February 5th, 2025]
- A novel aggregated coefficient ranking based feature selection strategy for enhancing the diagnosis of breast cancer classification using machine... - February 5th, 2025 [February 5th, 2025]
- Performance prediction and optimization of a high-efficiency tessellated diamond fractal MIMO antenna for terahertz 6G communication using machine... - February 5th, 2025 [February 5th, 2025]
- How machine learning and AI can be harnessed for mission-based lending - ImpactAlpha - January 27th, 2025 [January 27th, 2025]
- Machine learning meta-analysis identifies individual characteristics moderating cognitive intervention efficacy for anxiety and depression symptoms -... - January 27th, 2025 [January 27th, 2025]
- Using robotics to introduce AI and machine learning concepts into the elementary classroom - George Mason University - January 27th, 2025 [January 27th, 2025]
- Machine learning to identify environmental drivers of phytoplankton blooms in the Southern Baltic Sea - Nature.com - January 27th, 2025 [January 27th, 2025]
- Why Most Machine Learning Projects Fail to Reach Production and How to Beat the Odds - InfoQ.com - January 27th, 2025 [January 27th, 2025]
- Exploring the intersection of AI and climate physics: Machine learning's role in advancing climate science - Phys.org - January 27th, 2025 [January 27th, 2025]
- 5 Questions with Jonah Berger: Using Artificial Intelligence and Machine Learning in Litigation - Cornerstone Research - January 27th, 2025 [January 27th, 2025]
- Modernizing Patient Support: Harnessing Advanced Automation, Artificial Intelligence and Machine Learning to Improve Efficiency and Performance of... - January 27th, 2025 [January 27th, 2025]
- Param Popat Leads the Way in Transforming Machine Learning Systems - Tech Times - January 27th, 2025 [January 27th, 2025]
- Research on noise-induced hearing loss based on functional and structural MRI using machine learning methods - Nature.com - January 27th, 2025 [January 27th, 2025]
- Machine learning is bringing back an infamous pseudoscience used to fuel racism - ZME Science - January 27th, 2025 [January 27th, 2025]
- How AI and Machine Learning are Redefining Customer Experience Management - Customer Think - January 27th, 2025 [January 27th, 2025]
- Machine Learning Data Catalog Software Market Strategic Insights and Key Innovations: Leading Companies and... - WhaTech - January 27th, 2025 [January 27th, 2025]
- How AI and Machine Learning Will Influence Fintech Frontend Development in 2025 - Benzinga - January 27th, 2025 [January 27th, 2025]
- The Nvidia AI interview: Inside DLSS 4 and machine learning with Bryan Catanzaro - Eurogamer - January 22nd, 2025 [January 22nd, 2025]
- The wide use of machine learning VFX techniques on Here - befores & afters - January 22nd, 2025 [January 22nd, 2025]
- .NET Core: Pioneering the Future of AI and Machine Learning - TechBullion - January 22nd, 2025 [January 22nd, 2025]
- Development and validation of a machine learning-based prediction model for hepatorenal syndrome in liver cirrhosis patients using MIMIC-IV and eICU... - January 22nd, 2025 [January 22nd, 2025]
- A comparative study on different machine learning approaches with periodic items for the forecasting of GPS satellites clock bias - Nature.com - January 22nd, 2025 [January 22nd, 2025]
- Machine learning based prediction models for the prognosis of COVID-19 patients with DKA - Nature.com - January 22nd, 2025 [January 22nd, 2025]
- A scoping review of robustness concepts for machine learning in healthcare - Nature.com - January 22nd, 2025 [January 22nd, 2025]
- How AI and machine learning led to mind blowing progress in understanding animal communication - WHYY - January 22nd, 2025 [January 22nd, 2025]
- 3 Predictions For Predictive AI In 2025 - The Machine Learning Times - January 22nd, 2025 [January 22nd, 2025]
- AI and Machine Learning - WEF report offers practical steps for inclusive AI adoption - SmartCitiesWorld - January 22nd, 2025 [January 22nd, 2025]
- Learnings from a Machine Learning Engineer Part 3: The Evaluation | by David Martin | Jan, 2025 - Towards Data Science - January 22nd, 2025 [January 22nd, 2025]
- Google AI Research Introduces Titans: A New Machine Learning Architecture with Attention and a Meta in-Context Memory that Learns How to Memorize at... - January 22nd, 2025 [January 22nd, 2025]
- Improving BrainMachine Interfaces with Machine Learning ... - eeNews Europe - January 22nd, 2025 [January 22nd, 2025]
- Powered by machine learning, a new blood test can enable early detection of multiple cancers - Medical Xpress - January 15th, 2025 [January 15th, 2025]
- Mapping the Edges of Mass Spectral Prediction: Evaluation of Machine Learning EIMS Prediction for Xeno Amino Acids - Astrobiology News - January 15th, 2025 [January 15th, 2025]
- Development of an interpretable machine learning model based on CT radiomics for the prediction of post acute pancreatitis diabetes mellitus -... - January 15th, 2025 [January 15th, 2025]
- Understanding the spread of agriculture in the Western Mediterranean (6th-3rd millennia BC) with Machine Learning tools - Nature.com - January 15th, 2025 [January 15th, 2025]
- "From 'Food Rules' to Food Reality: Machine Learning Unveils the Ultra-Processed Truth in Our Grocery Carts" - American Council on Science... - January 15th, 2025 [January 15th, 2025]
- AI and Machine Learning in Business Market is Predicted to Reach $190.5 Billion at a CAGR of 32% by 2032 - EIN News - January 15th, 2025 [January 15th, 2025]
- QT Imaging Holdings Introduces Machine Learning-Enabled Image Interpolation Algorithm to Substantially Reduce Scan Time - Business Wire - January 15th, 2025 [January 15th, 2025]
- Global Tiny Machine Learning (TinyML) Market to Reach USD 3.4 Billion by 2030 - Key Drivers and Opportunities | Valuates Reports - PR Newswire UK - January 15th, 2025 [January 15th, 2025]
- Machine learning in mental health getting better all the time - Nature.com - January 15th, 2025 [January 15th, 2025]
- Signature-based intrusion detection using machine learning and deep learning approaches empowered with fuzzy clustering - Nature.com - January 15th, 2025 [January 15th, 2025]
- Machine learning and multi-omics in precision medicine for ME/CFS - Journal of Translational Medicine - January 15th, 2025 [January 15th, 2025]
- Exploring the influence of age on the causes of death in advanced nasopharyngeal carcinoma patients undergoing chemoradiotherapy using machine... - January 15th, 2025 [January 15th, 2025]
- 3D Shape Tokenization - Apple Machine Learning Research - January 9th, 2025 [January 9th, 2025]
- Machine Learning Used To Create Scalable Solution for Single-Cell Analysis - Technology Networks - January 9th, 2025 [January 9th, 2025]
- Robotics: machine learning paves the way for intuitive robots - Hello Future - January 9th, 2025 [January 9th, 2025]
- Machine learning-based estimation of crude oil-nitrogen interfacial tension - Nature.com - January 9th, 2025 [January 9th, 2025]
- Machine learning Nomogram for Predicting endometrial lesions after tamoxifen therapy in breast Cancer patients - Nature.com - January 9th, 2025 [January 9th, 2025]
- Staying ahead of the automation, AI and machine learning curve - Creamer Media's Engineering News - January 9th, 2025 [January 9th, 2025]
- Machine Learning and Quantum Computing Predict Which Antibiotic To Prescribe for UTIs - Consult QD - January 9th, 2025 [January 9th, 2025]
- Machine Learning, Innovation, And The Future Of AI: A Conversation With Manoj Bhoyar - International Business Times UK - January 9th, 2025 [January 9th, 2025]
- AMD's FSR 4 will use machine learning but requires an RDNA 4 GPU, promises 'a dramatic improvement in terms of performance and quality' - PC Gamer - January 9th, 2025 [January 9th, 2025]
- Explainable artificial intelligence with UNet based segmentation and Bayesian machine learning for classification of brain tumors using MRI images -... - January 9th, 2025 [January 9th, 2025]
- Understanding the Fundamentals of AI and Machine Learning - Nairobi Wire - January 9th, 2025 [January 9th, 2025]
- Machine learning can help blood tests have a separate normal for each patient - The Hindu - January 1st, 2025 [January 1st, 2025]
- Artificial Intelligence and Machine Learning Programs Introduced this Spring - The Flash Today - January 1st, 2025 [January 1st, 2025]
- Virtual reality-assisted prediction of adult ADHD based on eye tracking, EEG, actigraphy and behavioral indices: a machine learning analysis of... - January 1st, 2025 [January 1st, 2025]
- Open source machine learning systems are highly vulnerable to security threats - TechRadar - December 22nd, 2024 [December 22nd, 2024]
- After the PS5 Pro's less dramatic changes, PlayStation architect Mark Cerny says the next-gen will focus more on CPUs, memory, and machine-learning -... - December 22nd, 2024 [December 22nd, 2024]
- Accelerating LLM Inference on NVIDIA GPUs with ReDrafter - Apple Machine Learning Research - December 22nd, 2024 [December 22nd, 2024]
- Machine learning for the prediction of mortality in patients with sepsis-associated acute kidney injury: a systematic review and meta-analysis - BMC... - December 22nd, 2024 [December 22nd, 2024]
- Machine learning uncovers three osteosarcoma subtypes for targeted treatment - Medical Xpress - December 22nd, 2024 [December 22nd, 2024]
- From Miniatures to Machine Learning: Crafting the VFX of Alien: Romulus - Animation World Network - December 22nd, 2024 [December 22nd, 2024]
- Identification of hub genes, diagnostic model, and immune infiltration in preeclampsia by integrated bioinformatics analysis and machine learning -... - December 22nd, 2024 [December 22nd, 2024]
- This AI Paper from Microsoft and Novartis Introduces Chimera: A Machine Learning Framework for Accurate and Scalable Retrosynthesis Prediction -... - December 18th, 2024 [December 18th, 2024]
- Benefits and Challenges of Integrating AI and Machine Learning into EHR Systems - Healthcare IT Today - December 18th, 2024 [December 18th, 2024]
- The History Of AI: How Machine Learning's Evolution Is Reshaping Everything Around Us - SlashGear - December 18th, 2024 [December 18th, 2024]
- AI and Machine Learning to Enhance Pension Plan Governance and the Investor Experience: New CFA Institute Research - Fintech Finance - December 18th, 2024 [December 18th, 2024]
- Address Common Machine Learning Challenges With Managed MLflow - The New Stack - December 18th, 2024 [December 18th, 2024]
- Machine Learning Used To Classify Fossils Of Extinct Pollen - Offworld Astrobiology Applications? - Astrobiology News - December 18th, 2024 [December 18th, 2024]
- Machine learning model predicts CDK4/6 inhibitor effectiveness in metastatic breast cancer - News-Medical.Net - December 18th, 2024 [December 18th, 2024]
- New Lockheed Martin Subsidiary to Offer Machine Learning Tools to Defense Customers - ExecutiveBiz - December 18th, 2024 [December 18th, 2024]
- How Powerful Will AI and Machine Learning Become? - International Policy Digest - December 18th, 2024 [December 18th, 2024]
- ChatGPT-Assisted Machine Learning for Chronic Disease Classification and Prediction: A Developmental and Validation Study - Cureus - December 18th, 2024 [December 18th, 2024]
- Blood Tests Are Far From Perfect But Machine Learning Could Change That - Inverse - December 18th, 2024 [December 18th, 2024]
- Amazons AGI boss: You dont need a PhD in machine learning to build with AI anymore - Fortune - December 18th, 2024 [December 18th, 2024]