An introduction to generative AI with Swami Sivasubramanian – All Things Distributed
In the last few months, weve seen an explosion of interest in generative AI and the underlying technologies that make it possible. It has pervaded the collective consciousness for many, spurring discussions from board rooms to parent-teacher meetings. Consumers are using it, and businesses are trying to figure out how to harness its potential. But it didnt come out of nowhere machine learning research goes back decades. In fact, machine learning is something that weve done well at Amazon for a very long time. Its used for personalization on the Amazon retail site, its used to control robotics in our fulfillment centers, its used by Alexa to improve intent recognition and speech synthesis. Machine learning is in Amazons DNA.
To get to where we are, its taken a few key advances. First, was the cloud. This is the keystone that provided the massive amounts of compute and data that are necessary for deep learning. Next, were neural nets that could understand and learn from patterns. This unlocked complex algorithms, like the ones used for image recognition. Finally, the introduction of transformers. Unlike RNNs, which process inputs sequentially, transformers can process multiple sequences in parallel, which drastically speeds up training times and allows for the creation of larger, more accurate models that can understand human knowledge, and do things like write poems, even debug code.
I recently sat down with an old friend of mine, Swami Sivasubramanian, who leads database, analytics and machine learning services at AWS. He played a major role in building the original Dynamo and later bringing that NoSQL technology to the world through Amazon DynamoDB. During our conversation I learned a lot about the broad landscape of generative AI, what were doing at Amazon to make large language and foundation models more accessible, and last, but not least, how custom silicon can help to bring down costs, speed up training, and increase energy efficiency.
We are still in the early days, but as Swami says, large language and foundation models are going to become a core part of every application in the coming years. Im excited to see how builders use this technology to innovate and solve hard problems.
To think, it was more than 17 years ago, on his first day, that I gave Swami two simple tasks: 1/ help build a database that meets the scale and needs of Amazon; 2/ re-examine the data strategy for the company. He says it was an ambitious first meeting. But I think hes done a wonderful job.
If youd like to read more about what Swamis teams have built, you can read more here. The entire transcript of our conversation is available below. Now, as always, go build!
This transcript has been lightly edited for flow and readability.
***
Werner Vogels: Swami, we go back a long time. Do you remember your first day at Amazon?
Swami Sivasubramanian: I still remember… it wasnt very common for PhD students to join Amazon at that time, because we were known as a retailer or an ecommerce site.
WV: We were building things and thats quite a departure for an academic. Definitely for a PhD student. To go from thinking, to actually, how do I build?
So you brought DynamoDB to the world, and quite a few other databases since then. But now, under your purview theres also AI and machine learning. So tell me, what does your world of AI look like?
SS: After building a bunch of these databases and analytic services, I got fascinated by AI because literally, AI and machine learning puts data to work.
If you look at machine learning technology itself, broadly, its not necessarily new. In fact, some of the first papers on deep learning were written like 30 years ago. But even in those papers, they explicitly called out for it to get large scale adoption, it required a massive amount of compute and a massive amount of data to actually succeed. And thats what cloud got us to to actually unlock the power of deep learning technologies. Which led me to this is like 6 or 7 years ago to start the machine learning organization, because we wanted to take machine learning, especially deep learning style technologies, from the hands of scientists to everyday developers.
WV: If you think about the early days of Amazon (the retailer), with similarities and recommendations and things like that, were they the same algorithms that were seeing used today? Thats a long time ago almost 20 years.
SS: Machine learning has really gone through huge growth in the complexity of the algorithms and the applicability of use cases. Early on the algorithms were a lot simpler, like linear algorithms or gradient boosting.
The last decade, it was all around deep learning, which was essentially a step up in the ability for neural nets to actually understand and learn from the patterns, which is effectively what all the image based or image processing algorithms come from. And then also, personalization with different kinds of neural nets and so forth. And thats what led to the invention of Alexa, which has a remarkable accuracy compared to others. The neural nets and deep learning has really been a step up. And the next big step up is what is happening today in machine learning.
WV: So a lot of the talk these days is around generative AI, large language models, foundation models. Tell me, why is that different from, lets say, the more task-based, like fission algorithms and things like that?
SS: If you take a step back and look at all these foundation models, large language models… these are big models, which are trained with hundreds of millions of parameters, if not billions. A parameter, just to give context, is like an internal variable, where the ML algorithm must learn from its data set. Now to give a sense… what is this big thing suddenly that has happened?
A few things. One, transformers have been a big change. A transformer is a kind of a neural net technology that is remarkably scalable than previous versions like RNNs or various others. So what does this mean? Why did this suddenly lead to all this transformation? Because it is actually scalable and you can train them a lot faster, and now you can throw a lot of hardware and a lot of data [at them]. Now that means, I can actually crawl the entire world wide web and actually feed it into these kind of algorithms and start building models that can actually understand human knowledge.
WV: So the task-based models that we had before and that we were already really good at could you build them based on these foundation models? Task specific models, do we still need them?
SS: The way to think about it is that the need for task-based specific models are not going away. But what essentially is, is how we go about building them. You still need a model to translate from one language to another or to generate code and so forth. But how easy now you can build them is essentially a big change, because with foundation models, which are the entire corpus of knowledge… thats a huge amount of data. Now, it is simply a matter of actually building on top of this and fine tuning with specific examples.
Think about if youre running a recruiting firm, as an example, and you want to ingest all your resumes and store it in a format that is standard for you to search an index on. Instead of building a custom NLP model to do all that, now using foundation models with a few examples of an input resume in this format and here is the output resume. Now you can even fine tune these models by just giving a few specific examples. And then you essentially are good to go.
WV: So in the past, most of the work went into probably labeling the data. I mean, and that was also the hardest part because that drives the accuracy.
SS: Exactly.
WV: So in this particular case, with these foundation models, labeling is no longer needed?
SS: Essentially. I mean, yes and no. As always with these things there is a nuance. But a majority of what makes these large scale models remarkable, is they actually can be trained on a lot of unlabeled data. You actually go through what I call a pre-training phase, which is essentially you collect data sets from, lets say the world wide Web, like common crawl data or code data and various other data sets, Wikipedia, whatnot. And then actually, you dont even label them, you kind of feed them as it is. But you have to, of course, go through a sanitization step in terms of making sure you cleanse data from PII, or actually all other stuff for like negative things or hate speech and whatnot. Then you actually start training on a large number of hardware clusters. Because these models, to train them can take tens of millions of dollars to actually go through that training. Finally, you get a notion of a model, and then you go through the next step of what is called inference.
WV: Lets take object detection in video. That would be a smaller model than what we see now with the foundation models. Whats the cost of running a model like that? Because now, these models with hundreds of billions of parameters are very large.
SS: Yeah, thats a great question, because there is so much talk already happening around training these models, but very little talk on the cost of running these models to make predictions, which is inference. Its a signal that very few people are actually deploying it at runtime for actual production. But once they actually deploy in production, they will realize, oh no, these models are very, very expensive to run. And that is where a few important techniques actually really come into play. So one, once you build these large models, to run them in production, you need to do a few things to make them affordable to run at scale, and run in an economical fashion. Ill hit some of them. One is what we call quantization. The other one is what I call a distillation, which is that you have these large teacher models, and even though they are trained on hundreds of billions of parameters, they are distilled to a smaller fine-grain model. And speaking in a super abstract term, but that is the essence of these models.
WV: So we do build… we do have custom hardware to help out with this. Normally this is all GPU-based, which are expensive energy hungry beasts. Tell us what we can do with custom silicon hatt sort of makes it so much cheaper and both in terms of cost as well as, lets say, your carbon footprint.
SS: When it comes to custom silicon, as mentioned, the cost is becoming a big issue in these foundation models, because they are very very expensive to train and very expensive, also, to run at scale. You can actually build a playground and test your chat bot at low scale and it may not be that big a deal. But once you start deploying at scale as part of your core business operation, these things add up.
In AWS, we did invest in our custom silicons for training with Tranium and with Inferentia with inference. And all these things are ways for us to actually understand the essence of which operators are making, or are involved in making, these prediction decisions, and optimizing them at the core silicon level and software stack level.
WV: If cost is also a reflection of energy used, because in essence thats what youre paying for, you can also see that they are, from a sustainability point of view, much more important than running it on general purpose GPUs.
WV: So theres a lot of public interest in this recently. And it feels like hype. Is this something where we can see that this is a real foundation for future application development?
SS: First of all, we are living in very exciting times with machine learning. I have probably said this now every year, but this year it is even more special, because these large language models and foundation models truly can enable so many use cases where people dont have to staff separate teams to go build task specific models. The speed of ML model development will really actually increase. But you wont get to that end state that you want in the next coming years unless we actually make these models more accessible to everybody. This is what we did with Sagemaker early on with machine learning, and thats what we need to do with Bedrock and all its applications as well.
But we do think that while the hype cycle will subside, like with any technology, but these are going to become a core part of every application in the coming years. And they will be done in a grounded way, but in a responsible fashion too, because there is a lot more stuff that people need to think through in a generative AI context. What kind of data did it learn from, to actually, what response does it generate? How truthful it is as well? This is the stuff we are excited to actually help our customers [with].
WV: So when you say that this is the most exciting time in machine learning what are you going to say next year?
More:
An introduction to generative AI with Swami Sivasubramanian - All Things Distributed
- Looking back to move forward: can historical clinical trial data and machine learning drive change in participant recruitment in anticipation of... - October 15th, 2025 [October 15th, 2025]
- Physics-Based Machine Learning Paves the Way for Advanced 3D-Printed Materials - Bioengineer.org - October 15th, 2025 [October 15th, 2025]
- Predicting one-year overall survival in patients with AITL using machine learning algorithms: a multicenter study - Nature - October 15th, 2025 [October 15th, 2025]
- Explainable machine learning models for predicting of protein-energy wasting in patients on maintenance haemodialysis - BMC Nephrology - October 15th, 2025 [October 15th, 2025]
- Feasibility of machine learning analysis for the identification of patients with possible primary ciliary dyskinesia - Orphanet Journal of Rare... - October 15th, 2025 [October 15th, 2025]
- Machine learning-based prediction of preeclampsia using first-trimester inflammatory markers and red blood cell indices - BMC Pregnancy and Childbirth - October 15th, 2025 [October 15th, 2025]
- Utilizing AI and machine learning to improve railroad safety: Detecting trespasser hotspots - masstransitmag.com - October 15th, 2025 [October 15th, 2025]
- Precision medicine meets machine learning: AI and oncology biomarkers - pharmaphorum - October 15th, 2025 [October 15th, 2025]
- Aether Pro Exchange Transforms Execution Dynamics with Machine-Learning Optimization - GlobeNewswire - October 15th, 2025 [October 15th, 2025]
- Prevalence, associated factors, and machine learning-based prediction of depression, anxiety, and stress among university students: a cross-sectional... - October 15th, 2025 [October 15th, 2025]
- Artificial Intelligence vs. Machine Learning: Which skills will open better career options in the global - Times of India - October 15th, 2025 [October 15th, 2025]
- Study Reveals Impact of Negative Class Definitions on Machine Learning Accuracy in Immunotherapy - geneonline.com - October 15th, 2025 [October 15th, 2025]
- Muna Al-Khaifi: Detection of Breast Cancer Using Machine Learning and Explainable AI - Oncodaily - October 13th, 2025 [October 13th, 2025]
- Expedia Group Unveils Innovative AI and Machine Learning Solutions to Transform Partner Travel Experiences - Travel And Tour World - October 13th, 2025 [October 13th, 2025]
- Machine Learning-Guided Prediction of Formulation Performance in Inhalable CiprofloxacinBile Acid Dispersions with Antimicrobial and Toxicity... - October 13th, 2025 [October 13th, 2025]
- Machine Learning and BIG DATA workshop planned Oct. 14-15 - West Virginia University - October 11th, 2025 [October 11th, 2025]
- How Google enables third-party circularity by increasing recycling rates with Machine Learning - The World Business Council for Sustainable... - October 11th, 2025 [October 11th, 2025]
- Integrating Artificial Intelligence and Machine Learning in Hydroclimatic Research - A Promising Step Forward - University of Northern British... - October 11th, 2025 [October 11th, 2025]
- Semi-automatic detection of anteriorly displaced temporomandibular joint discs in magnetic resonance images using machine learning - BMC Oral Health - October 11th, 2025 [October 11th, 2025]
- AI and Machine Learning - Partnership to bring infrastructure intelligence to US public sector - Smart Cities World - October 11th, 2025 [October 11th, 2025]
- Between rain and snow, machine learning finds nine precipitation types - Phys.org - October 9th, 2025 [October 9th, 2025]
- Between rain and snow, machine learning finds 9 precipitation types - Michigan Engineering News - October 9th, 2025 [October 9th, 2025]
- Machine learning optimizes nanoparticle design for drug delivery to the brain - Physics World - October 9th, 2025 [October 9th, 2025]
- Development and validation of a machine learning-based prediction model for prolonged length of stay after laparoscopic gastrointestinal surgery: a... - October 9th, 2025 [October 9th, 2025]
- G Sachs: Stock Mkt Not in Bubble Yet; Machine Learning/ AI Expected to Spawn New Wave of Superstars - AASTOCKS.com - October 9th, 2025 [October 9th, 2025]
- AI and Machine Learning - See.Sense works with City of Sydney to develop AI dashboard - Smart Cities World - October 9th, 2025 [October 9th, 2025]
- Machine Learning Used to Predict Live Birth Outcomes in Fresh Embryo Transfers - geneonline.com - October 9th, 2025 [October 9th, 2025]
- RIT researchers use machine learning to better understand the pathways of disease - Rochester Institute of Technology - October 7th, 2025 [October 7th, 2025]
- Leveraging machine learning to predict mosquito bed net utilization among women of reproductive age in sub-Saharan Africa - Malaria Journal - October 7th, 2025 [October 7th, 2025]
- Machine learning-based radiomics using magnetic resonance images for prediction of clinical complete response to neoadjuvant chemotherapy in patients... - October 7th, 2025 [October 7th, 2025]
- Machine Learning Self Driving Cars: The Technology Driving the Future of Mobility - SpeedwayMedia.com - October 7th, 2025 [October 7th, 2025]
- Investigating the relationship between blood factors and HDL-C levels in the bloodstream using machine learning methods - Journal of Health,... - October 7th, 2025 [October 7th, 2025]
- AI in the fast lane: F1 teams Alpine, Audi use machine learning as force multiplier - The Business Times - October 7th, 2025 [October 7th, 2025]
- Future Scope of Machine Learning in Healthcare Market Set to Witness Significant Growth by 2025-2032 - openPR.com - October 7th, 2025 [October 7th, 2025]
- AI and Machine Learning - AI readiness and adoption toolkit launched - Smart Cities World - October 4th, 2025 [October 4th, 2025]
- Machine Learning Model UmamiPredict Developed to Forecast Savory Taste of Molecules and Peptides - geneonline.com - October 4th, 2025 [October 4th, 2025]
- Machine Learning Boosts Crop Yield Predictions in Senegal - Bioengineer.org - October 4th, 2025 [October 4th, 2025]
- Machine learning-driven stability analysis of eco-friendly superhydrophobic graphene-based coatings on copper substrate - Nature - October 4th, 2025 [October 4th, 2025]
- Integrated machine learning analysis of proteomic and transcriptomic data identifies healing associated targets in diabetic wound repair - Nature - October 4th, 2025 [October 4th, 2025]
- Development and evaluation of a machine learning prediction model for short-term mortality in patients with diabetes or hyperglycemia at emergency... - October 4th, 2025 [October 4th, 2025]
- Fast and robust mixed gas identification and recognition using tree-based machine learning and sensor array response - Nature - October 4th, 2025 [October 4th, 2025]
- Estimation of sexual dimorphism of adult human mandibles of South Indian origin using non-metric parameters and machine learning classification... - October 4th, 2025 [October 4th, 2025]
- Cloud-Based Machine Learning Platforms Technologies Market Growth and Future Prospects - Precedence Research - October 4th, 2025 [October 4th, 2025]
- Machine Learning Framework Developed to Optimize Phosphorus Recovery in Hydrothermal Treatment of Livestock Manure - geneonline.com - October 4th, 2025 [October 4th, 2025]
- Unifying machine learning and interpolation theory via interpolating neural networks - Nature - October 2nd, 2025 [October 2nd, 2025]
- Anna: an open-source platform for real-time integration of machine learning classifiers with veterinary electronic health records - BMC Veterinary... - October 2nd, 2025 [October 2nd, 2025]
- The Future of Liver Health: Can Human Models and Machine Learning Reduce Disease Rates? - Technology Networks - October 2nd, 2025 [October 2nd, 2025]
- Machine Learning Radiomics Predicts Pancreatic Cancer Invasion - Bioengineer.org - October 2nd, 2025 [October 2nd, 2025]
- Next-generation COVID-19 detection using a metasurface biosensor with machine learning-enhanced refractive index sensing - Nature - October 2nd, 2025 [October 2nd, 2025]
- Machine learning-based models for screening of anemia and leukemia using features of complete blood count reports - Nature - October 2nd, 2025 [October 2nd, 2025]
- Estimating the peak age of chess players through statistical and machine learning techniques - Nature - October 2nd, 2025 [October 2nd, 2025]
- Optimizing water quality index using machine learning: a six-year comparative study in riverine and reservoir systems - Nature - October 2nd, 2025 [October 2nd, 2025]
- Physics-informed machine learning-based real-time long-horizon temperature fields prediction in metallic additive manufacturing - Nature - October 2nd, 2025 [October 2nd, 2025]
- The Silicon Revolution: How AI and Machine Learning Are Forging the Future of Semiconductor Manufacturing - FinancialContent - October 2nd, 2025 [October 2nd, 2025]
- Machine learning model for differentiating Pneumocystis jirovecii pneumonia from colonization and analyzing mortality risk in non-HIV patients using... - October 2nd, 2025 [October 2nd, 2025]
- Radiomics and Machine Learning Applied to CECT Scans Show Potential in Predicting Perineural Invasion in Pancreatic Cancer - geneonline.com - October 2nd, 2025 [October 2nd, 2025]
- Machine learning and response surface optimization to enhance diesel engine performance using milk scum biodiesel with alumina nanoparticles - Nature - October 2nd, 2025 [October 2nd, 2025]
- Landmark Patent Appeal Decision Strengthens Protection for AI and Machine Learning Innovations - The National Law Review - October 2nd, 2025 [October 2nd, 2025]
- Machine learning researchers and industry leaders gathering at Santa Clara University - Stories - News & Events - Santa Clara University - September 30th, 2025 [September 30th, 2025]
- Building better batteries with amorphous materials and machine learning - Tech Xplore - September 30th, 2025 [September 30th, 2025]
- Machine Learning-Supported Fragment Hit Expansion in Absence of X-Ray Structures - Evotec - September 30th, 2025 [September 30th, 2025]
- Machine learning model predicts which radiotherapy patients are most vulnerable to adverse side effects - Health Imaging - September 30th, 2025 [September 30th, 2025]
- How AI and Machine Learning Are Revolutionizing Laser Welding - Downbeach - September 30th, 2025 [September 30th, 2025]
- What if A.I. Doesnt Get Much Better Than This? - Machine Learning Week 2025 - September 30th, 2025 [September 30th, 2025]
- Sex estimation from the sternum in Turkish population using various machine learning methods and deep neural networks - SpringerOpen - September 30th, 2025 [September 30th, 2025]
- Predictive AI Must Be Valuated But Rarely Is. Heres How To Do It - Machine Learning Week 2025 - September 30th, 2025 [September 30th, 2025]
- Interpretable machine learning incorporating major lithology for regional landslide warning in northern and eastern Guangdong - Nature - September 28th, 2025 [September 28th, 2025]
- Building Machine Learning Application with Django - KDnuggets - September 28th, 2025 [September 28th, 2025]
- Evaluating the use of body mass index change as a proxy for anorexia nervosa recovery: a machine learning perspective - Journal of Eating Disorders - September 28th, 2025 [September 28th, 2025]
- Prediction of cutting parameters and reduction of output parameters using machine learning in milling of Inconel 718 alloy - Nature - September 28th, 2025 [September 28th, 2025]
- How AI and machine learning are changing both retail and online casino experiences - Retail Technology Innovation Hub - September 28th, 2025 [September 28th, 2025]
- Machine learning and cell imaging combine to predict effectiveness of multiple sclerosis medication - Medical Xpress - September 25th, 2025 [September 25th, 2025]
- IC combines machine learning and analogue inferencing - Electronics Weekly - September 25th, 2025 [September 25th, 2025]
- ODU Awarded $2.3M NIH Grant to Improve Detection of Brain Tumor Recurrence with AI and Machine Learning - Old Dominion University - September 25th, 2025 [September 25th, 2025]
- Development of a machine learning-based depression risk identification tool for older adults with asthma - BMC Psychiatry - September 25th, 2025 [September 25th, 2025]
- AI and Machine Learning Uses in Neuroscience Drug Discovery, Upcoming Webinar Hosted by Xtalks - PR Newswire - September 25th, 2025 [September 25th, 2025]
- Error-controlled non-additive interaction discovery in machine learning models - Nature - September 23rd, 2025 [September 23rd, 2025]
- AI, Machine Learning Will Drive Market Data Consumption - Markets Media - September 23rd, 2025 [September 23rd, 2025]
- Machine Learning Model May Optimize Treatment Selection and Survival in HCC - Targeted Oncology - September 23rd, 2025 [September 23rd, 2025]
- From pixels to pumps: Machine learning and satellite imagery help map irrigation - Phys.org - September 23rd, 2025 [September 23rd, 2025]