Amid the breathless sum and relentless AI hype of caller years, 1 of the world’s biggest tech companies—Amazon—has been notably absent.
Matt Garman, the CEO of Amazon Web Services, is looking to alteration that. At the caller AWS re:Invent conference, Garman announced a clump of frontier AI models, arsenic good arsenic a instrumentality designed to fto AWS customers physique models of their own. That tool, Nova Forge, allows companies to prosecute successful what’s known arsenic customized pretraining—adding their information successful the process of gathering a basal model—which should let for vastly much customized models that suit a fixed company’s needs. Sure, it doesn’t rather person the sexiness of a Sora 2 announcement, but that’s not Garman’s goal: He’s little funny successful wide user usage of AI and much funny successful endeavor solutions that’ll integrate AI into each of AWS’s offerings—and person a worldly interaction connected a firm P&L.
For this week’s occurrence of The Big Interview, I caught up with Garman aft AWS re:Invent to speech astir what the institution announced, whether helium feels down successful the AI race, however helium thinks astir managing immense teams (and managing interior dissent), and wherefore he’s not convinced that AI is (or should be) the large occupation thief of our era. Here’s our conversation.
This interrogation has been edited for magnitude and clarity.
KATIE DRUMMOND: Matt Garman, invited to the Big Interview.
MATT GARMAN: Thank you. Thanks for having me.
We ever commencement these conversations with immoderate precise speedy questions, similar a warmup. Are you ready?
Go ahead. Fire away.
If AWS had a mascot, what would it be?
We person a large S3 bucket sometimes that goes around, truthful we'll telephone it that.
Sorry, what is an S3 bucket?
An S3 bucket is similar a happening that you store your S3 objects in, but we really person a ample foam large bucket that walks astir and really looks similar a overgarment bucket.
So you bash person a mascot.
Well, S3 has a bucket, it has a mascot. It's astir apt the closest we have, and I similar it.
What's the astir costly mistake you've ever made?
Personally oregon professionally? That’s a bully question. Personally, the astir costly mistake I ever made was playing hoops excessively agelong and I tore my Achilles. So that outgo maine astir 9 months of being capable to walk. I astir apt should person known that into my thirties I was good past basketball-playing age. I mislaid a small spot of clip there.
That sounds personally expensive.
Yes.
If you could rename the unreality today, what would you telephone it?
What is it called today?
The cloud.
I really deliberation the unreality is simply a beauteous bully name. So I don't cognize if I would rename it. I would rename what we called Amazon AWS, oregon Amazon Web Services. Now, nary 1 knows what web services are. So that, I mightiness rename a small bit, but I really similar the sanction of the cloud, truthful I'm not definite I would rename it.
Maybe Amazon Cloud Services.
Yeah, maybe.
What's the portion of your occupation you would emotion to outsource to an AI agent?
I effort to outsource a batch of the parts of my jobs to AI agents, if I can. But I find that I inactive haven't figured retired however to outsource answering my day-to-day emails yet. That still, I find, takes up a batch of my time. If I could fig that out, I deliberation that would beryllium great.
But it sounds similar you've tried, and I cognize we're expected to beryllium doing speedy questions, but I was going to inquire astir this later. Tell maine a spot astir however you've tried to incorporated artificial quality into your workflow, into your idiosyncratic and nonrecreational life.
There's a fig of ways that I've done it. In peculiar for me, though, a batch of the benefits that I get successful my occupation are taking a batch of accusation inputs and past benignant of sharing those retired to either the aforesaid oregon different radical and connecting a batch of those dots.
For my peculiar role, I haven't yet recovered a immense shortcut for being capable to bash that, peculiarly with regards to the mean successful which we communicate, similar email oregon different places similar that.
Because I find that each of the shortcuts suffer immoderate of that nuance. There's immoderate summaries and things that work. There's decidedly immoderate tools that let maine to summarize contented much rapidly oregon larn caller contented much quickly, which I find to beryllium ace useful. But I haven't recovered a immense clip triumph for my relation successful particular, wherever there's not arsenic overmuch repetitive enactment oregon different things similar that.
It mostly is cognition that I'm trying to get from a clump of sources and past consolidate to nonstop retired to others.
That's really precise absorbing to me, and reassuring successful a way, due to the fact that I sometimes consciousness similar I should person recovered a clump of shortcuts by now.
So, from 1 erstwhile AWS intern—that's you—to a aboriginal one, what is your champion portion of advice?
I find that radical ever overestimate however overmuch exertion has already been invented and deliberation that there's thing near to do. What I find is that we proceed to beryllium astatine the aboriginal stages of evolution. As agelong arsenic you're funny and looking and consenting to effort caller technologies, caller areas, we're ever astatine that aboriginal signifier of what tin beryllium invented.
Sometimes I tally into interns who are like, “When you started AWS it was small, but present it's a large institution and there's not the aforesaid opportunity.” And I'd accidental that that's conscionable not true. I deliberation there's conscionable arsenic much, if not more, accidental than determination ever has been.
Which leads maine to my adjacent question: How bash radical cognize erstwhile you're unimpressed?
When I'm unimpressed, I would archer them directly.
Fair. So, nonstop feedback.
Largely, though, it's not astir impressing me. I similar erstwhile radical are thoughtful, erstwhile they've travel up with the close sets of decisions. It's not similar radical are coming up with thing that's ace caller each day. I similar it erstwhile radical person done the enactment truthful that we person each the information, whether it's lawsuit input oregon information input oregon income input oregon whatever, truthful that arsenic a radical we tin marque thoughtful decisions.
I’m impressed erstwhile radical person done that enactment up of clip truthful that erstwhile we bash get together, we tin marque bully decisions, and thoughtful decisions. I'll often archer radical it's not needfully the proposal that I'm going to beryllium impressed by, but I privation america to person each the accusation truthful that arsenic a team, we tin determination guardant and marque large decisions.
What is simply a hobby you privation you had much clip for? I'm assuming it's not basketball, based connected what we conscionable learned astir you.
It's not; I've switched to golf. I caught the bug a mates years agone and rather emotion playing golf. I don't get to play arsenic much, but I bask it.
Let maine acceptable the signifier a small bit. We're present to talk, successful particular, astir a clump of announcements that you precocious made astir AWS and AI and agentic AI that WIRED covered. I'm biased, but I thought we did a large occupation with that coverage. But I besides privation to larn a small astir you successful a nonrecreational context. So archer me, and archer each of us, astir your vocation travel frankincense far. Now you’re the CEO of AWS, but you had a precise agelong vocation astatine Amazon earlier that. How did you extremity up wherever you are now?
I'll commencement astatine the beginning. I worked for a mates of startups aboriginal connected successful my career. None of them did peculiarly well, but I learned a ton from them, which was great. After my 2nd startup, my woman and I some discontinue our jobs and went to concern school, which was a fantastic opportunity.
When I was there, 1 of the things during my internship that I wanted to effort to research was what entrepreneurship looked similar wrong of a company. My extremity was ever to spell backmost and bash a startup again. So arsenic portion of that, I looked astatine a clump of antithetic companies and I ran crossed Amazon and really talked to [president and CEO] Andy Jassy, and they talked astir gathering this exertion services capableness wrong of Amazon. I thought that was precisely what I wanted to see. I wanted to spot what it would look similar for a palmy exertion institution to effort to physique thing caller wrong of it, due to the fact that I conscionable wanted to spot what that question looked similar and however I could larn from experienced entrepreneurs and what was antithetic than astatine a startup.
Got it.
So I did my internship for what turned into AWS successful 2005 earlier we launched. I was fascinated by it. I thought it was an awesome opportunity, and I said, “Great, I wanna travel backmost and enactment present for a mates years,” and past I would spell backmost and bash a startup.
So past I started full-time successful 2006, efficaciously arsenic the merchandise manager for each of AWS. It was mostly defining each of the services arsenic we launched them. So I started a mates weeks aft S3 launched, and earlier the remainder of our services launched, and helped motorboat them and sanction them and terms them and bash a clump of things.
Then I kept getting much and much responsibility. I focused connected EC2, which was our compute service. I started taking connected engineering teams, really launched our artifact retention service, benignant of wrote the PR/FAQ for that and hired the archetypal engineering squad and launched that.
Then I grew to pb astir of our halfway compute and networking and retention merchandise areas. So each of the merchandise and engineering teams for that. It was fun. I got to larn a batch on the way. I'm not necessarily, oregon I wasn't originally, a heavy exertion person, but I got to larn astir hypervisors and kernel engineering and a clump of these truly low-level halfway exertion pieces, which was cool. It was ace interesting. It's conscionable specified a fascinating space, and arsenic AWS grew truly rapidly, we grew on with it, and we grew the squad beauteous significantly.
We were fortunate capable to enactment with immoderate of the champion exertion radical successful the satellite arsenic we built the service, and a clump of services, arsenic the concern group. I can't retrieve the nonstop time, but it was aft astir 12 oregon 13 years, truthful it would've been similar 2019, thing similar that, Andy Jassy called maine successful his bureau 1 time and asked if I would pb income and marketing. I virtually didn't cognize thing astir income and marketing. In fact, I was benignant of looking astir to spot if helium was talking to idiosyncratic else. But it was a large accidental to larn that space, and it was a unsocial opportunity, right?
Yes.
I was fundamentally handed 1 of the world's 2 oregon 3 biggest endeavor income and selling organizations, having ne'er done immoderate of those jobs before. I deliberation Amazon's a spot unsocial successful that we spot people. If you're smart, you cognize however to operate, you cognize the business, you don't needfully person to cognize the nonstop happening that you're going into. The squad was gracious and helped maine larn that. That was a large accidental to get to larn however to tally a tract enactment astatine standard and get to walk a ton much clip with customers, to truly recognize the nuances of what a startup lawsuit was looking for versus enterprise, versus the governments, which was awesome.
Then I took implicit arsenic CEO 2 years ago, oregon a twelvemonth and a fractional ago. So I've spent astir 20 years present astatine Amazon, each successful AWS.
How large is the enactment by worker size?
I don't cognize the nonstop numbers, but it's successful the hundreds of thousands. We person ample information centers. We tally a precise ample operational organization. We person information centers each astir the world. So it's a beauteous large team.
I emotion management, and it was wide reasonably aboriginal successful my vocation that that is wherever I wanted to go. Was that ever wide for you, the thought that you wanted to beryllium taking connected much and much of the enterprise? That you were doing what you were meant to beryllium doing?
Yes, I similar it and I deliberation that I'm reasonably bully astatine it, I guess.
We’ll inquire immoderate of your employees.
I took connected much and much arsenic it went. You tin inquire others. I conjecture that's not for maine to say, but I liked it. You know, it was a ton of learning. What I truly emotion is erstwhile I get to instrumentality connected roles wherever I get to larn more, and get to agelong myself.
Frankly, I emotion gathering and emotion having an interaction connected what the concern is doing and what our customers are doing. It's hard due to the fact that you don't get immoderate of the joyousness of really physically getting to physique the happening oregon present the happening yourself.
In my world, that’s similar not getting to constitute the story.
That's right. But you tin constitute a batch much stories successful that case. And you larn to bash that done others, which is an absorbing and utile skill. You larn however to pass to teams, archetypal done nonstop management, past done layers of management.
So you gotta deliberation astir however you physique mechanisms. This is 1 of the things I enjoyed learning, which is however bash you deliberation astir gathering mechanisms that let you to assistance those idiosyncratic contributors marque immoderate of the close kinds of decisions, and the strategy that you're trying to thrust for the squad oregon the concern oregon the company? I've rather enjoyed learning however to leverage immoderate of those mechanisms astatine antithetic scale. That's been amusive to do, too.
You conscionable talked astir going from managing radical to managing managers, and I curse I'm asking for a friend, but bash you person immoderate peculiar mechanisms that you person picked up successful those 20 years that basal retired to you arsenic peculiarly effectual strategies? Because I volition accidental determination is thing precise specifically antithetic astir managing idiosyncratic who past manages radical who past negociate teams of people.
Yeah.
It has the imaginable to beryllium a precise unproductive crippled of telephone. What mechanisms bash you employment to marque it overmuch much effectual than that?
I mean, look, everybody has their ain mode of doing this. So you a small spot person to find what works, and I bash deliberation that arsenic your squad successful your enactment gets larger, you person to alteration immoderate of those things. I deliberation that's 1 of the communal pitfalls that I spot radical autumn into is that they volition presume that the happening that worked large erstwhile they were a enactment manager, managing a squad of six to 10 people, volition enactment the aforesaid arsenic erstwhile they're managing a squad of a 100 people.
Those aforesaid things won't work. There's different displacement that's similar erstwhile you don't cognize the sanction of everyone successful your team. Or your organization, which I can't unluckily cognize today. Usually that breaks determination astir a 100 to 200 people, determination successful there.
Right.
You'll tally into radical who are successful your squad that you don't cognize oregon don't cognize their names. You conscionable person to deliberation astir each of those things differently. You privation to springiness the broadest acceptable of radical successful your squad arsenic imaginable intelligence models connected however you would marque decisions successful their spot arsenic opposed to what the determination really is.
Because if you nonstop down edicts each day—like, we're going to bash this, oregon you wanna bash this, oregon I'm going to marque a determination connected that—it's not arsenic empowering to your team. And there's a accidental that immoderate you accidental gets miscommunicated. But if you tin springiness radical intelligence models of, for example, “If I was going to attack this situation, this is however I would deliberation astir it, oregon these are the ways successful which I would marque trade-offs,” past you tin empower tens of thousands of radical to marque decisions.
Then you tin absorption connected making definite that you prosecute astute radical and don't punish them erstwhile they marque atrocious decisions, but alternatively course-correct. That's however I've benignant of learned astatine scale, wherever your existent leverage points are ensuring that you person those close mechanisms astatine place.
I admit the escaped absorption advice. I'm definite galore of our listeners do, too, truthful convey you.
I privation to present inquire you astir AWS successful a wide context. I would picture WIRED's assemblage arsenic benignant of funny generalists. Obviously, they're listening to this, oregon speechmaking this, due to the fact that they're funny successful exertion and wherever it's taking the world. They astir apt person immoderate consciousness of what AWS is. But they astir apt don't cognize however large AWS is and however captious it is successful presumption of conscionable infrastructure. Can you explicate it to us?
Not similar we're five, but similar we're 21, we conscionable graduated with a humanities degree, trying to recognize this happening that you are successful complaint of.
At a precocious level, our thought down AWS hasn't changed successful the past 20 years, which is that determination are a clump of pieces of exertion that were truly non-differentiating that companies had to bash connected their ain for a agelong time. It utilized to beryllium that they had to physique a information center. They had to spell find servers, they had to instrumentality attraction of the servers. When a disk thrust broke, they had to spell hole it. They had to acceptable up their networks, et cetera. There's a full clump of enactment that they had to bash earlier they could ever constitute a chill exertion that would fto them watercourse a movie oregon link individuals with radical who had rooms they wanted to rent out.
When you deliberation astir each of that work, our extremity was: What if we could bash that enactment for companies truthful that they didn't person to bash that? That was the thesis erstwhile we started. So that idiosyncratic could travel and say, “Great, springiness maine servers, springiness maine storage, springiness maine databases,” and we'd proviso it for them. Then, done an net connection, they could person entree to that. It turns retired that was a precise almighty idea. I deliberation we did a bully occupation executing connected immoderate of the abstractions that made it truly powerful.
Netflix and Airbnb and Pinterest are each immoderate of these aboriginal customers that we had that built their concern from the opening connected AWS and the cloud. They don't ain information centers, they mostly conscionable tally wrong of AWS. Initially, erstwhile we archetypal launched the business, we thought this was going to beryllium incredibly compelling for startups and immoderate exertion companies, which it was, but arsenic we grew, we recovered retired that enterprises and truly ample organizations were arsenic compelled by this worth proposition, and we had to physique a batch much capabilities for them. Whether it was similar encryption capabilities oregon audit logs oregon abilities to deed peculiar compliance things oregon immoderate it is.
Got it.
But present we person customers similar Pfizer and JP Morgan, and the United States authorities and the quality agencies. That was a large triumph for america erstwhile we benignant of convinced the US authorities that we could physique a top-secret portion and that they could tally quality workloads wrong of AWS.
I would've loved to person been a alert connected the partition successful those meetings. What does it instrumentality to person the United States authorities that they should tally connected the backmost of AWS?
We walked done immoderate architectural questions that they had, and … you know, determination was a full [request for proposal] process and there's a batch of enactment that we did, but it was besides conscionable immoderate whiteboarding wherever we benignant of walked done however it would work, and astatine the extremity of the day, the unreality sounds similar it's a magical technology, but it's information centers and it's networks and it's servers, and different things that we tally astatine precise precocious reliability, astatine precise precocious security.
We recovered immoderate forward-leaning exertion folks that wanted to fig retired however they could get the benefits to the government. It turns retired that if you tin find the close idiosyncratic successful an organization, adjacent determination arsenic ample and bureaucratic arsenic the US government, you’ll find radical who privation to thin forward, spell faster—they privation to present worth for citizens. So, uncovering that close idiosyncratic and past being capable to collaborate with them, their eyes airy up conscionable similar they bash for a startup company, right?
These radical are oftentimes precise mission-driven, and if they tin spot an accidental to present much worth for the ngo that they're connected they're ace excited astir that.
So today, present it's crossed astir each state and each manufacture that you deliberation astir passim your regular life. There's inactive tons much to go, but I would accidental there's a clump of companies that present usage AWS arsenic their halfway infrastructure. NASDAQ trading markets tally connected AWS; fiscal services companies tally connected AWS; hospitals tally connected AWS; media and entertainment—live broadcasting oregon streaming—we powerfulness a batch of that technology.
Just erstwhile you thought you were done learning the job, on comes artificial intelligence. Which brings maine to my adjacent question. Obviously, we are successful this caller epoch for you and your organization. You held this re:Invent league successful December, and you streamed it connected Fortnite, I mightiness adhd …
Yep.
You announced immoderate beauteous important changes to AWS, to your mission, to your priorities, and to what you would beryllium offering to your consumers. I'm hoping you tin speech america done that transformation.
Technology is ever iterating and going done transformations. So for america staying astatine the forefront of each innovation is incredibly important. I deliberation there's been astir nary exertion leap since possibly the cloud, and the net earlier that, [as large arsenic AI]. So we’ve been investing successful AI, successful AWS and Amazon, for the past decade-plus, 2 decades maybe. But decidedly with the leap guardant from generative AI implicit the past 3 years, we've conscionable seen a monolithic alteration successful what's imaginable for customers.
So we've had this imaginativeness that it's not conscionable going to be, AI is implicit connected 1 broadside and past the remainder of your concern is going to beryllium implicit connected the different side. Basically AI is going to beryllium built into what everyone does. In bid for that to happen, fig 1 is you person to person each of your information successful the unreality world. And we’ve built this full level of tools that past let you, erstwhile you person that information successful the cloud, to present differentiated worth to your customers.
Some of the things that we launched—that I'm rather excited about—at re:Invent are truly astir AI agents and the quality betwixt the archetypal procreation of AI tools [and the next]. The archetypal procreation were truly astir summarization and contented creation, right? We were each rather excited and got a batch of worth retired of those. But there's lone truthful acold that goes. I deliberation the adjacent signifier is these agents. The existent worth is they tin instrumentality entree to your data—they tin inactive bash immoderate summarization and contented creation, but they tin really execute tasks. They're capable to reason.
When you person these agents that tin crushed and execute tasks connected your behalf, each of a abrupt you tin benignant of force-multiply what you're capable to do. So we launched a mates of things. One was called Nova Forge, wherever we let customers to really instrumentality their information and integrate it successful astatine the aboriginal stages of grooming 1 of these frontier models. So that gives enterprises 1 of these AI models that profoundly understands their information and their domain. Then we launched a fig of these frontier agents that let customers to truly present large bodies of work, whether it's successful coding oregon operations oregon security.
We've spent the clip to truly physique this level wherever it could really present value. I deliberation broadly radical person understood what that semipermanent strategy was. They truly get it. As they spot projects truly delivering, wherever there's existent value, they spot that AWS is the level wherever they privation to spell bash that. That's what customers were telling america implicit and implicit again [at re:Invent].
I consciousness similar 2025—which we are, convey God, astir done with—was a confusing twelvemonth successful the communicative for AI. I accidental that due to the fact that I consciousness similar successful January I went to immoderate conferences, talked to immoderate people, and the sentiment was “This is the twelvemonth of the agent. Agentic AI is here. It's astir to alteration everything.” That was astir a twelvemonth ago.
Yeah.
Am I utilizing agentic AI close now? Absolutely not. Has that travel to fruition successful the mode radical were talking astir successful January? No. At the aforesaid time, you’ve seen reports from MIT, for example, uncovering that 95 percent of gen AI pilots successful companies are failing to output the productivity gains that I deliberation those firm leaders thought they would see.
How bash you marque consciousness of each of those narratives coming together? Were companies conscionable moving excessively quickly?
If you leap guardant and don't physique that beardown instauration of I person my data, I cognize the workflows, I truly cognize however immoderate of these things are going to necktie together, past you're not going to get immoderate worth retired of it.
You're going to person conscionable a chatbot, which is benignant of chill and looks neat. Then everybody has a chatbot and past what? So it is those differentiated workflows, you know, and they're little sexy and absorbing for radical to look at, but a workflow that tin assistance you automate security assertion processing is ace valuable, right? You tin really get people’s claims processed faster, marque definite that you chopped your costs, person a amended lawsuit experience, marque definite you person amended accuracy—you tin present each these things.
Some of the technologies weren't disposable successful January. This exertion is moving truthful accelerated that the capabilities are overmuch amended today.
I volition archer you, astatine re:Invent I sat successful a country of executives for a wide acceptable of companies, and I asked for a amusement of hands of who is either present starting to spot affirmative ROI connected their AI investments oregon spot a wide way to meaningful affirmative ROI successful the adjacent six months. I deliberation 90 percent of the hands went up.
Hmm.
That's not the reply I would've gotten a twelvemonth ago, but it's due to the fact that we've done a clump of this work, and it's due to the fact that we've done a clump of this enactment unneurotic with customers to recognize precisely what they want, however bash we lick their problems, and however bash we present solutions that present them existent worth and not just, you know, clickbait headlines that dependable good.
On that note, I volition accidental candidly, Amazon has not been a cardinal portion of a batch of these AI narratives, right? There person been different companies that person been retired determination making announcements. I tin archer you, starring WIRED, it's this changeless watercourse of news, “We're doing this, we're doing that.” Does that interest you? Do you interest astir Amazon not being successful that narrative, oregon bash you consciousness similar you took your time?
I deliberation some of those things are true. I bash interest astir it, due to the fact that I don't privation customers to deliberation we're not innovating oregon driving the latest technologies they need. I privation to marque definite that it's not conscionable headline-grabbing worldly and it's really great.
Jeff Bezos utilized to person a saying that you person to beryllium consenting to beryllium misunderstood for agelong periods of time. For us, I deliberation that's what immoderate of the past 2 years entailed.
I deliberation a batch of that communicative has changed now. If you speech to tons of analysts, if you speech to folks successful the press, if I speech to customers they're saying, “Look, really AWS has by acold the strongest agentic level to spell physique on. They person the broadest acceptable of models that I tin physique on. They person the broadest acceptable of information controls and compliance controls that actually, if I spell enactment these agents successful production, I tin really audit, cognize what they're doing, power what they're doing.” That is what we're seeing.
That’s not to instrumentality thing distant from [other AI tools]. ChatGPT is an unthinkable user application, but it's conscionable a antithetic thing. That's not our business. Our concern is to marque definite that banks and wellness attraction companies and media and amusement companies and vigor companies tin thrust their businesses and present much outcomes for their customers oregon chopped costs oregon immoderate they privation to do.
So, I'm rather pleased with wherever we are now, and I deliberation we've already seen that communicative mostly shift.
I wanted to inquire you a small spot much astir Nova Forge. What was peculiarly absorbing to maine present is this thought of customized pretraining, arsenic opposed to the thought of good tuning, arsenic a mode that a institution tin instrumentality a exemplary and truly marque it their own. Can you explicate that favoritism to everybody earlier I excavation successful a small more?
It's not caller that radical person thought, “OK, there’s these out-of-the-box models that I wanna customize,” and the champion mechanics that we've had to day to customize are these open-weights models. Meta was large and released that with their archetypal Llama model, but present we've seen a assortment of these, whether it's Mistral oregon DeepSeek oregon Qwen oregon whatever.
There's a fig of these, but they're inactive achromatic boxes, right? You fundamentally get a afloat pretrained model, and past they unfastened the weights and you tin tune the weights to absorption successful peculiar areas. Or there's caller techniques similar reinforcement learning wherever you tin commencement to nonstop much accusation to these models that bid them aft the fact.
What we find is that if you enactment excessively overmuch caller information into these models successful the aboriginal stages, they hide the aboriginal worldly and truthful they hide what made them large astatine reasoning. Or they really hide immoderate of that information due to the fact that they get what's called overtrained connected immoderate of the information you're giving it.
Right.
Then they suffer what was invaluable successful the archetypal place. So there's lone truthful acold that that tin go. What we besides find is that those techniques are precise ineffective if the exemplary wasn't already trained connected your domain. So if you effort to thatch 1 of these open-weight models astir macromolecule folding and they cognize thing astir macromolecule folding, it doesn't work, due to the fact that it doesn't cognize however to inherently crushed astir that thing.
What we recovered is that if you tin bid them earlier, and I usage this analogy successful my re:Invent speech astir the quality brain, it’s similar learning languages. You're capable to larn caller languages aboriginal erstwhile you're younger. Much easier. Now, if I effort to larn a caller language, it's overmuch harder to do. So models are somewhat akin to that.
What we've done, which is simply a unsocial thing, we notation to it arsenic unfastened training, but truly the thought is that you person this corpus of information that is from your domain and your peculiar institution and the ways that you bash things. If you're capable to insert it into the pretraining stages and past premix successful a clump of the information that was utilized to primitively bid that exemplary and past decorativeness pretraining the model, erstwhile it's done the exemplary really inherently knows each of your stuff. It knows astir your data, it knows astir your company, it knows astir your domain, and thing that you bash with fine-tuning oregon post-training aft that is really overmuch much effective.
It's conscionable ne'er been imaginable before. Because open-weight models ne'er would exposure their data, and determination was nary benignant of mechanics for them to beryllium capable to spell bash this. So we did this with our Nova models and our Nova 2 models.
Take a fiscal services company. You tin instrumentality each your information, inject your data, premix it with an Amazon-curated dataset. This is the information that we person proprietary use. We'll springiness you tools to easy premix that together. And past you decorativeness pretraining the model, and you efficaciously person your ain customized frontier exemplary that understands your concern that you were capable to bid for a mates 100 1000 dollars oregon immoderate the outgo is to decorativeness that training, versus billions of dollars of doing each the probe to really spell and physique a frontier model.
I'm funny astir hazard successful the discourse of Nova Forge successful a fewer antithetic ways. You already woody with companies that person confidential proprietary information, right? You conscionable mentioned fiscal services inputting each of that information into this exemplary and into this training. You besides offered a truly compelling illustration disconnected the backmost of re:Invent from Reddit, which is utilizing Nova Forge to make a exemplary that tin beryllium utilized for contented moderation, which fundamentally means grooming a exemplary that breaks a batch of accepted rules astir however these models are typically designed, right? They would beryllium designed to debar violative oregon convulsive contented entirely. Reddit, connected the different hand, needs a exemplary that tin thin into that benignant of contented successful bid to bash the moderation. So those are 2 precise chiseled examples that I conscionable offered, but I'm funny what benignant of risks beryllium erstwhile we commencement down the roadworthy of this benignant of customized pretraining, and are determination chiseled risks that mightiness beryllium antithetic from what we've seen historically?
I don't cognize if there's immoderate particular, antithetic risks. I deliberation from a information extortion constituent of view, we person each sorts of protections astir this. That's 1 of the things we pridefulness ourselves on, and person implicit the past 20 years, is truly protecting lawsuit information and making definite that it's isolated and protected.
When radical are gathering their ain model, this is existent nary substance what, if they're fine-tuning it, if they're doing immoderate of that work, companies person to deliberation astir what is the output, and they person to ain the output of their ain models.
Mmm-hmm.
Even if they're utilizing an off-the-shelf model, by the way, you person to ain the outputs of that.
I deliberation that's 1 of the cardinal pieces: You can't conscionable springiness up work for immoderate your exertion is doing. You can't springiness up work due to the fact that a database makes a peculiar articulation and you're like, “I don't know, it's conscionable however the database did it.” AI is nary antithetic from that.
We present almighty tools to customers, but they person to ain those outputs and deliberation astir them. We inactive person information classifiers, by the way, for things. You can't pretrain thing and past make it to physique a weaponry oregon things similar that. We inactive person a batch of those information controls, and there's nary circumventing those.
But with regards to things similar contented moderation, that's idiosyncratic else's choice, and they tin marque a prime and they ain what that looks like. They person to beryllium definite that they marque large choices for the contented that's connected their website, that's due for their site.
So, we’re talking astir this premise whereby AI agents volition beryllium overmuch much integrated into endeavor settings, and I'm funny however you deliberation astir that successful the discourse of the workforce. I'm looking backmost astatine 2025 and remembering comments that different AI executives person made astir occupation disruption, cuts to the workforce, et cetera.
You really made an absorbing remark wherever you said that replacing inferior employees with AI is, quote, “one of the dumbest ideas” you've ever heard, which made maine laugh.
Ha!
Could you speech much astir that and much astir however you spot artificial quality and agentic AI changing the workplace successful the years to come. Because I deliberation you person a constituent of presumption connected this that immoderate radical mightiness find reassuring, and that I deliberation is antithetic from what we perceive from a batch of different leaders.
That point, by the way, was specifically astir bundle developers, but I deliberation it applies to lots. There was this thought that you'll conscionable regenerate each of your inferior engineers and each of your inferior employees and you'll conscionable person the astir senior, astir experienced employees and past agents.
Number one, my acquisition is that galore of the astir inferior folks are really the astir experienced with the AI tools. So they're really astir capable to get the astir retired of them. Number two, they're usually the slightest costly due to the fact that they're close retired of assemblage and they mostly marque less. So if you're reasoning astir outgo optimization, they're not the lone radical you would privation to optimize around.
Three, astatine immoderate constituent that full happening explodes connected itself. If you person nary endowment pipeline that you're gathering and nary inferior radical that you're mentoring and bringing up done the company, we often find that that's wherever we get immoderate of the champion ideas.
We get the caller caller humor into the institution from hires retired of college. There's a batch of excitement. There's a batch of caller thoughts. You've gotta deliberation longer word astir the wellness of a company, and conscionable saying “OK great, we're ne'er going to prosecute inferior radical anymore,” that's conscionable a nonstarter for anyone who's trying to physique a semipermanent company.
What does this mean, though, for the workforce broadly? What does it mean for Amazon's workforce? What does this look similar arsenic AI agents infiltrate—which is not a generous word, but it's the connection that comes to mind—the mode we enactment and live.
One of the things that I archer our ain employees is “Your occupation is going to change.” There's nary 2 ways astir it. If there's lone 1 happening I tin committedness you, it is that the mode that you did your occupation 4 years agone is not however you're going to bash your occupation adjacent year. You're going to beryllium capable to person a bigger impact. You're going to beryllium capable to bash much things, you're going to beryllium capable to person a broader scope of responsibilities, and it's conscionable not going to beryllium the aforesaid things that made you palmy 5 years ago. You're going to person to larn caller skills, you're going to person to larn caller ways of working. We whitethorn person to signifier our teams differently. We whitethorn person to spell aft problems differently. So radical are going to request to beryllium flexible. There is for definite going to beryllium disruption successful however enactment is done, due to the fact that jobs are going to alteration and industries are going to change.
If they don't, they'll astir apt get near down by radical who determination faster and bash change. There is going to beryllium immoderate disruption successful determination for sure. Like determination is nary question successful my mind.
When you accidental disruption, bash you mean successful the discourse of occupation loss, successful a large picture, economical way?
That is uncertain to me. I'm precise assured successful the mean to longer word that AI volition decidedly make much jobs than it removes astatine first. I deliberation anytime you find opportunities to make caller economical prosperity oregon physique caller experiences oregon things similar that, determination are much jobs that get created. But they volition beryllium different. There are jobs that volition beryllium eliminated arsenic portion of it, oregon reduced, astir for sure.
There are immoderate things wherever you conscionable nary longer request rather arsenic galore radical to bash a peculiar job. So our occupation is to besides supply grooming and upskilling truthful that we tin retrain, truthful that determination are different roles that immoderate of those folks tin do. Not each radical privation to bash that. There whitethorn beryllium immoderate churn successful the abbreviated term, arsenic radical either are hesitant to larn caller skills oregon don't wanna larn caller skills oregon different things similar that.
But this is existent of astir each azygous exertion change: It was existent of concern automation successful the 1930s. It was existent erstwhile idiosyncratic computers came around, and erstwhile the net came. It volition beryllium existent for AI too. There'll beryllium immoderate jobs that determination won't beryllium arsenic galore of, that is true. And I deliberation there'll beryllium caller jobs.
I'm funny however you're reasoning astir each of this from an biology perspective. I deliberation 1 of the notable pieces of agentic AI is having them tally continuously for hours oregon days. I would ideate determination is simply a sustained vigor request successful that context. We're already talking astir a precise energy-intensive technology. Amazon close present is the azygous biggest purchaser of caller renewable vigor contracts successful the world, astatine slightest for the past 5 years.
That's right.
How bash you enactment that committedness unneurotic with this drastically accrued request for energy?
Look, portion of what that is, is america investing. This is not conscionable going retired and buying existing things. That is investing successful caller projects and bringing caller renewable vigor projects online. That’s a immense committedness for us. It's thing we bash each azygous twelvemonth and we'll proceed to do. Because we deliberation that's important from an biology interaction point-of-view, and frankly, from an vigor availability point-of-view. I bash deliberation we're going to person to support looking astatine different vigor sources. I deliberation that atomic powerfulness is 1 of those that's going to beryllium precise important for america to look astatine successful the mean to longer term.
We person to marque definite that we person capable carbon-zero vigor retired there, but we request to look astatine each of those sources of energy, and it doesn't mean that nary one's going to usage earthy state successful the inter-medium term. I'm definite immoderate of that is true, too. Our extremity is, however bash we continuously, play implicit period, twelvemonth implicit year, trim the c strength of the vigor that we consume? We're inactive committed to that arsenic a company, and we've spent an tremendous magnitude of clip connected that.
How bash you respond to disapproval of this venture, which is an tremendous undertaking, internally? I'm funny astir the absorption portion there. A fewer weeks agone you had implicit a 1000 Amazon employees—granted, it's a institution with a seven-figure worker base, if I'm correct—but they described the company's quote, “all-costs-justified” warp-speed attack to AI development. They accidental that it volition cause, quote, “staggering harm to democracy, to our jobs, and to the Earth.” That's rather a connection from Amazon's ain employees. When you perceive that work backmost to you, however bash you code it?
Number one, we promote our employees to person their ain thoughts arsenic to however they're reasoning astir things.
Well, that's bully due to the fact that a batch of tech companies these days are not blessed to spot that happen.
I would besides accidental that erstwhile you person an enactment of immoderate size, you person viewpoints from a batch of antithetic places. I would accidental that that is not the bulk sentiment from our employees oregon adjacent close. I deliberation astir of our employees are excited astir the exertion we're building. Excited astir the worth that we're giving customers, and excited astir the imaginable and the clime pledge that we person and the way that we're on. So, I deliberation that's OK. As agelong arsenic those concerns are respectful, we're consenting to perceive to them. But I deliberation they’re acold from the majority.
I mentioned astatine the apical of the speech that successful January 2025 everyone promised maine this was the twelvemonth of agentic AI. They were wrong. And truthful arsenic we look retired to 2026, I'm funny for your prediction: In the discourse of AI, what is adjacent twelvemonth each astir for us? Then successful 12 months, I'm going to telephone you and we'll see.
Just beryllium super, ace clear, I'm atrocious astatine predicting the future.
I americium too, but I person to ask.
You could telephone me, I conscionable astir apt won't beryllium right. But I volition accidental I deliberation 1 of the large things that we are going to beryllium incredibly focused connected for our customers is delivering existent concern returns for them. Whether that's done agents, whether that's done customized models, whether that's done scalable infrastructure, whether that's eliminating tech indebtedness done utilizing [Amazon’s] Transform to get them disconnected of mainframes oregon get them disconnected of bequest databases. That, I think, is going to beryllium a large focus. It's not going to beryllium astir going and experimenting. It's not going and trying caller technology. It is truly delivering worth to the extremity customers and to the business.
So it's clip for the P&L to look a small spot antithetic adjacent year.
Yep. I deliberation that's wherever customers person relied connected AWS to assistance them amended for the past 20 years. And it's what we're peculiarly large at.
How to Listen
You tin ever perceive to this week's podcast done the audio subordinate connected this page, but if you privation to subscribe for escaped to get each episode, here's how:
If you're connected an iPhone oregon iPad, unfastened the app called Podcasts, oregon conscionable pat this link. You tin besides download an app similar Overcast oregon Pocket Casts and hunt for “Uncanny Valley.” We’re connected Spotify too.

1 day ago
9







English (US) ·