We acknowledge the significance of your online privacy and acknowledge that granting us permission to collect some personal information requires a great deal of trust. We seek this consent as it enables Distinct Post to offer a platform that amplifies the voices of the marginalized. By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Distinct PostDistinct Post
Aa
  • Home
  • Israel-Gaza Conflict
  • World
  • Entertainment
  • Style
  • Health
  • Sports
  • Business
  • Royals
Reading: ChatGPT could cost over $700,000 per day to operate but Microsoft will make it cheaper
Share
Aa
Distinct PostDistinct Post
  • Home
  • Israel-Gaza Conflict
  • World
  • Entertainment
  • Style
  • Health
  • Sports
  • Business
  • Royals
Search
  • Home
  • Israel-Gaza Conflict
  • World
  • Entertainment
  • Style
  • Health
  • Sports
  • Business
  • Royals
Follow US
Distinct Post > Business > Tech > ChatGPT could cost over $700,000 per day to operate but Microsoft will make it cheaper
chatgpt-could-cost-over-700000-per-day-to-operate-but-microsoft-will-make-it-cheaper
Tech

ChatGPT could cost over $700,000 per day to operate but Microsoft will make it cheaper

Lisa Sean Published April 20, 2023
SHARE

Using OpenAI‘s ChatBot-ChatGPT to write cover letters, generate lesson plans, and redo your social profile could cost up to $700,000 a day because of the expensive tech infrastructure the AI runs on, Dylan Patel, chief analyst at semiconductor research company SemiAnalysis, told The Information. That’s because ChatGPT requires massive amounts of computing power to calculate responses based on user prompts.

Most of this cost is based around the expensive servers they require.”

Patel told the tech publication.

In a phone call with Insider, Patel said it’s likely even more costly to operate now, as his initial estimate is based on OpenAI’s GPT-3 model. GPT-4 — the company’s latest model — would be even more expensive to run, he told Insider.

OpenAI did not immediately respond to Insider’s request for comment ahead of publication.

While training ChatGPT’s large language models likely costs tens of millions of dollars, operational expenses, or inference costs, far exceed training costs when deploying a model at any reasonable scale, Patel and Afzal Ahmad, another analyst at SemiAnalysis, told Forbes. “The costs to inference ChatGPT exceed the training costs every week,” they said.

Firms using OpenAI’s language models have been paying steep prices for years. Nick Walton, the CEO of Latitude, a startup behind an AI dungeon game that uses prompts to generate storylines, said that running the model — along with payments to Amazon Web Services servers — cost the company $200,000 a month for the AI to answer millions of user queries in 2021, CNBC reported.

The high cost is why Walton said he decided to switch to a language software provider backed by AI21 Labs, which he said cut his company’s AI costs in half to $100,000 a month.

We joked that we had human employees and we had AI employees, and we spent about as much on each of them. We spent hundreds of thousands of dollars a month on AI and we are not a big startup, so it was a very massive cost.”

Walton told CNBC.

To reduce the cost of running generative AI models, Microsoft is developing an AI chip it calls Athena, The Information first reported. The project, which started in 2019, comes years after Microsoft made a $1 billion deal with OpenAI which required OpenAI to run its models exclusively on Microsoft’s Azure cloud servers.

The idea behind the chip was two-fold, according to The Information. Microsoft execs realized they were falling behind Google and Amazon in their efforts to build its in-house chips, a source with knowledge of the matter told The Information. 

At the same time, Microsoft was reportedly looking for cheaper alternatives — its AI models were run on Nvidia’s chips known as graphics processing units — and decided to build a chip that would be less costly.

Nearly four years later, more than 300 Microsoft employees are now reportedly working on the chip, according to the report. The chip could be released for internal use by Microsoft and OpenAI as early as next year, two sources familiar with the matter told The Information.

You Might Also Like

Artemis II Crew Enters High-Earth Orbit, Sets Stage for Historic Moon Flyby

X Users Frustrated as Video Link Copying Issue Sparks Online Debate

Everything You Need to Know About the April 2026 Pink Moon

Artemis II Lunar Mission 2026: How Trump’s Space Policy Shapes America’s Moon Return

Scientists May Have Found Clues to a Hidden Type of Black Hole

Lisa Sean July 24, 2023 April 20, 2023
Popular News
Entertainment

Meghan Markle and Prince Harry Employ Secret Language for Communication

Alicia Brian Alicia Brian August 23, 2023
Ty Dolla $ign Reveals Tracklist for Kanye West Collaboration Album as Psychic Predicts Potential Friction in West’s Personal Life
Usman Khawaja responds after ICC charges him for wearing black armband
Drake’s Concert Encounter: Addressing Fan Behavior and Ensuring Concert Etiquette
Travis Scott’s “Meltdown” Sparks Speculation About Kylie Jenner-Timothée Chalamet Romance

Categories

  • Market
  • Tech
  • Fitness
  • Food
  • Celebrity
  • Fashion
  • Beauty
  • Football
  • Cricket
  • Entertainment
    • Celebrity
    • Movies
    • Television
  • Style
    • Arts
    • Beauty
    • Fashion
  • Health
    • Fitness
    • Food
  • Sports
    • Baseball
    • Basketball
    • Cricket
    • Football
    • Olympics
  • Business
    • Market
    • Tech
Useful Links
  • About us
  • Privacy policy
  • Term Of Use

2023 © Distinct Post News & Media. All Rights Reserved.

Welcome Back!

Sign in to your account

Lost your password?