We acknowledge the significance of your online privacy and acknowledge that granting us permission to collect some personal information requires a great deal of trust. We seek this consent as it enables Distinct Post to offer a platform that amplifies the voices of the marginalized. By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
Distinct PostDistinct Post
Aa
  • Home
  • Israel-Gaza Conflict
  • World
  • Entertainment
  • Style
  • Health
  • Sports
  • Business
  • Royals
Reading: ChatGPT could cost over $700,000 per day to operate but Microsoft will make it cheaper
Share
Aa
Distinct PostDistinct Post
  • Home
  • Israel-Gaza Conflict
  • World
  • Entertainment
  • Style
  • Health
  • Sports
  • Business
  • Royals
Search
  • Home
  • Israel-Gaza Conflict
  • World
  • Entertainment
  • Style
  • Health
  • Sports
  • Business
  • Royals
Follow US
Distinct Post > Business > Tech > ChatGPT could cost over $700,000 per day to operate but Microsoft will make it cheaper
chatgpt-could-cost-over-700000-per-day-to-operate-but-microsoft-will-make-it-cheaper
Tech

ChatGPT could cost over $700,000 per day to operate but Microsoft will make it cheaper

Lisa Sean Published April 20, 2023
SHARE

Using OpenAI‘s ChatBot-ChatGPT to write cover letters, generate lesson plans, and redo your social profile could cost up to $700,000 a day because of the expensive tech infrastructure the AI runs on, Dylan Patel, chief analyst at semiconductor research company SemiAnalysis, told The Information. That’s because ChatGPT requires massive amounts of computing power to calculate responses based on user prompts.

Most of this cost is based around the expensive servers they require.”

Patel told the tech publication.

In a phone call with Insider, Patel said it’s likely even more costly to operate now, as his initial estimate is based on OpenAI’s GPT-3 model. GPT-4 — the company’s latest model — would be even more expensive to run, he told Insider.

OpenAI did not immediately respond to Insider’s request for comment ahead of publication.

While training ChatGPT’s large language models likely costs tens of millions of dollars, operational expenses, or inference costs, far exceed training costs when deploying a model at any reasonable scale, Patel and Afzal Ahmad, another analyst at SemiAnalysis, told Forbes. “The costs to inference ChatGPT exceed the training costs every week,” they said.

Firms using OpenAI’s language models have been paying steep prices for years. Nick Walton, the CEO of Latitude, a startup behind an AI dungeon game that uses prompts to generate storylines, said that running the model — along with payments to Amazon Web Services servers — cost the company $200,000 a month for the AI to answer millions of user queries in 2021, CNBC reported.

The high cost is why Walton said he decided to switch to a language software provider backed by AI21 Labs, which he said cut his company’s AI costs in half to $100,000 a month.

We joked that we had human employees and we had AI employees, and we spent about as much on each of them. We spent hundreds of thousands of dollars a month on AI and we are not a big startup, so it was a very massive cost.”

Walton told CNBC.

To reduce the cost of running generative AI models, Microsoft is developing an AI chip it calls Athena, The Information first reported. The project, which started in 2019, comes years after Microsoft made a $1 billion deal with OpenAI which required OpenAI to run its models exclusively on Microsoft’s Azure cloud servers.

The idea behind the chip was two-fold, according to The Information. Microsoft execs realized they were falling behind Google and Amazon in their efforts to build its in-house chips, a source with knowledge of the matter told The Information. 

At the same time, Microsoft was reportedly looking for cheaper alternatives — its AI models were run on Nvidia’s chips known as graphics processing units — and decided to build a chip that would be less costly.

Nearly four years later, more than 300 Microsoft employees are now reportedly working on the chip, according to the report. The chip could be released for internal use by Microsoft and OpenAI as early as next year, two sources familiar with the matter told The Information.

You Might Also Like

NASA Confirms Asteroid 2024 YR4 May Strike Moon in 2032

China Launches Tianwen-2 Probe for Groundbreaking Asteroid Mission

Apple Music Unveils New ‘Sound Therapy’ Collection Featuring Reimagined Tracks by Top Artists

Strongest Evidence of Life Beyond Earth Found on Planet K2-18b

AI-Powered Travel Planning Takes Over: The Decline of Traditional Travel Agencies

Lisa Sean July 24, 2023 April 20, 2023
Popular News
kevin-mccarthy-removed-from-the-role-in-the-unprecedented-vote-becomes-the-first-us-speaker-to-be-ousted-via-vote
United States

Kevin McCarthy removed from the role in the unprecedented vote, becomes the first US speaker to be ousted via vote

Alicia Brian Alicia Brian October 4, 2023
Prince William’s Playful Response to “Tricky Being a Prince” Question Raises Questions
Prince Harry and Meghan Markle Raise a Glass in Celebration: A Stylish German Birthday Bash for Prince Harry’s 39th
James Corden Applauds Adolescence for Pushing Creative Boundaries Amidst TV Mediocrity
Brad Pitt Offers Support to Jennifer Aniston Amidst Stalking Incident

Categories

  • Market
  • Tech
  • Fitness
  • Food
  • Celebrity
  • Fashion
  • Beauty
  • Football
  • Cricket
  • Entertainment
    • Celebrity
    • Movies
    • Television
  • Style
    • Arts
    • Beauty
    • Fashion
  • Health
    • Fitness
    • Food
  • Sports
    • Baseball
    • Basketball
    • Cricket
    • Football
    • Olympics
  • Business
    • Market
    • Tech
Useful Links
  • About us
  • Privacy policy
  • Term Of Use

2023 © Distinct Post News & Media. All Rights Reserved.

Welcome Back!

Sign in to your account

Lost your password?