TikTokward

How many red flags can we count in the Microsoft/TikTok AI spend news?
TikTok Spending Drove Microsoft’s Booming AI Business
TikTok, Walmart, Intuit and G42 have been among the biggest customers of Microsoft’s Azure OpenAI Service

When seemingly good news is also seemingly awkward news:

Microsoft has had success reselling OpenAI’s artificial intelligence to its cloud customers, a core part of its plan to build a new multibillion-dollar profit machine. New data show which customers are powering the business.

TikTok, for instance, was paying nearly $20 million per month to buy OpenAI’s models through Microsoft as of March, according to someone who saw internal financial documents about it. That was close to 25% of the total revenue Microsoft was generating from the business, when it was on pace to generate $1 billion annually, or $83 million per month, this person said.

Good news: Microsoft is able to resell OpenAI's tech to impressively large customers.
Bad news: those large customers are dominated by one customer in particular.
Good news: that customer is known to also be using Google's and Amazon's clouds.
Bad news: that customer is TikTok.

The risk for Microsoft is that a large fraction of those AI sales are concentrated with one customer—and the gravy train could slow down quickly. That’s because TikTok and its parent company, ByteDance, are no slouches when it comes to AI and have their own ambition to make conversational and image-generating software that would rival OpenAI’s. Those plans suggest TikTok’s spending on OpenAI’s tech could stop when its own AI is ready.

Um, also, what happens if TikTok is sold or banned in the US? I feel like if TikTok is banned from the US for its China ties, the US government is not going to want Microsoft to be selling them tens of millions of dollars in AI compute a month.

But it's okay, Microsoft has other larger customers here, such as:

Walmart, a longtime Microsoft cloud customer, has said it’s using such technology to personalize shopping suggestions. Another customer that has spent millions of dollars a month on Azure OpenAI Service is G42, an Abu Dhabi–based firm that previously announced a partnership with OpenAI to develop AI for Middle East customers.

Yeah, G42 would seem to be all sorts of problematic as well. Microsoft struck a deal with them, seemingly at the behest of the US government to move the group away from China. But now that deal is, well, awkward.

Though not quite this awkward:

Microsoft also takes a cut of OpenAI’s direct sales of its models to businesses, which unexpectedly surpassed sales from Azure’s OpenAI Service this year. Plus, Microsoft is generating billions of dollars a year in revenue—albeit without much of a profit margin—from renting out servers to OpenAI so the startup can run ChatGPT and develop related technology.

Not a strange relationship with all sorts of perverse incentives. Not at all.

But even that is not as awkward as this:

Using OpenAI’s technology to develop competing AI goes against OpenAI’s rules, but many customers do it anyway. OpenAI doesn’t appear to enforce those rules—possibly because it has been accused of violating intellectual property rules in how it trained its state-of-the-art AI.

The Verge reported last year that ByteDance was using OpenAI’s GPT-4 model to train its in-house AI models, in part by having OpenAI’s chatbot generate tracts of text that ByteDance would feed into its own model.

In response to that report, ByteDance said at the time that it was employing OpenAI’s models “to a very limited extent” to develop its own models.

Even if TikTok isn't bought or banned, it's pretty clear – as in, confirmed by their parent ByteDance – that they're spending so much on OpenAI through Microsoft at least in part as a way to help train their own models. Need I remind you that ByteDance is a Chinese company? And such companies are increasingly being banned from using American AI technology for their own purposes? Including the latest NVIDIA chips. The ones which are rented out to OpenAI from Microsoft.