The increasing reliance on artificial intelligence (AI) by major platforms like Amazon has introduced a series of complications, particularly in product categorization and compliance. In this episode, Chris and Leah delve into the intricate world of AI-driven recategorization on Amazon, exploring how products are often incorrectly classified, the consequent compliance triggers, and the impacts these errors have on sellers.


 [00:00:00] Chris: Hey everybody. Happy almost new year. This is Chris McCabe of ecommerceChris, welcome back to Seller Performance Solutions. I of course am joined once again by Leah McHugh. How are you? 

[00:00:10] Leah: Good. How are you, Chris?

[00:00:12] Chris: Good. We survived most of December, so we're grateful for that. And we've been hearing a lot, I would say over the past couple of months about AI, Amazon's AI incorrectly recategorizing products and creating compliance triggers. You work on the compliance side a lot more than I do. So maybe you can fill in that blank. Example being things being reclassified as medical devices that weren't medical devices. That's something we see a lot, but maybe you've got a couple other hot examples from the top of your head from this Q4, I figure, because these are coming to us as like complaints about Amazon's AI.

I mean, is that new AI that they're using or is this kind of a rehash of a previous complaint? 

Yeah, this is one of those things where like somebody posts about this new thing that Amazon's doing and I'm like, oh, are they doing something new? And then I read it and I'm like, oh, that's the same thing they've been doing for the last four years.

 Kind of like the other day when somebody talked about a brand new email, which we had been seeing for like three years. But I mean they're just changing the terminology here. So it's now calling AI what people used to call spiders or algorithms.

 It's the same thing we've been seeing for years. I would say, we definitely saw a ramp up during COVID, but it's not new. I mean, the reclassification thing. I mean, that's been going on for a decade, I would say.

That's been going on for a long time so you're arguing, just because I heard this from sellers, I heard it from agencies, and I even had a reporter call me and ask me like, what's everyone complaining about with these? Amazon AI foul ups, right?

[00:01:52] Leah: I just mean, I have been consistently working on these cases for the last three to four years. So it's really not a new thing. Perhaps people who weren't experiencing before or experiencing now. And that's why people are suddenly talking about it again. But it's really nothing new.

[00:02:08] Chris: What is it exactly? Why would something be medical device classified when it's not a medical device, for example? 

[00:02:14] Leah: For a number of reasons, some of them seller mistakes and some of them competitor attacks. We talk about keyword abuse all of the time. This is another form of keyword abuse where somebody goes into your listing, adds a bunch of words to get it flagged as a medical device or a drug or a pesticide and then you have no idea why it's flagged as that because it has nothing to do with that. Or the other side of it is that people are making claims about their products that they're not allowed to make. I would say 40 percent of the time when somebody comes to me saying that Amazon's misclassified their product as a medical device, turns out their product is actually a medical device.

[00:02:49] Chris: Or as we've talked about a lot, they can't just fix the listing. They've also got claims on their packaging. 

[00:02:57] Leah: Well, that's more with supplements. And so that's where you run into the unregistered drug issues. And so, yeah, that's something we've been seeing for a long time too, as well as the pesticide claims, which we've talked about before, if your product claims to kill pests, including bacteria, other than on the human body that's a pesticide. So we've been seeing this for a long time and sometimes Amazon gets it right. Sometimes Amazon doesn't get it right. I think the biggest change that I've seen this year, maybe? it's hard to keep track but I want to say this year, the biggest change I've seen is that it seemed like last year, the compliance team was trending upward, like they were getting better at their job.

So if something was misclassified, you could just be like, this is wrong and this is why it's wrong and this is what my product is and they'd be like, Oh yeah, that makes sense and would reinstate it. Whereas now it seems to be turning in the opposite direction where if they've misidentified your product, getting it reinstated through the regular appeal channel is almost impossible. And I do have a theory on that. I don't think it's just that the people on the team are like half assing it. I think that they one have less access to tools because I've been hearing that from multiple Amazon teams, that they all have way less access to different Amazon tools, and I also think that they aren't being empowered to use a stupid business term and I apologize in advance for that, but, but I don't think that they are. I don't think that that team is empowered anymore to make that call of oh, yeah, you're right. This isn't the device that we think it was, whereas before they did have that ability. So now you essentially have to escalate it in order to get it properly reinstated. If Amazon is wrong. 

[00:04:43] Chris: Right, they're just churning through the contacts and answering them however they can answer quickly. 

[00:04:47] Leah: I think they just have a decision tree, right? And the decision tree doesn't have a what if we were wrong option. So it's either send the documentation or nothing.

[00:04:58] Chris: I don't think I have the training or the ability anymore to be like, Oh yeah, this, this is wrong. This is not whatever we decided it was before. 

[00:05:06] Leah: Well, I'm going to hope for 2024's sake that that's not the case in Q1 and that it's just a multi month pause in training and SOP auditing. It's been like a year. Okay. 

[00:05:21] Chris: Yes. In a lot of ways, we have to just kind of turn the page on 2023 because we saw numerous alarming trends and problems and increasing reliance on just no investigation, copy paste responses, which we will address in our year end episode, where we cover some of the 2023 trends.

I, for one, hope that next year will just be better in a lot of ways. But I guess the good news in what you're saying is that there's no, possibly no increased reliance on AI where they take engineers out of the equation. And they're just using bots for absolutely everything tied to listing enforcement.

[00:06:04] Leah: Well, I think at this point, most of these flags are automated. I don't think there is a human reviewing the listing prior to it being blocked. The appeals themselves, I do believe are going to humans, but I know for a fact that a lot of the documentation review is automated. But I think the problem is like, you know, I'm not against automation, but if your system didn't work when humans were doing it, how do you think that you're going to build an efficient automation for that?

Because now it's just the same issues as before. But now there's no way to actually get somebody to be like, Oh, yeah, no, that makes more sense. 

[00:06:45] Chris: You're supposed to replace successful manual actions with successful automation. 

[00:06:50] Leah: Right, like they were never good at this when humans were doing it. Why do they think that they're going to be able to make good machine learning for it?

[00:06:58] Chris: That's an excellent point. I think we're just seeing increasing complaints, just nonsensical actions taken. What I've seen on my side, since I'm not doing compliance work every day, like you are, is the Amazonian will answer where there's a clear mistake, a clear error with something like, well, you need to get approval to sell this brand.

[00:07:23] Leah: That's actually something we've been seeing a lot more of, the approval thing, and I don't know why people are talking about that. 

[00:07:28] Chris: Or you need a letter of authorization to sell your own brand. You need approval, you need to apply for brand approval to sell your own brand. 

[00:07:35] Leah: Well, I've had a few recently where, like, the ASIN got gated, and then they were accepting applications, and then later Amazon was like, oh yeah, that was done in error, sorry.

[00:07:44] Chris: They say it's done in error every time, ultimately. But there are some people coming to us with a dozen rejections or case replies or whatever it might be. 

[00:07:54] Leah: No, I mean, you just get the same response if you go through the regular cases. It was only an escalation that got them to admit that it was an error.

[00:08:01] Chris: Yeah. And again, we'll kind of pick this apart next week with our end of the year episode, but unfortunately now I hear and see account health representatives mimicking this just because they see it annotated somewhere on the account. And they'll say, oh, you need approval to sell that brand, but you're on a phone call with that person.

So the brand owner can say, why do I need that? I'm the brand owner, and I'm in brand registry. What is it just dead air on the other end? 

[00:08:25] Leah: Again, I think this is a tools issue. Each of these teams have very limited access to these tools. And so the only way to get it resolved is to escalate it to somebody that has full access because like I've had, especially with the gating, I've had conversations with catalog who are like, we can see that you're approved for this, but we can also see that there's a restriction, but there's nothing that we can do with our access to these tools. And then they send it to brand registry and brand registry send you a, like, Oh yeah, that's gated you have to apply. And it's like, yeah, we just went through this, like they just don't have. None of these teams, and it's weird because it's cyclical, right? We've gone through this before, where they didn't have any access to tools, and then they got given more access to tools, and then Amazon had some data leak issues, and then they don't have access again, or people messed up and so then they remote the access again.

We're back there, and I've been hearing a lot of misinformation from Amazon's teams lately, a lot more than we had maybe six months ago. And it seems to have just gone right back to like, nobody really knows what's going on, but you're on the phone with them. So they have to say something.

But the only way to resolve a lot of these issues is to escalate it. And it's not necessarily because of incompetence. It's because of a lack of access to those, like for those teams to correct these problems. 

That's what I meant. Hopefully in 2024, it swings back to. Giving more access more people than they get into the issue of, you know, people being bought out or doing things they're not supposed to be doing.

And then they go right back to restricting. 

[00:09:51] Chris: And we've said, throughout the year that when sellers are baffled, you know, why don't you have access? Why can't you see more of this stuff? And are they're paying five grand a month for an account manager who doesn't have access to things that would be useful.

 It all traces back to when people had access to tools that they didn't necessarily need or use every day. They were susceptible to taking money from outsiders to share information that was privileged in private. 

[00:10:16] Leah: So just to like kind of go back to the level of competency of the compliance team and again, like, you know, I've dealt with the legal parts of this team and I'm not saying that they are incompetent.

They are highly competent, but they've been using the same templated notification for medical devices when they think your medical device is a class 2 medical device. Oh, I mean, I'm probably not five years, but years, they've been using this for years where they say that it's been identified as a class two medical device, which is professional use only class two medical devices are not professional use only.

So they've been giving this like essentially legal letter to sellers. For years, that's not even the correct information if they were even flagging your product correctly. So, and then if your product is a Class 2 medical device, you're like, yeah, here's my 510K, and then they reinstate it because suddenly it's no longer professional use only.

Like, for years, they've been using that wrong wording on that notification. So, like, this is the level that we are dealing with. 

[00:11:25] Chris: There are old templates floating around that, that should 

[00:11:30] Leah: But this is the template. They use this template for all things classified as Class 2 medical devices.

[00:11:35] Chris: But what about the template for, you've engaged in illicit, deceptive, or fraudulent behavior? And using that in like 19 different outcomes, circumstances. . I mean, no creation of a second version, third version, fourth version to address all these different things.

[00:11:49] Leah: I'm not even asking for different versions. I'm just asking them to not have misinformation in the one template that they use for this. 

[00:11:56] Chris: You're teasing next week's episode, but we're also starting to get into next week's episode, which I'm trying to avoid doing. I just wanted to bring this up today because whether it's reading through LinkedIn or Facebook or different forums posts, everyone's starting to use and misuse the term AI, the way they like to say hijacking in all the wrong contexts.

It's becoming like that. And if they just mean they're just using a different term for Amazon's algorithms, Amazon's bots, and now everyone's going to say AI, that's fine. 

[00:12:32] Leah: No, I mean, we can change the terminology, just we probably shouldn't pretend like it's a new thing because it's the same thing, right?

[00:12:38] Chris: Well, the problem is it's creating confusion. And it sounds like, I mean, I haven't addressed it this week yet, but it sounds like now the media is catching on to it, but they think this is new, this is something new that Amazon's doing. They've tweaked their AI because AI is kind of clickbait at this point. 

[00:12:54] Leah: I mean if this gets the media interested in these problems that we've been seeing for years, then great. Yeah. Let's call it AI. 

[00:13:02] Chris: That's an excellent note to end on. There's nothing I could say more succinct into the point than that. So, are there any last thoughts on this or we pretty much covered it, I think. , do you have any words of warning in terms of people getting listing restriction messaging or something suddenly comes in saying you're selling a restricted drug or a medical device and you're confused?

[00:13:25] Leah: Yeah, well, I mean, 1st of all, maybe check to make sure you aren't selling a restricted drug or medical device. Because, like I said, I do see that a good amount. The after that, you know, you check your listing content, especially if we're talking about if it's being flagged as a drug, or , even a medical device.

 It's generally using an algorithm to look for keywords and that's what's causing the flag. So check your listing for anything that could be causing that. That also being said, I mean, there is kind of like an entire little terrifying cottage industry around misclassifying medical devices, so you don't have to do as much with the FDA.

And we have also seen instances where it's not an algorithm, it's the FDA contacted Amazon and told them to get rid of all of these products. So, you know, you need to one, look at your product to look at your listing. And then three,. I would say talk to Catalog and see if they can see anything, because a lot of times the keywords are in attributes that you can't see, which is always fun.

So, yeah. Otherwise, call me. 

[00:14:22] Chris: Otherwise, ask Leah. Thanks for listening, everybody, and we'll come back at you next for the kind of end of 2023 barrage of things we've learned, topics we've discussed throughout the year and what we see happening going into next month, 2024.

[00:14:38] Leah: I think we should have a drink the next time we can do that one at the end of 2023.

[00:14:43] Chris: Yes. Sounds good. Bye everybody.