Yes, "selfdriving" and "driverless" hashtags exist on TikTok, and they contain exactly what you might think—videos of people using their Tesla's self-driving feature while doing anything but focusing on the road.

To deter other users from abusing the self-driving feature, TikTok has decided to put a warning label on these types of videos.

TikTok Tries to Kill the Self-Driving Trend

In a Tweet first spotted by The Next Web, a user noticed a new label beneath a self-driving video on TikTok. The label reads: "The actions in this video could result in serious injury or adverse health effects."

Despite this, there are still tons of dangerous self-driving videos on TikTok that don't have this label. This includes videos of people using Autopilot while in the backseat of their Teslas, as well as videos of people eating their lunch while "driving." TikTok may be unable to catch every single dangerous video, but it will likely need even stricter measures to counteract this trend.

TikTok's Community Guidelines clearly state that the platform doesn't allow users to "share content depicting them partaking in, or encouraging others to partake in, dangerous activities that may lead to serious injury or death." With this in mind, is a simple label really enough to keep users from posting dangerous self-driving videos?

On Tesla's support page, the company states that Tesla owners still have to "pay attention" when using the self-driving feature. Drivers must also agree to keep their hands on the wheel and maintain control of the car before turning on Autopilot.

Related: Tesla Owners Discover a Childish Way to Open the Charge Port

Warnings like these can only do so much, unfortunately, as people will inevitably still attempt to abuse the Autopilot feature. On May 5, 2021, a man was killed in a crash while using Tesla's Autopilot feature. According to a report by Reuters, the driver frequently posted "full self driving" videos on TikTok.

If TikTok continues to allow users to post "funny" videos of them using the Autopilot feature while distracted, more people will likely want to do the same, resulting in more needless deaths and injuries.

Both TikTok and Tesla Need to Condemn Autopilot Abuse

In the end, it falls on Tesla and TikTok's shoulders to condemn distracted self-driving. This means that Tesla needs to make it harder to use the Autopilot feature while distracted, and TikTok should remove dangerous self-driving videos altogether. Until then, we'll likely still see more injuries and deaths related to abusing the Autopilot feature.