YouTube is rolling out unclickable links.
Video portals like YouTube have had to deal with spam comments and bogus links for many years. With new additions to a platform come new places for scammers to go about their business. YouTube is now cracking down on links posted to the comments section of Shorts.
Shorts has been around for a few years now, but you may not have noticed the video format up until this point. They’re most commonly found on the frontpage of YouTube, in the form of horizontally framed Tik-Tok style clips. Clicking into a Shorts video will give you an endless, scrolling feed of seemingly random content. Some videos have hashtags you can click into, but for the most part it can feel like a chaotic, non-curated experience.
As with regular YouTube videos, users of the site can leave comments and replies to videos on Shorts, but that’s introduced a new problem. So, YouTube introduced sweeping changes that will affect people who are trying to build out a Shorts platform. From the release detailing the changes:
Since introducing Shorts two years ago, the volume and speed of content published on YouTube has increased in fun and exciting ways. At the same time, this speed and level of engagement has made it easier for spammers and scammers to share links in Shorts comments and Shorts descriptions that harm the community – for example, clickable links that drive users to malware, phishing, or scam-related content.
Essentially: if you build it (and by “it”, I mean “a rapid-fire barrage of non-stop content”) they will come (and by “they”, I mean “a cavalcade of spam the likes of which the moderation team simply cannot police”).
The list of link-related casualties is as follows:
Starting on August 31st, 2023, links in Shorts comments, Shorts descriptions, and links in the vertical live feed will no longer be clickable – this change will roll out gradually. We don’t have any plans to make any other links unclickable. Because abuse tactics evolve quickly, we have to take preventative measures to make it harder for scammers and spammers to mislead or scam users via links.
YouTube also goes on to say that “clickable social media icons from all desktop channel banners will no longer show, as they can be a source of misleading links.” As The Verge notes, these links are used to direct content viewers to their accounts on other websites. Considering the Shorts platform is fairly limited in functionality to others of a similar nature, removing anything along these lines could cause issues for Shorts makers.
There are plans to replace these links with something, though there’s no word yet as to what form this may take.
In 2022, one of YouTube’s transparency reports showed that a big problem was in the realm of misinformation—122,000 videos (not channels) were removed for violating misinformation policies from an overall total of four million removals in Q2 2022. And 89 of these were removed due to being classed as “Spam, misleading, and scams.” The biggest reason for videos being removed was child safety, clocking 1,383,028 removals (31 percent of the overall tally).
With this in mind, it makes sense that YouTube would be keen to bring the banhammer down on a sudden rise in scams affecting the Shorts platform. The quick-cut video content is geared toward younger users; indeed, it’s popular with the 16 – 24 age group. The last thing YouTube or Google needs is a potential child safety issue spreading wildly out of control with rogue links and dubious comments lurking in potentially blink-and-you’ll-miss-it comments sections.
Ultimately, this could be a burden for Shorts creators, but it is a proactive move and anything which impacts the terrifying volume of spam on one of the biggest video platforms in the world can only be a good thing.
We don’t just report on threats—we remove them
Cybersecurity risks should never spread beyond a headline. Keep threats off your devices by downloading Malwarebytes today.