- Compatible XF Versions
- 1.3
- 1.4
In forums where you can share links to filehosting sites, some sites meet a massive problem with link scraping bots, including pseudo search engines like filestube.com which are massively scraping links shared on forums, blogs and other sites but giving users almost no traffic back since (if any) they get a small link which is not clicked by the users.
There are many other nonpublic bots scraping the links and content from the posts and publishing them on other spam sites. A simple registration is not enough to see full links, since they started to get dozens of fake accounts to crawl newly added posts.
By catching the link(s) into a code paragraph, you could avoid that.
There are many other nonpublic bots scraping the links and content from the posts and publishing them on other spam sites. A simple registration is not enough to see full links, since they started to get dozens of fake accounts to crawl newly added posts.
By catching the link(s) into a code paragraph, you could avoid that.