Last modified: 2013-11-12 17:35:15 UTC
Spam BlackList doesn't block links which doesn't have http or https or // (at the first) for example I could add www.atbriders.com to the text which is listed in https://en.wikipedia.org/wiki/MediaWiki:Spam-blacklist see https://en.wikipedia.org/wiki/User:Yamaha5/Spam-blacklist it should block also URLs which doesn't have http: or https: or // (at the first)
Now many articles are effected by this bug and users adds their Spam links to the articles without any blockage!
I don't see any sane way to do this (and am tempted to close as INVALID). Text not starting with http:// simply is not a link - it is text. So you want to block any word which has two dots in it?
(In reply to comment #2) > I don't see any sane way to do this (and am tempted to close as INVALID). > > Text not starting with http:// simply is not a link - it is text. > > So you want to block any word which has two dots in it? No I don't want to block any word which has two dots in it! I said when we define \bexample\.com\b it should also block www.example.com it is a website address
spam blacklist was designed to block linking to black listed urls. Thats why only the added external links are checked, no content. You can use AbuseFilter when you want text searches for plain website names. Supporting INVALID
--> AbuseFilter territory by definition