mpharo Posted April 5, 2007 Share Posted April 5, 2007 I am writing a reporting tool for the open source project Dansguardian, right now I am taking a URL that a user has visited and need to compare it to a blacklist, which contains just domain names (i.e. - phpfreaks.com). I currently have some code which does this but it does not work for https URL's. Here is the code: while($sql=mysql_fetch_array($select)) { preg_match('@^(?:http://)?([^/]+)@i', $sql, $matches); $host = $matches[1]; echo "host is: $host\n"; preg_match('/[^.]+\.[^.]+$/', $host, $matches); echo "domain name is: {$matches[0]}\n"; } I then found the parse_url(); function and tried using that, but that dosent strip off the subdomains. It just returns the host. Does anyone have a snippet or a modification they could suggest? Thanks in advance... Quote Link to comment Share on other sites More sharing options...
MadTechie Posted April 5, 2007 Share Posted April 5, 2007 Try <?php while($sql=mysql_fetch_array($select)) { preg_match('@^(?:http://|https://)?([^/]+)@i', $sql[url], $matches); $host = $matches[1]; echo "host is: $host\n"; preg_match('/[^.]+\.[^.]+$/', $host, $matches); echo "domain name is: {$matches[0]}\n"; } ?> their probably a better way but i think that will work Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.