mpharo Posted April 5, 2007 Share Posted April 5, 2007 I am writing a reporting tool for the open source project Dansguardian, right now I am taking a URL that a user has visited and need to compare it to a blacklist, which contains just domain names (i.e. - phpfreaks.com). I currently have some code which does this but it does not work for https URL's. Here is the code: while($sql=mysql_fetch_array($select)) { preg_match('@^(?:http://)?([^/]+)@i', $sql, $matches); $host = $matches[1]; echo "host is: $host\n"; preg_match('/[^.]+\.[^.]+$/', $host, $matches); echo "domain name is: {$matches[0]}\n"; } I then found the parse_url(); function and tried using that, but that dosent strip off the subdomains. It just returns the host. Does anyone have a snippet or a modification they could suggest? Thanks in advance... Link to comment https://forums.phpfreaks.com/topic/45800-domain-names/ Share on other sites More sharing options...
MadTechie Posted April 5, 2007 Share Posted April 5, 2007 Try <?php while($sql=mysql_fetch_array($select)) { preg_match('@^(?:http://|https://)?([^/]+)@i', $sql[url], $matches); $host = $matches[1]; echo "host is: $host\n"; preg_match('/[^.]+\.[^.]+$/', $host, $matches); echo "domain name is: {$matches[0]}\n"; } ?> their probably a better way but i think that will work Link to comment https://forums.phpfreaks.com/topic/45800-domain-names/#findComment-222494 Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.