karldesign Posted April 4, 2007 Share Posted April 4, 2007 I have the following funtion: function parseUrl($str_haystack) { $str_haystack = str_replace('http://', '', $str_haystack); $str_needle = preg_replace("/(www.)([^\s,]*)/i", "<a href='http://$1$2' target='_blank' class='ld'>$1$2</a>", $str_haystack); return $str_needle; } I am parsing URLs from a string, which works fine until I have a URL as such: www.domain.com/subdomain.php?id=234... Any ideas why? Not familiar with preg_replace in honesty... Quote Link to comment Share on other sites More sharing options...
effigy Posted April 4, 2007 Share Posted April 4, 2007 Is '...' part of the URL? If so, is there any surrounding content that hints at what it represents? Quote Link to comment Share on other sites More sharing options...
dsaba Posted April 4, 2007 Share Posted April 4, 2007 what is your problem? Quote Link to comment Share on other sites More sharing options...
karldesign Posted April 5, 2007 Author Share Posted April 5, 2007 No, the '...' was just me typing. function nl2p($str) { $new_str = preg_replace('/<br \\/>\s*<br \\/>/', "</p><p>", nl2br($str)); return '<p>' . $new_str . '</p>'; } This is the function I perform before the parseUrl function (above). So when I have www.domain.com its fine. But if I have www.domain.com/subdomain.php?id=1234 the page it links to is www.domain.com/subdomain.php?id=1234</p><p> . Now I took out the </p><p> from the nl2p function and it worked, but the formatting of the text went to pot... Quote Link to comment Share on other sites More sharing options...
karldesign Posted April 5, 2007 Author Share Posted April 5, 2007 Found a solution... not too sure if its the best way or the right way, but its working... I added a ' ' (whitespace) before each <p> and </p>. This seemed to end the regex... Hope this is useful to anyone else too! Feel free to use it. Quote Link to comment Share on other sites More sharing options...
karldesign Posted April 5, 2007 Author Share Posted April 5, 2007 Having said that , I have found that https: urls are causing me massive concerns!!! Quote Link to comment Share on other sites More sharing options...
karldesign Posted April 5, 2007 Author Share Posted April 5, 2007 Ok, I have found a way of allowing all URLs, https: and http... So if you are interested, here it is. function parseUrl($str_haystack) { $str_haystack = str_replace('http://', '', $str_haystack); $str_haystack = str_replace('www.', 'http://www.', $str_haystack); $str_needle = preg_replace("/(http:\/\/|https:\/\/)([^\s,]*)/i","<a href='$1$2' target='_blank' class='ld'>$2</a>",$str_haystack); return $str_needle; } Please notice how the output doesn't show $1, this is because I didn't want to show http: or https: in the output link... I expect you could add ftp: and any others to the | (or) list... I would however suggest testing it fully before using it, as I haven't done any testing. Hope this helps! Quote Link to comment Share on other sites More sharing options...
effigy Posted April 5, 2007 Share Posted April 5, 2007 function parseUrl($str_haystack) { $str_needle = preg_replace("#(https?://|www\.)([-a-z0-9+.%/?=]+)#i", "<a href='http://$1$2' target='_blank' class='ld'>$1$2</a>", $str_haystack); return $str_needle; } Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.