Skip to content Skip to sidebar Skip to footer

Get Page Description From Page Url Without Slowdowing The Page Load

Is it possible to get page description from page url without slowdowing the page load? via java script or php or any language? For example I would send this input: http://www.faceb

Solution 1:

You need the function file_get_contents($url) . For more help, refer to this

1: http://php.net/manual/en/function.file-get-contents.php . You may need to do urlencode if the URL contains some spaces . As for the parsing part, I have found some code on the web . Here is the link. Do let know

Code :

<?phpfunctiongetMetaTitle($content){
//echo "AAAAA".$content;$pattern = "|<[\s]*title[\s]*>([^<]+)<[\s]*/[\s]*title[\s]*>|Ui";
if(preg_match($pattern, $content, $match))
{
    //echo $match[1];return$match[1];
}
elsereturnfalse;
}   
    //echo "<h1>Hello World!</h1>";$url = "your url here";

$str = file_get_contents($url);

$title1 = getMetaTitle($str);
echo$title1;
//echo htmlentities($str);?>

Solution 2:

I wanted the similar feature to create a somewhat Facebook like feature and fetch title, description and image. I used DOMDocument for it, so even you can try DOMDocument to parse the page. Its very useful to parse the HTML page as per the HTML tags or attributes.

With the combination of ajax (by keeping your PHP script on your domain) you can pass the url to the PHP script (similar to below) which in turn will get back the required details from the website.

Sample code:

$url = ''; // this will be your URL$doc = new DOMDocument();
// added @ to suppress the errors
@$doc->loadHTMLFile($url);

foreach($doc->getElementsByTagName('title') as$title)
{
   $arrDetails['title'] = $title->nodeValue;
}

Solution 3:

file_get_contents($url) then parse the tag, OR any description. Then save the couple url - description to a local cache to avoid requesting continuously the page.

Post a Comment for "Get Page Description From Page Url Without Slowdowing The Page Load"