PHP file_get_contents very slow when using full url

Php

Php Problem Overview


I am working with a script (that I did not create originally) that generates a pdf file from an HTML page. The problem is that it is now taking a very long time, like 1-2 minutes, to process. Supposedly this was working fine originally, but has slowed down within the past couple of weeks.

The script calls file_get_contents on a php script, which then outputs the result into an HTML file on the server, and runs the pdf generator app on that file.

I seem to have narrowed down the problem to the file_get_contents call on a full url, rather than a local path.

When I use

$content = file_get_contents('test.txt');

it processes almost instantaneously. However, if I use the full url

$content = file_get_contents('http://example.com/test.txt');

it takes anywhere from 30-90 seconds to process.

It's not limited to our server, it is slow when accessing any external url, such as http://www.google.com. I believe the script calls the full url because there are query string variables that are necessary that don't work if you call the file locally.

I also tried fopen, readfile, and curl, and they were all similarly slow. Any ideas on where to look to fix this?

Php Solutions


Solution 1 - Php

> Note: This has been fixed in PHP 5.6.14. A Connection: close header will now automatically be sent even for HTTP/1.0 requests. See commit 4b1dff6.

I had a hard time figuring out the cause of the slowness of file_get_contents scripts.

By analyzing it with Wireshark, the issue (in my case and probably yours too) was that the remote web server DIDN'T CLOSE THE TCP CONNECTION UNTIL 15 SECONDS (i.e. "keep-alive").

Indeed, file_get_contents doesn't send a "connection" HTTP header, so the remote web server considers by default that's it's a keep-alive connection and doesn't close the TCP stream until 15 seconds (It might not be a standard value - depends on the server conf).

A normal browser would consider the page is fully loaded if the HTTP payload length reaches the length specified in the response Content-Length HTTP header. File_get_contents doesn't do this and that's a shame.

SOLUTION

SO, if you want to know the solution, here it is:

$context = stream_context_create(array('http' => array('header'=>'Connection: close\r\n')));
file_get_contents("http://www.something.com/somepage.html",false,$context);

The thing is just to tell the remote web server to close the connection when the download is complete, as file_get_contents isn't intelligent enough to do it by itself using the response Content-Length HTTP header.

Solution 2 - Php

I would use curl() to fetch external content, as this is much quicker than the file_get_contents method. Not sure if this will solve the issue, but worth a shot.

Also note that your servers speed will effect the time it takes to retrieve the file.

Here is an example of usage:

$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://example.com/test.txt');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$output = curl_exec($ch);
curl_close($ch);

Solution 3 - Php

Sometimes, it's because the DNS is too slow on your server, try this:

replace

echo file_get_contents('http://www.google.com');

as

$context=stream_context_create(array('http' => array('header'=>"Host: www.google.com\r\n")));
echo file_get_contents('http://74.125.71.103', false, $context);

Solution 4 - Php

I had the same issue,

The only thing that worked for me is setting timeout in $options array.

$options = array(
    'http' => array(
        'header'  => implode($headers, "\r\n"),
        'method'  => 'POST',
		'content' => '',
		'timeout' => .5
    ),
);

Solution 5 - Php

$context = stream_context_create(array('http' => array('header'=>'Connection: close\r\n')));
$string = file_get_contents("http://localhost/testcall/request.php",false,$context);

Time: 50976 ms (avaerage time in total 5 attempts)

$ch = curl_init();
$timeout = 5;
curl_setopt($ch, CURLOPT_URL, "http://localhost/testcall/request.php");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
echo $data = curl_exec($ch);
curl_close($ch);

Time: 46679 ms (avaerage time in total 5 attempts)

Note: request.php is used to fetch some data from mysql database.

Solution 6 - Php

Can you try fetching that url, on the server, from the command line? curl or wget come to mind. If those retrieve the URL at a normal speed, then it's not a network problem and most likely something in the apache/php setup.

Solution 7 - Php

I have a huge data passed by API, I'm using file_get_contents to read the data, but it took around 60 seconds. However, using KrisWebDev's solution it took around 25 seconds.

$context = stream_context_create(array('https' => array('header'=>'Connection: close\r\n')));
file_get_contents($url,false,$context);

Solution 8 - Php

What I would also consider with Curl is that you can "thread" the requests. This has helped me immensely as I do not have access to a version of PHP that allows threading at the moment .

For example, I was getting 7 images from a remote server using file_get_contents and it was taking 2-5 seconds per request. This process alone was adding 30seconds or something to the process, while the user waited for the PDF to be generated.

This literally reduced the time to about 1 image. Another example, I verify 36 urls in the time it took before to do one. I think you get the point. :-)

	$timeout = 30;
	$retTxfr = 1;
	$user = '';
	$pass = '';

	$master = curl_multi_init();
	$node_count = count($curlList);
	$keys = array("url");

	for ($i = 0; $i < $node_count; $i++) {
		foreach ($keys as $key) {
			if (empty($curlList[$i][$key])) continue;
			$ch[$i][$key] = curl_init($curlList[$i][$key]);
			curl_setopt($ch[$i][$key], CURLOPT_TIMEOUT, $timeout); // -- timeout after X seconds
			curl_setopt($ch[$i][$key], CURLOPT_RETURNTRANSFER, $retTxfr);
			curl_setopt($ch[$i][$key], CURLOPT_HTTPAUTH, CURLAUTH_ANY);
			curl_setopt($ch[$i][$key], CURLOPT_USERPWD, "{$user}:{$pass}");
			curl_setopt($ch[$i][$key], CURLOPT_RETURNTRANSFER, true);
			curl_multi_add_handle($master, $ch[$i][$key]);
		}
	}

	// -- get all requests at once, finish when done or timeout met --
	do {  curl_multi_exec($master, $running);  }
	while ($running > 0);

Then check over the results:

			if ((int)curl_getinfo($ch[$i][$key], CURLINFO_HTTP_CODE) > 399 || empty($results[$i][$key])) {
				unset($results[$i][$key]);
			} else {
				$results[$i]["options"] = $curlList[$i]["options"];
			}
			curl_multi_remove_handle($master, $ch[$i][$key]);
			curl_close($ch[$i][$key]);

then close file:

    curl_multi_close($master);

Solution 9 - Php

I know that is old question but I found it today and answers didn't work for me. I didn't see anyone saying that max connections per IP may be set to 1. That way you are doing API request and API is doing another request because you use full url. That's why loading directly from disc works. For me that fixed a problem:

if (strpos($file->url, env('APP_URL')) === 0) {
    $url = substr($file->url, strlen(env('APP_URL')));
} else {
    $url = $file->url;
}
return file_get_contents($url);

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionecurbhView Question on Stackoverflow
Solution 1 - PhpKrisWebDevView Answer on Stackoverflow
Solution 2 - PhpJimView Answer on Stackoverflow
Solution 3 - PhpdiyismView Answer on Stackoverflow
Solution 4 - PhpWalid AmmarView Answer on Stackoverflow
Solution 5 - PhpAmitoView Answer on Stackoverflow
Solution 6 - PhpMarc BView Answer on Stackoverflow
Solution 7 - PhpElyorView Answer on Stackoverflow
Solution 8 - PhpMike QView Answer on Stackoverflow
Solution 9 - PhpElChupacabraView Answer on Stackoverflow