Well that's what's causing your slowdown, since it is not being disabled.
Now as to why your requests are failing, that might be a host issue or security issue.
Does this new file remove the slow down? It should set a 100ms timeout for requests. But I've seen hosts ignore that since it a system operation. But no recheck should help if you can get those stupid plugin checks, I might have to ask him wtf that is about.
2013-05-06, 06:46:25 (This post was last modified: 2013-05-06, 06:59:13 by hd.)
sorry, i was unclear, i18n _was_ disabled already, but it is there. So i guess disabling is not enough if this plugin is causing the problems. I'll try and remove it completely.
The new file does remove the slowdown, sorry, forgot to mention that. New load time for plugins page:
48.57 ms Scheduling request
0.08 ms Writing request header
0.13 ms Writing request body
614.58 ms Waiting for response
0.12 ms Reading response header
286.23 ms Reading response body
33.61 ms Processing response
983.32 ms Duration
still not superfast but ok.
Well, removing i18n plugin did not help
I deleted the following three subfolders in plugins:
i18n_base
i18n_common
i18n_navigation
plus the files:
i18n_base.php
i18n_navigation.php
Additionally i removed this file from \data\other\:
i18n_menu_cache.xml
This should be everything, right?
Without your patched file i still get nearly the same load time(plugins about 22s instead of 30s before).
No matter if GSNOVERCHECK was enabled or not.
Yeah that should be it, I am not sure exactly what could be doing, but in the debug log you can clearly see something requesting plugin API. Only plugins page should do that. Do you have any updater plugins?
I'll send a new file to fix those errors, I forgot to add the resources and mine did not complain.
I use the updater plugin for updating plugins. I disabled it now, makes no big difference(to me...).
I'll wait for the new file to test(probably tomorrow, i'm tired...)
Maybe you can do some tests hitting another url on another host or something, narrow it down, or ask host.
I think you can modify the api url in the admin/configuration file.
3.2.2 will have plugin api calls enabled only on the plugins tab, to help alleviate the symptoms this causes and also speed up pages a bit. Ill also add in the new timeouts that were scheduled for 3.3 probably.
2013-05-07, 05:46:44 (This post was last modified: 2013-05-07, 06:38:39 by hd.)
Many thanks for this debug tool and the brilliant support!
Well, now i have more question
I try to interpret the debug info and stumbled over the fact that the shown duration is much shorter than the configured timeout(i guess "API timeout: 100" means timeout=100ms). For example the first call shows "Duration: 0,00109". Since duration is in seconds with 5 decimal places this would be about 1ms. Of course this is only true if the system runs with German locale wich would mean a comma used as decimal point, not a point like in english systems. Similar short times show for all the calls in the debug info.
If this is true, it would mean that a timeout gets triggered even though the actual execution time was much shorter. What would mean something else is (very) wrong. But maybe i just got the info wrong, i'm quite a noob when it comes to all this web server stuff(while being an experienced embedded programmer helps a bit).
btw, if i change the php locale to en_US, the numbers use a decimal point as expected.
But still the duration is very short:
Duration: 0.00113
These times look more realistic, just a bit more than 100ms.
But, if there's no bug in the code, it doesn't get any data as well.
So in the end both methods don't work.
Looks like cURL is failing before timeout actually happens and returns the wrong error code.
I'll test a bit more and ask my hoster about this.
Strange thing, it's one of the biggest hosters in Germany, cannot imagine such a problem happening unrecognised.
p.s. don't get confused by the different plugins in this debug report - i did a test installation on my hoster to be able to try things without disturbing the actual website.
2013-05-09, 21:12:04 (This post was last modified: 2013-05-09, 21:17:11 by hd.)
Well, i researched a bit about cURL and played around and these are the results:
1.
There's an explanation for the very short timeout times for cURL: "If you want cURL to timeout in less than one second, you can use CURLOPT_TIMEOUT_MS, although there is a bug/"feature" on "Unix-like systems" that causes libcurl to timeout immediately if the value is < 1000 ms with the error "cURL Error (28): Timeout was reached"." you can read more here, posting of Steve Kamerman: http://www.php.net/manual/en/function.curl-setopt.php. The bottom line of this is to use CURLOPT_NOSIGNAL on unix systems. I think you shold integrated this option in GS.
curl_setopt($ch, CURLOPT_NOSIGNAL, 1);
4. Looks like 100ms timeout is too short. Maybe it would be best to make this option configurable via gsconfig.
Even if i only try to get the GS website i get this:
Yeah I noticed that yesterday, that default was for testing locally.
This actually comes from the configuration file in gs, ( or will ), ideally 200-300ms is ideal.
This file is mostly from the development branch of gs that is not near even an alpha release.
Thank you so much for testing further.
Brilliant finds.
That was my next tests for today.
I wonder why it is having issues with the domain. I am looking into if we have any security that uses domain blacklists. Or ddos filter that might be returning nothing.
Doing a quick check for any blacklisting for the get-simple.info domain and IP i did not found anything.
I've asked my hoster Strato now for support. Let's see if they come up with an answer.
(2013-05-10, 06:40:59)hd Wrote: Doing a quick check for any blacklisting for the get-simple.info domain and IP i did not found anything.
I've asked my hoster Strato now for support. Let's see if they come up with an answer.
Hey!
So as it's clear that we are at the same hoster, with the same problems, maybe we can now try to find a solution in this thread. So we don't have to switch back and forth.
I just posted a new post to my thread. But that was the last one there. So I am just "hijacking" your post with my problem HD. :-)
2013-05-14, 07:32:32 (This post was last modified: 2013-05-14, 07:40:42 by hd.)
via info.php i get this:
cURL Information 7.21.1
i guess this is the version number you're interested in?
btw, after 3 days i got a reply from Strato - just to find out that they didn't understand the problem (or are not willing to...)
I'll try again for the next days...
I tried this
Code:
curl_setopt($ch, CURLOPT_DNS_CACHE_TIMEOUT, 10);
but still get this reply Curl Data:
cURL error number:7
cURL error:couldn't connect to host
Duration: 0,90062
Hi AD,
you would have to insert the command as above into the function "get_api_details" in the template_functions.php that shawn posted some days ago.
Curl Data: cURL error number:7 cURL error:couldn't connect to host Verbose information: * About to connect() to get-simple.info port 80 (#0)
* Trying 72.10.36.125... * Timeout
* couldn't connect to host
* Closing connection #0
Ok, this sheds some new light onto the problem. If i run it on my WAMP server at home, it says this:
* Connected to get-simple.info (72.10.36.125) port 80 (#0)
Obviously a different IP address. DNS problem? Is this related to the recent server move?
Is there anything i can do to improve this?
btw: many thanks again for your support!
btw, regarding the earlier question why the "GSNOVERCHECK" did not help: I did a check of the admin php scripts and found out, that the "get_api_details" function that does the version check is called in various places: header.php, health-check.php, plugin_functions.php, install.php and plugin.php. But: the "GSNOVERCHECK" is only evaluated in header.php. All other scripts call it without checking...
Maybe it would be better to check it in get_api_details directly?
Hope this helps a bit, though i guess you found and maybe even corrected this already.
that file has 2 new options,
CURLOPT_HEADER TRUE to include the header in the output.
CURLINFO_HEADER_OUT TRUE to track the handle's request string. Available since PHP 5.1.3. The CURLINFO_ prefix is intentional.
It should show the actual request header in the log.
A virtual host is a web server config that lets you run mutiple hosts no the same server and hence the same ip. This is basically how all shared hosts work.
So you can imagine if i use the ip address in the browser for the hostname, the server wont actually know wtf you are asking for. It says i have 50 domains what do you want. It figures this out by reading the host header that the browser normally sends when you use the domain name to get to a webserver. This might be missing somehow on your host, so the server just sends back a nothing, although proper http code would be nice.