Hi Vikram,
you mentioned a 404, I experienced a similar issue with twitter and the cause might be the same, so I'll describe my "Twitter-Case" below:
I have a Plug-in calling the XML of a specific twitter user. Once in a while, I had the exact problem, I got a 404 but could access the XML directly from my computer. It turned out that they limit the number of request to something like 30 requests per hour and IP.
This was the case for my site, while developing I hit refresh very often, thus the CMS requested the feed more than the set quota and I received a 404.
To bypass this I had to implement a caching into the plug-in that downloads the feed and keeps it for a specifiable time before it requests the file again. I do not know how the RSS2HTML module works, but probably you can use the function I wrote (with some help of the forum):
Code: Select all
function get_xml_feed($cache_location,$external_xml,$cache_ttl) {
// Get time (seconds) difference (=cache age) between cached file last update and now
$timedif = @(time() - filemtime($cache_location));
// Create cache info object and store cache age and time-to-live [=ttl]
$cacheinfo= new StdClass;
$cacheinfo->cache_age = $timedif;
$cacheinfo->ttl = $cache_ttl;
// Caching mechanics:
if ($timedif < $cache_ttl)
{
// ^If cache age is younger than ttl assign stored feed copy to simplexml for processing
// afterwards set info cache object status information (data comming from cache)
$get_xml_feed = simplexml_load_file($cache_location);
$cacheinfo->status = "delivering cached data from: ".$cache_location;
} else {
// If cache is too old
// Check if external file is available
if ($feedbuffer = file($external_xml)) {;
if ($f = @fopen($cache_location, 'w')) {
// External file is available, assign file and write contents of live url data into storage file
for ($i=0; $i<=count($feedbuffer); $i++){
fwrite ($f, $feedbuffer[$i], strlen($feedbuffer[$i]));
}
fclose($f);
// Close file and assign stored/updated feed copy to simplexml for processing
$get_xml_feed = simplexml_load_file($cache_location);
$cacheinfo->status ="cache outdated - cache reloaded";
}
} else {
// Cache is outdated, but we can not renew it, so keep it and assign data to function
$get_xml_feed = simplexml_load_file($cache_location);
//Assign cached object to function
$cacheinfo->status ="cache outdated - could not reload file, using old cache data";
}
}
return $get_xml_feed;
}
The function requires three parameters:
$cache_location = where you want to store the XML in your system
$external_xml = the URL of the XML feed you want to get
$cache_ttl = how long should the data be cached
The function provides you with a XML object that you can conveniently process with PHPs xml functions.However, I do not known how that could be implemented for your case, but I guess it shouldn't be too complicated.
So if hitting the server too often is the reason for your 404s, then this might be a solution. Have you checked if the RSS2HTML Module comes along with a caching?
Best
Nils
_____
edit: You could check the feedburner help or google for the keywords "quota" and "404" as well as such things as "maximum hits" with feedburner or so to see if there is a quota on feedburner requests per hour and site.