Yeah, you're going to need to build a script, that runs on your main page, that simple pulls and read the meta titles of every pages in that directory, and lists them. That's a relatively simply 10 line pieces of code. You can do it in php, perl, python, or even javascript.
if ($handle = opendir('/path/to/files')) {
while (false !== ($entry = readdir($handle))) {
$scrape = file_get_contents('http://www.myurl.com/path/to/files/'.$handle) ;
$title = get_explode('<title>','</title>',$scrape) ;
echo "<a href='http://www.myurl.com/path/to/files/".$handle."'>".$title."</a><br />" ;
}
closedir($handle);
}
function get_explode($first,$last,$text) {
$text_array = explode($first,$text) ;
$text_array2 = @explode($last,$text_array[1]) ;
$return = $text_array2[0] ;
return $return ;
}
In before coding purists tell me about the wonders of xpath, cUrl & all the time saving benefits of using ' vs "PHP:if ($handle = opendir('/path/to/files')) { while (false !== ($entry = readdir($handle))) { $scrape = file_get_contents('http://www.myurl.com/path/to/files/'.$handle) ; $title = get_explode('<title>','</title>',$scrape) ; echo "<a href='http://www.myurl.com/path/to/files/".$handle."'>".$title."</a><br />" ; } closedir($handle); } function get_explode($first,$last,$text) { $text_array = explode($first,$text) ; $text_array2 = @explode($last,$text_array[1]) ; $return = $text_array2[0] ; return $return ; }
There's this thing called "dynamic scripting on websites", the idea in this case is that you have the actual listing be generated every time the page is hit. Less work, always up to date, it's the shit.