content / index builder
#2
Scooby Senior
iTrader: (1)
Join Date: Nov 2000
Location: Wildberg, Germany/Reading, UK
Posts: 9,706
Likes: 0
Received 73 Likes
on
54 Posts
Originally Posted by ChristianR
Basically we have hundreds of folders with html content.
What I need is a program that scans all these folders for html files and indexes them into a html file and creates links to them.
Anyone know of any that exist ?
What I need is a program that scans all these folders for html files and indexes them into a html file and creates links to them.
Anyone know of any that exist ?
Let me know if you want it to play with.
Steve
#3
Scooby Senior
iTrader: (1)
Join Date: Nov 2000
Location: Wildberg, Germany/Reading, UK
Posts: 9,706
Likes: 0
Received 73 Likes
on
54 Posts
on second thoughts it migth not be what you want, I wrote it to scan through a whole directory of tar files and output the contents to a webpage.
Basically it it lists all the tar files and prints the file name as a link that when you click on it it displays the contents of the zip file, you might be able to doctor it for your needs though if you have time to play.
Basically it it lists all the tar files and prints the file name as a link that when you click on it it displays the contents of the zip file, you might be able to doctor it for your needs though if you have time to play.
#5
Scooby Regular
If you have activestate installed this will do most of what you want:
perl -MFile::Find -e 'find(\&wanted, "top level dir"); sub wanted { /\.html?$/ && print "<a href=\"" . $File::Find::name . "\">" . $File::Find::name . "</a>\n"; }'
Forgot, redirect the output somewhere
perl -MFile::Find -e 'find(\&wanted, "top level dir"); sub wanted { /\.html?$/ && print "<a href=\"" . $File::Find::name . "\">" . $File::Find::name . "</a>\n"; }'
Forgot, redirect the output somewhere
Thread
Thread Starter
Forum
Replies
Last Post