This should be relatively easy to do with iMacros. If all the pages are within the same password protected account you could just log in normally through the browser and then run your script. Even if they are not behind the same login and password, it is possible to automate the login process although that adds a level of complexity.
iMacros returns its results in a csv format. If it is difficult to get iMacros to select the precise HTML element you want, I'll often select a larger part of the page and then extract the precise string I need with a Mid() function in Excel. The standalone full version of iMacros has a few helpful features that are not present in the Firefox plugin (at least in the GUI). You can use the full program for free as a 30 day trial.
Alternatively you can use wget to download all the pages and then work with them locally. It can retrieve pages from a list of urls. Wget also allows logging in, although admittedly I haven't tried that. Once you have them local you can process them with iMacros or even a macro running text editor such as notepad++.
A more powerful tool would be Scraperwiki. That however requires some programming experience.