What an incredible product, but.....

I love this idea and I would jump on it inmediately but for one deal-killer. When the program stores a webpage, it doesn’t store the URL which for a researcher makes it pretty close to useless since the time involved of doing that manually is prohibitive. This just seems to be a problem that hs never been worked out to my satisfaction in any operating system.

Here is the crux. Even if the program is resigned to store the URL as part of the database, like a tag for example, then that meta information will be lost if I ever decide to switch to another archival method. Since my current data base of webpages is about 5 gb, that is not trivial. So, I need a program that will somehow “stamp” the URL/Title/Date onto the page. Currently, I save webpages to DevonThink which then has a script for doing just that. Unfortunately, you have to save the pages in RTF which can be a problem. It would be much better if this could be done with HTML. When I used to work in Windows (gasp), there was a neat little toolbar that did just that, but the dev eventually went under and the program stopped working with SP2.

So, bottom line, if EagleFiler could ever save webpages with the URL/date/title stored on the actual page, I would sing its praises to anybody who would listen. As it stands, no can do.

EagleFiler does store the original URL. You can access it using Edit > Copy Source URL or choose Record > Open Source URL to open the live page in your browser.

That’s not true, because the URL is stored in the database and inside the Web archive file.

Could you elaborate about what you want for the date and the title? Currently, the title of the Web page is stored in the Web archive file. The date is stored in the creation date of the Web archive file.

Along those same lines, would it be possible to store the source link when using the “Import Text” service?

No, because when you use the Import Text service (or when you drag and drop text) the only information EagleFiler receives is the text itself. If you use the Import URL service, it will create a Web archive and store the URL.

Ok, not a big deal, I didn’t realize they service was the same as dragging and dropping text. I thought it would have been possible for the service to be extended to include the URL.

Thanks.

Starting with EagleFiler 1.1.1, EagleFiler does receive the URL when dragging text from a Web page (in a WebKit-based browser), and it stores this as the record’s source URL. This is not the case for the Import service, due to a bug in Mac OS X.

Are you sure it is a Mac-bug? If I use the service menue with Together, the URL is stored in the comment field

Yes, I reported it more than a year ago, and Apple has confirmed it. The bug is that if EagleFiler asks to receive the full Web information from the service, it is no longer possible to drag and drop text onto the Dock icon. This bug only affects Mac OS X 10.4. Since most EagleFiler users now have Leopard, it’s probably now worth it to flip to the other side of the tradeoff.

In EagleFiler 1.4.1, this now works when using the Import service. The OS bug is still there, and it means that EagleFiler no longer supports drag and dropping text onto the Dock icon under Mac OS X 10.4.