I am having a great deal of trouble trying to import bookmarks into EagleFiler.
I admit there are a lot - almost 8000, but my understanding is that EF should be able to handle this.
I also see some of the same problems with imports as small as 10 or 12 bookmarks.
What I do:
I have the bookmarks broken up into three folders (with sub-hierarchies) of about 2500-3000 each.
I simply drag and drop a folder from Safari bookmarks into “records” in the EF window.
Is there a different or better way to import bookmarks as webarchives?
What happens:
In the activity window I see numerous imports happen.
In the Error window I begin to see errors related to “timeouts”. Occassionally I also get an “TaskFailedException 10. Error”
Out of about 8000 bookmarks, I got about 2000 timeout errors.
Why I don’t think the timeouts are legit:
If I select them in the error window they usually come up fine.
Also I prescreened all the bookmarks through a program called BookDog which does a good job of verifying every URL works.
Why I don’t think it’s my network/computer but related specifically to EagleFiler:
a. I have run ISP speed tests right around the same time as these errors and found my cable broadband connection to be 6kB/s down and 360kB/s up.
b. Also I have run “BookDog” which verifies every URL. I realize at different times of day there may be downtimes, but I don’t think this accounts for
the degree of timeouts I’m seeing. Could it be that a server is trying to respond but cannot get a message through because so many downloads
are occurring at the same time? Or EagleFiler is busy processing those downloads?
c. I also ran some of these tests on a different ISP/network (at a cafe; so wireless connection). Results were similar.
-
Could there be a need to provide a throttling rate? Other tools that queue many requests like this seem to do this; for example “BookDog” when
it verifies URL’s. -
Have other people had similar problems with getting “timeouts”? Can you suggest any other ways to troubleshoot this? How can I isolate the
problem? If it is isolated to something outside of Eaglefiler, is there some utility you might suggest to debug this?
Workaround Option 1. I thought perhaps I could import the same folder again, and it would simply warn me about duplicates while it went ahead
and loaded many if not all of the previous one’s that had failed due to timeout. **This didn’t work **because apparently duplicate checking only works
on text files - not on webarchives nor URL’s. It seems like it would be very useful to **allow an option to check for duplicate URL’s(not content) **upon import and
prevent it. Why load it again if I already have it? I understand the user would have to accept the responsibility that two url’s may be identical but
generate different content depending on when they were accessed for example. [Update] Perhaps duplicate checking does work on webarchives in certain
cases or only when imported from OmniWeb (see #6 below)?
Workaround Option 2. Copy the error log file into a textedit file. Manually invoke each of the 2000 URL’s in Safari and capture using F1. This is
not practical. Maybe it could be scripted. But again maybe there would be a throttling issue. Even still there are other problems with this.
The error log file doesn’t include the folder hierarchy for which the bookmark belonged to - even if I could get the bookmark into EF.
Another problem with the error window (for me) is that the only way to see what caused the error is to click on the error; then the status bar shows it
as “timeout” or “host not found” etc. The problem here is that to save this information I have to select all in the window and copy to a textedit window
so I lose all of the “reason codes” as well. All I get is “Could not Import URL: http://www.apple.com”.
Workaround Option 3. Provide a way to go back and re-try failed imports. Requires EagleFiler enhancements. Possible approaches:
a. Add option to create a record even if there is a timeout. The record would retain the proper folder hierarchy from the original bookmarks and it
would have the URL. The user would at least have his sources retained. He would be no worse off (no loss of information). He would have to
manually try to re-import each of the URLs as time permits or as he needs them.
b. Add option to keep a specific log of failed imports and capability to re-try these at the user’s convenience.
Workaround Option 4. Try importing from OmniWeb instead of Safari. I decided to try this quickly. I was surprised by two results.
a. Drag and Drop from OmniWeb does not work as well as from Safari. Specifically, I could not drag a folder to import. I could only select groups
of bookmarks and drag them. So you lose all hierarchy information.
b. There DOES seem to be some kind of throttling of webarchives going on by EagleFiler and it appears to be different when importing from OmniWeb.
When I imported the same URLs from Safari and watched the Activity window, I saw 6 simultaneous webarchives being processed. However,
with OmniWeb I only see 4 being processed. This seems consistent with my informal observations that I get fewer timeout errors using OmniWeb;
but I still get some occassionally as well as the dreaded TaskFailedException.
c. I also noticed I received many more duplicate errors for this set of URLs when I imported them more than once - although still just a fraction of the total.
Whereas with Safari, for the same set, I get no duplicates detected.
Conclusion:
Even if the timeouts are not at all related to EagleFiler (probably true since no one else seems to have this problem), I was hoping that it would
support some way for re-tries (multiple passes). Also, I’d like to resolve the **TaskFailedException. ** Ultimately I just want to get all my bookmarks loaded.
Sorry for the long message.
And just to clarify something … I am using the trial version of the software.
I think it’s great software! I really do! … I’m just a little frustrated trying to get my information into it.
(and I still need to tackle email )