Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Download files
How can I download all files?
If filenames are known, use IntGetFile function or Http class. See Http help macro.

This macro downloads 3 files from and saves in "qm downl files" folder on the desktop.

Copy      Help
str sf sfloc data localfolder="$desktop$\qm downl files"
str files=

mkdir localfolder
mkdir sf.from(localfolder "\images")

Http h.Connect("")
foreach sf files
,h.FileGet(sf data)
,sfloc.from(localfolder "\" sf)
and if the filenames are unknown (all links of webpage)?
Maybe 6 months ago, I wanted to automate opening of multiple links in a web page. Some browsers allow to do this, but you have to either open all links, or select from a long list of URLs. So I created macro that opens links that are selected (within selected text) in page. This macro does not save these pages, just opens, but when you know URLs, you can download with IntGetFile, and save.

Note: this macro is not perfect. Will not extract links that are opened by scripts.

Copy      Help
;Opens selected links.
;Works with IE, does not work with Mozilla (can't get base url).

str s su
int i j

;get selection in HTML format
s.getsel(0 "HTML Format")
if(!s.len) mes- "Please at first select one or more links"
;out s

;find base url for relative links
if(findrx(s "^(SourceURL:)(.+/).*$" 0 9 su 2)<0) ret
j=find(su "/cgi-bin/" 0 1)+1; if(j) su.fix(j)
;out su

;extract links
ARRAY(str) a
lpstr linkrx="(?si)<A\s+[^>]*href\s*=\s*[''']?([^''' >]+)[''' >]"
if(!findrx(s linkrx 0 4 a)) ret

;filter unuseful links and scripts, make full urls, open
for i 0 a.len
,;out a[0 i]
,s=a[1 i]
,if(s[0]='/') s.get(s 1)
,;out s
,j=findc(s '#')
,if(j=0) continue ;;within this page
,else if(j>0) s.fix(j)
,if(findrx(s "^\w+:")<0) s-su ;;relative url
,else if(!s.begi("http")) continue ;;mailto, javascript, etc
,;out s
,run s
Your code has given to me an idea

Forum Jump:

Users browsing this thread: 1 Guest(s)