EU BAST / YAWSM

Encore Un Branleur avec un Avis Sur Tout/ Yet Another Wanker Speaking his Mind.

2004-04-19

Suprnova en panne ...


.. les honos en attente.
l'imprimante en maintenance, mais Decade of aggression dans le iPod !!!
ca fouette le sang.
"dance with the dead in my dreams, listen to their hallowed screams"

2004-04-06

suze et un aspirateur


aye je sui abonne a suze.net.
j'ai ete oblige de coder ca pour evite un syndrome carpal tunnel.
si vous le voulez, utilisez le , pas de support, pas d'aide

#!/usr/bin/python
from urllib2 import *
from string import *
import re
x=HTTPPasswordMgrWithDefaultRealm()
x.add_password(None,'http://members.suze.net',"VOTRE USER","VOTRE PASSWORD")
auth=HTTPBasicAuthHandler(x)
opener=build_opener(auth)
install_opener(opener)
# f=urlopen('http://members.suze.net/area3/ahbabes/2664IDX.HTM')
# f.readlines()
baseurl='http://members.suze.net/area3/'
page_index='/ahotshots/A2653IDX.HTM'
pages=('http://members.suze.net/area3/intpages/ipstars/B2578MP.HTM',
'http://members.suze.net/area3/intpages/iggirls/2638MP.HTM',
'http://members.suze.net/area3/intpages/ihbabes/0996MP.HTM',
'http://members.suze.net/area3/intpages/iredheads/B2555MP.HTM'
'http://members.suze.net/area3/intpages/ipstars/B2540MP.HTM',
'http://members.suze.net/area3/intpages/iblast/0174MP.HTM',
'http://members.suze.net/area3/intpages/ibrunettes/1481MP.HTM',
'http://members.suze.net/area3/intpages/ihotshots/2639MP.HTM',
'http://members.suze.net/area3/intpages/ihotshots/A2531MP.HTM',
'http://members.suze.net/area3/intpages/iggirls/A2635MP.HTM',
'http://members.suze.net/area3/intpages/iblondes/2629MP.HTM',
'http://members.suze.net/area3/intpages/ibrunettes/1513MP.HTM',
'http://members.suze.net/area3/intpages/ibrunettes/B2655MP.HTM',
'http://members.suze.net/area3/intpages/iblast/B857MP.HTM')
drop=len('http://members.suze.net/area3/intpages/i')
is_href=re.compile('.*href=\"(.*)\">(.*s\d+.jpg)<',re.IGNORECASE)
for page in pages:
page_index='/a%sIDX.HTM'% (page[drop:-6],)
print page_index
html_i=urlopen('%s%s' %(baseurl,page_index))

print is_href
for ligne_html in html_i.readlines():
''' je recois une table'''
yop= is_href.match(ligne_html)
if yop:
print yop.groups()
print len(ligne_html),ligne_html[0],lower(lstrip(ligne_html)),
jpeg=open(VOTRE DIRECTORY /%s'% yop.groups()[1],'w+')
target=yop.groups()[0][3:]
suze_side= urlopen( '%s%s'%(baseurl,target ))
for l in suze_side:
jpeg.write(l)
jpeg.close()
suze_side.close()