Hello Everybody!
A friend and i managed to develop our first useful tool using Python.
Proxies are Elite 🙂
Enjoy
[HIDE]
import urllib2, re opener = urllib2.build_opener() opener.addheaders = [('User-agent', 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/42.0.2311.90 Safari/537.36')] urllib2.install_opener(opener) def gatherlist1(): url = "https://www.sslproxies.org" read = urllib2.urlopen(url).read() tables = re.findall(r'(.*?) ',str(read)) for table in tables: rows = str(str(table.replace('','')).replace(' ',',')).split(',') if((rows[6] == 'yes') & (rows[4] == 'elite proxy')): with open('proxylist.txt','a') as f: f.write('{}:{}n'.format(rows[0],rows[1])) else: '' def gatherlist2(offset): for x in xrange(0,offset+1,50): url = "http://proxydb.net/?protocol=https&anonlvl=3&anonlvl=4&offset={}".format(x) read = urllib2.urlopen(url).read() links = re.findall(r'(.*?)',str(read)) for link in links: with open('proxylist.txt','a') as f: f.write('{}n'.format(link[1])) if __name__ == '__main__': gatherlist1() gatherlist2(1572)
How it works:
You run it, and it will create a .txt file named proxylist , where all scrapped proxies are going to be stored.
Proxies are being saved every second, so the more you leave the program open, the more proxies you will get. (ofc, some are going to be banned since they are public proxies)
[/HIDE]