python采集代理ip

我们需要用到一个模块儿

BeautifulSoup

到http://www.crummy.com/software/BeautifulSoup/网站上下载

下载完成后解压缩,放到python安装目录下

然后再cmd窗口中cd到BeautifulSoup所在目录

4.运行命令:

setup.py build

setup.py install

ok

下面是采集代码

 
import urllib2
from bs4 import BeautifulSoup
 
# get the proxy
of = open('proxy.txt', 'w')
for page in range(1,50):
    url = 'http://www.xicidaili.com/nn/%s' %page
    user_agent = "Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/43.0.2357.134 Safari/537.36"
    request = urllib2.Request(url)
    request.add_header("User-Agent", user_agent)
    content = urllib2.urlopen(request)
    soup = BeautifulSoup(content)
    trs = soup.find('table', {"id":"ip_list"}).findAll('tr')
    for tr in trs[1:]:
        tds = tr.findAll('td')
        ip = tds[2].text.strip()
        port = tds[3].text.strip()
        protocol = tds[6].text.strip()
        if protocol == 'HTTP' or protocol == 'HTTPS':
            of.write('%s=%s:%sn' % (protocol, ip, port))
            print '%s://%s:%s' % (protocol, ip, port)

采集的ip会自动写入proxy.txt中。

 

One thought on “python采集代理ip

Comments are closed.