您好,登錄后才能下訂單哦!
這篇文章將為大家詳細(xì)講解有關(guān)如何基于python分布式爬蟲并解決假死的問題,小編覺得挺實(shí)用的,因此分享給大家做個(gè)參考,希望大家閱讀完這篇文章后可以有所收獲。
python版本:3.5.4
系統(tǒng):win10 x64
放函數(shù)只需要兩個(gè)參數(shù)即可下載相應(yīng)內(nèi)容到本地,一個(gè)是網(wǎng)址,一個(gè)是保存位置
import urllib.request url = 'http://xxx.com/xxx.mp4' file = 'xxx.mp4' urllib.request.retrieve(url, file)
但是博主在使用過程中發(fā)現(xiàn),該函數(shù)沒有timeout方法。使用時(shí),可能由于網(wǎng)絡(luò)問題導(dǎo)致假死!
使用方法如下:
import urllib.request url = 'http://xxx.com/xxx.mp4' file = 'xxx.mp4' response = urllib.request.urlopen(url, timeout=5) data = response.read() with open(file, 'wb') as video: video.write(data)
此函數(shù)有timeout設(shè)置,可以避免假死。
偽代碼如下:
import urllib.request import socket from urllib import error from queue import Queue from threading import Thread import os class DownloadWorker(Thread): #定義一個(gè)類,繼承自thread類,重寫其run函數(shù) def __init__(self, queue): Thread.__init__(self) self.queue = queue #標(biāo)準(zhǔn)的多線程實(shí)現(xiàn)方法都使用了queue def run(self): while True: link, file = self.queue.get() #從隊(duì)列中獲取一組網(wǎng)址以及對(duì)應(yīng)的保存位置 try: #使用try except方法進(jìn)行各種異常處理 response = urllib.request.urlopen(link, timeout=5) data = response.read() with open(file, 'wb') as video: video.write(data) except error.HTTPError as err: print('HTTPerror, code: %s' % err.code) except error.URLError as err: print('URLerror, reason: %s' % err.reason) except socket.timeout: print('Time Out!') except: print('Unkown Error!') self.queue.task_done() #標(biāo)記隊(duì)列中的一個(gè)元素已經(jīng)被處理 def main(): queue = Queue() #定義隊(duì)列 for x in range(8): #開啟8個(gè)線程 worker = DownloadWorker(queue) worker.daemon = True worker.start() for lineData in txtData: #向隊(duì)列中放入數(shù)據(jù) link = lineData[0] file = lineData[1] queue.put((link, file)) queue.join() #等待隊(duì)列中的數(shù)據(jù)被處理完畢 if __name__ == '__main__': main()
補(bǔ)充:基于python的一個(gè)大規(guī)模爬蟲遇到的一些問題總結(jié)
前些天在某個(gè)論壇看到一些很感興趣的信息,想要將其爬取下來,預(yù)估了下規(guī)模,想要做的是:將整個(gè)論壇的所有文章爬取下來,保存為本地的txt。
一開始寫了個(gè)爬蟲,大致思路是:
先從論壇的起始頁開始爬起,得到所有分區(qū)版面的網(wǎng)址
然后從分區(qū)版面得到該區(qū)總共的頁碼數(shù),根據(jù)網(wǎng)址規(guī)律得到分區(qū)版面所有頁數(shù)的網(wǎng)頁
從上面的分區(qū)版面的某一頁的網(wǎng)頁中得到該頁所有文章的網(wǎng)址,然后抓取這些文章,保存為本地txt
上面的思路是典型的自上而下的思路,這樣第一版本的代碼就寫好了。
下面進(jìn)入正題,總結(jié)一下遇到的問題:
上面的爬蟲在調(diào)試階段表現(xiàn)還是不錯(cuò)的,后來實(shí)測(cè)中,跑起來發(fā)現(xiàn),跑了一段時(shí)間后就會(huì)發(fā)生http錯(cuò)誤,由于使用的是有線網(wǎng),且檢查后不是網(wǎng)絡(luò)本身的錯(cuò)誤,所以判定為本網(wǎng)站封禁了,于是開始研究這個(gè)問題。
一般來說,python爬蟲將自己偽裝為瀏覽器時(shí),使用的方法是在urllib2.Request函數(shù)中加入headers參數(shù),也即類似于
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/22.0.1207.1 Safari/537.1"
的user_agent代碼片段,但是這樣在大規(guī)模爬取中,就會(huì)被網(wǎng)站判定為一個(gè)用于長期快速訪問,容易被封禁。本來在開始的代碼中,爬蟲訪問兩個(gè)網(wǎng)頁之間是加入了0.5s的時(shí)間延時(shí),就是為了防止這一問題的,結(jié)果還是不可以,而如果將延時(shí)加大,將會(huì)影響到爬蟲的效率,而且如此大規(guī)模的爬取更是不知要何時(shí)才能結(jié)束。
于是,考慮偽裝成多個(gè)瀏覽器的訪問的方法來解決這一問題,具體做的就是,找許多user_agent,保存為一個(gè)列表,而在訪問網(wǎng)頁時(shí),輪流使用以上user_agent,這樣就偽裝成了許多瀏覽器。附上具體子函數(shù)如下:
user_agent_list = [ "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/22.0.1207.1 Safari/537.1", "Mozilla/5.0 (X11; CrOS i686 2268.111.0) AppleWebKit/536.11 (KHTML, like Gecko) Chrome/20.0.1132.57 Safari/536.11", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.6 (KHTML, like Gecko) Chrome/20.0.1092.0 Safari/536.6", "Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.6 (KHTML, like Gecko) Chrome/20.0.1090.0 Safari/536.6", "Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/19.77.34.5 Safari/537.1", "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.9 Safari/536.5", "Mozilla/5.0 (Windows NT 6.0) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.36 Safari/536.5", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3", "Mozilla/5.0 (Windows NT 5.1) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3", "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_0) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3", "Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1062.0 Safari/536.3", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1062.0 Safari/536.3", "Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3", "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3", "Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.0 Safari/536.3", "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.24 (KHTML, like Gecko) Chrome/19.0.1055.1 Safari/535.24", "Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/535.24 (KHTML, like Gecko) Chrome/19.0.1055.1 Safari/535.24", 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.2; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; Media Center PC 6.0; InfoPath.2; MS-RTC LM 8)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; InfoPath.2)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0 Zune 3.0)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; MS-RTC LM 8)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; InfoPath.3)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; InfoPath.2; MS-RTC LM 8)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET CLR 4.0.20402; MS-RTC LM 8)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET CLR 1.1.4322; InfoPath.2)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Win64; x64; Trident/4.0; .NET CLR 2.0.50727; SLCC2; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; Tablet PC 2.0)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Win64; x64; Trident/4.0; .NET CLR 2.0.50727; SLCC2; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET CLR 3.0.04506; Media Center PC 5.0; SLCC1)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Win64; x64; Trident/4.0; .NET CLR 2.0.50727; SLCC2; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Win64; x64; Trident/4.0)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; Tablet PC 2.0; .NET CLR 3.0.04506; Media Center PC 5.0; SLCC1)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; FDM; Tablet PC 2.0; .NET CLR 4.0.20506; OfficeLiveConnector.1.4; OfficeLivePatch.1.3)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET CLR 3.0.04506; Media Center PC 5.0; SLCC1; Tablet PC 2.0)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET CLR 1.1.4322; InfoPath.2)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.3029; Media Center PC 6.0; Tablet PC 2.0)', 'Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0; SLCC2)', 'Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 6.0)', 'Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.2; .NET CLR 1.1.4322; .NET CLR 2.0.50727; InfoPath.2; .NET CLR 3.0.04506.30)', 'Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; Media Center PC 3.0; .NET CLR 1.0.3705; .NET CLR 1.1.4322; .NET CLR 2.0.50727; InfoPath.1)', 'Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; FDM; .NET CLR 1.1.4322)', 'Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; .NET CLR 1.1.4322; InfoPath.1; .NET CLR 2.0.50727)', 'Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; .NET CLR 1.1.4322; InfoPath.1)', 'Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; .NET CLR 1.1.4322; Alexa Toolbar; .NET CLR 2.0.50727)', 'Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; .NET CLR 1.1.4322; Alexa Toolbar)', 'Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; .NET CLR 1.1.4322; .NET CLR 2.0.50727)', 'Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; .NET CLR 1.1.4322; .NET CLR 2.0.40607)', 'Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; .NET CLR 1.1.4322)', 'Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; .NET CLR 1.0.3705; Media Center PC 3.1; Alexa Toolbar; .NET CLR 1.1.4322; .NET CLR 2.0.50727)', 'Mozilla/5.0 (Windows; U; MSIE 7.0; Windows NT 6.0; en-US)', 'Mozilla/5.0 (Windows; U; MSIE 7.0; Windows NT 6.0; el-GR)', 'Mozilla/5.0 (MSIE 7.0; Macintosh; U; SunOS; X11; gu; SV1; InfoPath.2; .NET CLR 3.0.04506.30; .NET CLR 3.0.04506.648)', 'Mozilla/5.0 (compatible; MSIE 7.0; Windows NT 6.0; WOW64; SLCC1; .NET CLR 2.0.50727; Media Center PC 5.0; c .NET CLR 3.0.04506; .NET CLR 3.5.30707; InfoPath.1; el-GR)', 'Mozilla/5.0 (compatible; MSIE 7.0; Windows NT 6.0; SLCC1; .NET CLR 2.0.50727; Media Center PC 5.0; c .NET CLR 3.0.04506; .NET CLR 3.5.30707; InfoPath.1; el-GR)', 'Mozilla/5.0 (compatible; MSIE 7.0; Windows NT 6.0; fr-FR)', 'Mozilla/5.0 (compatible; MSIE 7.0; Windows NT 6.0; en-US)', 'Mozilla/5.0 (compatible; MSIE 7.0; Windows NT 5.2; WOW64; .NET CLR 2.0.50727)', 'Mozilla/4.79 [en] (compatible; MSIE 7.0; Windows NT 5.0; .NET CLR 2.0.50727; InfoPath.2; .NET CLR 1.1.4322; .NET CLR 3.0.04506.30; .NET CLR 3.0.04506.648)', 'Mozilla/4.0 (Windows; MSIE 7.0; Windows NT 5.1; SV1; .NET CLR 2.0.50727)', 'Mozilla/4.0 (Mozilla/4.0; MSIE 7.0; Windows NT 5.1; FDM; SV1; .NET CLR 3.0.04506.30)', 'Mozilla/4.0 (Mozilla/4.0; MSIE 7.0; Windows NT 5.1; FDM; SV1)', 'Mozilla/4.0 (compatible;MSIE 7.0;Windows NT 6.0)', 'Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0)', 'Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0;)', 'Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0; YPC 3.2.0; SLCC1; .NET CLR 2.0.50727; Media Center PC 5.0; InfoPath.2; .NET CLR 3.5.30729; .NET CLR 3.0.30618)', 'Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0; YPC 3.2.0; SLCC1; .NET CLR 2.0.50727; .NET CLR 3.0.04506)', 'Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0; WOW64; SLCC1; Media Center PC 5.0; .NET CLR 2.0.50727)', 'Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0; WOW64; SLCC1; .NET CLR 3.0.04506)', 'Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0; WOW64; SLCC1; .NET CLR 2.0.50727; Media Center PC 5.0; InfoPath.2; .NET CLR 3.5.30729; .NET CLR 3.0.30618; .NET CLR 1.1.4322)', ]
上面大概有60多個(gè)user_agent,這樣就偽裝成了60多個(gè)瀏覽器。嘗試這種方法后,發(fā)現(xiàn)在此長時(shí)間爬取,出錯(cuò)或者訪問速度變慢的情況就很少出現(xiàn)了,基本解決了這一問題。
但是,需要注意的是,如果網(wǎng)站不是根據(jù)user_agent,而是根據(jù)用戶的IP來封禁的話,那就不好辦了,網(wǎng)上的一些解決辦法是云計(jì)算之類的,貌似略麻煩,不太適合個(gè)人用戶,有興趣的可以看一下相關(guān)資料。
由于規(guī)模略大,不可能一直守在電腦前,所以,代碼的穩(wěn)定性(容錯(cuò)性)需要較高,這里,python的try……except……語法就發(fā)揮了很好的作用。
前幾天的實(shí)踐證明,出錯(cuò)大多數(shù)是由于一時(shí)的網(wǎng)絡(luò)不穩(wěn)定而出現(xiàn)的,而解決辦法也很簡單,重新訪問以下就好了,于是將抓取網(wǎng)頁的函數(shù)寫成下面的形式
def get_page_first(url): global user_agent_index user_agent_index+=1 user_agent_index%=len(user_agent_list) user_agent = user_agent_list[user_agent_index] #print user_agent print user_agent_index headers = { 'User-Agent' : user_agent } print u"正在抓取"+url req = urllib2.Request(url,headers = headers) try: response = urllib2.urlopen(req,timeout=30) page = response.read() except: response = urllib2.urlopen(req,timeout=30) page = response.read() print u"抓取網(wǎng)頁"+url return page
這里,如果訪問一個(gè)網(wǎng)頁30s無響應(yīng),就重新訪問。基本解決了這一問題。
由于txt的命名采用的是 “日期--作者——標(biāo)題”的形式,而一些帖子的標(biāo)題含有諸如?等txt中不允許出現(xiàn)的命名,這樣就會(huì)發(fā)生報(bào)錯(cuò)。這里解決辦法是,如果保存文件出錯(cuò),則先嘗試將名稱改為“日期--作者——編號(hào)”形式,仍然出錯(cuò),而保存為“日期--編號(hào)”的形式。具體代碼如下:
try: if news_author[0]=='': save_file(path+'//'+news_time[0]+'--'+news_title+'.txt',news) else: save_file(path+'//'+news_time[0]+'--'+news_author[0]+u"——"+news_title+'.txt',news) except: try: save_file(path+'//'+news_time[0]+'--'+news_title+'.txt',news) except: save_file(path+'//'+news_time[0]+'--'+str(j)+'-'+str(index)+'.txt',news)
開始的代碼考慮不周,沒有想到同一天的帖子中會(huì)出現(xiàn)作者和名稱都相同的情況,于是后來發(fā)現(xiàn)一些版面的總文章數(shù)和保存的txt數(shù)目不同,后來發(fā)現(xiàn)了這一問題。于是將保存文件的子函數(shù)修改如下,大致思路就是保存前先檢查同名文件是否存在,不存在直接保存;存在的話,在名稱后加(i)(i從1開始遞增變化),重復(fù)上述步驟,直至同名文件不存在:
def save_file(path,inf): if not os.path.exists(path): f = file(path, 'w') f.write(inf) f.close else: i=0 while(1): i+=1 tpath=path[:-4] tpath+='('+str(i)+')'+'.txt' if not os.path.exists(tpath): break f = file(tpath, 'w') f.write(inf) f.close
理論上,大規(guī)模的爬蟲可以采用多線程的方法加快抓取速度,但是考慮到不要對(duì)網(wǎng)站造成過大的壓力,也為避免被網(wǎng)站封禁IP,所以主程序中未引入多線程的概念。但是又為了加快進(jìn)度,就手動(dòng)打開多個(gè)命令行窗口運(yùn)行爬蟲,來同時(shí)抓取不同的版面的文章。這樣,當(dāng)一個(gè)程序報(bào)錯(cuò),其他的仍然能運(yùn)行,也是增強(qiáng)了程序的容錯(cuò)性。
實(shí)際運(yùn)行一段時(shí)間后,發(fā)現(xiàn)該程序的時(shí)間延遲最主要是發(fā)生在抓取網(wǎng)頁的環(huán)節(jié),也就是下載網(wǎng)頁的時(shí)間上,想要提高效率也就是需要改善這一環(huán)節(jié)。當(dāng)我正考慮應(yīng)該采用什么辦法解決這一問題時(shí),忽然發(fā)現(xiàn),原來該論壇還提供了無圖版的網(wǎng)頁(也就是類似于手機(jī)版),這樣,每個(gè)網(wǎng)頁的大小就減小了很多,而且文章內(nèi)容之類所需信息仍然存在,所以就重新修改了代碼。然后,發(fā)現(xiàn)速度確實(shí)有了極大的提升。所以,以后 抓取網(wǎng)頁前一定要先看看是否存在類似于無圖版(手機(jī)版)的網(wǎng)頁,這樣就可以很大的提高速度。
關(guān)于“如何基于python分布式爬蟲并解決假死的問題”這篇文章就分享到這里了,希望以上內(nèi)容可以對(duì)大家有一定的幫助,使各位可以學(xué)到更多知識(shí),如果覺得文章不錯(cuò),請(qǐng)把它分享出去讓更多的人看到。
免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如果涉及侵權(quán)請(qǐng)聯(lián)系站長郵箱:is@yisu.com進(jìn)行舉報(bào),并提供相關(guān)證據(jù),一經(jīng)查實(shí),將立刻刪除涉嫌侵權(quán)內(nèi)容。