您好,登錄后才能下訂單哦!
本人長期出售超大量微博數(shù)據(jù)、旅游網(wǎng)站評論數(shù)據(jù),并提供各種指定數(shù)據(jù)爬取服務(wù),Message to YuboonaZhang@Yahoo.com。同時(shí)歡迎加入社交媒體數(shù)據(jù)交流群:99918768
??由于硬件等各種原因需要把大概170多萬2t左右的微博圖片數(shù)據(jù)存到Mysql中.之前存微博數(shù)據(jù)一直用的非關(guān)系型數(shù)據(jù)庫mongodb,由于對Mysql的各種不熟悉,踩了無數(shù)坑,來來回回改了3天才完成。
存數(shù)據(jù)的時(shí)候首先需要設(shè)計(jì)數(shù)據(jù)庫,我準(zhǔn)備設(shè)計(jì)了3個(gè)表
微博表:[id, userid, blog_text, lat, lng, created_time, reserve]?? pkey: id
圖片表:[md5, pic_url, pic_bin, exif, reserve]?? pkey: md5
關(guān)系表:[id, md5, reserve]?? pkey: (id, md5) ? fkey: (id, 微博表id)? (md5, 圖片表md5)
??建表的時(shí)候別的問題都還好,主要是 pic_bin 的類型和 blog_text 的類型有很大的問題,首先是pic_bin的類型,開始設(shè)置的為BLOB,但是運(yùn)行之后發(fā)現(xiàn)BLOB最大只能存1M的數(shù)據(jù),并不能滿足微博圖片的存儲(chǔ),后改成MEDIUMBLOB(16M)基本能夠滿足要求了。再后來就是blog_text,我遇到的第一個(gè)大坑
??開始的時(shí)候很自然的設(shè)置blog_text的類型為TEXT,但跑起來發(fā)現(xiàn)有些數(shù)據(jù)存不進(jìn)去,會(huì)報(bào)錯(cuò),經(jīng)篩查發(fā)現(xiàn)是有些微博文本中包含了emoji表情...隨后找了很多資料發(fā)現(xiàn)是因?yàn)閡tf8下文字是三字節(jié),但是emoji是四字節(jié),需要將編碼改成utf8mb4。然而我在mac上整mysql的配置文件報(bào)各種奇葩錯(cuò)誤,一怒之下把TEXT改成了BLOB,就好了。因?yàn)楸镜厥荕AC,我要連接到遠(yuǎn)程的一臺(tái)Windows上才能通過那個(gè)Windows連接到群暉的Mysql上...本地配置改了也白改。
??然后這就是一個(gè)大坑?。?! 由于我使用的python3,所以讀取圖片得到的二進(jìn)制的結(jié)果前面會(huì)有一個(gè)b', 表示bytes,正是由于這個(gè)b'導(dǎo)致sql語句拼接的時(shí)候這個(gè)b后面的單引號會(huì)和sql語句的引號結(jié)合,導(dǎo)致后面的二進(jìn)制沒有在引號里面出錯(cuò)!二進(jìn)制編碼又不像string可以對字符轉(zhuǎn)義,試了好多方法都不行!最后沒有辦法使用base64 對二進(jìn)制進(jìn)行加密轉(zhuǎn)化成字符串,存到數(shù)據(jù)庫中,然后要用時(shí)的時(shí)候再解密。
pic_bin = str(base64.b64encode(pic_bin))[2:-1]
??由于使用Python多進(jìn)程,一個(gè)小時(shí)8G數(shù)據(jù)量,圖片數(shù)據(jù)比較大,發(fā)包的時(shí)候回超過mysql的默認(rèn)限制,出現(xiàn)Mysql server has gone away, 這個(gè)時(shí)候要改配置文件,在配置文件中參數(shù)
max_allowed_packet = 600M
wait_timeout = 60000
??程序跑著跑著總會(huì)出現(xiàn)這個(gè)錯(cuò)誤,一直找原因,試了各種辦法看了好多資料,一直都是錯(cuò)誤。實(shí)在不知道什么原因了...后來一想,我管他什么原因,失去連接之后重新連接就行了。使用conn.Ping(True) 判斷是否連接mysql成功。如果失去連接就重新連接就行了!最后解決了這個(gè)問題
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# Created by Baoyi on 2017/10/16
from multiprocessing.pool import Pool
import pymysql
import requests
import json
import exifread
from io import BytesIO
import configparser
import hashlib
import logging
import base64
# 配置logging
logging.basicConfig(level=logging.WARNING,
format='%(asctime)s %(filename)s[line:%(lineno)d] %(levelname)s %(message)s',
datefmt='%a, %d %b %Y %H:%M:%S',
filename='weibo.log',
filemode='w')
cf = configparser.ConfigParser()
cf.read("ConfigParser.conf")
# 讀取配置mysql
db_host = cf.get("mysql", "db_host")
db_port = cf.getint("mysql", "db_port")
db_user = cf.get("mysql", "db_user")
db_pass = cf.get("mysql", "db_pass")
db = cf.get("mysql", "db")
# 創(chuàng)建連接
conn = pymysql.connect(host=db_host, user=db_user, passwd=db_pass, db=db, port=db_port, charset='utf8')
# 獲取游標(biāo)
cursor = conn.cursor()
# 創(chuàng)建insert_sql
insert_blog_sql = (
"INSERT IGNORE INTO blog(userid, id, blog_text, lat, lng, created_time) VALUES('{uid}', '{id}','{blog_text}','{lat}','{lng}','{created_time}')"
)
insert_pic_sql = (
"INSERT IGNORE INTO pics(pic_url, pic_bin, md5, exif) VALUES ('{pic_url}','{pic_bin}','{md5}','{exif}')"
)
insert_relationship_sql = (
"INSERT IGNORE INTO relationship(id, md5) VALUES ('{id}','{md5}')"
)
uid = []
with open('./data/final_id.txt', 'r') as f:
for i in f.readlines():
uid.append(i.strip('\r\n'))
# 處理圖片數(shù)據(jù)
def handle_pic(pic_url):
large_pic_url = pic_url.replace('thumbnail', 'large')
large_bin = requests.get(large_pic_url)
return large_bin.content
def get_poiid_info(uid):
try:
url = 'https://api.weibo.com/2/statuses/user_timeline.json'
load = {
'access_token': 'xxxxxxxxxx',
'uid': uid,
'count': 100,
'feature': 2,
'trim_user': 1
}
get_info = requests.get(url=url, params=load, timeout=(10, 10))
if get_info.status_code != 200:
logging.warning(ConnectionError)
pass
info_json = json.loads(get_info.content)
info_json['uid'] = uid
statuses = info_json['statuses']
# 處理篩選微博數(shù)據(jù)
for status in statuses:
id = status['idstr']
if status['geo'] is not None:
lat = status['geo']['coordinates'][0]
lng = status['geo']['coordinates'][1]
pic_urls = status['pic_urls']
# 判斷是否在北京
if (115.7 < lng < 117.4) and (39.4 < lat < 41.6):
# 若在北京,插入blog數(shù)據(jù)進(jìn)庫
blog_text = status['text'].replace('\'', '\'\'')
created_time = status['created_at']
try:
cursor.execute(
insert_blog_sql.format(uid=uid, id=id, blog_text=blog_text, lat=lat, lng=lng,
created_time=created_time))
except pymysql.err.OperationalError as e_blog:
logging.warning(e_blog.args[1])
pass
# conn.commit()
# 處理圖片
for pic_url in pic_urls:
# 獲取原圖片二進(jìn)制數(shù)據(jù)
pic_bin = handle_pic(pic_url['thumbnail_pic'])
# 讀取exif 數(shù)據(jù)
pic_file = BytesIO(pic_bin) # 將二進(jìn)制數(shù)據(jù)轉(zhuǎn)化成文件對象便于讀取exif數(shù)據(jù)信息和生成MD5
tag1 = exifread.process_file(pic_file, details=False, strict=True)
tag = {}
for key, value in tag1.items():
if key not in (
'JPEGThumbnail', 'TIFFThumbnail', 'Filename',
'EXIF MakerNote'): # 去除四個(gè)不必要的exif屬性,簡化信息量
tag[key] = str(value)
tags = json.dumps(tag) # dumps為json類型 此tag即為exif的json數(shù)據(jù)
# 生成MD5
MD5 = hashlib.md5(pic_file.read()).hexdigest()
# 首先把二進(jìn)制圖片用base64 轉(zhuǎn)成字符串之后再存
try:
cursor.execute(
insert_pic_sql.format(pic_url=pic_url['thumbnail_pic'].replace('thumbnail', 'large'),
pic_bin=str(base64.b64encode(pic_bin))[2:-1], md5=MD5,
exif=tags))
except pymysql.err.OperationalError as e_pic:
logging.warning(e_pic.args[1])
pass
try:
cursor.execute(insert_relationship_sql.format(id=id, md5=MD5))
except pymysql.err.OperationalError as e_relation:
logging.warning(e_relation)
pass
conn.commit()
else:
logging.info(id + " is Not in Beijing")
pass
else:
logging.info(id + ' Geo is null')
pass
except pymysql.err.OperationalError as e:
logging.error(e.args[1])
pass
def judge_conn(i):
global conn
try:
conn.ping(True)
get_poiid_info(i)
except pymysql.err.OperationalError as e:
logging.error('Reconnect')
conn = pymysql.connect(host=db_host, user=db_user, passwd=db_pass, db=db, charset='utf8')
get_poiid_info(i)
def handle_tuple(a_tuple):
read_uid_set = []
for i in a_tuple:
read_uid_set.append(i[0])
return set(read_uid_set)
if __name__ == '__main__':
sql_find_uid = (
"SELECT userid FROM blog"
)
cursor.execute(sql_find_uid)
read_uid_tuple = cursor.fetchall()
read_list = handle_tuple(read_uid_tuple)
print(len(read_list))
new_uid = set(uid).difference(read_list)
print(len(new_uid))
pool = Pool()
pool.map(judge_conn, list(new_uid))
8aoy1.cn
免責(zé)聲明:本站發(fā)布的內(nèi)容(圖片、視頻和文字)以原創(chuàng)、轉(zhuǎn)載和分享為主,文章觀點(diǎn)不代表本網(wǎng)站立場,如果涉及侵權(quán)請聯(lián)系站長郵箱:is@yisu.com進(jìn)行舉報(bào),并提供相關(guān)證據(jù),一經(jīng)查實(shí),將立刻刪除涉嫌侵權(quán)內(nèi)容。