├── .gitignore ├── LICENSE ├── README.md ├── UGC_analysis ├── UGC_Analysis.py ├── crawler.txt ├── emo_pic.py ├── resource.txt ├── test_spider.py └── txt_analysis │ ├── __pycache__ │ ├── picturing.cpython-36.pyc │ └── spider.cpython-36.pyc │ ├── picturing.py │ └── spider.py └── pic ├── 价格.png ├── 分量.png ├── 味道.png ├── 情感分析.png ├── 程序结构.png ├── 统计.png ├── 配送.png └── 采集到的数据样式.png /.gitignore: -------------------------------------------------------------------------------- 1 | # Byte-compiled / optimized / DLL files 2 | __pycache__/ 3 | *.py[cod] 4 | *$py.class 5 | 6 | # C extensions 7 | *.so 8 | 9 | # Distribution / packaging 10 | .Python 11 | build/ 12 | develop-eggs/ 13 | dist/ 14 | downloads/ 15 | eggs/ 16 | .eggs/ 17 | lib/ 18 | lib64/ 19 | parts/ 20 | sdist/ 21 | var/ 22 | wheels/ 23 | *.egg-info/ 24 | .installed.cfg 25 | *.egg 26 | MANIFEST 27 | 28 | # PyInstaller 29 | # Usually these files are written by a python script from a template 30 | # before PyInstaller builds the exe, so as to inject date/other infos into it. 31 | *.manifest 32 | *.spec 33 | 34 | # Installer logs 35 | pip-log.txt 36 | pip-delete-this-directory.txt 37 | 38 | # Unit test / coverage reports 39 | htmlcov/ 40 | .tox/ 41 | .coverage 42 | .coverage.* 43 | .cache 44 | nosetests.xml 45 | coverage.xml 46 | *.cover 47 | .hypothesis/ 48 | .pytest_cache/ 49 | 50 | # Translations 51 | *.mo 52 | *.pot 53 | 54 | # Django stuff: 55 | *.log 56 | local_settings.py 57 | db.sqlite3 58 | 59 | # Flask stuff: 60 | instance/ 61 | .webassets-cache 62 | 63 | # Scrapy stuff: 64 | .scrapy 65 | 66 | # Sphinx documentation 67 | docs/_build/ 68 | 69 | # PyBuilder 70 | target/ 71 | 72 | # Jupyter Notebook 73 | .ipynb_checkpoints 74 | 75 | # pyenv 76 | .python-version 77 | 78 | # celery beat schedule file 79 | celerybeat-schedule 80 | 81 | # SageMath parsed files 82 | *.sage.py 83 | 84 | # Environments 85 | .env 86 | .venv 87 | env/ 88 | venv/ 89 | ENV/ 90 | env.bak/ 91 | venv.bak/ 92 | 93 | # Spyder project settings 94 | .spyderproject 95 | .spyproject 96 | 97 | # Rope project settings 98 | .ropeproject 99 | 100 | # mkdocs documentation 101 | /site 102 | 103 | # mypy 104 | .mypy_cache/ 105 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | MIT License 2 | 3 | Copyright (c) 2018 CarryChang 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy 6 | of this software and associated documentation files (the "Software"), to deal 7 | in the Software without restriction, including without limitation the rights 8 | to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 9 | copies of the Software, and to permit persons to whom the Software is 10 | furnished to do so, subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 17 | FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 18 | AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 19 | LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 20 | OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 21 | SOFTWARE. 22 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | [![996.icu](https://img.shields.io/badge/link-996.icu-red.svg)](https://996.icu) 2 | 3 | [![Stargazers over time](https://starchart.cc/CarryChang/UGC-Analysis.svg)](https://starchart.cc/CarryChang/UGC-Analysis) 4 | 5 | ##### 饿了么星选平台的 UGC 进行分析 6 | 1. 包括数据实时采集、预处理。 7 | 2. 基于词典的主题提取 8 | 3. Snownlp情感分析 9 | 4. 可视化 10 | 11 | 12 | ##### 程序结构为:![结构图](https://github.com/CarryChang/UGC-Analysis/blob/master/pic/%E7%A8%8B%E5%BA%8F%E7%BB%93%E6%9E%84.png) 13 | ##### UGC_Analysis.py利用Tkinter技术进行GUI设计,属于主文件,对spider.py网络爬虫程序和picturing.py数据可视化程序进行调度,它的作用就是将Spider.py文件采集出来的非结构化评论文本进行情感计算然后将计算出来的分数传给picturing.py进行可视化处理,然后将处理后的统计图传到主文件进行展示,对于spider.py采集出的结构化数据,如用户打分等结构化数据传给picturn.py进行统计绘图,然后传给主文件进行展示。 14 | ##### 本软件产生在互联网产业迅速发展的大背景下,随着在线购物平台、在线旅游平台等在线服务平台用户成数据呈现级数增长,平台上也会产生大量的的UGC(User Generated Content)用户生成内容,比如商品发表的评论、用户提交的照片、用户评分等等。UGC本身含有对于此项服务或产品的意见,对此进行意见挖掘可以帮助平台上的服务提供商进行必要的业务调整,平台对于UGC的展示能帮助消费者增加对于商品或服务的认知,但是大量UGC显示出的用户打分和评价出现不相符合的特征,为防止对潜在消费者出现误导,平台也需要对UGC进行必要的处理和展示,以此来显示平台本身以及销售商品的质量。所以本软件站在平台的角度,使用tkinter制作操作界面、使用matplotlib画出统计图,通过在线原始评论采集、评论情感计算并分类展示、以及对于用户打分、服务评分等结构化数据进行可视化等方式,以便于帮助平台更好的展示用户评论和售卖信息,增加商品销量。 15 | ![菜品分析](https://github.com/CarryChang/UGC-Analysis/blob/master/pic/统计.png) 16 | #### 本软件特点 17 | >1. 改造了网络爬虫,使用fake_useragent加入随机轮换模拟浏览器header来确保爬虫的稳定和高效爬取。 18 | ![情感分析](https://github.com/CarryChang/UGC-Analysis/blob/master/pic/采集到的数据样式.png) 19 | >2. 利用Snownlp作为评论情感分析的库,直接在输出框输出情感值。 20 | ![情感分析](https://github.com/CarryChang/UGC-Analysis/blob/master/pic/%E6%83%85%E6%84%9F%E5%88%86%E6%9E%90.png) 21 | >3. 利用词典的方式找出主题,便于实时对评论进行筛选。 22 | ![主题分析](https://github.com/CarryChang/UGC-Analysis/blob/master/pic/配送.png) 23 | ![味道分析](https://github.com/CarryChang/UGC-Analysis/blob/master/pic/味道.png) 24 | -------------------------------------------------------------------------------- /UGC_analysis/UGC_Analysis.py: -------------------------------------------------------------------------------- 1 | import re 2 | import tkinter as tk 3 | from threading import Thread 4 | from txt_analysis import picturing 5 | from txt_analysis.spider import crawl 6 | from txt_analysis import spider 7 | from snownlp import SnowNLP 8 | import json 9 | import matplotlib.pyplot as plt 10 | # 显示中文 11 | plt.rcParams['font.sans-serif']=['SimHei'] #用来正常显示中文标签 12 | plt.rcParams['axes.unicode_minus']=False #用来正常显示负号 13 | #有中文出现的情况,需要u'内容' 14 | import numpy as np 15 | # 设定变量参数 16 | ALL = "all comments" 17 | ALL1 = "all comments" 18 | positive = "good comments" 19 | medium = "medium comments" 20 | negative = "bad comments" 21 | taste_pos = " taste_pos" 22 | taste_neg = " taste_neg" 23 | taste = " taste" 24 | speed_pos = " speed_pos" 25 | speed_neg = " speed_neg" 26 | speed = " speed" 27 | weight_pos = " weight_pos" 28 | weight_neg = " weight_neg" 29 | weight = " weight" 30 | service_pos = " service_pos" 31 | service_neg = " service_neg" 32 | service = " service" 33 | price_pos = " price_pos" 34 | price_neg = " price_neg" 35 | price = " price" 36 | start = 7 37 | height = 30 38 | width = 60 39 | re_space = re.compile('(\s+)') 40 | all_direction = tk.E + tk.N + tk.W + tk.S 41 | result = None 42 | def get_result(): 43 | global result 44 | try: 45 | ##将地址传入数据采集,并将信息采集到本地 46 | result1 = spider.crawl(url_tv.get()) 47 | with open("resource.txt", "w", encoding="utf-8") as f: 48 | json.dump(result1, f, ensure_ascii=False,indent=2) 49 | result = crawl(url_tv.get()) 50 | prompt_text.set("在线评论数据采集完毕,可以进行后续分析") 51 | except ValueError: 52 | prompt_text.set("地址有误,请重新输入!") 53 | def data_collecting(): 54 | #点击输入框时,开启多线程,将信息发送给prompt_text 55 | prompt_text.set("正在采集信息,请稍后......") 56 | ############采集功能使用多线程来维护界面的流畅 57 | t = Thread(target=get_result) 58 | t.start() 59 | ###########使用传参 60 | def test_tag(parse_result, sentence, type_, foreground, i, check_tv, j): 61 | index = start 62 | if check_tv.get(): 63 | result = parse_result[type_] 64 | for a in result: 65 | index = sentence.index(a, index) 66 | text.tag_add("tag%d_%d" % (i, j), "%d.%d" % (i, index), "%d.%d" % (i, index + len(a))) 67 | text.tag_config("tag%d_%d" % (i, j), foreground=foreground) 68 | index += len(a) 69 | j += 1 70 | return j 71 | def text_tag_config(sentence, i): 72 | sentence = re_space.sub(r' ', sentence) 73 | # print(i, sentence) 74 | #########增加情感极性分数显示 75 | sentence = "%5d. %s" % (i, sentence) 76 | text.insert(tk.END, "%s\n" % sentence) 77 | # 增加对于积极和消极的极性评价 78 | def text_tag_config1(sentence, i, score): 79 | # sentence = re_space.sub(r' ', sentence) 80 | # print(i, sentence) 81 | #########增加情感极性分数显示 82 | sentence_1 = "%5d. %s %s" % (i, sentence, score) 83 | text.insert(tk.END, "%s\n" % sentence_1) 84 | ########进行情感趋势画图 85 | def emotion_analysis(which): 86 | col = 20 87 | if result is not None: 88 | sentiments_list = [] 89 | text.delete(1.0, tk.END) 90 | comments = result["content"] 91 | scores = result["score"] 92 | comments = [comments[a[0]] for a in result["useful_comment_id"]] 93 | scores = [scores[a[0]] for a in result["useful_comment_id"]] 94 | if which == ALL1: 95 | j_1 = 0 96 | for i in range(len(comments)): 97 | print(comments[i]) 98 | s = SnowNLP(comments[i]) 99 | # score = round(s.sentiments,3) 100 | score = s.sentiments 101 | sentiments_list.append(score) 102 | j_1 += 1 103 | text_tag_config1(comments[i], j_1, score) 104 | plt.hist(sentiments_list, bins=col) 105 | plt.xlabel("情感值") 106 | plt.ylabel("评论数目") 107 | plt.title('整体情感极性分布图') 108 | plt.show() 109 | plt.close() 110 | def emotion_pic(which): 111 | col = 20 112 | # 使用关键字找取对应的评论 113 | taste_keywords = ["闻","口感","吃",'喝','尝','味','下饭','咬','味道','卖相','新鲜','嫩'] 114 | speed_keywords = ["送",'速','到达','快','慢','催','快递','送达'] 115 | weight_keywords = ['量','分量','轻重','斤','少'] 116 | service_keywords = ["服务","态度",'语气'] 117 | price_keywords = ["价格", "钱",'实惠','价','贵','便宜'] 118 | if result is not None: 119 | sentiments_list = [] 120 | text.delete(1.0, tk.END) 121 | comments = result["content"] 122 | scores = result["score"] 123 | comments = [comments[a[0]] for a in result["useful_comment_id"]] 124 | scores = [scores[a[0]] for a in result["useful_comment_id"]] 125 | if which == taste: 126 | j1 = 1 127 | for i in range(len(scores)): 128 | for keyword in taste_keywords: 129 | if keyword in comments[i]: 130 | ###对符合条件的抓取出来,并计算情感极性 131 | s = SnowNLP(comments[i]) 132 | score = s.sentiments 133 | sentiments_list.append(score) 134 | text_tag_config1(comments[i], j1, score) 135 | j1 += 1 136 | plt.hist(sentiments_list, bins=col) 137 | plt.xlabel("情感值") 138 | plt.ylabel("评论数目") 139 | plt.title('味道情感极性分布图') 140 | plt.show() 141 | plt.close() 142 | elif which == speed: 143 | j2 = 1 144 | for i in range(len(scores)): 145 | for keyword in speed_keywords: 146 | if keyword in comments[i]: 147 | ###对符合条件的抓取出来,并计算情感极性 148 | s = SnowNLP(comments[i]) 149 | score = s.sentiments 150 | sentiments_list.append(score) 151 | text_tag_config1(comments[i], j2, score) 152 | j2 += 1 153 | plt.hist(sentiments_list, bins=col) 154 | plt.xlabel("情感值") 155 | plt.ylabel("评论数目") 156 | plt.title('配送情感极性分布图') 157 | plt.show() 158 | plt.close() 159 | elif which == weight: 160 | j3 = 1 161 | for i in range(len(scores)): 162 | for keyword in weight_keywords: 163 | if keyword in comments[i]: 164 | ###对符合条件的抓取出来,并计算情感极性 165 | s = SnowNLP(comments[i]) 166 | score = s.sentiments 167 | sentiments_list.append(score) 168 | text_tag_config1(comments[i], j3, score) 169 | j3 += 1 170 | plt.hist(sentiments_list, bins=col) 171 | plt.xlabel("情感值") 172 | plt.ylabel("评论数目") 173 | plt.title('份量情感极性分布图') 174 | plt.show() 175 | plt.close() 176 | elif which == service: 177 | j4 = 1 178 | for i in range(len(scores)): 179 | for keyword in service_keywords: 180 | if keyword in comments[i]: 181 | ###对符合条件的抓取出来,并计算情感极性 182 | s = SnowNLP(comments[i]) 183 | score = s.sentiments 184 | sentiments_list.append(score) 185 | text_tag_config1(comments[i], j4, score) 186 | j4 += 1 187 | plt.hist(sentiments_list, bins=col) 188 | plt.xlabel("情感值") 189 | plt.ylabel("评论数目") 190 | plt.title('服务情感极性分布图') 191 | plt.show() 192 | plt.close() 193 | elif which == price: 194 | j5 = 1 195 | for i in range(len(scores)): 196 | for keyword in price_keywords: 197 | if keyword in comments[i]: 198 | ###对符合条件的抓取出来,并计算情感极性 199 | s = SnowNLP(comments[i]) 200 | score = s.sentiments 201 | sentiments_list.append(score) 202 | text_tag_config1(comments[i], j5, score) 203 | j5 += 1 204 | plt.hist(sentiments_list, bins=col) 205 | plt.xlabel("情感值") 206 | plt.ylabel("评论数目") 207 | plt.title('价格情感极性分布图') 208 | plt.show() 209 | plt.close() 210 | def all_display(which): 211 | if result is not None: 212 | text.delete(1.0, tk.END) 213 | comments = result["content"] 214 | if which == ALL: 215 | i = 1 216 | for comment in comments: 217 | text_tag_config(comment, i) 218 | i += 1 219 | def all_button_event(which): 220 | # 使用关键字找取对应的评论 221 | taste_keywords = ["闻","口感","吃",'喝','尝','味','下饭','咬','味道','卖相','新鲜','嫩'] 222 | speed_keywords = ["送",'速','到达','快','慢','催','快递','送达'] 223 | weight_keywords = ['量','分量','轻重','斤','少'] 224 | service_keywords = ["服务","态度",'语气'] 225 | price_keywords = ["价格", "钱",'实惠','价','贵','便宜'] 226 | if result is not None: 227 | text.delete(1.0, tk.END) 228 | comments = result["content"] 229 | scores = result["score"] 230 | comments = [comments[a[0]] for a in result["useful_comment_id"]] 231 | scores = [scores[a[0]] for a in result["useful_comment_id"]] 232 | # 按照评分和情感分数(snownlp展示)展示评论极性分数 233 | if which == positive: 234 | j6 = 1 235 | for i in range(len(scores)): 236 | s = SnowNLP(comments[i]) 237 | score = s.sentiments 238 | if scores[i] >= 4 and score > 0.7: 239 | text_tag_config1(comments[i], j6, score) 240 | j6 += 1 241 | elif which == medium: 242 | j7 = 1 243 | for i in range(len(scores)): 244 | s = SnowNLP(comments[i]) 245 | score = s.sentiments 246 | if 2 <= scores[i] < 4 and 0.3 <= score <= 0.7: 247 | text_tag_config1(comments[i], j7, score) 248 | j7 += 1 249 | elif which == negative: 250 | j8 = 1 251 | for i in range(len(scores)): 252 | s = SnowNLP(comments[i]) 253 | score = s.sentiments 254 | if scores[i] < 2 and score < 0.3: 255 | text_tag_config1(comments[i], j8, score) 256 | j8 += 1 257 | ###################后续分类打分 258 | elif which == taste_pos: 259 | j9 = 1 260 | for i in range(len(scores)): 261 | for keyword in taste_keywords: 262 | if keyword in comments[i]: 263 | ###对符合条件的抓取出来,并计算情感极性 264 | s = SnowNLP(comments[i]) 265 | score = s.sentiments 266 | if scores[i] >= 4 and score > 0.5: 267 | text_tag_config1(comments[i], j9, score) 268 | j9 += 1 269 | elif which == taste_neg: 270 | j10 = 1 271 | for i in range(len(scores)): 272 | for keyword in taste_keywords: 273 | if keyword in comments[i]: 274 | ###对符合条件的抓取出来,并计算情感极性 275 | s = SnowNLP(comments[i]) 276 | score = s.sentiments 277 | if scores[i] < 4 and score <= 0.5: 278 | text_tag_config1(comments[i], j10, score) 279 | j10 += 1 280 | elif which == speed_pos: 281 | j11 = 1 282 | for i in range(len(scores)): 283 | for keyword in speed_keywords: 284 | if keyword in comments[i]: 285 | ###对符合条件的抓取出来,并计算情感极性 286 | s = SnowNLP(comments[i]) 287 | score = s.sentiments 288 | if scores[i] >= 4 and score > 0.5: 289 | text_tag_config1(comments[i], j11, score) 290 | j11 += 1 291 | elif which == speed_neg: 292 | j12 = 1 293 | for i in range(len(scores)): 294 | for keyword in speed_keywords: 295 | if keyword in comments[i]: 296 | ###对符合条件的抓取出来,并计算情感极性 297 | s = SnowNLP(comments[i]) 298 | score = s.sentiments 299 | if scores[i] < 4 and score <= 0.5: 300 | text_tag_config1(comments[i], j12, score) 301 | j12 += 1 302 | elif which == weight_pos: 303 | j13 = 1 304 | for i in range(len(scores)): 305 | for keyword in weight_keywords: 306 | if keyword in comments[i]: 307 | ###对符合条件的抓取出来,并计算情感极性 308 | s = SnowNLP(comments[i]) 309 | score = s.sentiments 310 | if scores[i] >= 4 and score > 0.5: 311 | text_tag_config1(comments[i], j13, score) 312 | j13 += 1 313 | elif which == weight_neg: 314 | j14 = 1 315 | for i in range(len(scores)): 316 | for keyword in weight_keywords: 317 | if keyword in comments[i]: 318 | ###对符合条件的抓取出来,并计算情感极性 319 | s = SnowNLP(comments[i]) 320 | score = s.sentiments 321 | if scores[i] < 4 and score <= 0.5: 322 | text_tag_config1(comments[i], j14, score) 323 | j14 += 1 324 | elif which == service_pos: 325 | j15 = 1 326 | for i in range(len(scores)): 327 | for keyword in service_keywords: 328 | if keyword in comments[i]: 329 | ###对符合条件的抓取出来,并计算情感极性 330 | s = SnowNLP(comments[i]) 331 | score = s.sentiments 332 | if scores[i] >= 4 and score > 0.5: 333 | text_tag_config1(comments[i], j15, score) 334 | j15 += 1 335 | elif which == service_neg: 336 | j16 = 1 337 | for i in range(len(scores)): 338 | for keyword in service_keywords: 339 | if keyword in comments[i]: 340 | ###对符合条件的抓取出来,并计算情感极性 341 | s = SnowNLP(comments[i]) 342 | score = s.sentiments 343 | if scores[i] < 4 and score <= 0.5: 344 | text_tag_config1(comments[i], j16, score) 345 | j16 += 1 346 | elif which == price_pos: 347 | j17 = 1 348 | for i in range(len(scores)): 349 | for keyword in price_keywords: 350 | if keyword in comments[i]: 351 | ###对符合条件的抓取出来,并计算情感极性 352 | s = SnowNLP(comments[i]) 353 | score = s.sentiments 354 | if scores[i] >= 4 and score > 0.5: 355 | text_tag_config1(comments[i], j17, score) 356 | j17 += 1 357 | elif which == price_neg: 358 | j18 = 1 359 | for i in range(len(scores)): 360 | for keyword in price_keywords: 361 | if keyword in comments[i]: 362 | ###对符合条件的抓取出来,并计算情感极性 363 | s = SnowNLP(comments[i]) 364 | score = s.sentiments 365 | if scores[i] < 4 and score <= 0.5: 366 | text_tag_config1(comments[i], j18, score) 367 | j18 += 1 368 | root = tk.Tk() 369 | root.resizable(False, False) 370 | # 开始文本处理 371 | frame1 = tk.Frame(root, bd=1, relief=tk.SUNKEN) 372 | frame1.pack(fill=tk.BOTH, expand=tk.YES, anchor=tk.CENTER) 373 | # 输入框定义 374 | row_num = 0 375 | url_tv = tk.StringVar() 376 | url_tv_column_span = 9 377 | tk.Entry(frame1,textvariable=url_tv).grid( 378 | row=row_num, column=0, columnspan=url_tv_column_span,padx=2, sticky=all_direction) 379 | ####绑定按钮事件,将采集绑定到多线程开始采集按钮 380 | tk.Button(frame1, text="开始采集",command=data_collecting).grid( 381 | row=row_num, column=url_tv_column_span, sticky=all_direction) 382 | row_num = 1 383 | prompt_text = tk.StringVar() 384 | tk.Label(frame1, textvariable=prompt_text).grid(row=row_num, column=0, columnspan=8, pady=5, sticky=all_direction) 385 | prompt_text.set("请在上面输入数据采集的链接") 386 | # 设置按钮 387 | row_num = 2 388 | columnspan = 2 389 | tk.Button(frame1, text="所有评论展示", command=lambda: all_display(ALL)).grid( 390 | row=row_num, column=columnspan * 0, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 391 | ##使用commands绑定动作 392 | tk.Button(frame1, text="总体情感趋势", command=lambda: emotion_analysis(ALL1)).grid( 393 | row=row_num, column=columnspan * 1, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 394 | tk.Button(frame1, text="积极评论分析", command=lambda: all_button_event(positive)).grid( 395 | row=row_num, column=columnspan * 2, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 396 | tk.Button(frame1, text="一般评论分析", command=lambda: all_button_event(medium)).grid( 397 | row=row_num, column=columnspan * 3, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 398 | tk.Button(frame1, text="消极评论分析", command=lambda: all_button_event(negative)).grid( 399 | row=row_num, column=columnspan * 4, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 400 | row_num = 3 401 | tk.Button(frame1, text="味道积极评论", command=lambda: all_button_event(taste_pos)).grid( 402 | row=row_num, column=columnspan * 0, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 403 | tk.Button(frame1, text="配送积极评论", command=lambda: all_button_event(speed_pos)).grid( 404 | row=row_num, column=columnspan * 1, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 405 | tk.Button(frame1, text="份量积极评论", command=lambda: all_button_event(weight_pos)).grid( 406 | row=row_num, column=columnspan * 2, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 407 | tk.Button(frame1, text="服务积极评论", command=lambda: all_button_event(service_pos)).grid( 408 | row=row_num, column=columnspan * 3, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 409 | tk.Button(frame1, text="价格积极评论", command=lambda: all_button_event(price_pos)).grid( 410 | row=row_num, column=columnspan * 4, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 411 | row_num = 4 412 | tk.Button(frame1, text="味道消极评论", command=lambda: all_button_event(taste_neg)).grid( 413 | row=row_num, column=columnspan * 0, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 414 | tk.Button(frame1, text="配送消极评论", command=lambda: all_button_event(speed_neg)).grid( 415 | row=row_num, column=columnspan * 1, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 416 | tk.Button(frame1, text="份量消极评论", command=lambda: all_button_event(weight_neg)).grid( 417 | row=row_num, column=columnspan * 2, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 418 | tk.Button(frame1, text="服务消极评论", command=lambda: all_button_event(service_neg)).grid( 419 | row=row_num, column=columnspan * 3, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 420 | tk.Button(frame1, text="价格消极评论", command=lambda: all_button_event(price_neg)).grid( 421 | row=row_num, column=columnspan * 4, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 422 | row_num = 5 423 | tk.Button(frame1, text="味道情感趋势", command=lambda: emotion_pic(taste)).grid( 424 | row=row_num, column=columnspan * 0, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 425 | tk.Button(frame1, text="配送情感趋势", command=lambda: emotion_pic(speed)).grid( 426 | row=row_num, column=columnspan * 1, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 427 | tk.Button(frame1, text="份量情感趋势", command=lambda: emotion_pic(weight)).grid( 428 | row=row_num, column=columnspan * 2, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 429 | tk.Button(frame1, text="服务情感趋势", command=lambda: emotion_pic(service)).grid( 430 | row=row_num, column=columnspan * 3, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 431 | tk.Button(frame1, text="价格情感趋势", command=lambda: emotion_pic(price)).grid( 432 | row=row_num, column=columnspan * 4, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 433 | #统计分析 434 | frame0 = tk.LabelFrame(root, text="评论数字图表区", padx=2, pady=2, relief=tk.GROOVE) 435 | frame0.pack(fill=tk.BOTH, expand=tk.YES) 436 | columnspan = 7 437 | # 调节按钮之间的行距,可视化文本之中的数据 438 | ##配送时间汇总,推荐商品汇总,终端分布,质量分布,按钮绑定事件,row表示行数 439 | tk.Button(frame0, text="整体评分分析", command=lambda: picturing.score_detail(result)).grid( 440 | row=1, column=columnspan * 4, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 441 | tk.Button(frame0, text="指标评分分析", command=lambda: picturing.average_score(result)).grid( 442 | row=1, column=columnspan * 0, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 443 | tk.Button(frame0, text="终端统计分析", command=lambda: picturing.s_from(result)).grid( 444 | row=1, column=columnspan * 1, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 445 | tk.Button(frame0, text="热推商品展示", command=lambda: picturing.recommend_dishes2(result)).grid( 446 | row=1, column=columnspan * 2, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 447 | tk.Button(frame0, text="配送时间分析", command=lambda: picturing.cost_time(result)).grid( 448 | row=1, column=columnspan * 3, columnspan=columnspan, padx=2, pady=2, sticky=all_direction) 449 | # 开始文本处理 450 | frame2 = tk.Frame(root, bd=1, relief=tk.SUNKEN) 451 | frame2.pack(fill=tk.BOTH, expand=tk.YES, anchor=tk.CENTER) 452 | row_num = 5 453 | #文本框更改hight和width更改大小 454 | text = tk.Text(frame2, height=30, width=56) 455 | text.grid(row=row_num, column=0, columnspan=11, padx=10, pady=10) 456 | scrollbar = tk.Scrollbar(frame2, orient=tk.VERTICAL, command=text.yview) 457 | scrollbar.grid(row=row_num, column=11, rowspan=1, sticky=all_direction) 458 | text.configure(yscrollcommand=scrollbar.set) 459 | root.title('情感分析在电商评价中的应用demo') 460 | root.mainloop() -------------------------------------------------------------------------------- /UGC_analysis/crawler.txt: -------------------------------------------------------------------------------- 1 | {"service_score": [5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 4, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 4, 5, 3, 5, 5, 4, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 2, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 3, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 3, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 3, 4, 5, 5, 5, 4], "useful_comment_id": [[1, 1], [3, 1], [4, 1], [10, 1], [11, 1], [13, 1], [14, 1], [16, 1], [24, 1], [25, 1], [26, 1], [27, 1], [29, 1], [31, 1], [32, 1], [33, 1], [34, 1], [35, 1], [38, 1], [39, 1], [40, 1], [43, 1], [46, 1], [48, 1], [49, 1], [50, 1], [51, 1], [52, 1], [56, 1], [57, 1], [58, 1], [59, 1], [60, 1], [61, 1], [62, 1], [63, 1], [66, 1], [67, 1], [71, 1], [72, 1], [74, 1], [75, 1], [76, 1], [77, 1], [78, 1], [80, 1], [82, 1], [83, 1], [84, 1], [85, 1], [88, 1], [89, 1], [91, 1], [93, 1], [94, 1], [95, 1], [96, 1], [97, 1], [98, 1], [99, 1], [100, 1], [101, 1], [102, 1], [103, 1], [104, 1], [106, 1], [107, 1], [109, 1], [110, 1], [111, 1], [112, 1], [114, 1], [116, 1], [117, 1], [118, 1], [119, 1], [120, 1], [122, 1], [124, 1], [125, 1], [127, 1], [128, 1], [129, 1], [131, 1], [134, 1], [137, 1], [138, 1], [140, 1], [141, 1], [142, 1], [144, 1], [146, 1], [149, 1], [153, 1], [157, 1], [159, 1], [164, 1], [165, 1], [170, 1], [173, 1], [175, 1], [176, 1], [177, 1], [179, 1], [180, 1], [187, 1], [188, 1], [190, 1], [191, 1], [192, 1], [193, 1], [195, 1], [199, 1], [201, 1], [202, 1], [204, 1], [212, 1], [215, 1], [216, 1], [218, 1], [219, 1], [220, 1], [222, 1], [223, 1], [224, 1], [227, 1], [228, 0], [231, 1], [232, 1], [233, 1], [235, 1], [241, 1], [243, 1], [244, 1], [245, 1], [247, 1], [250, 1], [251, 1], [252, 1], [257, 1], [258, 1], [259, 1], [260, 1], [261, 1], [262, 1], [263, 1], [265, 1], [266, 1], [269, 1], [270, 1], [271, 1], [272, 1], [273, 1], [274, 1], [275, 1], [276, 1], [277, 1], [278, 1], [279, 1], [284, 1], [286, 1], [287, 1], [288, 1], [291, 1], [292, 1], [293, 1], [294, 1], [295, 1], [297, 1], [298, 1], [300, 1], [302, 1], [305, 1], [307, 1], [308, 1], [309, 1], [311, 1], [317, 1], [318, 1], [319, 1], [320, 1], [321, 1], [322, 1], [324, 1], [326, 1], [327, 1], [330, 1], [334, 1], [335, 1], [339, 1], [341, 1], [342, 1], [346, 1], [348, 1], [349, 1], [351, 1], [352, 1], [353, 0], [354, 1], [355, 1], [356, 1], [357, 1], [359, 1], [361, 1], [363, 1], [364, 1], [365, 1], [366, 1], [367, 1], [368, 1], [369, 1], [370, 1], [371, 1], [372, 1], [373, 1], [374, 1], [379, 1], [380, 1], [381, 1], [382, 1], [383, 1], [384, 1], [385, 1], [387, 1], [388, 1], [389, 1], [390, 1], [391, 1]], "recommend_dishes": {"2磅四重奏生日蛋糕": 4, "榴芒双拼节日蛋糕(2磅)": 4, "芒果千层(10块装)": 5, "浪漫果纷鲜果蛋糕_1磅_134800": 1, "1.5磅小熊先生芒果慕斯蛋糕": 1, "潘多拉之心蛋糕_2磅_121671": 1, "芒果千层下午茶": 2, "莓果乐悠悠下午茶(约8寸)": 1, "萌趣咕咕蛋糕": 1, "芒果拿破仑9块装限时优惠": 1, "(芒果千层+榴莲千层)套餐": 2, "芒果千层10块装": 6, "芒果茫茫生日蛋糕-2磅": 1, "雪顶榴心(约8寸,共9块)": 3, "2磅爱尔兰百利甜榛果蛋糕": 1, "榴莲千层(10块装)": 10, "雪顶榴心9块装限时特惠": 1, "举个栗子(约8寸,共9块)": 2, "2磅四重奏蛋糕": 1, "2磅草莓白天使蛋糕": 3, "黑白恋曲_2磅": 1, "(特价)榴莲千层10块装": 1, "榴莲千层10块装": 9, "雪顶榴心(9块装)": 2, "芒果拿破仑9块装": 5, "2磅草莓乐园蛋糕": 1, "举个栗子(9块装)": 1, "2磅四重奏生日蛋糕C": 2, "芒果拿破仑(9块装)": 3, "2磅浪漫果纷蛋糕": 3, "榴莲千层(约8寸/10块装)": 23, "四重奏蛋糕C_2磅": 2, "鲜果嘉年华蛋糕(2磅)": 1, "雪顶榴心9块装s": 1, "单恋黑森林_2磅": 1, "森林果王鲜果蛋糕C_2磅": 1, "全心全意鲜果蛋糕C_2磅_105579": 1, "3磅四重奏生日蛋糕": 2, "鲜果嘉年华蛋糕": 1, "芒果拿破仑下午茶": 6, "2磅黄金彼岸蛋糕": 1, "芒果千层(约8寸/10块装)": 9, "海盐脆脆下午茶(9块装)": 1, "2磅芒果茫茫鲜果蛋糕": 2, "香醇魔方(约8寸,共9块)": 3, "四重奏生日蛋糕 2磅": 3, "榴莲千层(10块装C)": 6, "2磅全心全意水果蛋糕": 3, "芒果千层+雪顶榴心": 1, "榴莲香雪生日蛋糕C_2磅_101013": 1, "2磅莓园草莓蛋糕": 2, "森林果王鲜果蛋糕C_3磅": 2, "芒果千层(10块装C)": 2, "四重奏蛋糕": 1, "四重奏蛋糕_2磅": 1, "2磅榴芒双拼节日蛋糕(新款)": 2, "两情相悦_2磅": 1, "2磅芒果茫茫生日蛋糕": 1, "草莓千层(约8寸/10块装)": 2}, "average_score": {"average_service_score": 4.9, "average_dish_score": 4.8, "average_score": 4.9}, "score": [3, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 3, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 4, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 4, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 3, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 3, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 4, 5, 5, 5, 5, 5, 3, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 3, 4, 5, 5, 5, 4], "comment_num": 392, "weeks_score": {"last_three_week": 3.0, "last_two_week": 0.0, "last_one_week": 0.0}, "score_detail": {"2": 0, "3": 6, "4": 23, "5": 363, "1": 0}, "cost_time": [420, 421, 180, 421, 420, 3811, 421, 420, 421, 2039, 421, 203, 420, 421, 421, 324, 421, 420, 445, 420, 567, 669, 179, 421, 307, 420, 421, 1202, 778, 180, 420, 421, 420, 420, 230, 421, 700, 945, 310, 5330, 421, 180, 421, 421, 179, 420, 239, 420, 420, 179, 634, 420, 179, 1787, 0, 420, 420, 181, 646, 421, 598, 420, 420, 3332, 462, 585, 181, 430, 420, 180, 270, 419, 179, 420, 420, 420, 420, 830, 32, 420, 652, 546, 420, 1558, 1260, 421, 420, 419, 421, 420, 420, 120, 180, 421, 180, 180, 421, 241, 0, 421, 421, 421, 420, 420, 420, 420, 1197, 421, 420, 420, 39, 473, 289, 2485, 421, 180, 420, 522, 1378, 185, 420, 421, 419, 421, 421, 5955, 647, 179, 0, 421, 420, 179, 180, 420, 179, 421, 421, 1694, 420, 419, 1702, 330, 420, 419, 640, 0, 246, 186, 421, 421, 179, 420, 420, 421, 421, 208, 420, 673, 180, 484, 420, 1035, 1517, 179, 591, 420, 180, 3095, 420, 420, 421, 420, 420, 421, 179, 179, 420, 421, 1314, 0, 421, 179, 420, 421, 420, 421, 421, 421, 421, 420, 421, 420, 1899, 420, 572, 421, 1061, 180, 1648, 420, 421, 161, 180, 164, 288, 420, 642, 4534, 420, 81, 443, 0, 421, 179, 516, 139, 421, 113, 512, 118, 423, 137, 236, 420, 421, 82, 421, 185, 254, 420, 420, 420, 421, 420, 153, 173, 104, 108, 147, 423, 420, 355, 201, 1596, 378, 106, 329, 115, 119, 421, 1511, 84, 336, 113, 180, 140, 119, 180, 251, 420, 374, 89, 158, 420, 192, 136, 195, 420, 275, 474, 420, 94, 179, 98, 179, 421, 547, 59, 420, 1582, 179, 421, 153, 183, 175, 520, 358, 420, 179, 67, 420, 111, 152, 1529, 234, 80, 421, 0, 84, 421, 300, 407, 145, 271, 121, 138, 142, 130, 116, 174, 421, 179, 216, 344, 421, 541, 0, 129, 1263, 420, 231, 541, 207, 131, 541, 2132, 540, 540, 2012, 540, 252, 701, 220, 4860, 571, 299, 540, 115, 540, 541, 445, 540, 351, 541, 541, 541, 0, 540, 540, 3558, 0, 581, 1067, 540, 142, 541, 0, 540, 541, 541, 541, 277, 176, 332, 333, 139, 0, 300, 233, 295, 540, 246, 108, 82, 297, 540, 540, 299, 0, 303, 325, 86, 540, 540, 540, 174, 541, 540, 541, 1157, 299, 366], "rubbish_comment_id": [[0, 0], [2, 1], [5, 1], [6, 1], [7, 1], [8, 1], [9, 1], [12, 1], [15, 1], [17, 1], [18, 1], [19, 1], [20, 1], [21, 1], [22, 1], [23, 1], [28, 1], [30, 1], [36, 1], [37, 1], [41, 1], [42, 0], [44, 1], [45, 1], [47, 1], [53, 1], [54, 1], [55, 1], [64, 1], [65, 1], [68, 1], [69, 1], [70, 1], [73, 1], [79, 1], [81, 1], [86, 1], [87, 1], [90, 1], [92, 1], [105, 1], [108, 1], [113, 1], [115, 1], [121, 1], [123, 1], [126, 1], [130, 1], [132, 1], [133, 1], [135, 1], [136, 1], [139, 1], [143, 1], [145, 1], [147, 1], [148, 1], [150, 1], [151, 0], [152, 1], [154, 1], [155, 1], [156, 1], [158, 1], [160, 1], [161, 1], [162, 1], [163, 1], [166, 1], [167, 1], [168, 1], [169, 1], [171, 1], [172, 1], [174, 1], [178, 1], [181, 1], [182, 1], [183, 1], [184, 1], [185, 1], [186, 1], [189, 1], [194, 1], [196, 1], [197, 1], [198, 1], [200, 1], [203, 1], [205, 1], [206, 1], [207, 1], [208, 1], [209, 1], [210, 1], [211, 1], [213, 1], [214, 1], [217, 1], [221, 1], [225, 1], [226, 1], [229, 1], [230, 1], [234, 1], [236, 1], [237, 1], [238, 1], [239, 1], [240, 1], [242, 1], [246, 1], [248, 1], [249, 1], [253, 1], [254, 1], [255, 1], [256, 1], [264, 1], [267, 1], [268, 1], [280, 1], [281, 1], [282, 1], [283, 1], [285, 1], [289, 1], [290, 1], [296, 1], [299, 1], [301, 1], [303, 1], [304, 1], [306, 1], [310, 1], [312, 1], [313, 1], [314, 1], [315, 1], [316, 1], [323, 1], [325, 1], [328, 1], [329, 1], [331, 1], [332, 1], [333, 1], [336, 1], [337, 1], [338, 1], [340, 1], [343, 1], [344, 1], [345, 1], [347, 1], [350, 1], [358, 1], [360, 1], [362, 1], [375, 1], [376, 1], [377, 1], [378, 1], [386, 0]], "sfrom": ["iphone", "bn-ios", "android", "bn-android", "bn-android", "bn-android", "bn-android", "bn-android", "bn-android", "bn-ios", "bn-android", "iphone", "bn-android", "bn-android", "bn-ios", "android", "bn-ios", "iphone", "iphone", "bn-android", "android", "android", "android", "iphone", "android", "bn-android", "iphone", "iphone", "iphone", "android", "iphone", "iphone", "bn-android", "iphone", "bn-ios", "bn-android", "bn-android", "iphone", "bn-ios", "iphone", "bn-ios", "android", "bn-ios", "iphone", "android", "iphone", "android", "iphone", "iphone", "android", "iphone", "bn-ios", "android", "iphone", "android", "iphone", "bn-android", "android", "bn-android", "iphone", "bn-ios", "iphone", "bn-android", "iphone", "bn-ios", "bn-ios", "webapp", "bn-android", "bn-ios", "android", "bn-ios", "bn-android", "android", "bn-android", "iphone", "bn-android", "bn-ios", "bn-ios", "android", "bn-ios", "bn-android", "iphone", "iphone", "bn-android", "webapp", "iphone", "bn-ios", "iphone", "iphone", "iphone", "bn-android", "webapp", "android", "bn-ios", "android", "android", "iphone", "bn-ios", "android", "bn-ios", "bn-ios", "bn-android", "bn-android", "bn-android", "bn-android", "iphone", "bn-ios", "bn-ios", "iphone", "bn-ios", "bn-ios", "bn-ios", "bn-ios", "bn-android", "iphone", "android", "bn-android", "bn-ios", "bn-android", "bn-ios", "bn-android", "iphone", "bn-ios", "iphone", "bn-android", "bn-android", "bn-ios", "android", "android", "iphone", "iphone", "android", "android", "iphone", "android", "bn-ios", "iphone", "iphone", "iphone", "iphone", "bn-android", "android", "bn-ios", "bn-ios", "bn-ios", "android", "android", "android", "bn-ios", "bn-ios", "android", "iphone", "bn-android", "bn-ios", "bn-ios", "bn-ios", "bn-ios", "bn-ios", "android", "bn-android", "iphone", "bn-android", "bn-android", "android", "bn-ios", "bn-android", "android", "bn-ios", "bn-android", "iphone", "bn-android", "bn-android", "iphone", "bn-android", "android", "android", "bn-ios", "bn-android", "iphone", "bn-android", "bn-ios", "android", "bn-ios", "iphone", "bn-ios", "bn-ios", "iphone", "bn-ios", "bn-ios", "iphone", "bn-android", "bn-ios", "iphone", "bn-android", "iphone", "iphone", "iphone", "android", "bn-ios", "iphone", "bn-android", "bn-ios", "android", "iphone", "bn-ios", "iphone", "iphone", "bn-ios", "bn-android", "bn-ios", "iphone", "android", "bn-ios", "android", "bn-android", "bn-ios", "bn-android", "bn-android", "bn-ios", "iphone", "iphone", "bn-android", "bn-ios", "iphone", "iphone", "iphone", "bn-android", "android", "iphone", "bn-ios", "iphone", "bn-ios", "bn-android", "bn-android", "bn-android", "bn-android", "android", "android", "bn-ios", "bn-ios", "iphone", "iphone", "iphone", "bn-ios", "bn-ios", "iphone", "iphone", "bn-ios", "iphone", "iphone", "iphone", "bn-ios", "bn-android", "iphone", "android", "android", "android", "android", "bn-android", "bn-android", "bn-android", "bn-ios", "bn-ios", "bn-android", "iphone", "bn-android", "bn-android", "iphone", "bn-ios", "bn-ios", "bn-ios", "iphone", "android", "iphone", "android", "bn-android", "android", "bn-android", "bn-android", "bn-ios", "android", "bn-ios", "android", "iphone", "iphone", "iphone", "iphone", "bn-android", "iphone", "iphone", "bn-android", "iphone", "bn-ios", "bn-android", "iphone", "bn-ios", "bn-ios", "android", "iphone", "iphone", "iphone", "iphone", "android", "iphone", "bn-ios", "bn-ios", "iphone", "iphone", "iphone", "bn-ios", "bn-ios", "android", "bn-ios", "android", "bn-ios", "bn-ios", "android", "bn-ios", "iphone", "iphone", "bn-android", "bn-android", "android", "bn-ios", "iphone", "bn-ios", "iphone", "bn-ios", "iphone", "bn-ios", "bn-android", "iphone", "bn-ios", "bn-ios", "bn-ios", "iphone", "bn-ios", "bn-ios", "bn-ios", "bn-ios", "iphone", "bn-ios", "bn-ios", "bn-android", "bn-android", "bn-android", "android", "iphone", "bn-android", "bn-ios", "android", "bn-ios", "bn-ios", "bn-android", "bn-ios", "bn-android", "android", "bn-android", "bn-android", "bn-android", "iphone", "bn-android", "android", "bn-ios", "iphone", "bn-ios", "android", "android", "bn-ios", "bn-android", "iphone", "iphone", "iphone", "bn-ios", "bn-ios", "iphone", "iphone", "iphone", "android", "bn-ios", "iphone", "iphone", "bn-android", "iphone", "bn-android", "iphone", "iphone", "bn-android", "bn-ios", "bn-android", "iphone", "iphone"], "create_time": ["2018-06-11 18:56:56", "2018-05-11 17:11:07", "2018-05-09 09:03:39", "2018-04-27 18:35:49", "2018-04-18 22:31:54", "2018-04-16 08:50:14", "2018-04-14 19:31:30", "2018-04-05 11:24:23", "2018-04-02 21:52:44", "2018-03-31 11:38:15", "2018-03-26 10:01:56", "2018-03-25 20:38:43", "2018-03-13 09:07:06", "2018-03-11 21:25:49", "2018-03-07 09:07:23", "2018-03-03 08:54:44", "2018-02-13 14:45:23", "2018-02-12 10:34:28", "2018-02-08 12:18:29", "2018-02-05 16:09:01", "2018-01-26 14:45:01", "2018-01-26 14:44:46", "2018-01-26 14:44:24", "2018-01-24 22:08:07", "2018-01-23 21:17:19", "2018-01-21 13:39:16", "2018-01-17 15:34:18", "2018-01-16 11:56:16", "2018-01-15 09:25:26", "2018-01-15 09:14:55", "2018-01-14 23:58:48", "2018-01-12 00:18:28", "2018-01-11 11:34:14", "2018-01-10 22:20:37", "2018-01-08 22:17:06", "2018-01-08 20:45:43", "2018-01-06 00:21:33", "2018-01-03 22:30:25", "2017-12-31 08:02:02", "2017-12-29 01:26:33", "2017-12-28 16:45:45", "2017-12-24 10:09:39", "2017-12-22 12:40:58", "2017-12-22 12:23:12", "2017-12-22 10:54:14", "2017-12-22 10:05:38", "2017-12-20 20:37:32", "2017-12-20 18:50:08", "2017-12-19 21:32:26", "2017-12-18 19:11:06", "2017-12-11 22:14:33", "2017-12-07 17:59:15", "2017-12-07 12:04:33", "2017-12-06 22:57:10", "2017-11-25 17:17:58", "2017-11-23 13:28:15", "2017-11-20 10:36:09", "2017-11-19 15:55:12", "2017-11-18 08:59:28", "2017-11-15 19:39:41", "2017-11-14 22:20:08", "2017-11-14 12:25:53", "2017-11-08 09:13:39", "2017-11-03 16:24:30", "2017-10-27 04:59:53", "2017-10-22 11:01:32", "2017-10-21 02:15:32", "2017-10-17 18:02:56", "2017-10-09 13:21:35", "2017-10-04 15:16:15", "2017-10-02 19:28:19", "2017-09-29 18:16:33", "2017-09-19 09:08:43", "2017-09-17 23:27:32", "2017-09-11 17:19:21", "2017-09-09 11:40:19", "2017-09-05 06:56:57", "2017-09-04 21:36:46", "2017-09-02 21:33:03", "2017-08-31 10:57:00", "2017-08-29 09:01:58", "2017-08-28 23:27:03", "2017-08-27 14:35:47", "2017-08-26 12:19:57", "2017-08-24 11:49:05", "2017-08-21 09:09:36", "2017-08-19 13:05:22", "2017-08-17 19:03:56", "2017-08-15 22:54:50", "2017-08-15 11:40:14", "2017-08-14 12:53:06", "2017-08-14 11:42:53", "2017-08-13 16:39:33", "2017-08-13 10:27:12", "2017-08-12 09:11:59", "2017-08-12 09:11:00", "2017-08-11 16:44:59", "2017-08-10 16:14:15", "2017-08-10 15:50:26", "2017-08-10 10:02:02", "2017-08-09 15:37:56", "2017-08-08 19:50:08", "2017-08-07 21:30:43", "2017-08-07 17:06:15", "2017-08-06 11:09:29", "2017-08-05 14:25:59", "2017-08-05 13:37:59", "2017-08-03 12:27:27", "2017-08-02 11:47:32", "2017-08-02 11:15:59", "2017-08-01 11:26:50", "2017-08-01 08:04:55", "2017-08-01 08:03:04", "2017-07-31 16:26:51", "2017-07-30 13:00:18", "2017-07-28 15:00:18", "2017-07-25 22:31:06", "2017-07-23 20:31:42", "2017-07-20 15:45:16", "2017-07-19 21:49:38", "2017-07-18 23:12:11", "2017-07-18 00:05:30", "2017-07-17 21:54:12", "2017-07-17 08:20:23", "2017-07-16 23:58:08", "2017-07-15 13:36:35", "2017-07-14 11:27:08", "2017-07-14 10:41:57", "2017-06-30 22:22:08", "2017-06-26 16:02:34", "2017-06-24 13:46:39", "2017-06-23 11:02:36", "2017-06-20 12:23:34", "2017-06-18 10:57:39", "2017-06-18 07:39:00", "2017-06-17 22:34:16", "2017-06-16 15:51:29", "2017-06-16 15:19:00", "2017-06-12 13:37:37", "2017-06-11 10:54:01", "2017-06-08 23:57:57", "2017-06-05 20:22:00", "2017-06-03 00:05:08", "2017-05-30 23:01:14", "2017-05-29 09:41:02", "2017-05-28 18:11:30", "2017-05-28 09:51:59", "2017-05-27 10:12:56", "2017-05-26 17:40:49", "2017-05-24 20:13:07", "2017-05-22 14:43:00", "2017-05-22 12:53:49", "2017-05-20 19:35:08", "2017-05-17 04:50:29", "2017-05-12 06:47:15", "2017-05-05 16:52:36", "2017-04-29 14:51:33", "2017-04-28 22:22:18", "2017-04-28 09:26:30", "2017-04-27 23:30:59", "2017-04-26 13:29:43", "2017-04-18 10:05:02", "2017-04-18 10:04:55", "2017-04-15 20:43:37", "2017-04-14 21:07:45", "2017-04-10 12:54:01", "2017-04-09 14:19:46", "2017-04-08 21:24:45", "2017-04-08 15:35:15", "2017-04-07 19:38:24", "2017-04-04 16:49:50", "2017-03-28 15:36:12", "2017-03-26 11:19:20", "2017-03-25 18:09:56", "2017-03-25 16:37:11", "2017-03-24 23:38:49", "2017-03-24 00:02:56", "2017-03-23 20:49:06", "2017-03-23 20:05:07", "2017-03-22 21:51:08", "2017-03-21 15:19:33", "2017-03-16 16:53:50", "2017-03-10 01:05:33", "2017-03-08 15:34:45", "2017-03-07 12:24:12", "2017-02-28 13:03:52", "2017-02-25 21:34:14", "2017-02-25 11:20:52", "2017-02-25 00:42:58", "2017-02-21 10:44:37", "2017-02-18 13:09:04", "2017-02-16 20:58:55", "2017-02-16 11:31:49", "2017-02-14 16:25:14", "2017-02-03 08:25:45", "2017-02-01 12:25:25", "2017-01-24 21:36:57", "2017-01-22 21:26:11", "2017-01-20 08:31:56", "2017-01-17 09:47:52", "2017-01-14 15:04:09", "2017-01-11 19:15:35", "2017-01-11 16:42:10", "2017-01-10 17:06:14", "2017-01-03 22:15:16", "2016-12-30 11:35:13", "2016-12-28 14:08:09", "2016-12-26 23:16:23", "2016-12-23 11:35:51", "2016-12-23 10:40:14", "2016-12-20 20:52:19", "2016-12-13 13:06:46", "2016-12-13 10:57:57", "2016-12-12 21:42:25", "2016-12-12 20:31:23", "2016-12-12 14:13:11", "2016-12-11 10:23:51", "2016-12-09 17:05:00", "2016-12-07 21:00:45", "2016-12-07 16:59:47", "2016-12-06 16:18:40", "2016-12-06 14:03:54", "2016-12-05 20:35:35", "2016-12-02 11:18:50", "2016-11-29 18:51:33", "2016-11-28 19:12:27", "2016-11-26 03:05:13", "2016-11-25 20:48:53", "2016-11-23 19:36:42", "2016-11-23 01:34:06", "2016-11-22 12:32:08", "2016-11-20 11:24:54", "2016-11-20 10:57:14", "2016-11-19 23:18:44", "2016-11-19 19:59:11", "2016-11-19 14:25:26", "2016-11-16 17:34:26", "2016-11-16 14:48:01", "2016-11-14 15:42:19", "2016-11-13 19:57:44", "2016-11-13 15:29:05", "2016-11-11 19:24:08", "2016-11-09 18:37:29", "2016-11-08 12:06:06", "2016-11-07 19:31:07", "2016-11-07 18:12:26", "2016-11-04 18:21:38", "2016-11-04 18:20:13", "2016-10-23 17:04:53", "2016-10-22 18:29:23", "2016-10-21 16:43:35", "2016-10-20 15:37:27", "2016-10-19 19:38:59", "2016-10-18 11:35:53", "2016-10-18 11:09:51", "2016-10-17 21:46:52", "2016-10-13 16:31:02", "2016-10-10 20:48:18", "2016-10-10 13:11:50", "2016-10-09 18:27:03", "2016-10-09 16:39:30", "2016-10-08 14:07:16", "2016-10-08 13:01:17", "2016-10-07 22:51:21", "2016-10-06 22:40:22", "2016-10-06 20:55:12", "2016-10-06 18:08:37", "2016-10-05 01:14:21", "2016-09-29 12:45:02", "2016-09-27 12:16:52", "2016-09-27 11:12:18", "2016-09-27 10:48:34", "2016-09-25 18:59:02", "2016-09-25 15:01:59", "2016-09-23 11:29:41", "2016-09-22 20:15:28", "2016-09-22 18:32:43", "2016-09-20 18:07:05", "2016-09-20 17:54:01", "2016-09-20 00:50:24", "2016-09-19 18:13:44", "2016-09-18 15:11:14", "2016-09-17 15:00:44", "2016-09-17 12:36:11", "2016-09-16 14:06:52", "2016-09-13 09:00:33", "2016-09-09 16:44:05", "2016-09-07 22:00:49", "2016-09-03 23:09:21", "2016-08-31 18:38:34", "2016-08-27 23:32:22", "2016-08-22 17:12:08", "2016-08-21 20:28:49", "2016-08-20 18:18:47", "2016-08-20 18:16:02", "2016-08-18 17:54:11", "2016-08-18 16:48:40", "2016-08-18 16:23:22", "2016-08-16 21:35:55", "2016-08-16 15:26:21", "2016-08-15 21:49:47", "2016-08-15 20:44:35", "2016-08-15 18:46:53", "2016-08-13 21:17:40", "2016-08-13 18:31:33", "2016-08-13 12:05:19", "2016-08-11 13:57:33", "2016-08-06 13:54:46", "2016-08-05 12:13:01", "2016-08-04 16:03:03", "2016-08-03 11:42:38", "2016-08-02 18:32:53", "2016-07-30 15:51:04", "2016-07-29 13:29:12", "2016-07-26 13:31:22", "2016-07-23 15:19:36", "2016-07-20 18:47:01", "2016-07-19 20:23:14", "2016-07-18 06:43:26", "2016-07-16 23:13:31", "2016-07-16 13:37:47", "2016-07-13 13:37:41", "2016-07-08 18:50:04", "2016-07-03 17:03:50", "2016-07-03 15:06:28", "2016-07-03 00:12:33", "2016-07-02 17:12:16", "2016-07-02 16:10:13", "2016-06-30 19:27:04", "2016-06-30 12:55:59", "2016-06-24 19:52:20", "2016-06-24 14:11:06", "2016-06-23 13:57:11", "2016-06-22 11:04:08", "2016-06-21 22:32:30", "2016-06-21 15:43:50", "2016-06-19 10:54:39", "2016-06-18 21:49:18", "2016-06-18 20:36:15", "2016-06-18 16:18:04", "2016-06-17 21:34:14", "2016-06-17 16:21:44", "2016-06-16 20:28:49", "2016-06-12 15:44:19", "2016-06-11 19:12:02", "2016-06-11 12:26:40", "2016-06-11 10:31:15", "2016-06-10 14:35:35", "2016-06-10 14:28:42", "2016-06-06 12:57:21", "2016-06-05 14:57:25", "2016-06-04 21:39:30", "2016-06-04 14:43:32", "2016-06-03 17:56:14", "2016-06-03 14:06:35", "2016-06-03 10:57:29", "2016-06-01 20:18:21", "2016-05-30 18:46:09", "2016-05-29 17:35:14", "2016-05-28 20:47:33", "2016-05-28 19:14:05", "2016-05-28 15:48:44", "2016-05-27 19:22:16", "2016-05-26 20:50:21", "2016-05-26 15:48:46", "2016-05-23 19:24:00", "2016-05-22 19:53:18", "2016-05-22 16:06:46", "2016-05-20 15:06:31", "2016-05-20 12:55:22", "2016-05-19 23:08:07", "2016-05-15 17:10:19", "2016-05-14 17:39:53", "2016-05-11 13:05:39", "2016-05-08 15:47:07", "2016-05-08 15:46:04", "2016-05-08 15:45:48", "2016-05-08 14:11:33", "2016-05-06 00:00:20", "2016-05-03 23:49:58", "2016-04-30 21:33:18", "2016-04-30 15:01:53", "2016-04-30 11:43:03", "2016-04-28 10:11:34", "2016-04-23 22:50:01", "2016-04-18 10:52:42", "2016-04-13 17:22:42", "2016-04-10 23:46:33", "2016-04-09 16:03:41", "2016-04-04 15:08:13", "2016-03-31 21:08:10", "2016-03-26 17:29:08"], "content": ["", "买过很多次了,非常好", "", "很好吃,送货很快,到手时冰冰的很新鲜。", "很好吃,味道不错,会再次购买", "", "", "", "", "", "多次购买了,很棒,很好吃", "材料还是多新鲜的,就是这个送达时间不说迟到一分钟一元吗,感觉只是一个噱头,感觉派送员也不容易,算了吧", "", "第二次购买了,送货超快,当天下午买晚上就收到了,味道不错,榴莲再多点就更好了。", "还挺快的 我以为要等很久呢 味道也不错的 榴莲量也够", "", "你以为我是猪么,那么大一块,能不能做小一点,品质还一般", "", "", "", "", "", "", "", "很好,就是水果有点酸。不过水果很多这点很满意", "很不错的,很好吃,又买了一个", "这是第二次买芒果千层,不晓得是上午定的原因还是什么,里面的芒果都是被冰包着的,吃起都没有芒果的味道……", "嗯嗯。这个蛋糕还是非常不错的 而且很好吃 价格超级的实惠的 #2磅四重奏蛋糕#", "", "还是很不错的,满意", "", "很大一个。。。芒果千层比榴莲千层分量足。感觉榴莲千层里面都没榴莲。建议改善一下。", "味道不错,服务周到。", "还不错,配送员也很好", "真的非常赞", "榴莲味很浓,我和弟弟都特喜", "", "", "谢谢配送员大冬天送餐,很好吃。活动划算。还会再来", "女儿说很好吃,下次继续买", "配送及时,比在其他地方下单便宜很多。", "", "", "好吃", "", "", "超级划算,喜欢幸福的蛋糕。", "", "虽然送迟咯 态度还可以 味道也很赞", "很大一份,而且很便宜,味道也还可以,下次想试一下榴莲的", "非常好!!!", "很好包装送货味道很赞", "很好吃,就是你们能不能,,,处理一下我的退款", "", "", "", "蛋糕很新鲜送了盘子吃蛋糕也很合适", "其他都还好,就是中间榴莲夹心太少了,有点偷工减料。", "本来是冲着可爱去买的,担心不好吃,结果意外的好吃,不是那种太甜的,大家都赞味道棒", "超級好!一如既往的好吃!服務也很好!", "味道很好,芒果很新鲜,真心不错哦", "好好吃", "还不错,送货速度快,蛋糕味道好~", "反正没以前好吃了,感觉水果也不那么新鲜,感觉放了2,3天了", "", "", "味道相当不错~送得也快~", "好尴尬,送来的蛋糕忘了放刀叉盘子,蜡烛都算了嘛,可以想象怎么吃的了吧~”", "", "", "", "味道不错,服务一流。", "喜欢 好吃", "", "好吃的了", "蛋糕很好!这个价格能买到,真的很划算!", "味道很不错!就是希望价格再实惠点就好了!", "吃过一次,这是第二次,味道可以。可以吃四种口味。包装赠品都很好。", "很好吃。很喜欢 下次还来买", "", "蛋糕看着好喜欢,口感也很不错,快递员好暖心,强烈推荐", "", "#榴莲千层10块装# 味道很棒,榴莲的量比之前多了很多。希望店家多多保持٩(˃̶͈̀௰˂̶͈́)و", "味道很不错价格很便宜特别是配送上门。", "很好很好很好啊", "非常好吃啊", "", "", "还会再光顾 确实不错", "怎么说呢 #榴莲千层10块装# 量比上一次少太多了,希望下次改进", "", "配送员态度很好,榴莲感觉味道不太浓。", "", "很好吃喔……关键是便宜还快……", "#雪顶榴心下午茶# 真的很棒吶,榴莲味的蛋糕很好吃", "#芒果千层下午茶# 跟图一样,味道很好", "榴莲 #榴莲千层10块装# 千层炒鸡棒", "一如既往的好吃,个人还是最喜欢芒果千层和芒果拿破仑~有活动价格也很便宜,送达时间都是按要求送的,很满意~", "味道很好 就是奶油太多了很腻", "很快送达,东东也很新鲜,不错的~", "芒果千层和芒果拿破仑都很喜欢,小哥送达时间也很合适,几乎每周都要点一个~~好吃", "非常快 也好吃 值得回购 下次安排闺蜜聚餐可以在家搞了", "态度很好,蛋糕存放的条件说得很清楚", "以前吃过,这次超便宜,老板怎么赚钱", "这次很好有注意到强调少奶油了,吃起来就没那么腻。", "", "包装好,味道不错,配送也很好", "好好吃,榴莲味儿十足,最爱榴莲了,关键是没有奶油,我最不爱吃奶油了。快递员也不错", "", "很满意,很新鲜,水果很多", "配送时间很快,拿破仑很新鲜,里面有满满的芒果,大家都觉得很好吃的,就是吃多会有点腻,配红茶、绿茶正好。", "四重奏里面最喜欢的还是芒果漫漫,榴莲如果吃得惯的也不错哦。配送也很准时,按我备注的时间送的,挺好的", "幸福西饼的蛋糕一直不错,这次买的芒果拿破仑,口感极佳,外卖小哥也特别好,早了家里没人,给快递小哥说了之后,就晚上专门送过来,蛋糕和服务都很好", "", "里面外面都是芒果,真的是一分钱一分货,不喜欢,但是包装精美,送货很快", "", "挺好吃的,一直喜欢这家的千层蛋糕", "很好,非常好吃,也很方便", "太好吃了,两点个都吃光了", "味道不错,里面的芒果也新鲜,孩子们很喜欢吃,下次还来", "很好吃~超级好吃喔~", "", "幺儿6岁,蛋糕很好吃 每次都是幸福西饼", "", "很好吃的哟,芒果很新鲜,满满的芒果…", "非常准确,味道也很不错,很喜欢", "", "感觉越来越小了", "贼好吃,给你满分不怕你骄傲", "确实很好吃,配送也很快,好评!", "", "蛋糕很漂亮和图上一模一样,比街边蛋糕店实惠。", "", "", "第一次用外卖来点蛋糕", "", "", "特别好吃,味道很赞,快递也很准时,每次都提醒我吃不完放冰箱,经常做活动就好了", "实惠又好吃,本来想买榴莲,结果自己点成芒果了,也很好吃", "", "好吃呢,价格也公道,跟同事几个人分享的,大家都很开心,一起品尝美味!", "特别好吃哎!包装也超级漂亮~感觉也不腻,外皮特别香然后配上芒果~不会让人有奶油的油腻感,建议食用之前稍加冷藏~反正以后还会继续定这款来吃!店家态度也特别好,还有公众号客服~", "第二次购买了,好吃又划算", "", "特别好,味道好。送到及时!", "", "真的很不错,吃起来一点也不腻 ,为啥不能上传照片呢?", "", "", "竟然秒杀到了。我下单的时候 不信真的 接蛋糕的时候还不敢信哈哈哈 超好吃的感觉 真的真的 完美", "", "", "", "电梯坏了,送蛋糕的小哥爬了十多层楼送上来的,非常感谢。千层买很多次了,满意。", "", "", "", "好吃—现在蛋糕都在他家买,还快递到家方便", "", "味道很好吃,有4种口味各有各的特色,下次再买,送货很准时。", "", "", "", "", "一直信赖、物美价廉,好吃!", "份量很足,就是偏甜了", "", "", "", "", "第一次买真是不错,强烈推荐", "", "", "买了很多次了,味道没话说,家人都很喜欢。性价也比高。又很方便", "", "送朋友的,朋友很喜欢。\n我是在外地订的,只是想送给朋友吃,希望心情好点,真的不是过生日呀。餐具选了两份,看到写的给10份,联系了一下客服。希望减少餐具,但是这个并没有做到,其实帮我减少一下餐具很容易的一件事情,不知道为什么显得这么难。ok,是不好拆五个装一套,我也说去掉五个都行,就尽量减少,最后还是没能做到。连这最直接的诉求都没有满足,那么更不会帮我把那不合时宜的“生日快乐”四字标签取掉了。朋友说东西很好,我希望在服务上面可以更好吧。", "真的很好,买过两次。很好", "送达挺准时的,由于带娃不方便下楼取,配送员暖心的送到家,味道也挺赞,下次再", "", "不错,活动价,只是比较喜欢芒果千层,榴莲味还不错,有点大", "蛋糕很不错,四种口味。值得购买!", "", "", "", "", "", "", "一如既往的好 希里产品做得更加好 货真价实", "提前送达的,蛋糕很新鲜,味道好", "", "味道不错,送货也快,就是小了点", "很喜欢这家店的甜点,多次光顾了", "生日蛋糕很好吃,分量足,甜而不腻,还要再下一单榴莲千层蛋糕,希望味道一如既往的好吃", "蛋糕很好吃,和图片一样。配送小哥人一直叫我打开看看坏没坏。总之很nice", "", "永远那么好,快递非常热情,谢谢那句祝你幸福", "", "", "", "特别好 性价比超高 吃得很开心 下次再来", "", "完好无损 一如既往的味道", "很好吃!也感谢快递员!快递员很热情!", "", "超级好吃 很新鲜哦 怎么发图啊", "", "", "", "好好好好好好好好好好好好好", "", "", "", "很好吃 四个不同的口味 每一个口味都很好吃", "", "", "很好吃,速度很快。经济实惠", "蛋糕很好吃,水果很多,很丰富", "", "超级好吃,很新鲜,会继续购买", "挺好吃的满意", "很好很不错。", "", "快递小哥,服务态度简直好,蛋糕也特别好吃", "超级好吃", "已经是二次购买了,各种满意", "", "", "就是不好吃,浪费了一大半,十几个人都没有吃完,包装很好,什么都配齐全了", "配送超时了一个小时~", "", "", "不错就是太大,一个人吃撑了。", "蛋糕好吃,价格也漂亮,配送小哥更是敬业,雨下的很大,也及时送到,谢谢", "很准时,一早就送到了,忘了确定,不好意思。", "", "蛋糕很漂亮,服务也很好。很满意", "", "", "", "", "", "很好吃", "", "这个蛋糕是芒果口味里最好吃的", "朋友推荐的,味道真的好吃。", "快递小哥很客气...蛋糕也不错噢", "", "这次不送到家。 下楼到小区门口拿的。关键是碟子和叉子没拿给我。 这叫我怎么吃。 还是这个套餐不含碟叉。", "", "", "送成了榴莲千层+芒果千层", "送的很快呀!原本还想着晚点送过来当作晚饭呢", "蛋糕很好吃,第二次买了,大家都很喜欢哦。但是送达时间还需要改进", "", "", "", "", "很不错,挺漂亮的", "蛋糕", "蛋糕好好吃,水果非常新鲜。以后公司人员生日,全部找这个蛋糕店了!", "味道很不错,很好吃,但是这个分量怎么也不够6个人吃吧。还是很喜欢的。以后还会买的。", "很好啊,配送也很快,下午茶不错的选择", "完全有一种幸福送到家得感觉,配送人员真是很热情耐心,蛋糕也特别好吃,浓浓的榴莲味,太棒了,而且到手里的蛋糕真是很新鲜美味,而且餐具绝对是良心用品,比外面的质量好的多,感觉都可以再次利用了。第一次在幸福西饼买蛋糕,以后肯定都选择了!", "及时,超好,价格也超好……", "", "服务好,味道不错。优惠幅度大", "真的好好吃,配送员还提示吃不完放冰箱!好评还会再来,值得回购!", "", "", "雪顶榴心超好吃,千层不怎样", "可以,配送时间再少一点就好了", "榴莲香雪很好吃,巧克力慕斯有点腻", "我觉得7到10个人吃正好,一个人顶多吃两块就腻,不过味道还是很正的,榴恋香气浓郁。连锁的挺正规的", "好吃好吃", "送货真的超快的,味道太好吃了,满满的榴莲喝芒果,满满的幸福感!大爱,以后生日定蛋糕就这家了!!", "太好吃了,新鲜的芒果,一层又一层,吃在嘴里感觉爽极了!配送又快又好!值得信赖的商家!特棒!", "安排了隔我地址最近的店铺送过来!虽然时间提前了几个小时!但是也给我电话确认过的!送餐小哥态度也超级好!必须给好评!跟图片也吻合!还没吃,味道还不知道!", "超级满意,超级好吃,超快的", "非常好吃,超级好吃 快递晚了5分钟也补钱了", "送的很准时,蛋糕味道也不错哦,该有的东西也有,赞一个哈!", "", "", "", "", "包装完好,服务很好,虽然还没吃,但看着就觉得不错。", "", "以前就一直吃幸福西饼,以前在三峡广场那片,真心喜欢,哈哈", "同事推荐的,味道确实不错,值得!", "很好", "", "", "这么热的天气,送货的师傅真是太辛苦了!", "如果榴莲再多点就更好了!很棒,很好吃,很满意,快递也很好,谢谢!", "很不错,送得也及时,\n还好吃", "好吃", "很好吃,但是这次送来就散开了", "", "很久没吃榴莲千层了,味道很赞!", "大学城很快就收到了~服务好~量足还好吃", "", "好吃!\n", "", "特别好吃,感觉比蛋糕还好吃就是没给我盘子叉子。。。", "", "", "很不错,好吃,真的好吃。", "", "速度快,东西不错。", "好吃不贵。", "味道非常好,价格也很美丽!!送货也很及时?如果能一直不涨价,就会常光顾哦,喜欢所有榴莲味的东西", "", "好吃,就是这8寸挺水的,估计就15厘米的正方形,39.5还是可以的,79就小贵,比较不大,确实挺好吃", "", "", "", "", "", "太好吃了,很新鲜,肉很多,反正很好", "榴莲味挺重,食材也新鲜,快递准时送达", "大家都不相信那么便宜,而且大老远从沙坪坝送到南岸区,真心不错的商家", "超好的,全是新鲜的芒果,蛋糕的口感也棒棒哒,全家都赞", "值得购买,以后朋友过生日还订他们家", "味道不错", "", "送货人员服务态度很好,味道不错,娃儿特别喜欢!", "", "很不错", "很棒、速度很快,包装也很好,希望榴莲可以多点,价格翻倍也愿意~", "", "", "蛋糕好吃,只是天气热,蛋糕有点走形了,不过没关系。送外卖的师傅特意确认了送货时间,不错。", "", "", "", "水果大点吧,太腻人了", "分量很足很好吃,吃一块就饱啦。会继续光顾", "", "", "", "好吃,小了点,就是不喜欢上面一层可可粉,都要推掉", "", "大热天的,真的很感谢那么及时的送到,拿到手的时候都还是冰冰的,很开心。包装也很精美,美中不足的是榴莲真的太少了,和我吃过的其它榴莲千层相比,可能是价格原因。阿门,原谅我的重口味吧,只是太爱榴莲。祝生意越来越好", "分量小,这款除了一层巧克力几乎都是蛋糕,夹心内容少了点,还是提拉米苏好吃些", "", "", "", "我还没吃TAT我妈妈说很好吃!", "", "很好吃,一点都不腻,朋友很喜欢", "好吃的不得了,下次又买", "", "这款可不可以有黑森林的口味啊,还是觉得一次吃一坨小了点", "好吃,也不贵,还送货上门", "第一次买不好吃 味道怪怪的", "蛋糕很好吃,比预计送达时间要找很多!", "好吃,又便宜又好吃,巴适惨了", "很好吃,味道特别正,送货员也很好,态度好", "第二次买了,价格实惠,味道也好。以前不吃榴莲的,在这里吃了两次就喜欢上了。以后买蛋糕都在你这里买了。", "", "很实惠,榴莲比芒果好吃。包装很漂亮,强烈推荐。", "", "好多水果,舒服", "", "大写的赞,非常好吃、很不错,水果再多点大点就更好了", "不甜不腻,口感很好,挺满意的", "比我想象的快得多,物美价廉", "没想到这个大个,味道很好。送得也准时。", "配送师傅很赞,提前送达,并贴心提醒保存事项,蛋糕美味,赞", "很好吃。好喜欢。以后还会回购的", "到手都还是冷藏的,第一天吃起来中间有点沙冰的口感特别好。榴莲很新鲜量足,就是蛋皮不够柔软,太韧了。希望能够改进〜\n配送的餐具都好有逼格,点赞!", "不错 吃多了有点腻 下次换其他口味", "不甜腻。很好。价格实惠。", "服务态度好,味道也特别正宗,这也是我吃过最好吃的", "好吃得不得了,今天看就涨价了,过几天再定一个.", "都不错 提前送到 给妈妈买来母亲节吃的 还是比较喜欢", "", "", "", "", "口感很不错,很实惠的", "味道还是不错的,下次试榴莲的,快递也很给力", "配送小哥非常棒吖,东西也不错", "好吃,实惠,以后蛋糕就在你家定了", "不是很甜很甜的那种,但真的可以吃出幸福味道(˶ ̄᷄ ⁻̫  ̄᷅˵)好好吃", "服务不错,味道也可以!", "味道不错,如果有保温措施就更好了。单位没冰箱,吃不完的拿回家都化掉了。奶油确实有点儿少噢,不过榴莲味很浓。", "", "还行!奶油少了点!挺大的!因为就2个人吃所以", "真心的奶油太少了!不过还是好吃", "不错,很值得,送货及时,服务很好,很好吃", "很好吃,好吃哭了", "味道不错,就是奶油少,还是挺值的"], "arrive_time": ["2017-07-17 22:38", "2018-04-22 21:58", "2018-04-28 14:44", "2018-04-06 16:06", "2018-04-17 23:36", "2018-04-06 15:00", "2018-04-05 19:28", "2018-01-27 17:44", "2018-03-28 18:00", "2018-03-11 20:46", "2018-03-25 17:56", "2018-03-25 13:10", "2018-03-10 20:52", "2018-03-10 23:14", "2018-03-06 23:32", "2018-03-02 21:30", "2018-02-12 21:56", "2018-01-30 19:56", "2018-01-29 23:00", "2018-01-30 22:20", "2018-01-09 20:00", "2018-01-10 21:15", "2018-01-15 13:12", "2018-01-12 16:02", "2018-01-23 16:45", "2018-01-17 19:48", "2018-01-14 17:14", "2018-01-15 15:46", "2018-01-13 00:00", "2018-01-14 15:49", "2018-01-14 19:50", "2018-01-11 16:16", "2018-01-10 17:00", "2018-01-09 23:08", "2018-01-08 16:00", "2018-01-07 18:54", "2018-01-05 22:00", "2018-01-03 16:30", "2017-12-30 11:40", "2017-12-27 18:30", "2017-11-28 21:26", "2017-12-22 12:04", "2017-12-01 21:22", "2017-12-04 18:32", "2017-09-07 20:58", "2017-12-01 16:56", "2017-12-19 15:30", "2017-12-20 16:12", "2017-12-19 21:10", "2017-12-09 19:41", "2017-12-11 20:00", "2017-11-18 20:42", "2017-12-06 17:34", "2017-12-06 16:00", "2017-11-25 11:45", "2017-11-19 17:56", "2017-11-18 20:40", "2017-11-18 16:02", "2017-11-16 20:00", "2017-11-12 21:44", "2017-11-14 21:30", "2017-11-14 00:02", "2017-11-07 23:16", "2017-11-01 20:00", "2017-10-26 16:30", "2017-10-21 20:00", "2017-10-14 12:09", "2017-10-17 16:00", "2017-10-03 00:26", "2017-10-02 17:35", "2017-09-04 21:34", "2017-09-28 23:40", "2017-09-18 18:53", "2017-09-17 19:46", "2017-09-09 18:12", "2017-09-09 00:52", "2017-08-07 18:16", "2017-09-03 10:40", "2017-09-02 14:25", "2017-08-14 18:40", "2017-08-28 23:02", "2017-08-28 19:30", "2017-08-21 17:06", "2017-07-30 20:00", "2017-02-25 11:00", "2017-08-08 18:10", "2017-08-16 21:50", "2017-06-29 20:12", "2017-08-15 21:16", "2017-08-11 20:40", "2017-08-11 16:18", "2017-08-13 17:07", "2017-08-12 15:52", "2017-08-13 00:30", "2017-08-11 20:53", "2017-08-05 16:09", "2017-07-30 23:50", "2017-08-09 19:46", "2017-08-05 10:15", "2017-08-08 17:42", "2017-08-04 21:44", "2017-08-08 00:20", "2017-08-07 20:18", "2017-08-07 16:50", "2017-08-05 20:18", "2017-07-31 23:26", "2017-07-29 14:16", "2017-08-01 21:38", "2017-07-26 00:12", "2017-07-31 00:38", "2017-07-30 16:00", "2017-07-22 21:46", "2017-07-31 20:00", "2017-07-31 11:02", "2017-07-29 16:56", "2017-07-24 19:40", "2017-07-19 21:28", "2017-07-11 21:30", "2017-07-15 15:30", "2017-07-18 19:28", "2017-07-18 21:24", "2017-07-12 23:22", "2017-07-16 22:06", "2017-07-16 21:56", "2017-07-16 21:54", "2017-07-15 13:36", "2017-07-12 22:00", "2017-06-23 13:00", "2017-06-30 10:15", "2017-06-25 20:36", "2017-06-18 17:38", "2017-06-22 17:58", "2017-06-14 20:02", "2017-06-16 18:14", "2017-06-17 14:40", "2017-06-15 16:00", "2017-06-01 19:58", "2017-06-03 14:30", "2017-06-05 22:34", "2017-05-20 19:04", "2017-06-08 20:46", "2017-06-05 16:00", "2017-05-26 23:32", "2017-05-30 19:00", "2017-05-28 21:00", "2017-05-28 13:15", "2017-05-26 18:00", "2017-05-26 16:51", "2017-05-25 20:54", "2017-05-24 20:02", "2017-05-21 19:35", "2017-05-22 00:04", "2017-05-20 17:26", "2017-05-17 00:28", "2017-05-05 19:42", "2017-05-05 13:20", "2017-04-27 17:56", "2017-04-28 21:00", "2017-04-27 19:02", "2017-04-27 21:00", "2017-04-25 22:38", "2017-04-10 14:16", "2017-03-26 14:16", "2017-04-09 17:15", "2017-04-14 20:30", "2017-04-07 18:14", "2017-03-21 12:24", "2017-04-06 21:00", "2017-04-02 20:06", "2017-04-02 21:52", "2017-03-12 20:16", "2017-03-01 17:10", "2017-03-25 18:00", "2017-03-22 00:22", "2017-02-26 12:10", "2017-03-24 17:27", "2017-03-24 00:02", "2017-03-23 16:42", "2017-03-23 16:46", "2017-03-21 16:25", "2017-03-12 16:30", "2017-03-11 12:08", "2017-03-03 21:16", "2017-02-18 19:26", "2017-02-25 18:46", "2017-02-26 23:32", "2017-02-25 19:08", "2017-02-24 22:56", "2017-02-22 23:28", "2017-02-13 21:26", "2017-02-18 00:04", "2017-02-15 22:46", "2017-02-14 21:30", "2017-02-08 17:00", "2017-01-11 21:00", "2017-01-31 20:06", "2017-01-23 14:16", "2017-01-22 17:20", "2017-01-19 19:00", "2017-01-05 00:38", "2016-12-14 17:52", "2017-01-11 19:14", "2017-01-11 16:39", "2017-01-10 17:05", "2017-01-03 22:14", "2016-12-21 17:42", "2016-12-24 22:00", "2016-12-23 17:16", "2016-11-24 00:12", "2016-12-23 10:40", "2016-12-20 20:51", "2016-12-13 12:15", "2016-12-12 23:02", "2016-12-12 19:28", "2016-12-12 20:31", "2016-12-12 14:11", "2016-12-10 18:32", "2016-12-09 17:04", "2016-12-07 21:00", "2016-12-07 16:59", "2016-11-24 16:08", "2016-12-06 14:03", "2016-12-05 20:34", "2016-11-25 19:54", "2016-11-28 17:18", "2016-11-28 17:12", "2016-11-19 17:54", "2016-11-24 16:00", "2016-11-23 19:35", "2016-11-18 00:42", "2016-11-13 17:22", "2016-11-16 20:18", "2016-11-07 21:30", "2016-11-19 23:10", "2016-11-19 19:58", "2016-11-19 14:24", "2016-11-16 17:34", "2016-11-16 13:30", "2016-11-14 15:21", "2016-11-09 16:38", "2016-11-08 00:14", "2016-11-11 19:23", "2016-11-09 18:37", "2016-11-07 18:00", "2016-11-07 19:30", "2016-11-07 18:12", "2016-11-04 16:55", "2016-11-04 18:17", "2016-10-23 17:04", "2016-09-29 22:48", "2016-10-21 16:20", "2016-10-20 15:36", "2016-10-19 19:37", "2016-10-18 11:00", "2016-10-05 12:01", "2016-10-17 18:59", "2016-10-10 11:40", "2016-10-09 17:55", "2016-10-09 17:50", "2016-10-08 20:24", "2016-10-09 16:38", "2016-10-08 14:06", "2016-10-08 12:57", "2016-10-05 21:52", "2016-10-06 12:15", "2016-10-06 20:01", "2016-10-06 18:07", "2016-10-01 00:02", "2016-09-27 14:26", "2016-09-21 18:30", "2016-09-15 16:48", "2016-09-26 14:10", "2016-09-23 15:37", "2016-09-25 14:55", "2016-09-22 17:36", "2016-09-21 21:32", "2016-09-22 18:30", "2016-09-20 18:06", "2016-09-15 23:04", "2016-09-20 20:10", "2016-09-18 14:14", "2016-09-11 00:12", "2016-09-17 15:00", "2016-09-17 12:15", "2016-09-16 14:05", "2016-09-11 20:00", "2016-09-09 16:43", "2016-09-07 21:48", "2016-08-29 17:12", "2016-08-30 15:06", "2016-08-20 16:24", "2016-08-22 17:11", "2016-08-21 20:10", "2016-05-31 19:30", "2016-08-19 18:30", "2016-08-18 17:53", "2016-08-13 23:20", "2016-08-17 10:30", "2016-08-16 19:00", "2016-08-13 23:14", "2016-08-15 21:48", "2016-08-15 20:43", "2016-08-15 18:15", "2016-08-13 21:17", "2016-08-13 18:31", "2016-08-13 11:30", "2016-08-11 13:57", "2016-08-06 11:42", "2016-08-03 13:10", "2016-08-04 14:00", "2016-08-02 00:46", "2016-08-02 18:24", "2016-07-30 15:50", "2016-07-25 15:00", "2016-07-09 18:56", "2016-06-20 00:36", "2016-07-21 06:40", "2016-07-19 20:22", "2016-07-15 17:16", "2016-07-16 01:30", "2016-07-16 13:34", "2016-06-19 20:40", "2016-07-08 13:15", "2016-04-20 19:33", "2016-07-03 03:08", "2016-05-14 00:30", "2016-07-02 01:08", "2016-05-28 21:00", "2016-06-30 19:26", "2016-06-27 01:28", "2016-06-24 19:50", "2016-06-07 22:00", "2016-05-29 20:46", "2016-04-04 02:04", "2016-06-19 19:16", "2016-06-21 14:08", "2016-05-28 20:44", "2016-05-19 11:40", "2016-05-13 19:24", "2016-06-17 20:00", "2016-06-17 21:34", "2016-06-16 21:18", "2016-06-16 20:26", "2016-06-06 23:42", "2016-04-21 00:26", "2016-05-20 23:06", "2016-06-11 18:51", "2016-05-22 01:02", "2016-06-06 04:54", "2016-06-01 00:00", "2016-06-05 11:34", "2016-05-31 19:16", "2016-06-04 14:42", "2016-05-29 20:02", "2016-06-02 19:30", "2016-05-08 01:32", "2016-06-01 13:49", "2016-05-29 20:22", "2016-05-02 02:32", "2016-05-21 00:18", "2016-05-08 03:34", "2016-05-28 15:46", "2016-05-27 19:20", "2016-05-26 20:49", "2016-05-26 15:47", "2016-05-23 18:54", "2016-05-22 15:33", "2016-05-21 19:03", "2016-05-20 14:04", "2016-05-15 17:27", "2016-05-18 02:28", "2016-05-15 17:10", "2016-05-14 17:39", "2016-05-06 16:52", "2016-05-08 15:46", "2016-05-05 20:38", "2016-05-06 22:14", "2016-05-06 20:54", "2016-05-05 16:50", "2016-05-03 19:50", "2016-04-30 15:00", "2016-04-30 15:01", "2016-04-27 00:44", "2016-04-28 03:24", "2016-04-22 22:32", "2016-04-17 13:48", "2016-04-06 03:42", "2016-04-09 22:46", "2016-04-09 00:48", "2016-04-04 15:06", "2016-03-31 20:37", "2016-03-26 17:28"], "dish_score": [1, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 4, 5, 5, 5, 3, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 2, 5, 5, 3, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 3, 5, 5, 5, 5, 4, 2, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 3, 5, 4, 5, 5, 5, 5, 5, 5, 3, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 1, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 4, 5, 4, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 3, 4, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 4, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 3, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 4, 5, 5, 3, 5, 3, 5, 5, 5, 5, 5, 2, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 3, 4, 5, 5, 5, 4]} -------------------------------------------------------------------------------- /UGC_analysis/emo_pic.py: -------------------------------------------------------------------------------- 1 | import matplotlib.pyplot as plt 2 | import numpy as np 3 | # 显示中文 4 | plt.rcParams['font.sans-serif']=['SimHei'] #用来正常显示中文标签 5 | plt.rcParams['axes.unicode_minus']=False #用来正常显示负号 6 | 7 | sentiments_list = [0.994,0.01] 8 | plt.hist(sentiments_list, bins=20) 9 | plt.xlabel("情感值") 10 | plt.ylabel("评论数目") 11 | plt.title('整体情感极性分布图') 12 | plt.show() 13 | plt.close() -------------------------------------------------------------------------------- /UGC_analysis/resource.txt: -------------------------------------------------------------------------------- 1 | { 2 | "average_score": { 3 | "average_dish_score": 5.0, 4 | "average_service_score": 4.6, 5 | "average_score": 4.9 6 | }, 7 | "score_detail": { 8 | "5": 6, 9 | "4": 1, 10 | "1": 0, 11 | "2": 0, 12 | "3": 0 13 | }, 14 | "weeks_score": { 15 | "last_three_week": 5.0, 16 | "last_two_week": 0.0, 17 | "last_one_week": 0.0 18 | }, 19 | "recommend_dishes": { 20 | "鸡汁鲜肉包": 4, 21 | "茴香豆腐包": 1, 22 | "雪菜黄豆包": 2, 23 | "青椒豆干包": 4, 24 | "精品卤鸡蛋": 5, 25 | "小米粥": 5, 26 | "香菇菜心包": 4, 27 | "八宝粥": 2, 28 | "韭菜鸡蛋包": 3, 29 | "玫瑰豆沙包": 3 30 | }, 31 | "comment_num": 5, 32 | "content": [ 33 | "", 34 | "", 35 | "", 36 | "", 37 | "#鸡汁鲜肉包# #香菇菜心包# #玫瑰豆沙包# 都挺好吃的,美中不足就是粥太甜了", 38 | "", 39 | "" 40 | ], 41 | "cost_time": [ 42 | 27, 43 | 23, 44 | 29, 45 | 20, 46 | 23, 47 | 36, 48 | 33 49 | ], 50 | "service_score": [ 51 | 5, 52 | 0, 53 | 5, 54 | 5, 55 | 3, 56 | 0, 57 | 5 58 | ], 59 | "dish_score": [ 60 | 5, 61 | 0, 62 | 5, 63 | 5, 64 | 5, 65 | 0, 66 | 5 67 | ], 68 | "sfrom": [ 69 | "android", 70 | "pc", 71 | "android", 72 | "bn-android", 73 | "iphone", 74 | "pc", 75 | "iphone" 76 | ], 77 | "score": [ 78 | 5, 79 | 5, 80 | 5, 81 | 5, 82 | 4, 83 | 5, 84 | 5 85 | ], 86 | "create_time": [ 87 | "2018-12-02 06:02:50", 88 | "2018-11-27 08:19:40", 89 | "2018-11-14 09:18:10", 90 | "2018-10-16 12:40:13", 91 | "2018-10-04 10:12:53", 92 | "2018-09-24 16:01:19", 93 | "2018-09-17 13:06:34" 94 | ], 95 | "arrive_time": [ 96 | "2018-11-27 07:47", 97 | "2018-11-27 08:13", 98 | "2018-11-14 09:17", 99 | "2018-10-16 09:06", 100 | "2018-10-04 09:54", 101 | "2018-09-11 08:48", 102 | "2018-09-17 09:20" 103 | ], 104 | "useful_comment_id": [ 105 | [ 106 | 4, 107 | 1 108 | ] 109 | ], 110 | "rubbish_comment_id": [ 111 | [ 112 | 0, 113 | 1 114 | ], 115 | [ 116 | 1, 117 | 1 118 | ], 119 | [ 120 | 2, 121 | 1 122 | ], 123 | [ 124 | 3, 125 | 1 126 | ], 127 | [ 128 | 5, 129 | 1 130 | ], 131 | [ 132 | 6, 133 | 1 134 | ] 135 | ] 136 | } -------------------------------------------------------------------------------- /UGC_analysis/test_spider.py: -------------------------------------------------------------------------------- 1 | from txt_analysis import spider 2 | import json 3 | def test1(): 4 | id = "1438461721" 5 | result = spider.crawl(id) 6 | # 1k + 7 | # https://star.ele.me/shopui/?qt=shopcomment&shop_id=1553452950 8 | ##http://waimai.baidu.com/waimai/comment/getshop?display=json&shop_id=1570745324&page=0&count=99 9 | # https://star.ele.me/shopui/?qt=shopcomment&shop_id=1438461721 10 | with open("crawler.txt", "w", encoding="utf-8") as f: 11 | # f.write(str(result)) 12 | # json.dump(result, f, ensure_ascii=False, indent=4, sort_keys=True, separators=(",", ":")) 13 | json.dump(result, f, ensure_ascii=False) 14 | 15 | if __name__ == "__main__": 16 | test1() 17 | -------------------------------------------------------------------------------- /UGC_analysis/txt_analysis/__pycache__/picturing.cpython-36.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/CarryChang/UGC-Analysis/65f019a79625cc00839ca56451c0d3cc0fa7cbb4/UGC_analysis/txt_analysis/__pycache__/picturing.cpython-36.pyc -------------------------------------------------------------------------------- /UGC_analysis/txt_analysis/__pycache__/spider.cpython-36.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/CarryChang/UGC-Analysis/65f019a79625cc00839ca56451c0d3cc0fa7cbb4/UGC_analysis/txt_analysis/__pycache__/spider.cpython-36.pyc -------------------------------------------------------------------------------- /UGC_analysis/txt_analysis/picturing.py: -------------------------------------------------------------------------------- 1 | import matplotlib.pyplot as plt 2 | from pylab import mpl 3 | from txt_analysis import spider 4 | from random import choice 5 | ###加载score_detail 6 | def score_detail(result): 7 | if result is not None: 8 | # 导入json数据 9 | label_scores = sorted(result["score_detail"].items()) 10 | # labels 11 | labels = "1分", "2分", "3分", "4分", "5分" 12 | # sizes 13 | sizes = [pair[1] for pair in label_scores] 14 | # explode 15 | the_max = max(sizes) 16 | the_index = sizes.index(the_max) 17 | explode = [0, 0, 0, 0, 0] 18 | explode[the_index] = 0.1 19 | explode = tuple(explode) 20 | plt.pie(sizes, explode=explode, labels=labels, 21 | autopct="%1.2f%%", shadow=True, startangle=0) 22 | plt.title("整体评分分析", loc="left", fontsize=20) 23 | plt.axis("equal") 24 | plt.show() 25 | plt.close() 26 | #dish_score统计 27 | def dish_score_detail(result): 28 | if result is not None: 29 | # 导入 30 | label_scores = result["dish_score"] 31 | # labels 32 | labels = "1分", "2分", "3分", "4分", "5分" 33 | # sizes 34 | sizes = [label_scores.count(i) for i in range(5)] 35 | # explode 36 | the_max = max(sizes) 37 | the_index = sizes.index(the_max) 38 | explode = [0, 0, 0, 0, 0] 39 | explode[the_index] = 0.1 40 | explode = tuple(explode) 41 | plt.pie(sizes, explode=explode, labels=labels, 42 | autopct="%1.2f%%", shadow=True, startangle=0) 43 | plt.title("商品质量评分分析", loc="left", fontsize=20) 44 | plt.axis("equal") 45 | plt.show() 46 | plt.close() 47 | def bar_auto_label(rects, suffix="分"): 48 | colors = ["g", "r", "c", "m", "y", "b", "chartreuse", "lightgreen", "skyblue", 49 | "dodgerblue", "slateblue", "blueviolet", "purple", "mediumorchid", 50 | "fuchsia", "hotpink", "lightcoral", "coral", "darkorange", "olive", 51 | "lawngreen", "yellowgreen", "springgreen", "cyan", "indigo", "darkmagenta", 52 | "orchid", "lightpink", "darkred", "orangered", "goldenrod", "lime", "aqua", 53 | "steelblue", "plum", "m", "tomato", "greenyellow", "darkgreen", "darkcyan", 54 | "violet", "crimson"] 55 | for i, rect in enumerate(rects): 56 | height = rect.get_height() 57 | plt.text(rect.get_x() + rect.get_width() / 2, 1.01 * height, "%s%s" % (float(height), suffix)) 58 | color = choice(colors) 59 | colors.remove(color) 60 | rect.set_color(color) 61 | #各项评价平均指标统计 62 | def average_score(result): 63 | import numpy as np 64 | if result is not None: 65 | # 导入json数据 66 | average_scores = result["average_score"] 67 | title = "评价指标平均分数" 68 | y_label = "分数" 69 | labels = ("质量", "服务", "整体") 70 | label_pos = (0, 1, 2) 71 | heights = (average_scores["average_dish_score"], 72 | average_scores["average_service_score"], 73 | average_scores["average_score"]) 74 | plt.title(title, fontsize=20) 75 | plt.ylabel(y_label) 76 | plt.ylim(0, 5) 77 | plt.xticks(label_pos, labels) 78 | rect = plt.bar(label_pos, heights, width=0.35, align="center") 79 | bar_auto_label(rect) 80 | plt.show() 81 | plt.close() 82 | #订餐平台分析统计 83 | def s_from(result): 84 | if result is not None: 85 | sfrom = result["sfrom"] 86 | title = "使用终端分析" 87 | # total sources 88 | sources = tuple(set(sfrom)) 89 | # size 90 | sizes = [sfrom.count(source) for source in sources] 91 | plt.pie(sizes, labels=sources, autopct="%1.2f%%", shadow=True, startangle=0) 92 | plt.title(title, loc="left", fontsize=20) 93 | plt.axis("equal") 94 | plt.show() 95 | plt.close() 96 | def barh_auto_label(rects, suffix="次"): 97 | colors = ["g", "r", "c", "m", "y", "b", "chartreuse", "lightgreen", "skyblue", 98 | "dodgerblue", "slateblue", "blueviolet", "purple", "mediumorchid", 99 | "fuchsia", "hotpink", "lightcoral", "coral", "darkorange", "olive", 100 | "lawngreen", "yellowgreen", "springgreen", "cyan", "indigo", "darkmagenta", 101 | "orchid", "lightpink", "darkred", "orangered", "goldenrod", "lime", "aqua", 102 | "steelblue", "plum", "m", "tomato", "greenyellow", "darkgreen", "darkcyan", 103 | "violet", "crimson"] 104 | for i, rect in enumerate(rects): 105 | width = rect.get_width() 106 | plt.text(1.01 * width, rect.get_y(), "%s%s" % (int(width), suffix)) 107 | color = choice(colors) 108 | colors.remove(color) 109 | rect.set_color(color) 110 | ####获取热卖单品 111 | def recommend_dishes2(result): 112 | if result is not None: 113 | # 导入json数据 114 | ##########直接导入推荐单品数据 115 | recommend_dishes = sorted(result["recommend_dishes"].items(), key=lambda dish: dish[1])[-30:] 116 | title = "推荐单品展示" 117 | x_label = "次" 118 | labels = [dish[0] for dish in recommend_dishes] 119 | label_pos = tuple(range(len(labels))) 120 | heights = tuple([dish[1] for dish in recommend_dishes]) 121 | plt.title(title, fontsize=20) 122 | plt.xlabel(x_label) 123 | plt.yticks(label_pos, labels) 124 | rects = plt.barh(label_pos, width=heights, alpha=0.35, align="center") 125 | barh_auto_label(rects) 126 | plt.show() 127 | plt.close() 128 | def cost_time(result): 129 | if result is not None: 130 | # 导入json数据 131 | cost_times = result["cost_time"] 132 | title = "送餐时间可视化分布" 133 | sources = ("<30min)", "30-45min)", "45-60min)", "60-75min", "大于75min") 134 | sizes = [0] * len(sources) 135 | for a_time in cost_times: 136 | ############使用限定时间 137 | if a_time <= 30: 138 | sizes[0] += 1 139 | elif a_time <= 45: 140 | sizes[1] += 1 141 | elif a_time <= 60: 142 | sizes[2] += 1 143 | elif a_time <= 75: 144 | sizes[3] += 1 145 | else: 146 | sizes[4] += 1 147 | the_max = max(sizes) 148 | the_index = sizes.index(the_max) 149 | explode = [0, 0, 0, 0, 0] 150 | explode[the_index] = 0.1 151 | explode = tuple(explode) 152 | # 使用饼图 153 | plt.pie(sizes, labels=sources, explode=explode, autopct="%1.2f%%", shadow=True, startangle=0) 154 | plt.title(title, loc="left", fontsize=20) 155 | ####使用正圆 156 | plt.axis("equal") 157 | plt.show() 158 | plt.close() 159 | def _test(): 160 | shop_id = "1452459851" 161 | # 采集结果 162 | result = spider.crawl(shop_id) 163 | # 1采集score_detail 164 | score_detail(result) 165 | # 2采集average_score 166 | average_score(result) 167 | # 3采集sfrom 168 | s_from(result) 169 | # 4采集recommend_dishes 170 | recommend_dishes2(result) 171 | # 5采集cost_time 172 | cost_time(result) 173 | if __name__ == "__main__": 174 | pass 175 | _test() -------------------------------------------------------------------------------- /UGC_analysis/txt_analysis/spider.py: -------------------------------------------------------------------------------- 1 | import re 2 | import json 3 | import requests 4 | # from fake_useragent import UserAgent 5 | #首先先在输入的地址中提取店铺ID,然后将店铺ID和找到的接口进行匹配得到一个地址链接 6 | ##然后将得到的地址链接导入到网络爬虫中去得到json1文件 7 | #最后解析jason文件得到内容 8 | class Crawler: 9 | def __init__(self): 10 | #1625734074 11 | self.base_url = "http://waimai.baidu.com/waimai/comment/getshop?display=json&shop_id=%s&page=%s&count=99" 12 | self.shop_id = None 13 | self.page_num = 1 14 | self.info = {} 15 | def crawl(self, url=None, shop_id=None): 16 | self._get_shop_id(url, shop_id) 17 | i = 0 18 | while i < self.page_num: 19 | self._get_json_request(self.base_url % (self.shop_id, i + 1)) 20 | i += 1 21 | self.page_num = 1 22 | self._filter() 23 | return self.info 24 | def _filter(self): 25 | for i, sentence in enumerate(self.info["content"]): 26 | rubbish_comment = False 27 | if self._is_english(sentence): 28 | rubbish_comment = True 29 | elif self._is_numeric(sentence): 30 | rubbish_comment = True 31 | elif self._is_too_short(sentence): 32 | rubbish_comment = True 33 | elif self._is_word_repeat(sentence): 34 | rubbish_comment = True 35 | if rubbish_comment: 36 | if self.info["score"][i] >= 4: 37 | self.info["rubbish_comment_id"].append((i, 1)) 38 | else: 39 | self.info["rubbish_comment_id"].append((i, 0)) 40 | else: 41 | if self.info["score"][i] >= 4: 42 | self.info["useful_comment_id"].append((i, 1)) 43 | else: 44 | self.info["useful_comment_id"].append((i, 0)) 45 | @staticmethod 46 | def _is_too_short(sentence): 47 | #########过滤文本字数 48 | if len(sentence) < 2: 49 | return True 50 | if len(re.findall(r'[\u4e00-\u9fa5]', sentence)) <= len(sentence) * 0.4: 51 | return True 52 | return False 53 | @staticmethod 54 | def _is_numeric(sentence): 55 | match = re.findall("\d+", sentence) 56 | if match is not None and sum([len(m) for m in match]) >= len(sentence) * 0.75: 57 | return True 58 | return False 59 | @staticmethod 60 | def _is_english(sentence): 61 | match = re.findall("[a-zA-Z]+", sentence) 62 | if match is not None and sum([len(m) for m in match]) >= len(sentence) * 0.75: 63 | return True 64 | return False 65 | @staticmethod 66 | def _is_word_repeat(sentence): 67 | repeat_words, length = [], 0 68 | for word in sentence: 69 | times = sentence.count(word) 70 | if times >= 4 and word not in repeat_words: 71 | repeat_words.append(word) 72 | length += times 73 | if length > len(sentence) / 2: 74 | return True 75 | return False 76 | def _get_shop_id(self, url, id): 77 | if url is not None: 78 | shop_id = re.search("\d+", url) 79 | if shop_id is None: 80 | raise ValueError("Bad url") 81 | self.shop_id = shop_id.group() 82 | elif id is not None: 83 | self.shop_id = id 84 | else: 85 | raise ValueError("Bad url") 86 | def _get_json_request(self, url): 87 | try: 88 | headers = { 89 | 'accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8', 90 | 'accept-encoding': 'gzip, deflate, br', 91 | 'accept-language': 'zh-CN,zh;q=0.9', 92 | 'cache-control': 'max-age=0', 93 | 'cookie': 'WMID=279740f5f48dad8d6c5a2981bfe66b48; WMST=1544622441', 94 | 'dnt': '1', 95 | 'upgrade-insecure-requests': '1', 96 | 'user-agent': 'Mozilla/5.0 (Windows NT 6.3; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.110 Safari/537.36', 97 | } 98 | result = requests.get(url, headers=headers) 99 | except requests.ConnectionError: 100 | raise ValueError("requests.ConnectionError") 101 | content = json.loads(result.text) 102 | result = content["result"] 103 | if self.page_num == 1: 104 | self._get_initial_info(result) 105 | content = result["content"] 106 | for a_json in content: 107 | self._get_a_json_info(a_json) 108 | def _get_initial_info(self, result): 109 | # 提取输入翻页地址 110 | self.page_num = result["comment_num"] // 99 + 1 111 | average_score = {} 112 | average_score["average_dish_score"] = float(result["average_dish_score"]) 113 | average_score["average_service_score"] = float(result["average_service_score"]) 114 | average_score["average_score"] = float(result["average_score"]) 115 | self.info["average_score"] = average_score 116 | # get the score detail 117 | self.info["score_detail"] = result["score_detail"] 118 | # get the weeks score 119 | weeks_score = {} 120 | for key, value in result["weeks_score"].items(): 121 | weeks_score[key] = float(value) 122 | self.info["weeks_score"] = weeks_score 123 | # get the recommend dished 124 | self.info["recommend_dishes"] = result["recommend_dishes"] 125 | # get the comment num 126 | self.info["comment_num"] = result['comment_num'] 127 | # initialize the self.info variable 128 | self.info["content"] = [] 129 | self.info["cost_time"] = [] 130 | self.info["service_score"] = [] 131 | self.info["dish_score"] = [] 132 | self.info["sfrom"] = [] 133 | self.info["score"] = [] 134 | self.info["create_time"] = [] 135 | self.info["arrive_time"] = [] 136 | self.info["useful_comment_id"] = [] 137 | self.info["rubbish_comment_id"] = [] 138 | def _get_a_json_info(self, a_json): 139 | self.info["content"].append(a_json["content"]) 140 | self.info["cost_time"].append(a_json["cost_time"]) 141 | self.info["service_score"].append(int(a_json["service_score"])) 142 | self.info["dish_score"].append(int(a_json["dish_score"])) 143 | self.info["score"].append(int(a_json["score"])) 144 | self.info["sfrom"].append(a_json["sfrom"][3:] if "na-" in a_json["sfrom"] else a_json["sfrom"]) 145 | self.info["create_time"].append(a_json["create_time"]) 146 | self.info["arrive_time"].append(a_json["arrive_time"]) 147 | _crawler = Crawler() 148 | crawl = _crawler.crawl 149 | def __test1(): 150 | shop_id = "1430806214" 151 | # shop_id = "1452459851" 152 | crawler = Crawler() 153 | a = crawler.crawl(shop_id) 154 | pass 155 | if __name__ == "__main__": 156 | pass 157 | __test1() 158 | -------------------------------------------------------------------------------- /pic/价格.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/CarryChang/UGC-Analysis/65f019a79625cc00839ca56451c0d3cc0fa7cbb4/pic/价格.png -------------------------------------------------------------------------------- /pic/分量.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/CarryChang/UGC-Analysis/65f019a79625cc00839ca56451c0d3cc0fa7cbb4/pic/分量.png -------------------------------------------------------------------------------- /pic/味道.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/CarryChang/UGC-Analysis/65f019a79625cc00839ca56451c0d3cc0fa7cbb4/pic/味道.png -------------------------------------------------------------------------------- /pic/情感分析.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/CarryChang/UGC-Analysis/65f019a79625cc00839ca56451c0d3cc0fa7cbb4/pic/情感分析.png -------------------------------------------------------------------------------- /pic/程序结构.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/CarryChang/UGC-Analysis/65f019a79625cc00839ca56451c0d3cc0fa7cbb4/pic/程序结构.png -------------------------------------------------------------------------------- /pic/统计.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/CarryChang/UGC-Analysis/65f019a79625cc00839ca56451c0d3cc0fa7cbb4/pic/统计.png -------------------------------------------------------------------------------- /pic/配送.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/CarryChang/UGC-Analysis/65f019a79625cc00839ca56451c0d3cc0fa7cbb4/pic/配送.png -------------------------------------------------------------------------------- /pic/采集到的数据样式.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/CarryChang/UGC-Analysis/65f019a79625cc00839ca56451c0d3cc0fa7cbb4/pic/采集到的数据样式.png --------------------------------------------------------------------------------