我在做什么:
具有几个(10 到 15 个)固定提要的 RSS 阅读器。
问题:
当我在浏览器上点击刷新时,加载大约需要 15 秒。
我知道大部分加载时间都在等待服务器迭代每个提要并从每个提要加载所有条目。
也许 AJAX 是解决方案?
代码:
这是 View :
@app.route('/')
def index():
RSS_URLS = [
'http://feeds.feedburner.com/RockPaperShotgun',
'http://www.gameinformer.com/b/MainFeed.aspx?Tags=preview',
'http://www.polygon.com/rss/group/news/index.xml',
]
entries = []
for url in RSS_URLS:
entries.extend(feedparser.parse(url).entries)
entries_sorted = sorted(
entries,
key=lambda e: e.published_parsed,
reverse=True)
return render_template(
'index.html',
entries=entries_sorted
)
这是模板:
{% block content %}
<div class="row">
{% for e in entries %}
<div class="col-md-4 col-lg-3">
<h1><a href="{{ e.link }}">{{ e.title }}</a></h1>
<h5>Published on: {{ e.published }}</h5>
{% for content in e.content %}
<p>{{ content.value|safe }}</p>
{% else %}
<p>{{ e.summary_detail.value|safe }}</p>
{% endfor %}
</div>
{% endfor %}
</div>
{% endblock %}
最佳答案
您可以并行获取提要:Practical threaded programming with Python Eventlet 这里有some code examples
import feedparser
import multiprocessing
def parallel_with_gevent():
import gevent.monkey
gevent.monkey.patch_all()
from gevent.pool import Pool
# limit ourselves to max 10 simultaneous outstanding requests
pool = Pool(10)
def handle_one_url(url):
parsed = feedparser.parse(url)
if parsed.entries:
print 'Found entry:', parsed.entries[0]
for url in LIST_OF_URLS:
pool.spawn(handle_one_url, url)
pool.join()
我在相同的场景中使用缓存文件。
def update_cache(tmp_file,cache):
""" logic to update cache """
pass
def return_cache(tmp_file,update_time_sec):
if os.path.getctime(tmp_file) < (time.time() - update_time_sec)
with open(tmp_file,"r") as data:
return data
else:
return None
@app.route('/')
def index():
entries_sorted=return_cache(tmp_file,update_time_sec)
if entries_sorted!=None:
return render_template(
'index.html',
entries=entries_sorted
)
RSS_URLS = [
'http://feeds.feedburner.com/RockPaperShotgun',
'http://www.gameinformer.com/b/MainFeed.aspx?Tags=preview',
'http://www.polygon.com/rss/group/news/index.xml',
]
entries = []
for url in RSS_URLS:
entries.extend(feedparser.parse(url).entries)
entries_sorted = sorted(
entries,
key=lambda e: e.published_parsed,
reverse=True)
update_cache(tmp_file,cache)
return render_template(
'index.html',
entries=entries_sorted
)
关于python - Flask + feedparser RSS 阅读器加载时间过长(15 秒)。我怎样才能减少这个时间?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/28719920/