对使用 uwsgi nginx 和 pypy 部署的简单 python Web 服务进行基准测试的最佳方法是什么?
如何根据我的硬件规范了解预期的平均结果?
我用的是ab:
root@# ab -kc 10 -n 1000 http://domain:8080/
This is ApacheBench, Version 2.3 <$Revision: 655654 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/
Benchmarking domain (be patient)
Completed 100 requests
Completed 200 requests
Completed 300 requests
Completed 400 requests
Completed 500 requests
Completed 600 requests
Completed 700 requests
Completed 800 requests
Completed 900 requests
Completed 1000 requests
Finished 1000 requests
Server Software:
Server Hostname: domain
Server Port: 8080
Document Path: /
Document Length: 11 bytes
Concurrency Level: 10
Time taken for tests: 21.567 seconds
Complete requests: 1000
Failed requests: 0
Write errors: 0
Keep-Alive requests: 0
Total transferred: 55000 bytes
HTML transferred: 11000 bytes
Requests per second: 46.37 [#/sec] (mean)
Time per request: 215.673 [ms] (mean)
Time per request: 21.567 [ms] (mean, across all concurrent requests)
Transfer rate: 2.49 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 103 109 63.1 104 1107
Processing: 103 105 1.6 105 109
Waiting: 103 105 1.6 105 109
Total: 206 214 63.2 209 1215
Percentage of the requests served within a certain time (ms)
50% 209
66% 210
75% 210
80% 215
90% 216
95% 216
98% 216
99% 216
100% 1215 (longest request)
但我不知道这些值是否正常。对我来说这似乎非常非常慢。 46 rps,只有 10 个并发连接?
当我使用 100 个并发连接时,我的网络服务器崩溃并且没有响应
我哪里出错了?
当我使用1000个并发连接时,ab无法完成测试。
Completed 1000 requests
Completed 2000 requests
Completed 3000 requests
Completed 4000 requests
Completed 5000 requests
Completed 6000 requests
Completed 7000 requests
Completed 8000 requests
Completed 9000 requests
apr_socket_recv: Connection reset by peer (104)
Total of 9894 requests completed
在虚拟环境中启动 uwsgi:
uwsgi --http :8080 --wsgi-file foobar.py --master --processes 4 --threads 2 --stats :9191
我的硬件: 2GB 内存,2 核(linode 2GB)
最佳答案
新来的,但我会尽力提供帮助。您看到的数字是正常的,我认为您不一定配置错误。 10 个并发连接意味着基准测试有 10 个并发线程同时发出请求。您看到的请求/秒速率实际上是收到的请求数,而不是服务器的响应速率。
要了解服务器的速度,您应该查看“每个请求的时间”,即响应时间。这是一个并发连接的线程发出请求并接收完整响应所需的时间。在本例中,215 毫秒是一个不错的平均值。下面的分割特别有用,因为它显示 90% 的请求是在 216 毫秒内处理的,这接近您的平均值,并显示服务器的响应速度相当稳定。尝试增加并发进程数或每个进程发送的请求数。这将增加您的请求/秒值,并使您的服务器更加困难。然后,您可以衡量每秒一定数量的请求(实时流量负载)下您应该期望什么样的响应时间。
希望这有帮助!
关于python - uwsgi + nginx + pypy Web 服务基准测试,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/24373007/