Targer
Handling 100K tps per core on 64 bit linux (RH) json based service.
After deep investigations
Comparisons of following webservers
undertow with Servlet
tomcat with Servlet
jetty with Servlet
Frameworks
play http://www.playframework.com/
spray.io http://spray.io/
Low level frameworks
Netty
ZeroMQ
Conclusions
1. In order to reach the load we have to release service io thread as soon as possible .
2. Request with single entry vs request with bulk mode.
3. zero calcualtion in io thread.
Thats how i reached the performance torget.
Final architecture:
1. Jetty with servlet based serviec (POST implementation)
2. Bulk mode with 100K per request.
3. Release of request ASAP ( return 200 as soon as possible - then processing )
4. Still synchronous servlet.
5. Jmeter load testing .
Measuring
On server side by definining int[120] and doing System.currentTimeInMillis() / 1000 and incrementing apropriate variable in array :
myArray[System.currentTimeInMillis() / 1000] ++;
then printing once in 2 minutes and zeroing.
Limitation on single core
taskset -c 0 mycommand --option # start a command with the given affinity
taskset -c 0 -p 1234 # set the affinity of a running process
BTW :When process was already running taskes -p <PID> didnt work.
Future investigation
1. Asynch servlet.
2. Akka based async service.
3.Netty + RESTEasy framework
No comments:
Post a Comment