Skip to content

Conversation

@brianc
Copy link
Owner

@brianc brianc commented Apr 1, 2020

One of the things that was causing slowdowns was using a transform stream to parse messages. Not a huge slowdown but the overhead of an additional event emitter versus a single callback. There is a lot more work to do here, but I am gonna merge this into master so we can get CI starting to work on the new code path to keep things from drifting again.

I also brought the new connection code in line w/ the existing code & changes from 8.0. Adding the new parsing code path to travis so I can ensure it works in all supported node versions.

I'll try to include my very, very crude benchmark numbers w/ PRs to track perf progress.

$ node bench && PG_FAST_CONNECTION=true node bench
warmup done

little queries: 3136
qps 627.2
on my laptop best so far seen 733 qps

sequence queries: 5305
qps 1061
on my laptop best so far seen 1209 qps

insert queries: 28998
qps 5799.6
on my laptop best so far seen 5600 qps
using faster connection
warmup done

little queries: 3607
qps 721.4
on my laptop best so far seen 733 qps

sequence queries: 6048
qps 1209.6
on my laptop best so far seen 1209 qps

insert queries: 24976
qps 4995.2
on my laptop best so far seen 5600 qps

@brianc brianc merged commit 2013d77 into master Apr 2, 2020
@brianc brianc deleted the bmc/messing-with-parser-speed branch April 2, 2020 21:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants