Message from JavaScript discussions
February 2018
— And then you can write some more
So assuming you are using Mongodb or even mysql and you select more than 1 million records at the same time.. Do you mean node will use js arrays to start streaming the data back to the user? Hence the statistics we see online about node being faster than some languages like python and php?
— Well, using json you'd have to load it all into node at once first
— But yeah, that's the idea, using a simpler format you can load the records one-at-a-time, and write each to the response stream
— That makes it possible to send 1M records without ever keeping 1M in memory at once
— You'll be receiving data from DB and at the same time writing data to the response stream
— Pumping it through
— Woaa.. That sounds awesome.. I need learn these things deeper.. Do you have any recommendations for reading materials?
— Https://nodejs.org/api/ is a must
— Also the book Node.js Design Patterns
— Also ^
— (For general language stuff, also some node stuff there)