Why mongodb does not give more than 100 records?

Prompt why does not want to perform the query if the table has more than 100 entries, if there are 100 or less, everything is OK.
db.collection('allowedmacs').find().toArray(function(err, docs) {

Err this error:
name: 'MongoError',
message: 'connection destroyed, not possible to instantiate cursor'
July 4th 19 at 23:51
1 answer
July 4th 19 at 23:53
You need to use a cursor here.

var findMacs = function(db, callback) {
 var cursor =db.collection('allowedmacs').find( );
 cursor.each(function(err, doc) {
 assert.equal(err, null);
 if (doc != null) {
 } else {

MongoClient.connect(url, function(err, db) {
 assert.equal(null, err);
 findMacs(db, function() {
I read about that, but I myself need json array to widawati his browser, what's the point its a record to parse? In addition, within kaveka one is called from the output which adds the final answer. - Assunta57 commented on July 4th 19 at 23:56
: the point is that you could run out of memory when fetching a large dataset. Think what would happen if a similar request will go to the database with a billion records?
Make a request with stop db.collection('allowedmacs').find().limit(1000), and then go around in a loop the cursor, to put the documents in one big array. - raou commented on July 4th 19 at 23:59

Find more questions by tags Node.js