Message from JavaScript discussions
July 2017
— Heh, you're the only person I know that optimizes by adding complexity
Whenever I design algorithms they tend to get really fat before they get smaller, I do a lot of refactoring after I make a (usually too complex) minimally functional example
— This thing *used* to be a thunk, now it's some magical thing...
// Formats & inserts data into `templDocs`, then inserts it into `destElems`.
function insertionThunk(templDoc) {
var templDocs = [templDoc];
if (conf.paged && conf.elemPerPage > 1) {
for (var loc = 0; loc <= conf.elemPerPage; loc++) {
templDocs[templDocs.length] = templDoc.cloneNode(true);
}
}
return insertTemplates(injectData(templDocs, conf), destElems, conf);
}
— TemplDocs[templDocs.length] = templDoc.cloneNode(true);
— Is this really faster than .push?
— Push is extremely slow
— Never use push when you need performance
— Huh
— Interesting
— Also length
on arrays is the same as storing your own variable, no overhead at all
— Well, only if you're modifying it
— I mean, for reading