mardi 17 janvier 2017

How to determine time it takes to call a function x number of times

I'm working on function that determines the time it takes to run another function so I can see how long my code takes to run.

Here is the code:

public function profileCall(name:String, repeatCount:int, method:Function, ...args):String {
    var time:int = getTimer(); // returns time in milliseconds since app start
    var average:Number;

    for (var i:int; i < repeatCount; i++) {
        method.apply(null, args);

    time = getTimer()-time;
    average = repeatCount>0? time/repeatCount:0;

    return average;

var result:String = profileCall(myCode, 1000);
trace("Time taken: " + result); // Time taken: .01

I thought it was working but then I noticed in one test if I run it 100 times it says 0.1. If I run it 1000 it says .01, at 10000 it says .001.

Is it getting more accurate or is there a miscalculation? I know on multiple calls the CPU will cache calls and that can decrease execution time but the numbers don't look right. Forgive me if this is a basic question, it has been a long day.

Aucun commentaire:

Enregistrer un commentaire