The latter point has some interest: how do you meaningfully compare performances of tools that have strong interactions with operating system components that exhibit high and variable latency?
The answer I've been using for many years relies upon (on Windows) the use of the ptime tool. The command-lines are (for the C++ program):
ptime --r3 --ah -- ...\CSharpSourceFinder\cpp\CSharpSourceFinder\vc9\Release\CSharpSourceFinder.cpp.exe
and (for the C# program):
ptime --r3 --ah -- ...\CSharpSourceFinder\dotnet\CSharpSourceFinder\bin\Release\CSharpSourceFinder.exe
I'll leave you to peruse the ptime usage, but will comment that each of these command-lines causes the search to be run three times, and averages the times for all runs, after discarding the largest (which we can reasonably presume would be the first). It's not ultimate precision, but it's pretty useful at discarding unrepresentative measurements in such conditions, and gives figures within (I'm guessing now) 5% of their true figures.
No comments:
Post a Comment