On Thu, Jul 30, 2015 at 10:51 AM, Dimitris Chloupis <kilon.alios@gmail.com> wrote:Frankly I dont mind big images or big data , neither I share the obsession to shrink things down to few mbs in a time that we are talking in TBs .I usually do it because time to save an image increases with image size (so instead of ~instant for 60MB it takes couple seconds for 600 MB)... since I have HDD.I was wondering whether it would worth the effort beyond the unit tests that check for behaviour of the code to have also benchmark tests that will have to pass specific standards so that specific method must perform under a strict timetable specific tasks, this way CI may alert not only unit tests that fail but also benchmarks that fail , automagically.We already sort of have that:TestAsserter>>should: aBlock notTakeMoreThanMilliseconds: anIntegerAlthough I see a problem that each machine will have a different performance. So it would need to establish a baseline and then base the execution time on that. (e.g. execution shouldn't take more than 300% of a baseline)._______________________________________________Peter
Moose-dev mailing list
Moose-dev@iam.unibe.ch
https://www.iam.unibe.ch/mailman/listinfo/moose-dev